News

CRTC Network Management Hearings, Day Two: Open Internet Coalition, Zip.ca, CISP, Roks, Mezei

Day two of the CRTC network management hearing featured some great presentations from the Open Internet Coalition, Zip.ca, CISP, and two knowledgeable individuals – Jean-Francois Mezei and Jason Roks.  The presenters had some strong words about the lack of Canadian competition for high-speed Internet service, the debatable claims about the impact of P2P on congestion, and the overstated advertising claims.  Unfortunately, it would appear once again that the Commission has accepted the ISP claims regarding congestion and network costs, leaving the panelists with the challenge of overcoming those basic assumptions.

That said, the day featured some startling revelations including Zip.ca's Rob Hall stating that it is currently cheaper to spend hundreds of thousands of dollars on postage to send DVDs via the mail, rather than distributing the same content electronically through the Internet given the bandwidth costs.  Moreover, Jason Roks emphasized peering arrangements, where he stated that Bell is the only major Canadian ISP that refuses to peer with anyone else.

Potential solutions to come out of the day included:

1.   Establishing a test for acceptable traffic management.  The OIC three-part test focused on whether the traffic management furthers a pressing and substantial objective; is narrowly tailored to the objective; and is the least restrictive means of achieving the objective.
2.   Truth in advertising.  Emphasis on disclosure as well as possible limits on over-subscription.
3.   Regulated peering to bring greater efficiences into the Canadian Internet.
4.   Strong anti-competitive action to stop any attempts to leverage network management or pricing plans for unfair advantage.

Full report on the day's proceedings are posted below, again thanks to Frances Munn.  Additional coverage from the National Post liveblog, CBC.ca, and CIPPIC's twitter feed.

CRTC Net Neutrality Hearings: July 7, 2009

Open Internet Coalition

Opening Remarks: Jacob Glick, Google's Canada Policy Counsel

The Open Internet Coalition described their main purpose as working to keep the Internet fast, open, and available to everyone. In their submission, they made four main arguments:

1. Open Internet drives innovation.
They argued that robust access to an open Internet is important to public policy. They urged the Commission to act in a way that promotes the development of an open Internet since it is a key economic engine.

2. Practices that undermine the Internet's openness are bad for innovation.
The Coalition argued that application specific traffic management practices make the Internet less attractive to users. They pointed out that slower applications will change user behaviour and undermine the Internet's competitive market in applications.

3. It is okay to manage some internet traffic.
They argued that some traffic management is normal and okay. As the Internet has moved towards a greater multimedia format overtime, congestion has become a greater problem. However, they emphasized that increased capacity has been the primary means of dealing with this evolution in the past.

4. Acceptable traffic management will pass the light-touch test derived from the interpretation of s. 27(2) and 36 of the Telecommunications Act.
They distinguished between useful traffic management and traffic management that discriminates. They claimed that evidence shows that carriers can manage their networks, reduce congestion, and keep the Internet open at the same time.

After noting that they believed discrimination between applications constituted discrimination under s. 27, they went on to explain their three-part “test” derived from s. 27 and s. 36 of the Telecommunications Act:

1. Does the traffic management practice further a pressing and substantial objective?
2. Is the traffic management traffic narrowly tailored to address the objective?
3. Is the traffic management practice the least restrictive means to reach the objective?

The Coalition argued that a debilitating network could be a substantial and pressing objective. However, they pointed out that the evidence in this proceeding has not established the existence of debilitating network congestion. They also expressed doubt that applications like BitTorrent and P2P "exploit" the Internet.

Addressing step two of the test, they argued that throttling is almost never narrowly tailored. Further, they argued that throttling has negative effects on innovation and that there are better means of lessening congestion.

For the third arm of the test, they argued in favour of techniques that were more effective at reducing congestion and that did not discriminate between applications. For the foremost alternative, they pointed to increased network capacity. They also pointed to application-neutral and price-based levers.

Questions

Von Finckenstein opened the questions, asking the Open Internet Coalition to describe how they defined "openness." In response, the Coalition described "openness" as an Internet where users are free to engage with the World Wide Web and where application creators have the freedom to make new applications without having to go through a gatekeeper.

Finckenstein went on to address their three-part test, comparing it to the Oakes test. He asked what constituted a "pressing" and "substantial" objective. The Open Internet Coalition replied that the test was flexible and served as a guideline. They argued that its main strength was that it could be applied on a case-by-case basis. However, they admitted that it was ultimately a value judgement. They went on to point out that there should be a higher standard applied to a practice that discriminates between applications than one that is application neutral.

Timothy Denton continued with the questioning. He wanted to know what kind of system they imagined for the application of their test – e.g. whether they envisioned a complaint based system. The Open Internet Coalition replied that they wanted the players to first know the rules of the game and, further, that they would have to seek specific exemptions. They went on to say that in other contexts, there could be complaints to the Commission on a case-to-case basis.

Denton asked for their view on price-based systems and the utility of a price-based system to alleviate congestion. In response, the Open Internet Coalition said they saw a variety of price based systems developing in the future. In addition, they expressed concern that price-based systems might have unintended consequences such as discouraging Internet use overall. However, they said that the Commission's role was to continue to promote competition, not to regulate fees.

Denton then asked whether their proposal to increase network capacity was an optimal solution. In response, the Coalition pointed out that increasing network capacity is key to the history of the Internet. In the past, increasing Internet capacity has encouraged innovation, leading to a win-win "virtuous circle." They cited an Internet2 study that demonstrated that adding capacity was a more beneficial way to deal with increased activity than managing the network. In addition, they pointed out that some traffic management was desirable as long as it satisfied their three-part test.

Denton asked about the future, wondering if the CRTC would be dealing with the traffic management controls on an ongoing basis or whether it would be a permanent part of their work. The Coalition responded that if clear guidelines were developed, the Commission would not be overburdened with requests. Further, they pointed to past experience where there have been periods of time with greater congestion. They claimed it would be an overreaction to allow the networks to control scarcity. Instead, policies should be developed that are aimed at increasing capacity and promoting innovation.

Leonard Katz took over the questioning and asked about the goal of increasing capacity. He wanted to know how increasing capacity benefits shareholders in a market based economy. While he accepted the argument that it might lead to innovation, he was concerned that it would lead to greater costs for shareholders. In response, the Coalition pointed to a U.S. study that showed that costs have gone down in telecommunications but that prices have gone on. They implied that prices were based on market power rather than costs. Further, they argued that if increasing prices were necessary for more innovation, it was worth the trade-off.

Katz pressed them for evidence that consumers were willing to pay more for an open Internet. The Coalition said they had no studies or statistics either way, but maintained that consumers wanted open access to the Internet and they pointed to the millions of consumers who have joined their coalition. Further, the Coalition disagreed with the premise that the only available traffic management techniques were those that discriminated among applications. As well, while they thought added capacity was a better alternative, they were not asking the Commission to mandate it – it should be open to the providers with incentives.

Molnar took over the questioning, asking about the "virtuous circle" between increased capacity and the consumer. She wanted to know whether there were any obligations for an application maker to create an application in an efficient manner.  In response, the Coalition pointed out that there was a highly competitive market in the application sphere and that creators face pressures from users to provide fast applications. Molnar went on to wonder if P2P is an inefficient use of bandwidth. The Coalition replied that it serves no one to have a congested network. Consequently, there is a built in incentive to reduce congestion.

Molnar asked about this built in "incentive" for application providers to be efficient. The Open Internet Coalition replied that users would avoid network congesting applications and instead seek applications that did not cause congestion. Further, if the application is causing congestion, it is due to high user demand for the application. Consequently, the question is one of dealing with user demand. They argued in favour of methods that did not arbitrarily discriminate among applications for dealing with user demand.

Von Finckenstein wondered if it would be just to use application discrimination in a congested Internet if there was no other choice. The Coalition did admit that under their test, application discrimination would be a last result, but could be used if there was no other choice.

Molnar finished her questions by asking about technologies that allow the consumers – rather than the provider – to choose their applications. The Coalition responded that having the consumers in charge of their own Internet was one way of dealing with the congestion problem. They argued that the service provided by the Internet is set by the consumer. That is, the user should control the packets of the Internet and that putting the consumer in charge was consistent with that.

Lamarre asked how privacy can be preserved on an open Internet. The Coalition responded that in the Canadian context, privacy is preserved through private sector incentives. They claimed that Canada is further along in terms of privacy than in the United States. In addition, they pointed out that an open Internet is still subject to law. That is, applications created on the Internet would be subject to Canadian privacy laws.

Zip.ca

Zip.ca opened their submission by arguing that it was difficult to set universal rules because there had to be some way of looking at individual practices. Further, they encouraged the CRTC to find a way to identify infractions and settle them quickly. Zip.ca is a DVD delivery company that uses the Internet to stream live video to its customers. As this requires a fair amount of bandwidth, they were particularly concerned with congestion techniques that target applications.

They were also concerned with DPI technology. Since Zip.ca can follow what their viewers watch, they can provide recommendations to their consumers. They worried that their competitors may use DPI technology to do the same thing and thus harm their business.

They also raised the issue of timing. As Zip.ca is a time sensitive business, they expressed concern that rulings could take long periods of times – maybe even years – to be resolved. Such long periods would put their business in danger since they would not be able to deliver to their customers. They encouraged the CRTC to maintain a fast Internet system.

Further, they addressed some of the negative criticisms of BitTorrent, arguing that it is an efficient way of delivering content to customers. As such, BitTorrent can be used in legitimate ways and application discrimination would hinder this delivering ability.  They were also concerned about the idea of "walled gardens." Since much of their content is streamed from the U.S., such a policy could harm their business.  Finally, they argued that choice should lie with the consumer.

Questions

Von Finckenstein opened the questioning by inquiring into their business model. Zip.ca explained that their system of recommendations differentiated them from video on demand. They said they offer a system similar to a DVD delivering system, but that content is downloaded from the Internet rather than received by mail.

Von Finckenstein expressed concern about Zip.ca as a third party being carried over the Internet and using a network such as Bell or Rogers. In response, Zip.ca disagreed with the idea that big carriers should have the right to give their own applications priority over theirs. Zip.ca argued that they carried the same service as a big network and that it was an unfair business practice for providers to favour their own services. They argued that rules to manage network traffic should not exempt the applications belonging to the carrier. They argued that throttling based on caps could also lead to the same unfair business practice. While they said that volume caps make the most sense, they could be problematic if carriers could throttle outside applications like Zip.ca while allowing their own applications through unhindered.

Lamarre asked about the size of the bandwidth that Zip.ca would need to download movies to their customers. Zip.ca replied that it depended on the size of the resolution and that with new compression technologies, they can deliver movies for under 1G. They said they were concerned that bandwidth caps would be set just below what they could deliver a movie for.

Lamarre inquired into the cost of changing the delivery system from mail to the Internet. In response, Zip.ca said that it costs more to deliver a video through the Internet than through the post. They hoped that it would be cheaper in the long-run, but that with higher resolutions coming out, it is currently more expensive.

Denton asked about the cost of delivering movies by mail. Zip.ca replied that it was about a dollar each way plus some handling fees. They reiterated that with bandwidth costs, it was still more expensive to deliver movies over the Internet.

Coalition of Internet Service Providers

They opened by expressing concern over the use of DPIs but arguing that Internet Protocol flow management is in the public interest. They made three key points:

1. Encryption

They argued that when encryption is used for security reasons, it makes DPI inspection impossible and leads to congestion. Moreover, if ISPs could detect encrypted traffic, the encryption would be no good and quickly replaced by a stronger one. They went on to argue that it only takes a small percent of users engaged in P2P traffic to congest the network. Further, they argued that if they remain encrypted, there is no evidence that throttling will be able to reduce congestion.

2. Congestion Signaling

They argued that if there was a way for ISPs to signal to applications that their network is congested, the application would slow down or desist. They pointed to the work of Dr. Lawrence Roberts who advocated a system of flow management. Under this system, technology signals the presence of congestion by selectively dropping IP packets to slow down the system.

In addition, they criticized the current system of measuring gigabytes per month as arbitrary. Instead they pointed to gigabytes downloaded per hour or per minute as the real problem. They claimed that the Commission took an easy way out in ruling that it was non-discriminatory to set a monthly rate.

3. Wirespeed Aggregation

The Coalition described wirespeed as any device that processes data without reducing the overall transmission rate. In other words, the cable box itself never becomes a choke point. The system would require a big, fat "pipe" that is not congested.

The Coalition compared aggregated DSL to the widening of telephone networks. They argued that changing the architecture is the only way to solve the problem of Internet congestion. They urged the Commission to support wirespeed aggregation because it would lead to a system without traffic management and without discrimination against applications.

Questions

Von Finckenstein pointed out that their presentation was full of jargon and asked for a simple explanation of Dr. Roberts's idea for congestion signalling. In response, the ISP Coalition explained that under the characteristics of P2P, speeds are limited by the speeds of uploads. They argued in favour of pacing the packets so that the system can function properly. However, it would require new technology that can "signal" in times of congestion. The main idea is that after applications are signalled, they slow down.

Von Finckenstein asked if the technology was available today and why it is not being used. In response, the ISP said that while the technology is available, the technology does not work on IP protocol. ISPs would have to make an investment in new technology.

Von Finckenstein wanted to know what would happen if the technology was deployed on a large basis. In response, the Coalition said that the technology would have to be deployed by the ISPs. However, since the technology remains expense, some ISPs would be unwilling to make the investment since they are currently free to engage in traffic shaping. They added that under today's system, network operators have to engage in traffic management because the network is not strong enough to engage in P2P applications.

Panel of Jason Roks and Vaxination Informatique (Jean-Francois Mezei)

Jean-Francois Mezei

Mezei is a self-employed citizen aiming to share his opinion on this issue. He began his presentation by emphasizing how much people depend on Internet for their day-to-day life. He compared it to a utility like electricity where the supplier does not care how much the consumer uses.

He argued that in order to be competitive, a country needs a competitive telecommunications industry. He pointed out that if we don't have a competitive environment in telecommunications, Canadian businesses will leave the country and Canada will come to depend on telecommunications products developed by others.

He claimed that he got involved due to the myth that P2P and BitTorrent file sharing is "bad." He pointed out that when users are limited by speeds – e.g. 5 MB – it is not possible to download eleven times more than other users. 

He argued that the networks have no business saying that P2P unfairly takes up bandwidth. He pointed out that the now defunct Bell store allowed for full 5MB per second downloading without throttling when it used just as much bandwidth as a P2P download.

Further, he argued that P2P is more efficient than Youtube. Since Youtube videos come from one link, P2Ps are more efficient because they can come from several users at the same time. He described this situation as unfair because when it comes to throttling, the networks should look at how much bandwidth is actually being used as opposed to the application itself.

He argued that ISPs did not throttle Youtube because of its popularity. P2P, in comparison, remains an emerging technology. He theorized that carriers are attempting to throttle it before it becomes popular. In comparison, Youtube is putting out HD content that is non-regulated and not throttled. Finally, he argued that P2P is democratic – people can create their own media and distribute it themselves. Youtube, in comparison, can be controlled.

He pointed to the Montreal based VIF Internet system for dealing with heavy users. After hitting 100 GB a month, a customer is given a slower throughput and there is no need for DPI.

He urged companies to stop advertising high speeds that are impossible to reach due to throttling and to instead advertise their true speeds to promote competition.

Jason Roks

Roks started with P2P and BitTorrent. He called BitTorrent a "shipping container" and expressed concern that someone could be blocked just because of the shape of the box. He called it "essentially the same" as any other file sharing application out there. Further, he expressed scepticism at the idea that a technology could be banned, blocked, or hindered and described such methods as stifling innovation. He pointed out that DPI cannot manage encryption, arguing that it is therefore ineffectual. Further, he argued that there was simply no way to stop file sharing and that it would not go away.

He said that there were two options – adopting to the new technology or stifling it. He argued that speeding up torrents is one method of dealing with congestion. Since staying in the queue is what blocks the network, speeding up the queue would get them out of the way.

He said that ISPs were responsible for congestion because it happened on their networks.  He claimed that while ISPs say that changing the system would be a "lot of money," no ISP has ever said exactly how much money it will be. However, using inflated numbers, he claimed that an upgrade would cost them no more than 2 dollars per user per month over the next three years.

He addressed the market ploy that ISPs use to sell higher speeds. He pointed out that on the one hand they sell higher speeds and then turn around and claim that there is too much data in the system. He argued that people now pay a certain amount per month for a speed and that it is unfair for ISPs to charge more – e.g. people pay for 1MB for speed, and should not have to pay extra for using it.

He pointed out that there are two types of bandwidth:

1. Peer Bandwidth: Data that goes amongst the ISP's network and does not cost anything extra. He claimed that it in fact speeds up the network and maintains geographic integrity. He pointed out that Bell is the only major company in Canada that does not peer.

2. Transit Data: A request goes outside the network of an ISP and it costs the ISP money to connect to another network.

In sum, he said there should be no traffic management aside from controlling malicious software. He urged traffic management only in times of congestion and only while the network is being upgraded. He also argued in favour of full disclosure and transparency on the part of the ISP for the end user.

Questions

Von Finckenstein said that he was confused about P2P because other sources he had heard from claimed that it took five times more bandwidth than other applications.  In response, the panel said that P2P remains a tool of early adopters. They pointed out that Youtube videos use as much bandwidth and have even expanded to HD. They also said that focusing only on P2P does not work in the long-run as new technologies will come along. For instance, BitTorrent now uses a new technology that avoids throttling. As such, throttling in itself makes the network inefficient as the targeted application is avoiding it.

Von Finckenstein asked the panel to justify its recommendation that networks should upgrade their system. The panel argued that ISPs are selling customers an insufficient network. They argued that if the ISPs cannot afford to upgrade its system to meet demand, they should not keep subscribing to new consumers.

Leonard Katz addressed the issue of ISPs marketing the service with speeds "up to" a certain point and then delivering speeds that are much lower. He asked the panel if ISPs should instead be advertising a minimum speed. The panel agreed that it is a problem when ISPs advertise high speeds and then claim that they do not have the capacity for it. They urged the Commission to insist that ISPs publish their throttling speed. They explained that if ISPs were forced to advertise it, their throttling speed might go up. They referred to "up to" speeds as "false advertising" since the ISPs cannot deliver it. Further, requiring minimum speeds would provide them with an incentive to upgrade their networks.

Katz asked why ISPs continued to throttle. In response, the panel said that ISPs have targeted P2P in particular for throttling. They pointed out that many of the big companies that also sell television subscriptions are the ones that target P2P. They claimed that the smaller networks – which are not threatened by downloadable television programs – can control their networks and do not complain about congestion.

Katz went on to ask Roks about the evolution of P2P. Roks pointed out that there are many different ways to use P2P and that sharing data is the point of the Internet. He also claimed that many of Bell's throttling practices are unclear. He argued that Bell is being selective and targeting certain P2P applications that perhaps pose a risk to their business. Roks also said that if Bell is going to be allowed to throttle, there should be better disclosure.

Molnar brought up the consumption model where users have to pay for their bandwidth use. She asked if the panel opposed the consumption model. In response, the Panel said that it depended on how it was implemented. They said that the Internet is a utility and that it is fair for users to pay for what they are using. However, they pointed out that people are all ready limited by speed because there is only so much bandwidth a user can access on a specific speed.

Molnar wondered if the consumption model was only useful during the "steady state" model but no longer useful during times of unforeseen peak congestion. She asked if there were any more reasonable technological solutions to managing trafficking. In response, the panel said that there are technologies that can manage usage, e.g. the Montreal VIF model. They pointed out that the Internet was designed to handle congestion and that the majority of users did not notice congestion on a daily basis. They said there should be a balance and that it is reasonable to expect a small amount of congestion.

Von Finckenstein asked about Bell's peering policies. Roks explained that ISPs who share their networks can increase speeds. Bell, in comparison, refuses to peer with anyone which makes the experience less efficient.

20 Comments

  1. I would like to dispute this claim of zip.ca that bittorrent and other p2p software can be used for purposes other than pirating. I am old, and I know better than them, and I so know that the internet is only used for email, porn and piracy and my grandson told me that email doesn’t use bittorrent or other p2p software.
    According to this here completely independant report, those “bittorrent” thingies are used by pirates!
    This “streaming video” or whatever they call it is obviously piracy, since they did not buy anything from a store. The only way the starving artists and the poor janitor who cleans up the sets can possibly make any money is if you buy cd’s and dvd’s from a physical store. These “legitimate music and videos” over the internet is all a myth, and there is no way to save our companies and our struggling artists using the internet. In fact, as these so called “legitimate” uses increase, our cd and dvd sales are going down! Down, not up!

    Also, make sure the cd or dvd has one of my authorized logos on it, otherwise it is a pirated copy! There are no legitimate cd’s or dvd’s that do not have a holographic logo from a major label on them.

    PS: Think of the children…
    That is all.

  2. Hey Crade, you appear to understand absolutely nothing about the Net Neutrality issue.
    I feel slightly guilty about having to use this wording but, you would be well advised to stay out of this one Grandpa, as you clearly do not understand what the whole issue is about. There is much more at stake here than BitTorrent. Have you not read anything about the issue on this site. If you delved a little deeper, you might not sound so ignorant.

  3. pps
    😉

  4. Devil's Advocate says:

    I’m sure Crade was being sarcastic
    I believe “Think of the children” should have been the tipper! : )

  5. jason roks says:

    FTR : I do not support “billing by usage”.
    FTR : I do not support “billing by usage”. I’ve paid for the usage based on the connection speed sold to me. if i’m sold 1Mbit per second then don’t be charging me more when I use it.

    fink? how is using what i bought ‘over-use’?

    as well the costs per GB charged by ISPs is unrealistic and unaccountable when the “real cost’ to ‘make bandwidth’ is POWER — @ a mere 3 cents per GB according to Google. what justifies the extra $4.95 im being charged.

    thnx for the summary.

    Jason
    0-

  6. P2P efficiencies
    So far, according to these write ups, nobody has made the point that it is more efficient for an ISP to have a large file distributed by peers within its network by P2P/Bittorrent than for all the same users to download individual copies of the same large file from outside the network. The comparison to YouTube came closest…. but really… don’t they have a whiteboard in the meetings? Could someone please draw a picture of how direct download works vs. P2P. Show the committee that when I download half a file from my neighbour on the same ISP, traffic is only passing through the switch at my local CO and doesn’t congest the larger network. That’s half a file that the ISP doesn’t have to pay to get from California.

    Someone should propose that the ISP crap-ware CDs that are given to subscribers contain a bittorrent client to help relieve congestion by distributing popular files within the ISP’s network.

  7. Michael Cowie says:

    Temporary solution
    You heard it here first – NEWS UPDATE.

    The DPI software that Bell uses only monitors certain ports. There is a way around this. I know, because I’m using it right now, during Bell’s “throttling time” to download a torrent from “LegalTorrents” and to download a build of Linux. 400K download or faster. I will not publish how I am doing this here, but it is possible, and ridiculously easy. However, it takes time to figure out – time most people do not have.

    Despite this news, it is sad to see that even here there is strong evidence of the publicity battle being won by the greedy and the powerful. As much as I despair at people referring to the previous poster as “Grandpa” (I will speak to you with respect sir) he is clearly one of the many Canadians who believes that there is little more at stake here than people getting access to something they shouldn’t have anyway. In my research in finding a way around Bell’s intrusive spying on me, I found one of the submissions to these proceedings. It was from a company that invested thousands in a new inter-office distribution system based on p2p, which is now next to useless (they said productivity had gone down by over 60%).

    So sir, before you start accusing the users of p2p of all being thieves and pornographers, please consider that there will be jobs lost and companies shut down if Bell has their way – not what Canada needs in a recession, when we should be building on our strengths. It is also quite sad to hear so many Canadians willing to accept the corporate argument that we are all, by nature, criminal, and that we need to be ‘managed”. I always thought Canadians had more pride than that.

  8. Michael Cowie says:

    and one other thing sir
    I am an artist by trade, and spent 2 years researching a system by which I could distribute my work at the lowest cost to me, with the largest audience and the greatest possible chance for a return. I settled on BitTorent, and now all my workis wasted thanks to Bell.

    And if you do not know the difference between streaming video and piracy, you are clearly completely ignorant of the facts.

    If BitTorrent is used by “pirates”, and should therefore be banned, I would also like to see cars banned from the roads, because they are also often used by people for illegal activities – activities which often get people killed, like drunk driving. And if you would like to dispute that, I’ll direct you to speak to my cousin, whose 3 friends are now dead because of a car.

  9. Ian Gillan says:

    Attention M. Cowie… Beating a dead horse.
    Mr. Cowie,

    You do not need to state you defeated Bell’s throttle and won’t show anyone how.

    Bell Canada Employee’s (managers specifically) and a paper submitted to the CRTC by Per Vices Corporation during the CAIP VS> Bell hearings explains and shows how.

    Here is 12 pages of it for you to read through:
    http://www.dslreports.com/forum/r20755668-How-to-defeat-the-throttle

    Now everyone knows.

    This blog is like a person to person app is it not?

    Knowledge is not something to hide. It is shared Mr. Cowie.

    Also, this does not work all the time, nor does it work for everyone (depends on the config of the DPI box you have maybe). It’s a cat and mouse game.

    Port 21 will work for some and not others. Port 22 will work for some and not others. Using a VPN port will work for some and not others.

    Cat and mouse Mr. Cowie.

    Be happy with your Bell service.

  10. Disappointing Disposition
    I’m rather alarmed that the CRTC has taken the position against Net Neutrality, as indicated by almost every question they’ve asked thus far. I find it troubling that they have taken the claims of the likes of Sandvine and Juniper at face value, despite the overwhelming number of groups and organizations that are crying foul, pointing out that they many of the claims made are demonstrably fallacious.
    I’m also shocked that more emphasis on the fact that when two friends on the same ISP network send a file to each other without having to go through any peering points, it DOES NOT COST THE ISP ANY MONEY. “Oh but they have to power the router and such” – get lost you nit-picking little sh*t, your entirely missing the point. The point, is that peer-to-peer holds the potential to be infinitely cheaper than the traditional download-from-the-server model; If a large group of people on network A are downloading torrent X, than they can share pieces of the file with each other and that means that those pieces that are shared amongst those friends on network A do not count towards the cost the ISP would incur had those people had all downloaded that file from a central server located off of the ISP’s network. In this way, P2P has the potential to save ISPs a significant PERCENTAGE of the bandwidth costs incurred by transmitting media.
    How does an Open Internet protect privacy?
    Um, because there would be no companies with DPI sniffing your packets tracking your usage? Because they wouldn’t be able to then turn around and sell that data? “But they’ve given us their word that they won’t” – Oh yeah, and keeping their word with you means more to them then their legal obligation to their shareholders does it? Puh-leez.

    I also believe that Internet is a utility, like power or water, and that it should be provided as such. How do you think people would feel if the policy suddenly changed from “when capacity is reached, add more” to “when capacity is reached, charged more and don’t upgrade capacity”? I don’t know about you but I’d be in the street with my pitchfork and torch.
    Upgrading capacity has been the ^underpinning^ of the advancement of the Internet. We would not HAVE the likes of Google Video and YouTube if it was not for the concept of upgrading network capacity when congestion became too much. The idea that telecom companies should suddenly be allowed to turn around and start restricting the content that we consume instead of providing bigger pipes to meet the demand, not only flies in the face of established tradition and precedent, it ^cripples^ innovation and devalues all of the content currently accessible because fewer people have even less access to it.
    How does it cripple innovation?
    If I want to write an application to allow my friend Bob to view my home videos from his computer on his network, do you think that the ISP in the middle squeezing the pipes between me and him is going to have a positive or negative affect on the design and implementation of my program? Consider it this way, if I can’t stream the video to him at a speed he can watch it at (a slow and steady approach), then it is preferable for me to send him the whole video (a speedy, bulk transfer) which (not only ^isn’t innovate^, but is counter-productive because one has to wait for the transfer to finish before watching) in itself causes more congestion than streaming because I’m sending as much data as I can as fast as I can because he can’t watch the video until the transfer is finished, as opposed to streaming which is only a nice steady low bandwidth rate.

  11. Patrick McNamara says:

    I am a legal torrenter.
    Har ye mates! They that use torrent be not all pirate.

    I distribute a LEGAL podcast through torrents (mainly though Mininova) and it’s been a very successful system. I’m somewhere around 700,000 downloads for about 125 files. (Better results in many cases than YouTube.) There are legal torrents, and I’ve tried to encourage more people to use them. There is even a site called Legaltorrents.com.

    The problem right now is that it’s so popular for illegal downloads. (Although technically they’re not really “pirating” unless they profit from the act.) However, there’s a number of lawsuits creating a change in the way many of the bigger sites are behaving. Piratebay was the biggest attention getter, but Mininova is also facing legal prosecution.

    In time it’s possible that torrent sites become dominated with legal torrents. But it’s also possible to find a lot of illegal material on YouTube, so it’s doubtful it will ever be completely eliminated. However, it would be a shame for a brilliant system like torrents to be destroyed simply because they’re labelled as illegal.

  12. Ken Chase says:

    Owner Heavy Computing, previous owner of Velocet/Datavaults/DSL.ca, Torix bandwidth peering exchange member since 1997
    My comment got too long and detailed, so i formatted it into a blog post

    http://notes.sizone.org/?p=17

    or if that link doesnt work

    notes.sizone.org/?p=17

    or notes dot sizone dot org post # 17.

    Take a read.

  13. Mississauga Resident says:

    The reason we lag behind…
    I’m surprised that a country that encourages multiculturalism would have to go through great “barriers” to share it. If someone was to decide between Canada versus (another country), they would think

    “hey Canada is the place to go cause they are about multiculturalism, great people, great beer(etc) and I have this great presentation about my country that I want to show everyone, but oh crap they have barriers, oh well I can always go somewhere else”. (Distributing via BitTorrent)

    So in essence, if we wanted to show how multicultural we really were, wouldn’t we want an open internet to allow us the freedom to show it.

  14. Von Finckenstein says:

    crtc seems to support anti-competition
    “Von Finckenstein expressed concern about Zip.ca as a third party being carried over the Internet and using a network such as Bell or Rogers.”

    That says it all right there. The default assumption of the chairman of the CRTC is that the telcos should use their control of Internet subscribers to monopolize video distribution.

  15. a zip user
    In regards to zip.ca

    Canada post will turn out to be the loser.

    Trickle down effect… Something needs to increase to maintain the standard.

    It’s not free!

    How does Canada post recoup the loss? Or should they be allowed to?

  16. Anyone else as surprised as I am that it’s cheaper in Canada to pay a staff to physically stuff envelopes and maintain 4 national warehouses than it is to maintain a small/medium data center?

  17. pat donovan says:

    grunt
    well well well..
    it looks like political reality has finally struck home here.

    a NEW form of supervision for the CRTC (brand new form of corp government.)
    an industry obsessed with control, not service) bell took a nasty pounding when they were forced to open up)
    self-serving crap, self-indulgent crap and pure ordinary crap
    from almost everyone.
    wanna take any bets they reduce the web to rubble? ‘The village had to be destroyed to save it’ kind of thing.
    pat

  18. No Shock says:

    Am I Surprised
    The CRTC has outlived it’s usefulness. It is obvious it’s interest is not in the people but in the interests of Rogers and Bell. I cannot for the life of me understand how the CRTC is allowed to exist on tax money from the people it is supposed to protect. Seeing as the CRTC looks to have already made up it’s mind, I say they get booted of the Government payroll. I already pay too much to my ISP, I shouldn’t have to pay the CRTC to protect them.

    Being a proud full blooded Canadian I hate to say this but… maybe we need some american giants to come in and create some competition.

    I wonder what would happen if we all got fed up enough to give up the internet. What would that do to the giant Telcos? I’m ready!

  19. FairnessFairy says:

    3rd World Internet on the Way
    One of the big concerns I have is the ISP knobs referring to “issues” with encrypted traffic and how it’s impossible to do DPI, then ultimately going on to claiming this is a big issue regarding congestion.

    Certain reseller DSL ISPs provide encrypted SSH tunnels to their users to bypass Bell’s throttling. Are ISPs asking permission to block non-443 encrypted traffic at their discretion?

    As for torrents – many major companies (IBM, Redhat, etc) have begun offering torrents for downloads (driver CDs, 100MB+ files) off their sites.

    Bell sux. Rogers swallows. Shaw writes a play about it

  20. Does anyone know if the online movie service is a go or not.lehmann