CRTC Network Management Hearings, Day Two: Open Internet Coalition, Zip.ca, CISP, Roks, Mezei
CRTC Net Neutrality Hearings: July 7, 2009
Open Internet Coalition
Opening Remarks: Jacob Glick, Google's Canada Policy Counsel
The Open Internet Coalition described their main purpose as working to keep the Internet fast, open, and available to everyone. In their submission, they made four main arguments:
1. Open Internet drives innovation.
They argued that robust access to an open Internet is important to public policy. They urged the Commission to act in a way that promotes the development of an open Internet since it is a key economic engine.
2. Practices that undermine the Internet's openness are bad for innovation.
The Coalition argued that application specific traffic management practices make the Internet less attractive to users. They pointed out that slower applications will change user behaviour and undermine the Internet's competitive market in applications.
3. It is okay to manage some internet traffic.
They argued that some traffic management is normal and okay. As the Internet has moved towards a greater multimedia format overtime, congestion has become a greater problem. However, they emphasized that increased capacity has been the primary means of dealing with this evolution in the past.
4. Acceptable traffic management will pass the light-touch test derived from the interpretation of s. 27(2) and 36 of the Telecommunications Act.
They distinguished between useful traffic management and traffic management that discriminates. They claimed that evidence shows that carriers can manage their networks, reduce congestion, and keep the Internet open at the same time.
After noting that they believed discrimination between applications constituted discrimination under s. 27, they went on to explain their three-part “test” derived from s. 27 and s. 36 of the Telecommunications Act:
1. Does the traffic management practice further a pressing and substantial objective?
2. Is the traffic management traffic narrowly tailored to address the objective?
3. Is the traffic management practice the least restrictive means to reach the objective?
The Coalition argued that a debilitating network could be a substantial and pressing objective. However, they pointed out that the evidence in this proceeding has not established the existence of debilitating network congestion. They also expressed doubt that applications like BitTorrent and P2P "exploit" the Internet.
Addressing step two of the test, they argued that throttling is almost never narrowly tailored. Further, they argued that throttling has negative effects on innovation and that there are better means of lessening congestion.
For the third arm of the test, they argued in favour of techniques that were more effective at reducing congestion and that did not discriminate between applications. For the foremost alternative, they pointed to increased network capacity. They also pointed to application-neutral and price-based levers.
Von Finckenstein opened the questions, asking the Open Internet Coalition to describe how they defined "openness." In response, the Coalition described "openness" as an Internet where users are free to engage with the World Wide Web and where application creators have the freedom to make new applications without having to go through a gatekeeper.
Finckenstein went on to address their three-part test, comparing it to the Oakes test. He asked what constituted a "pressing" and "substantial" objective. The Open Internet Coalition replied that the test was flexible and served as a guideline. They argued that its main strength was that it could be applied on a case-by-case basis. However, they admitted that it was ultimately a value judgement. They went on to point out that there should be a higher standard applied to a practice that discriminates between applications than one that is application neutral.
Timothy Denton continued with the questioning. He wanted to know what kind of system they imagined for the application of their test - e.g. whether they envisioned a complaint based system. The Open Internet Coalition replied that they wanted the players to first know the rules of the game and, further, that they would have to seek specific exemptions. They went on to say that in other contexts, there could be complaints to the Commission on a case-to-case basis.
Denton asked for their view on price-based systems and the utility of a price-based system to alleviate congestion. In response, the Open Internet Coalition said they saw a variety of price based systems developing in the future. In addition, they expressed concern that price-based systems might have unintended consequences such as discouraging Internet use overall. However, they said that the Commission's role was to continue to promote competition, not to regulate fees.
Denton then asked whether their proposal to increase network capacity was an optimal solution. In response, the Coalition pointed out that increasing network capacity is key to the history of the Internet. In the past, increasing Internet capacity has encouraged innovation, leading to a win-win "virtuous circle." They cited an Internet2 study that demonstrated that adding capacity was a more beneficial way to deal with increased activity than managing the network. In addition, they pointed out that some traffic management was desirable as long as it satisfied their three-part test.
Denton asked about the future, wondering if the CRTC would be dealing with the traffic management controls on an ongoing basis or whether it would be a permanent part of their work. The Coalition responded that if clear guidelines were developed, the Commission would not be overburdened with requests. Further, they pointed to past experience where there have been periods of time with greater congestion. They claimed it would be an overreaction to allow the networks to control scarcity. Instead, policies should be developed that are aimed at increasing capacity and promoting innovation.
Leonard Katz took over the questioning and asked about the goal of increasing capacity. He wanted to know how increasing capacity benefits shareholders in a market based economy. While he accepted the argument that it might lead to innovation, he was concerned that it would lead to greater costs for shareholders. In response, the Coalition pointed to a U.S. study that showed that costs have gone down in telecommunications but that prices have gone on. They implied that prices were based on market power rather than costs. Further, they argued that if increasing prices were necessary for more innovation, it was worth the trade-off.
Katz pressed them for evidence that consumers were willing to pay more for an open Internet. The Coalition said they had no studies or statistics either way, but maintained that consumers wanted open access to the Internet and they pointed to the millions of consumers who have joined their coalition. Further, the Coalition disagreed with the premise that the only available traffic management techniques were those that discriminated among applications. As well, while they thought added capacity was a better alternative, they were not asking the Commission to mandate it - it should be open to the providers with incentives.
Molnar took over the questioning, asking about the "virtuous circle" between increased capacity and the consumer. She wanted to know whether there were any obligations for an application maker to create an application in an efficient manner. In response, the Coalition pointed out that there was a highly competitive market in the application sphere and that creators face pressures from users to provide fast applications. Molnar went on to wonder if P2P is an inefficient use of bandwidth. The Coalition replied that it serves no one to have a congested network. Consequently, there is a built in incentive to reduce congestion.
Molnar asked about this built in "incentive" for application providers to be efficient. The Open Internet Coalition replied that users would avoid network congesting applications and instead seek applications that did not cause congestion. Further, if the application is causing congestion, it is due to high user demand for the application. Consequently, the question is one of dealing with user demand. They argued in favour of methods that did not arbitrarily discriminate among applications for dealing with user demand.
Von Finckenstein wondered if it would be just to use application discrimination in a congested Internet if there was no other choice. The Coalition did admit that under their test, application discrimination would be a last result, but could be used if there was no other choice.
Molnar finished her questions by asking about technologies that allow the consumers - rather than the provider - to choose their applications. The Coalition responded that having the consumers in charge of their own Internet was one way of dealing with the congestion problem. They argued that the service provided by the Internet is set by the consumer. That is, the user should control the packets of the Internet and that putting the consumer in charge was consistent with that.
Lamarre asked how privacy can be preserved on an open Internet. The Coalition responded that in the Canadian context, privacy is preserved through private sector incentives. They claimed that Canada is further along in terms of privacy than in the United States. In addition, they pointed out that an open Internet is still subject to law. That is, applications created on the Internet would be subject to Canadian privacy laws.
Zip.ca opened their submission by arguing that it was difficult to set universal rules because there had to be some way of looking at individual practices. Further, they encouraged the CRTC to find a way to identify infractions and settle them quickly. Zip.ca is a DVD delivery company that uses the Internet to stream live video to its customers. As this requires a fair amount of bandwidth, they were particularly concerned with congestion techniques that target applications.
They were also concerned with DPI technology. Since Zip.ca can follow what their viewers watch, they can provide recommendations to their consumers. They worried that their competitors may use DPI technology to do the same thing and thus harm their business.
They also raised the issue of timing. As Zip.ca is a time sensitive business, they expressed concern that rulings could take long periods of times - maybe even years - to be resolved. Such long periods would put their business in danger since they would not be able to deliver to their customers. They encouraged the CRTC to maintain a fast Internet system.
Further, they addressed some of the negative criticisms of BitTorrent, arguing that it is an efficient way of delivering content to customers. As such, BitTorrent can be used in legitimate ways and application discrimination would hinder this delivering ability. They were also concerned about the idea of "walled gardens." Since much of their content is streamed from the U.S., such a policy could harm their business. Finally, they argued that choice should lie with the consumer.
Von Finckenstein opened the questioning by inquiring into their business model. Zip.ca explained that their system of recommendations differentiated them from video on demand. They said they offer a system similar to a DVD delivering system, but that content is downloaded from the Internet rather than received by mail.
Von Finckenstein expressed concern about Zip.ca as a third party being carried over the Internet and using a network such as Bell or Rogers. In response, Zip.ca disagreed with the idea that big carriers should have the right to give their own applications priority over theirs. Zip.ca argued that they carried the same service as a big network and that it was an unfair business practice for providers to favour their own services. They argued that rules to manage network traffic should not exempt the applications belonging to the carrier. They argued that throttling based on caps could also lead to the same unfair business practice. While they said that volume caps make the most sense, they could be problematic if carriers could throttle outside applications like Zip.ca while allowing their own applications through unhindered.
Lamarre asked about the size of the bandwidth that Zip.ca would need to download movies to their customers. Zip.ca replied that it depended on the size of the resolution and that with new compression technologies, they can deliver movies for under 1G. They said they were concerned that bandwidth caps would be set just below what they could deliver a movie for.
Lamarre inquired into the cost of changing the delivery system from mail to the Internet. In response, Zip.ca said that it costs more to deliver a video through the Internet than through the post. They hoped that it would be cheaper in the long-run, but that with higher resolutions coming out, it is currently more expensive.
Denton asked about the cost of delivering movies by mail. Zip.ca replied that it was about a dollar each way plus some handling fees. They reiterated that with bandwidth costs, it was still more expensive to deliver movies over the Internet.
Coalition of Internet Service Providers
They opened by expressing concern over the use of DPIs but arguing that Internet Protocol flow management is in the public interest. They made three key points:
They argued that when encryption is used for security reasons, it makes DPI inspection impossible and leads to congestion. Moreover, if ISPs could detect encrypted traffic, the encryption would be no good and quickly replaced by a stronger one. They went on to argue that it only takes a small percent of users engaged in P2P traffic to congest the network. Further, they argued that if they remain encrypted, there is no evidence that throttling will be able to reduce congestion.
2. Congestion Signaling
They argued that if there was a way for ISPs to signal to applications that their network is congested, the application would slow down or desist. They pointed to the work of Dr. Lawrence Roberts who advocated a system of flow management. Under this system, technology signals the presence of congestion by selectively dropping IP packets to slow down the system.
In addition, they criticized the current system of measuring gigabytes per month as arbitrary. Instead they pointed to gigabytes downloaded per hour or per minute as the real problem. They claimed that the Commission took an easy way out in ruling that it was non-discriminatory to set a monthly rate.
3. Wirespeed Aggregation
The Coalition described wirespeed as any device that processes data without reducing the overall transmission rate. In other words, the cable box itself never becomes a choke point. The system would require a big, fat "pipe" that is not congested.
The Coalition compared aggregated DSL to the widening of telephone networks. They argued that changing the architecture is the only way to solve the problem of Internet congestion. They urged the Commission to support wirespeed aggregation because it would lead to a system without traffic management and without discrimination against applications.
Von Finckenstein pointed out that their presentation was full of jargon and asked for a simple explanation of Dr. Roberts's idea for congestion signalling. In response, the ISP Coalition explained that under the characteristics of P2P, speeds are limited by the speeds of uploads. They argued in favour of pacing the packets so that the system can function properly. However, it would require new technology that can "signal" in times of congestion. The main idea is that after applications are signalled, they slow down.
Von Finckenstein asked if the technology was available today and why it is not being used. In response, the ISP said that while the technology is available, the technology does not work on IP protocol. ISPs would have to make an investment in new technology.
Von Finckenstein wanted to know what would happen if the technology was deployed on a large basis. In response, the Coalition said that the technology would have to be deployed by the ISPs. However, since the technology remains expense, some ISPs would be unwilling to make the investment since they are currently free to engage in traffic shaping. They added that under today's system, network operators have to engage in traffic management because the network is not strong enough to engage in P2P applications.
Panel of Jason Roks and Vaxination Informatique (Jean-Francois Mezei)
Mezei is a self-employed citizen aiming to share his opinion on this issue. He began his presentation by emphasizing how much people depend on Internet for their day-to-day life. He compared it to a utility like electricity where the supplier does not care how much the consumer uses.
He argued that in order to be competitive, a country needs a competitive telecommunications industry. He pointed out that if we don't have a competitive environment in telecommunications, Canadian businesses will leave the country and Canada will come to depend on telecommunications products developed by others.
He claimed that he got involved due to the myth that P2P and BitTorrent file sharing is "bad." He pointed out that when users are limited by speeds - e.g. 5 MB - it is not possible to download eleven times more than other users.
He argued that the networks have no business saying that P2P unfairly takes up bandwidth. He pointed out that the now defunct Bell store allowed for full 5MB per second downloading without throttling when it used just as much bandwidth as a P2P download.
Further, he argued that P2P is more efficient than Youtube. Since Youtube videos come from one link, P2Ps are more efficient because they can come from several users at the same time. He described this situation as unfair because when it comes to throttling, the networks should look at how much bandwidth is actually being used as opposed to the application itself.
He argued that ISPs did not throttle Youtube because of its popularity. P2P, in comparison, remains an emerging technology. He theorized that carriers are attempting to throttle it before it becomes popular. In comparison, Youtube is putting out HD content that is non-regulated and not throttled. Finally, he argued that P2P is democratic - people can create their own media and distribute it themselves. Youtube, in comparison, can be controlled.
He pointed to the Montreal based VIF Internet system for dealing with heavy users. After hitting 100 GB a month, a customer is given a slower throughput and there is no need for DPI.
He urged companies to stop advertising high speeds that are impossible to reach due to throttling and to instead advertise their true speeds to promote competition.
Roks started with P2P and BitTorrent. He called BitTorrent a "shipping container" and expressed concern that someone could be blocked just because of the shape of the box. He called it "essentially the same" as any other file sharing application out there. Further, he expressed scepticism at the idea that a technology could be banned, blocked, or hindered and described such methods as stifling innovation. He pointed out that DPI cannot manage encryption, arguing that it is therefore ineffectual. Further, he argued that there was simply no way to stop file sharing and that it would not go away.
He said that there were two options - adopting to the new technology or stifling it. He argued that speeding up torrents is one method of dealing with congestion. Since staying in the queue is what blocks the network, speeding up the queue would get them out of the way.
He said that ISPs were responsible for congestion because it happened on their networks. He claimed that while ISPs say that changing the system would be a "lot of money," no ISP has ever said exactly how much money it will be. However, using inflated numbers, he claimed that an upgrade would cost them no more than 2 dollars per user per month over the next three years.
He addressed the market ploy that ISPs use to sell higher speeds. He pointed out that on the one hand they sell higher speeds and then turn around and claim that there is too much data in the system. He argued that people now pay a certain amount per month for a speed and that it is unfair for ISPs to charge more - e.g. people pay for 1MB for speed, and should not have to pay extra for using it.
He pointed out that there are two types of bandwidth:
1. Peer Bandwidth: Data that goes amongst the ISP's network and does not cost anything extra. He claimed that it in fact speeds up the network and maintains geographic integrity. He pointed out that Bell is the only major company in Canada that does not peer.
2. Transit Data: A request goes outside the network of an ISP and it costs the ISP money to connect to another network.
In sum, he said there should be no traffic management aside from controlling malicious software. He urged traffic management only in times of congestion and only while the network is being upgraded. He also argued in favour of full disclosure and transparency on the part of the ISP for the end user.
Von Finckenstein said that he was confused about P2P because other sources he had heard from claimed that it took five times more bandwidth than other applications. In response, the panel said that P2P remains a tool of early adopters. They pointed out that Youtube videos use as much bandwidth and have even expanded to HD. They also said that focusing only on P2P does not work in the long-run as new technologies will come along. For instance, BitTorrent now uses a new technology that avoids throttling. As such, throttling in itself makes the network inefficient as the targeted application is avoiding it.
Von Finckenstein asked the panel to justify its recommendation that networks should upgrade their system. The panel argued that ISPs are selling customers an insufficient network. They argued that if the ISPs cannot afford to upgrade its system to meet demand, they should not keep subscribing to new consumers.
Leonard Katz addressed the issue of ISPs marketing the service with speeds "up to" a certain point and then delivering speeds that are much lower. He asked the panel if ISPs should instead be advertising a minimum speed. The panel agreed that it is a problem when ISPs advertise high speeds and then claim that they do not have the capacity for it. They urged the Commission to insist that ISPs publish their throttling speed. They explained that if ISPs were forced to advertise it, their throttling speed might go up. They referred to "up to" speeds as "false advertising" since the ISPs cannot deliver it. Further, requiring minimum speeds would provide them with an incentive to upgrade their networks.
Katz asked why ISPs continued to throttle. In response, the panel said that ISPs have targeted P2P in particular for throttling. They pointed out that many of the big companies that also sell television subscriptions are the ones that target P2P. They claimed that the smaller networks - which are not threatened by downloadable television programs - can control their networks and do not complain about congestion.
Katz went on to ask Roks about the evolution of P2P. Roks pointed out that there are many different ways to use P2P and that sharing data is the point of the Internet. He also claimed that many of Bell's throttling practices are unclear. He argued that Bell is being selective and targeting certain P2P applications that perhaps pose a risk to their business. Roks also said that if Bell is going to be allowed to throttle, there should be better disclosure.
Molnar brought up the consumption model where users have to pay for their bandwidth use. She asked if the panel opposed the consumption model. In response, the Panel said that it depended on how it was implemented. They said that the Internet is a utility and that it is fair for users to pay for what they are using. However, they pointed out that people are all ready limited by speed because there is only so much bandwidth a user can access on a specific speed.
Molnar wondered if the consumption model was only useful during the "steady state" model but no longer useful during times of unforeseen peak congestion. She asked if there were any more reasonable technological solutions to managing trafficking. In response, the panel said that there are technologies that can manage usage, e.g. the Montreal VIF model. They pointed out that the Internet was designed to handle congestion and that the majority of users did not notice congestion on a daily basis. They said there should be a balance and that it is reasonable to expect a small amount of congestion.
Von Finckenstein asked about Bell's peering policies. Roks explained that ISPs who share their networks can increase speeds. Bell, in comparison, refuses to peer with anyone which makes the experience less efficient.