Orange fights back… intelligently!

23 Nov

spacer
Orange (CC) avhell

Over the years I’ve been fairly critical of Orange both in the way they captured the NGA market and contributed to its slowing down in France and in the way they took a quasi-systematic anti-net neutral stance, even against their own best interests. Their latest service announcements, however, show a change of tracks, and one that I find interesting and mostly compliant with Net Neutrality. That’s not to say they’re not pushing the envelope, but at least they seem to have embraced the paradigm that traffic can stay on your network or it can go elsewhere, and that much of that depends on your own choices as a service provider.

Le Monde published an article today (in French) entitled L’internet “ouvert” d’Orange ferme les portes (Orange’s “open” internet closes doors) which I think misses the point but at least lists the main evolutions in services offered. In a nutshell, here are the big announcements:

  • a new super duper set-top box which… catches up with the latest evolutions in Free’s and Numéricable’s own set-top boxes. So not much there.
  • a million FTTH customers as a target for 2014. The article states this as ambitious considering Orange only has 150 000 customers today, but it’s actually fairly conservative considering there’s no advertising for FTTH of any significance today and the deployment targets by 2014 are at least 5 times that.
  • a new Orange Cloud service which is effectively a Dropbox by Orange. This is something that I’m surprised is coming so late, and that I think is both smart and commercially viable. It’s a natural extension of your broadband service and – provided the T&Cs are intelligently done and the sync clients work fine – should be successful. It’s not clear to me whether the 50GB offer is free and the 100GB offer not-free or if both are optional, I’ll have to investigate that. Orange mobile customers won’t have that traffic count against their caps (which is net-neutrality compliant since it doesn’t leave their network).
  • Orange wants to integrate more over the top services within its offerings (and not just Deezer and Dailymotion which they own) and have announced a big partnership with Akamai to maximise the hosting of said services in their network. Again, this is smart and totally net-neutrality compliant. It’s a way to enhance the quality of the service they offer to end-users and minimize the amount of peering/transit required to do so.

All of this, in my opinion, is smart. What goes untold of course is what (if anything) happens to genuine internet traffic ? If something isn’t cached in Orange’s network, will you still access it with the same quality of service as you do today? If the answer is yes, then I applaud Orange and would even consider (as a customer) switching to their network just for that reason. If of course the unsaid flip side of this is that they increase contention on the internet access part of the service, then I’d find this appalling.

Hopefully we’ll know more in the coming days and weeks.

  • Comments 1 Comment
  • Categories Broadband, Contents, Regulation
  • Author Benoît Felten

A bit of good news on the ETNO/ITU take-over attempt front…

22 Nov

spacer
European Parliament Stairs (cc) photogreuhphies

This just in: the European Parliament posted a resolution with a large majority against ITU’s attempt at regulatory control of the internet. Here are the salient articles:

5. [The European Parliament] believes that the ITU, or any other single, centralised international institution (e.g. ICANN), is not the appropriate body to assert regulatory authority over the internet;

6. [The European Parliament] calls on the Member States to prevent any changes to the International Telecommunication Regulations which would be harmful to the openness of the internet, net neutrality, access to creative content online and the participatory governance entrusted to multiple actors such as governments, supranational institutions, non-governmental organisations, large and small private operators and the “internet public” consisting of users and consumers

I suspect political pressure will be a lot more effective in this regard and I hope the collective weight of the EU 27 and its member states can steer things in the right direction.

  • Comments 0 Comments
  • Categories Regulation
  • Author Benoît Felten

AT&T’s New Paradigm

20 Nov

spacer
Copper Strands (CC) Blyzz

A couple of weeks ago, AT&T unveiled its investment plans going forward and clarified the way it envisages wireline and wireless broadband in the coming years. Teresa Mastrangelo has a blog post that clarifies the salient elements of the announcement (AT&T Details its $14 Billion Project Velocity-IP). Another must-read (though longer and a bit more on the rambling side) is Dave Burstein’s comments over at DSL prime (
AT&T: Turning Off Copper To More Than Half Territory, 99% POPs LTE In Territory, 90% Out, Fiber To Businesses). Dave not only looks at what’s announced but at what’s not and what it means from a policy perspective.

I think both of these are worth reading if you want to understand what is potentially a big shift in US Telecom regulation (provided the FCC approves the plans). My comments are more on the implications. And since the rest of the world often turns to the US for examples, this may give ideas to others…

So, in a nutshell:

  • AT&T didn’t manage to divest their copper operations, so they’ve finally decided to upgrade a broader territory than originally planned with both U-Verse and better DSL through DSLAM upgrades . But this is by no means the whole of their territory, and by Dave Burstein’s assumptions, 20-30% of the US territory will be left with copper lines that only deliver voice.
  • Heavy investment in LTE is designed to cover for territories where wireline broadband will not be available. I personally have doubts that it will deliver a decent alternative, not to mention that just a year ago AT&T was saying it was technically unfeasible without T-Mobile’s spectrum…
  • This effectively spells the end of universal service as we know it. This is significant not only for the US, but for the rest of the world also. Regulators have yet to address the looming issue of universal service compensation mechanisms falling apart as voice line revenues dwindle and operators focus on investing in dense urban areas (if at all).

This is certainly going to be the must-follow telecom story of the quarter…

  • Comments 0 Comments
  • Categories Access, Broadband, Regulation
  • Author Benoît Felten

Fixing broadband as a service

19 Nov

spacer
Flowing Water (CC) bcimet

Martin Geddes is one of the smartest brains in the business, and a good friend. That doesn’t stop us from disagreeing on a number of things (business related), the chief one being the notion that internet in general and IP routing specifically are broken and need fixing. That being said, and while I’m open to hearing any well constructed argument – even those I think I’ll disagree with – I never understood the technology reasoning behind that assertion of his, and that made it hard for me to translate it into a business reasoning.

The excellent video interview of Martin over at Br0kenTeleph0n3 has changed this. It is rather overdramatically entitled How to avoid the coming broadband catastrophe but is well worth 10 minutes of your time nonetheless.

The core of the argument as I understand it is that bandwidth needs are fluid, and bandwidth itself is only one (and maybe not the most important) component of the broadband experience. Therefore by offering speed as the only service differentiator, and the same speed to one customer all the time, broadband access providers are shooting themselves in the foot and missing out on the real business opportunity.

I find myself, therefore, in agreement with the core tenet if that is indeed what it is. I’m assuming there’s a depth of technical reasoning below that that I have no chance in hell of grasping, but that hardly matters to me as long as I understand the business consequences. So that leads me to a few points of mild disagreement and a few comments on implications:

  • The title of the article is overdramatic, and indeed Martin’s own speech is overdramatizing because service providers are not dying. As I mentioned before, I’m myself often guilty of treating the service providers as companies who are suffering in my own writing. Because the goggles through which they view the world are so antiquated, because they repeatedly do stupid things that are clearly not in their best interests, because the innovators in the market are no longer the service providers, etc. But that doesn’t make them a dying breed, and in fact the notion that they’re in dire straits is not supported by their financial performance. So I think we should be careful about not painting proposed solutions as “saves”, just as radical improvements.
  • Martin is indeed right that the issue is a mindframe issue. It’s not necessarily that people within these organisations don’t understand the argument. The technical people probably will, the marketing people will struggle (but if I can get it, surely they will). It’s that they consider the way the internet market works to be beyond their grasp. In his interview, Martin doesn’t touch much on the ecosystem implications of the changes he’s proposing, and implicitly suggests that service providers have full end-to-end control of the traffic they receive and send. That’s not the case (and I know he doesn’t think that) but it’s going to be one of the main objections to changing the way things are done.
  • The final point I’d like to make is that while there are no doubt ways to improve customer experience (and monetize that improvement), it’s easy – perhaps too easy – to read that as “there’s no issue in the access network”. In fact at one point in the interview Ian Grant tries to get Martin to comment on BT’s FTTC investment, and to his credit, Martin dodges the bullet by stating that that’s an infrastructure consideration. The question ultimately is: will the proposed changes in traffic management and monetization improve the performance enough to meet the challenges of demand in the next few years. I have a hard time believing that, so I don’t think it’s an either/or proposition, but here my own biases may be showing.

 

  • Comments 2 Comments
  • Categories Access, Broadband, Expert Opinions
  • Author Benoît Felten

Interview on Stokab over at Community Broadband Bits

14 Nov

spacer

Gamla Stan (cc) *Kicki*

Community Broadband Bits is a great podcast addressing broadband issues for communities. It’s largely US centric but very open to understanding how other players worldwide have evolved. I was interviewed for this month’s edition, talking about Stokab and how the Stockholm model could provide inspiration for communities elsewhere.

You can find the podcast here and details about the white paper we published on Stokab here.

  • Comments 0 Comments
  • Categories Broadband, Muni Fiber
  • Author Benoît Felten

Latency certainly matters for video viewing…

13 Nov

spacer

Two scientists from Akamai recently published a paper entitled Video Stream Quality Impacts Viewer Behavior: Inferring Causality Using Quasi-Experimental Designs. Now Akamai being in the business of optimizing internet traffic flows and particularly internet video flows, you’d expect the paper to conclude that low quality in video streams causes users to disconnect, and indeed it does. Still, if you accept the inherent bias that Akamai can only measure videos they’re enabling (and therefore presumably not Youtube or Dailymotion), the methodology seems sound. Here are the main results as quoted from the abstract:

We study the impact of video stream quality on viewer behavior in a scientific data-driven manner by using extensive traces from Akamai’s streaming network that include 23 million views from 6.7 million unique viewers. We show that viewers start to abandon a video if it takes more than 2 seconds to start up, with each incremental delay of 1 second resulting in a 5.8% increase in the abandonment rate. Further, we show that a moderate amount of interruptions can decrease the average play time of a viewer by a significant amount. A viewer who experiences a rebuffer delay equal to 1% of the video duration plays 5% less of the video in comparison to a similar viewer who experienced no rebuffering. Finally, we show that a viewer who experienced failure is 2.32% less likely to revisit the same site within a week than a similar viewer who did not experience a failure.

Emphasis mine. Now this is nothing new, I think everyone is aware of that on some level. What astounded me was how little tolerance users have for non-delivery of a video. The graph provided in the paper (see above) is quite telling also: it shows not only how fast customers drop off, but the fact that the better connectivity they’re using, the less tolerant they are to latency in getting the content they’re trying to view. Fiber users are the least patient of all.

This leads me to three quick (and hopefully interesting points):

  • first, as I’ve mentioned before, speed is addictive and users who get used to the comfort of low latency clearly lack patience when content delivery won’t follow,
  • secondly, as highlighted in Numerama yesterday (quoting the same study, in French), service providers who deliberately throttle video and/or specific video content providers, like Free currently does in France with Youtube are damaging the business of these content players, but also irritating customers. There’s only two ways this can go: either customers don’t care enough to churn and ultimately the Youtubes of this world will have to cave in and fork out to get quality delivery, or the customers care enough and leave visibly enough to force the service provider to actually provision the service properly. I know I’m one of the latter, and the first service provider who will provide me with a good offer from now on I’ll switch to. I’m a heavy Youtube user, and getting decent service on that matters more to me than a reputedly lame blueray player I’ve never used in my set-top box,
  • finally, I find it very interesting that the tolerance towards mobile is so high. Clearly, there’s no alternative if you’re viewing on a mobile (if you were in a wifi enabled area, you’d be using wifi and it would count as whatever the wireline technology is in the statistics), but I still find it amazing that customers are willing to wait so long for a video on mobile. I’m wondering how long that tolerance will last…
  • Comments 0 Comments
  • Categories Access, Contents
  • Author Benoît Felten

Externalities by the Bucketload

9 Nov

spacer

Lightning (cc) ergates

Just a short post to point you to this longer post by David Isenberg on how the FTTH / Smart Grid combination helped Chatanooga recover energy wise from major storms last summer in 3.5 days instead of 5 thus saving an estimated 1.4 million dollars. It’s entitled Storm Recovery — Chattanooga Style versus Sandy and Athena and it’s proof once more that we cannot keep on considering the whole NGA financing issue solely on the impact on the incumbents’ bottom line. Any local authority looking into broadband development should read this.

  • Comments 0 Comments
  • Categories Broadband, Muni Fiber
  • Author Benoît Felten

A Masterful Response to ETNO’s Lobbying for ‘Sender Pays’ Internet

7 Nov

spacer
Scales of Justice Brisbane Courts – (CC) Sheba_Also

 

The coming decisions in December regarding possible changes in Internet regulation by the ITU at the heavy-handed request of European incumbent operator lobbying ETNO has been a growing concern
that I meant to write about, to highlight the so many ways in which their proposals are dangerous to the digital economy ecosystem in general and value-destructive for all players in the ecosystem (including those who are lobbying for it) and the countries who are seduced by the idea of a return to the old telephony models of compensation. I couldn’t find the time to do it in recent weeks, and thanks to Dean Bubley, I won’t have to.

Limited though my authority in these fields are, I enjoin you to read Dean’s long, articulate blog post entitled Why ETNO’s proposals to ITU for Internet regulation & Non-Neutrality are flawed & duplicitous. It explains with great clarity why the ITU must not follow the recommendations of ETNO and even suggests a number of ways that the waters should be clarified to avoid the kind of sneaky dissimulation used by ETNO here.

The need to forbid internet access bundling is something I wrote about over a year ago as a clear and understandable solution to solve this once and for all. If customers perceived the clear difference between internet access and managed services, you can bet they would care about net neutrality. To be clear, I’m not advocating the impossibility of a bundle sale, but the need for performance metrics of the internet access part of the service to be explicit.

As an additional piece of reading in line with Dean’s opinions on the laughable AT Kearney report published a couple of years ago and advocating for similar one-sided financial mechanisms (which, as I’ve shown here woul

gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.