Archive

Posts Tagged ‘network’

Infrastructure Edge: Awaiting Development

August 24, 2018 Leave a comment

Editor’s Note: This article originally appeared on the State of the Edge blog. State of the Edge is a collaborative research and educational organization focused on edge computing. They are creators of the State of the Edge report (download for free) and the Open Glossary of Edge Computing (now an official project of The Linux Foundation).

When I began looking into edge computing just over 24 months ago, weeks would go by with hardly a whimper on the topic, apart from sporadic briefs about local on-premises deployments. Back then, there was no State of the Edge Report and certainly no Open Glossary of Edge Computing. Today, an hour barely passes before my RSS feed buzzes with the “next big announcement” around edge. Edge computing has clearly arrived. When Gartner releases their 2018 Gartner Hype Cycle later this month, I expect edge computing to be at the steepest point in the hype cycle.

Coming from a mobile operator heritage, I have developed a unique perspective on edge computing, and would like to double-click on one particular aspect of this phenomenon, the infrastructure edge and its implication for the broader ecosystem.

The Centralized Data Center and the Wireless Edge

 

So many of the today’s discussions about edge computing ascribe magical qualities to the cloud, suggesting that it’s amorphous, ubiquitous and everywhere. But this is a misnomer. Ninety percent of what we think of as cloud is concentrated in a small handful of centralized data centers, often thousands of miles and dozens of network hops away. When experts talk about connecting edge devices to the cloud, it’s common to oversimplify and emphasize the two endpoints: the device edge and the centralized data center, skipping over the critical infrastructure that connects these two extremes—namely, the cell towers, RF radios, routers, interconnection points, network hops, fiber backbones, and other critical communications systems that liaise between edge devices and the central cloud.

In the wireless world, this is not a single point; rather, it is distributed among the cell towers, DAS hubs, central offices and fiber routes that make up the infrastructure side of the last mile. This is the wireless edge, with assets currently owned and/or operated by network operators and, in some cases, tower companies.

The Edge Computing Land Grab

 

The wireless edge will play a profound and essential role in connecting devices to the cloud. Let me use an analogy of a coastline to illustrate my point.

Imagine a coastline stretching from the ocean to the hills. The intertidal zone, where the waves lap upon the shore, is like the device edge, full of exciting activities and a robust ecosystem, but too ephemeral and subject to change for building a permanent structure. Many large players, including Microsoft, Google, Amazon, and Apple, are vying to win this prized spot closest at the water’s edge (and the end-user) with on-premises gateways and devices. This is the domain of AWS Greengrass and Microsoft IoT Edge. It’s also the battleground for consumers, with products like Alexa, Android, and iOS devices, In this area of the beach, the battle is primarily between the internet giants.

On the other side of the coastline, opposite the water, you have the ridgeline and cliffs, from where you have an eagle view of the entire surroundings. This “inland” side of the coastline is the domain of regional data centers, such as those owned by Equinix and Digital Realty. These data centers provide an important aggregation point for connecting back to the centralized cloud and, in fact, most of the major cloud providers have equipment in these co-location facilities.

And in the middle — yes, on the beach itself — lies the infrastructure edge, possibly the ideal location for a beachfront property. This space is ripe for development. It has never been extensively monetized, yet one would be foolhardy to believe that it has no value.

In the past, the wireless operators who caretake this premier beachfront space haven’t been successful in building platforms that developers want to use. Developers have always desired global reach along with a unified, developer-friendly experience, both of which are offered by the large cloud providers. Operators, in contrast, have largely failed on both fronts—they are primarily national, maybe regional, but not global, and their area of expertise is in complex architectures rather than ease of use.

This does not imply that the operator is sitting idle here. On the contrary, every major wireless operator is actively re-engineering their networks to roll out Network Function Virtualization (NFV) and Software Defined Networking (SDN), along the path to 5G. These software-driven network enhancements will demand large amounts of compute capacity at the edge, which will often mean micro data centers at the base of cell towers or in local antenna hubs. However, these are primarily inward-looking use cases, driven more from a cost optimization standpoint rather than revenue generating one. In our beach example, it is more akin to building a hotel call center on a beachfront rather than open it up primarily to guests. It may satisfy your internal needs, but does not generate top line growth.

Developing the Beachfront

 

Operators are not oblivious to the opportunities that may emerge from integrating edge computing into their network; however, there is a great lack of clarity about how to go about doing this. While powerful standards are emerging from the telco world, Multi-Access Edge Computing (MEC) being one of the most notable, which provides API access to the RAN, there is still no obvious mechanism for stitching these together into a global platform; one that offers a developer-centric user experience.

All is not lost for the operator;, there are a few firms such as Vapor IO and MobiledgeX that have close ties to the  infrastructure and operator communities, and are tackling the problems of deploying shared compute infrastructure and building a global platform for developers, respectively. Success is predicated on operators joining forces, rather than going it alone or adopting divergent and non-compatible approaches.

In the end, just like a developed shoreline caters to the needs of visitors and vacationers, every part of the edge ecosystem will rightly focus on attracting today’s developer with tools and amenities that provide universal reach and ease-of-use. Operators have a lot to lose by not making the right bets on programmable infrastructure at the edge that developers clamor to use.  Hesitate and they may very well find themselves eroded and sidelined by other players, including the major cloud providers, in what is looking to be one of the more exciting evolution to come out of the cloud and edge computing space.

Al la carte versus All you can eat – the rise of the virtual cable operator

October 30, 2014 Leave a comment

Recent announcements by the FCC proposing to change the interpretation of the term “Multi-channel Video Programming Distributor (MVPD)” to a technology neutral one has thrown open a lifeline to providers such as Aereo who had been teetering on the brink of bankruptcy. Although it is a wide open debate if this last minute reprieve will serve Aereo in the long run, it serves as a good inflection point to examine the cable business as a whole.

“Cord cutting” seems to be the “in-thing” these days with more people moving away from cable and satellite towards adopting an on-demand approach – whether via Apple TV, Roku or now – via Amazon’s devices. This comes from a growing culture of “NOW“; rather than wait for a episode at a predefined hour, the preference is to watch a favorite series at a time, place – and now device of ones choosing. Figures estimate up to 6.5% of users have gone this route, with a large number of new users not signing up to cable TV in the first place. The only players who seem to have weathered this till date are premium services such as HBO who have ventured into original content creation, and providers offering content such as ESPN – the hallmark of live sports. Those who want to cut the cord have to end up dealing with numerous content providers – each offering their own services, billing solutions etc. To put together all the services that users like, ends up being a tedious and right now – and expensive proposition.

This opens the door for what can only be known as the Virtual Cable Operator – one which would get the blessings of the FCC proposal. Such an aggregator (could be Aereo) could bundle and offer such channels without investment into the underlying network infrastructure – offering a cost advantage as much as 20% as compared to current offerings. This trend is a familiar one in the Telco business – and cable companies better be ready for this. Right now they may be safe as long as ESPN doesn’t move that route – but with HBO, CBS etc all announcing their own services, I believe it is only a matter of time. While the primary impacts to cable has extensively been covered – there are a few other consequences, and opportunities that I would like to address.

The smaller channels (those who charge <30c to the cable operators) will experience a dramatically reduced audience. Currently there was a chance that someone would stop by while channel flipping – with an al la carte service – this pretty much disappears, and along with it advertising revenue. An easy analogy would be that of an app in an app store – app discovery (in this case channel discovery) becomes highly relevant. Another result is that each channel would be jockeying for space on the “screen” – whether a TV, tablet or phone. I can very well imagine a scenario of a clean slate design like Google on a TV, where based upon your personal interest you would be “recommended” programs to watch – question is who would control this experience…

Who could this be –  a TV vendor (a.k.a a Samsung – packaging channiels with the TV), an OEM (Rovio, Amazon etc.), a Telco or cable provider (if you can’t beat them – join ’em), or someone like Aereo? The field is wide open and the jury has yet to make a decision. One thing however is clear – the first with a winning proposition – including channels, pricing and excellent UI would be a very interesting company to invest in…..

Net Neutrality – let the “Blame Games” begin

netflix-is-shaming-verizon-for-its-slow-internet

The picture stared right back at me – it was hard to miss, with a bright red background and letter’s clearly spelled out  – “Netflix is slow, because the Verizon Network is congested“. It is hard to pass a few days without yet another salvo thrown in the latest war against net neutrality. No sooner does one side claim victory, the other party files a counter-claim. This is not only a US phenomenon, but has also spread to Europe; regulators now have their backs firmly against the wall as they figure out what to do.

What is a pet peeve of mine is that although there are good, solid arguments on both sides – among the majority of comments there seems to be a clear lack of understanding of the whole situation as it stands. I am all for an open and fair market, an ardent supporter of innovation – but all this needs to be considered in the context of the infrastructure and costs needed to support this innovation.

A few facts are certainly in order

  • People are consuming a whole lot more data these days.  To give you an idea of what this is – here are some average numbers for different data services
    • Downloading an average length music track – 4 MB
    • 40 hours of general web surfing – 0.3 GB
    • 200 emails – 0.8 GB, depending upon attachments
    • Online radio for 80 hours per month – 5.2 GB
    • Downloading one entire film – 2.1 GB
    • Watching HD films via Netflix – 2.8 GB per hour
  • Now consider the fact that more and more content is going the video route – and this video is increasingly ubiquitous, and more and more available in HD… this means
    • Total data consumption is fast rising AND
    • Older legacy networks are no longer capable or equipped to meet this challenge

Rather than to dive into the elements of peering etc for which there are numerous other well written articles I would like to ponder on three aspects.

  1. These networks are very expensive to build out – truck-rolls to get fiber to your house are expensive… and
  2. Once built out – someone has to pay
  3. and finally – firms are under increasing pressure from shareholders and the like to increase revenues and profits…

This simply means that if the burden was all on the consumer – he would end up with a lighter (I dare say a much lighter) wallet. It would be similar to a utility – the more you consume, the more you pay. It would be goodbye to the era of unlimited data and bandwidth (even for fixed lines).

Now, I totally get the argument when there is only one provider (a literal monopoly) – then the customer is left with no choice. This is definitely a case which needs some level of regulation to protect the customer from “monopoly extortion”. Even when regulators promote “bit-stream access” – i.e. allowing other parties to use the infrastructure – there is a cost associated with the same. Hence this is more of a pseudo-competition on elements such a branding, customer service etc rather than on the infrastructure itself. The competitor may discount, but at the expense of margins. There always exists a lower price threshold which is agreed with the regulator. Other losers in such a game are those consumers who live in far flung areas – economics of providing such connections eliminate them from any planning (unless forced by the government). In such a case – it becomes a cross-subsidy, with the urban, denser populace subsidizing the rural community. However, these can be served by dedicated high speed wireless connections and in my opinion do not present such a pressing concern.

As everyone is in agreement, the availability and access to high-speed data at a reasonable cost does have a direct and clear impact on the overall economy of a country. If we do not want to continue increasing prices for consumers in order to keep investing in upgrades to infrastructure, who then should shoulder this burden? Even though it would undoubtedly be unfair to burden small and emerging companies by throttling services etc, given on how skewed data traffic is with a few providers (e.g. Netflix etc) consuming the bulk or the traffic – would it be fair they bear a portion of this load? After all, these firms are making healthy profits while bearing none of the cost for the infrastructure.

These aren’t easy questions to answer – but need to be considered in the broader context. One extreme may lie in having national infrastructure networks – but this is easier said than done. A better compromise may be to get both sides to the negotiating table and involve the consumer as well, each recognizing that it is better to bury the hatchet and work out a reasonable plan rather than endless lawsuits.

Once this is accomplished – the Blame Games would end; and hopefully the “Innovation” celebrations would commence!

Network ‘superiority’ or NOT

March 8, 2013 Leave a comment

The last couple of weeks in the Telecom sector are what I can only phrase as ‘circus’ weeks with both the Mobile World Congress in Barcelona and CEBIT in Hanover occurring in quick succession. Companies spend weeks if not months prepping for this, with several key milestones built around the dates. My thoughts this week are primarily towards upending one of those assumptions made by the leading carriers is that network quality will be the chief differentiation – especially among the European Telcos. This is not limited to the Europeans: “We Let the Network do the Talking – said Verizon Wireless“! The question is – is this assumption right, or more succinctly put, how much longer will the ‘network superiority’ claim maintain itself?

The underlying reason has to do with two distinct factors – the cost of new technologies and increasingly active regulators. Let us examine each one independently.

The Cost: we now have LTE (call it 4G, call it whatever the marketing department cooks up!) with which you can seamlessly watch videos and other high data rate content. Users seem to love it, the number of devices are simply exploding which means that networks have to be designed for even higher demand. Since bandwidth is limited (there is only so much of it which is usable for mobile communications), LTE squeezes every bit of spectral efficiency – the only option in laymen’s term to add capacity is to place base stations closer and closer together (you can then reuse spectrum further away). That means – more base stations per sq. km and well – that translates to a quick escalation in cost. Besides an obvious fact that in a certain area there is only limited place where you can put up these sites! Operators have long realized this and in many places they already share assets (towers, sites – and in many cases – even capacity). If you add the cost of ‘winning spectrum’ in an auction then this cost is steep – many operators are not in a position to recoup, especially in countries where competition keeps driving prices down.

Key Takeaway – Increasing cost of network build-outs will make it prohibitively expensive for an operator to build out a network by itself.

The Regulator: I have been interacting with quite a few regulators in the recent past and just from a sample I see that many are moving from a reactive into a pro-active mood. This may be due to several reasons, but many authorities have enacted and enforced legislation which has led to price reductions (directly increasing consumption), and more critically to level the playing field to introduce ‘innovation’ from other – primarily OTT players. Although operators may not like it – this is a trend that is likely to continue. What this results in is that even if the operator builds his ‘super-star’ network he is forced to open it to others – thus directly dis-incentivizing him in building the network in the first place. He becomes a ‘super dumb pipe’. Other players are then able to offer attractive solutions and monetize them directly.

Key Takeaway – Active regulators make it difficult for operators to extract the best value from the network.

Where does it leave us – Operators need to think beyond the ‘network’ to determine how to survive. A leaf can be taken out of the MVNO model where players (some even as small as two people in 1 room!) are able to identify target segments and provide tailor made solutions which customers want. Future networks will allow different levels of QoS – hence this segmentation could offer differing levels of service. The future could herald the introduction of the pure ‘Net-Telco’ offering only aggregate services – similar to LightSquared, a valiant effort but which failed due to technical reasons (their spectrum was interfering with GPS users).  Operators will definitely fight this, but the writing is on the wall. They need to revamp their large, inefficient organizational structures into a more lean, market oriented approach. Then – when the time to give up the ‘network’ arrives – the organization that has planned for it will come out victorious.

Categories: Innovation, OTT, Uncategorized Tags: ,