The Sunday Brief

Connecting technology, telecommunications, and the internet

The Cloud Needs Better Code- Now

by | Jan 9, 2014 | TSB

Happy Holidays from Raleigh, Charlotte, Greensboro, Dallas, and Fraser (CO).  The snow is terrific this year, and, as a result of three consecutive powder days, this week’s Sunday Brief is being published on Tuesday.  This will also serve as the last Sunday Brief of 2013. 

Before diving into our last big trend, it’s worth mentioning the sale of AT&T’s Connecticut operations (a.k.a., the Southern New England Telephone or SNET transaction) for $2 billion to Frontier Communications, which was announced on Thursday.  For those of you who do not remember the predecessor company, SBC Communications bought SNET for $4.4 billion in 1998.

There has been a lot of speculation about whether the former SNET properties would be sold, and what would happen to the 2,700 employees that will come with the sale.  By my calculation (and validated with a reliable Sunday Brief reader), AT&T Connecticut’s $1.2 billion in revenues were generating slightly more than $400 million in EBITDA.  Using this figure, Frontier paid a 4.9x multiple of current year expected earnings.  While an “island” property (see picture), the Connecticut sale will slightly erode  AT&T’s remaining wireline picture (AT&T had a total wireline EBITDA of 29% in 3Q 2013).

The question is “How does Frontier grow and eventually revive CT?”  Given continued earnings pressure for all wireline providers, does this purchase look more like an 6-7x earnings acquisition?  The cable compeitive environment is fractured (Cox, Cablevision, and Charter all have franchises in CT).  Time will tell, but it seems as if AT&T received a decent valuation.  It will be up to Frontier now to beat cable.

Over the last four weeks, we have explored several trends that will have meaningful impact on the telecommunications landscape in 2014 and beyond:

  1. The rise of third party paid data, which may include carrier-engineered bit prioritization
  2. The expansion of data “packing” and other compression techniques, in some cases driven by applications companies such as Facebook
  3. The transformation/ metamorphasis of the set-top box as cloud-stored content commands more value and as gaming systems gain in popularity
  4. The continued disproportionate value creation from live or real-time broadcasts
  5. (This week) Increased adoption of cloud-optimized software

Our last trend is broadly called “cloud revised software.”  This is based on the premise that client or mainframe-based applications will continue to move to the cloud and be used by many different types of devices.

I get asked a lot by VCs and entrepreneurs “What’s the biggest problem no one has solved?”  I stumbled on my answer for many months, but now I respond concisely:  “Applications software is not efficient.  Virtualization of inefficient code does not help the case for cloud, and in fact could weaken it.  We need more efficient code.”

That’s right – most of the code written for consumption on desktops or laptops is useless in a virtualized world.  The “apps on tap” concept sounded so good a decade ago, and many were persuaded to believe that if we could just have 10, 20, 50Mbps to the home or office desktop, the problem would be solved.  Bandwidth improved (and in the case of many small and medium businesses, 100Mbps could easily be available), but code still remained large and unwieldy.

Let’s put things into perspective – a full download of the latest edition of Microsoft Office 2013 requires 3 gigabytes (GB) of your hard drive.  This is up from 2 GB in 2008.  Most users consume 10% or less of the total features of Microsoft Office (e.g., the Trace Dependents function in Microsoft Excel is an advanced feature for active spreadsheet editors, not your everyday spreadsheet reader).

Microsoft Office is not the only software company guilty of super-sized software.  Adobe Acrobat reader (280MB), QuickBooks Pro (428MB), and even iTunes on the Mac OSX (221MB) are way too heavy for the cloud.  Package software needs to go on a diet – fast.

For applications to make the transition to the cloud, they are going to have to be personalized for each individual user.  How the CEO/ CFO uses Microsoft Access or Excel is different than a senior marketing analyst or a data scientist.  If applications virtualization could be customized so that the executive office used certain functions and analysts used others, performance and customer experience levels would increase dramatically.

There are several companies, including Microsoft and Citrix, modifying their current software to make it more efficient.  In fact the entire premise of Microsoft 365 is to eliminate the need for applications on devices that need to temporarily access a Microsoft application.  While their movements have been good, one smaller company has attracted a lot of attention over the past several quarters.

Numecent was hatched out of a 1999 DARPA project run out of the University of California – Irvine (noted computer scientist Arthur Hitomi led the research).  At that time, it’s primary purpose was to develop an applications delivery support system for connected computers (and you thought cloud computing was a brand new development).  After several forays into the gaming market, the predecessor company went into receivership in 2008 and was purchased by Osman Kent, the mastermind behind 3DLabs.

What makes Numecent interesting is that through their study of thousands of software applications (and hundreds of thousands of users), they can predict which parts (they call these “pages”) you will use and which ones you won’t.  When you want to use an application that would typically run 600MB of disk storage, Numecent will “right size” this application to 50-60 MB.  If you happen to need a particular function not included in the basic package (e.g., the Trace Dependents Excel function described earlier), this will be delivered separately.

The best part of this is that Numecent performs their “cloud paging” process with the full support of the applications providers.  In fact, Numecent has “cloudified” more than 10,000 applications.  To see two terrific use cases for their technology see their university use case here and their CAD application use case here.

This is not a pitch for Numecent, but simply highlights the massive effort required to transition from premise-based to cloud-based services.  OnLive, Citrix, Microsoft and a few other companies are jumping into the fray (OnLive has made terrific strides recently in gaming).  Contrary to popular belief, virtualizing applications preserves the traditional license model and does not put it at risk.

Bottom line:  Software libraries full of crap code.  Inefficient interface languages cost processing seconds and ruin the customer experience.  Poorly written database sequences cost tablet memory and time and stifle adoption of next-generation hardware.  The cloud needs better code – now.

We will take a Holiday break next Sunday and kick off 2014 with a CES preview (I will be at CES on Tuesday (1/7) and Wednesday (1/8) but will not be speaking at CES this year).  Until then, if you have friends who would like to be added to The Sunday Brief, please have them drop a quick note to sundaybrief@gmail.com and we’ll subscribe them as soon as we can.  Have a Happy Holiday!

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Subscribe

Enter your email address to receive notifications of new posts by email.

Recent Posts

Archives

%d