Contributed By


Guest Contributor

View Profile

Virtually everything: How the physical world is being consumed by software services

February 18, 2015

Marc Andreesen famously said in 2011 that “software is eating the world.” I believe that’s only the case because of underlying changes to the hardware that’s facilitated it.

By John Best

John works in the ISP industry as a Network Security and Abuse consultant, and is based in and around Manchester, UK. He tweets in a personal capacity @bestjohnd and occasionally writes about startups, business, and technology at

Marc Andreesen famously said in 2011 that “software is eating the world.” I believe that’s only the case because of underlying changes to the hardware that’s facilitated it.

The creation of that world-eating software, the coding jobs that it creates and the wealth it generates would all be imaginary if the means to execute the software didn’t exist.

Recently I’ve spent some time looking at the trends in technology and computing and what they might mean to a business. I’ve observed that one thread has become apparent across a number of applications and services. That thread seems constant, frequent and repetitious; namely, dedicated physical devices supplanted by virtualized physical devices, supplanted in turn by services.

World-eating software is just a symptom

The move from dedicated device to eventual software service running in a virtual machine has only been possible because of the capabilities of the hardware on which that machine is running (and the bandwidth of connections available to that hardware). The symptom of the world being eaten by software is only that – a symptom. It is simply a reflection of the changes in delivery method in line with a common trend. Underlying that symptom is a wider process.

By way of an explanation; if you can, cast your mind back some 10-15 years. Back then, a device was a device. If you wanted a modem, you bought a box (probably from a physical bricks-and-mortar store) and inside was a modem. It sat squealing on a phone line while it modulated and demodulated, and that was the extent of it. The same is true of routers, firewalls, content filtering systems and so on.

When a device was needed to perform a particular task, that task was performed by that device. The physical hardware of that device, its firmware and any associated templates or configurations were dedicated solely to that task. Slotting that device into the network was simple. It had a role to play, a space in which to fit, and a series of actions to execute.

The march of time

With the progress of time, and the advances in technology which that brought, we saw a shift – a device wasn’t just a machine any more, or rather a machine was sometimes more than one device. Often prompted by the Open-Source-brew-it-yourself movement there was a change from dedication of all the resources of a device to a single function to excess capability of a device being used to achieve another function.

What that meant was that rather than necessarily buying a router AND a firewall, both tasks could be achieved by purchasing a single device and running virtual versions of those devices.

Physical devices in the core could now house multiple logical functions. This became reflected commercially – now you could buy router/modem/firewalls off the shelf. “Unified Threat Management” systems became available with a range of Web filtering / spam catching / intrusion prevention capabilities. Indeed, carrier-class devices became capable of simulating entire networks within themselves, creating a multitude of virtual router instances, all with their own traffic management and virtual firewalling.

The next jump saw the abstraction of the local physical device. Taking infrastructure and functions and placing them in the hands of service providers. Entire networks were hived off becoming IPVPNs, using MPLS to push packets around at layer two in an Internet version of the shell game.

Businesses could effectively (by paying a subscription) rent a network with a minimal outlay on physical infrastructure. That network could exist remotely from the customer site, hosted on equipment that also hosted other networks while not touching any of them.

Indeed, remote intranets with no breakout to the Internet became a readily available product option. Now routers weren’t devices any more, often they weren’t more than lines of code running in a box running many more such lines of code. This aspect is the basis of “Cloud” computing – a service that offers to replace infrastructure and devices with simulations and services. That’s a business that was worth $131 billion in 2013 according to Gartner – which seems at least to validate the demand for virtual devices.

This process happened (and is happening) with a variety of other platforms. Consider the humble telephone and corporate switchboard. Advances saw them giving way to VoIP phones and PBXs while those in turn eventually gave way to hosted virtualized telephony exchanges.

With consumer media, vinyl/tape/CD moved to locally-stored MP3s and then to streaming content platforms. Even gaming (an industry worth more than $90 billion a year according to Gartner) has shifted from physical media/bundles to streaming and subscription.

The mobile phone is perhaps another obvious consumer example. It’s vacuumed up functionality from calculators, computers, calendars and so on. Now it frequently operates as something akin to a window to our virtual lives, with the real parts of the various services we access through it being hosted elsewhere entirely.

A change in our expectations

What’s interesting in a wider context is what’s happening almost as a by-product of the change towards virtualization and services – namely the change in expectations that it brings. The mere fact that the shift has happened leads us to expect more from those technologies it’s happened to.

Naturally, if a product or service is capable of offering multiple functions then that forms a basis for product comparison. Where those “value adds” aren’t present, we make a judgment as to the relative price. There’s no doubt that “core” functions are the key, but cross-functional competition can only benefit the consumer as suppliers try to squeeze market share from related products.

After all, unless you really need an in-depth fully-featured add-on, you’ll buy the one that does as much as you need for less outlay. If you’re not expecting to need a top-end firewall in addition to your router, just something that does some simple packet filtering, then why not buy a router with that basic filtering function built in?

This trend isn’t just limited to physical devices, either. Consider television/broadband/landline/video rentals. Combined packages for all are now being offered by major service providers. While we as consumers still have the option of selecting a basic package consisting of one component from the list, we can equally subscribe and get all of the functions for a single subscription (likely cheaper than assembling a similar package from disparate parts and suppliers).

Indeed, we’d expect that to be the case, the centralization and concentration of supply bringing the preconception of a bundled saving. That should mean that disruption of those industries is incredibly difficult.

The startup approaching a space filled with incumbents offering bundles packages can’t hope to compete on an equal footing. Instead the successful ones emulate the evolution, choosing one aspect to focus on, using their agility as the basis for competitive advantage. Of course as they grow, they begin to offer their own bundles, and so on.

A repeating process

Almost unknowingly we’ve repeated and replicated this process across commonplace aspects of our daily existence.

Think about the daily commute. We’ve gone through a stage where a physical device (be it a horse, bike or car) is replaced by public transport (although not always successfully, mind you). This then has given way to on-demand services like taxis. We “share” in the consumption of the taxi, and it does perform other functions – it can be a delivery device, an ambulance and so on (imagine a world where a taxi refused to take your briefcase/presentation/luggage!).

We should look to this as an example because this process happened many years ago. This therefore allows us to assess what the possible next steps are for those industries still in the throes of changing, and those trying to derive the same efficiency gains pure technology has afforded in a much shorter span elsewhere. I’d argue that one thing it has done is enabled the concept of what some call “the sharing economy” (although I’d also argue that “sharing” is a misnomer).

You might suggest not much has changed, since ridesharing services are taxis by any other name, and since we’re physically moving people and things (which also applies to buses, charter airplanes, ferries, etc.). There has been the same shift we’ve seen elsewhere, however – we’re now at the point where we’ve shifted at least the ordering part of these services to an application – a service in itself. Further, we now have expectations about the nature of the service – we expect pickup quickly, are able to view the position of local cars, and we can even see how well or poorly rated our potential driver is.

The fact that this pattern seems to be repeated across multiple technological platforms, across different industries and in both the consumer and enterprise spaces suggests it’s an almost natural progression. Having established this is a common phenomenon, what does this tell us, and more importantly, what does it mean for products, services and industries where such a progression has yet to happen? Could this process occur in other aspects of our lives? Could it even happen to us?

I believe we’re still a long way off replacing aspects of ourselves, but that it is happening by degrees. Now, before you get excited about transporters, pattern buffers and jaunting around space in Lycra onesies, I don’t necessarily mean teleportation (although yes, that would be cool) rather I mean a lack of need for a physical presence.

Teleworking, working from home, virtual teams spread across the globe all are the next step on from needing to be physically transported to a place of work. If I were to go full sci-fi I might suggest that the next advance along from that moves from telepresence to digital presence: virtual digital software agents programmed to act and react like their creator would desire in set situations and circumstances (well, at least until we crack that teleportation thing).

Business opportunities

Bringing it back down to Earth and more practical terms, the mere fact that this pattern repeats presents its own opportunities. For each stage of the process (physical/virtual/service) businesses can be created to take advantage of the cycle.

By identifying industries that are lagging or stuck in a given phase, it should be possible to determine how best to move them through that cycle, and with sufficient resources present an easy path for them to convert down. Although some functions of those industries may have to remain entrenched in the purely physical device stage, the delivery of those functions doesn’t have to.

The smart entrepreneur can cater to providing the physical exchange at a higher layer (like ridesharing apps do). If the pattern continues, and software does indeed eat the world, I have to hope that it doesn’t lead to a point of stagnation, where the infrastructure supporting that software is “good enough,” instead that our expectations of bundling software and services drives investment and innovation in the methods of supporting them.

This article originally appeared on The Next Web.

This article is available exclusively to
Comcast Business Community Members.

Join the Comcast Business Community to read this article
and get access to all the resources and features on the site.

It's free to sign up


Join the Discussion

300 Characters Left