≡ Menu
earlyproducts

The chaotic early days of a new computing era is an extended period of product innovation and experimentation. But both the form and function of new products are still strongly influenced by the norms and transitional technologies of the waning era. New technologies are applied to new problems but often those new technologies are not yet mature enough to support early expectations. The optimal form-factors, conceptual metaphors, and usage idioms of the new era have yet to be fully explored and solidified. Looking back from the latter stages of a computing era, early era products appear crude and naive.

This is a great time to be a product innovator or an enthusiastic early adopter. But don’t get too comfortable with the present. These are still the early days of the Ambient Computing Era and the big changes are likely still to come.

grassroots

How do we know when we are entering a new computing era? One signal is a reemergence of grassroots innovation. Early in a computing era most technical development resources are still focused on sustaining the mature applications and use cases from the waning era or on exploiting attractive transitional technologies.

The first explorers of the technologies of a new era are rebels and visionaries operating at the fringes. These explorers naturally form grassroots organizations for sharing and socializing their ideas and accomplishments. Such grassroots organizations serve as incubators for the the technologies and leaders of the next era.

The HomeBrew Computing Club was a grassroots group out of which emerged many leaders of the Personal Computing Era. Now, as the Ambient Computing Era progresses, we see grassroots organizations such as the Nodebots movement and numerous collaborative GitHub projects serving a similar role.

fromchaos

At the beginning of a new computing era, it’s fairly easy to sketch a long-term vision of the era. All it takes is knowledge of current technical trajectories and a bit of imagination. But it’s impossible to predict any of the essential details of how it will actually play out.

Technical, business, and social innovation is rampant in the early years of a new era. Chaotic interactions drive the churn of innovation. The winners that will emerge from this churn are unpredictable. Serendipity is as much a factor as merit. But eventually, the stable pillars of the new era will emerge from the chaos. There are no guarantees of success, but for innovators right now is your best opportunity for impacting the ultimate form of the Ambient Computing Era.

AmbientEduardo

In the Ambient Computing Era humans live in a rich environment of communicating computer enhanced devices interoperating with a ubiquitous cloud of computer mediated information and services. We don’t even perceive most of the computers we interact with. They are an invisible and indispensable part of our everyday life.

transitionalquickly

A transitional technology is a technology that emerges as a computing era settles into maturity and which is a precursor to the successor era. Transitional technologies are firmly rooted in the “old” era but also contain important elements of the “new” era. It’s easy to think that what we experience using transitional technologies is what the emerging era is going to be like. Not likely! Transitional technologies carry too much baggage from the waning era. For a new computing era to fully emerge we need to move “quickly through” the transition period and get on with the business of inventing the key technologies of the new era.

nothardware

Computing “generations” used to be defined by changing computer hardware. Not anymore. The evolution of computing hardware (and software) technologies may enable the transition to a new era of computing. But it isn’t the hardware that really defines such an era. Instead, a new computing era emerges when hardware and software innovations result in fundamental changes to the way that computing impacts people and society. A new computing era is about completely rethinking what we do with computers.

Over the last several years, a lot of my ideas about the future of computing have emerged as I prepared talks and presentations for various venues. For such talks, I usually try to illustrate each key idea with an evocative slide. I’ve been reviewing some of these presentations for material that I should blog about. But one thing I noticed is that some of these slides really capture the essence of an idea. They’re worth sharing and shouldn’t be buried deep within a presentation deck where few people are likely to find them.

So, I’m going to experiment with a series of short blog posts, each consisting of an image of one of my slides and at most a paragraph or two of supplementary text. But the slide is the essence of the post. One nice thing about this form is that the core message can be captured in a single tweet. A lot of reading time isn’t required. And if it isn’t obvious, “slide bite” is a play on “sound bite”.

Let me know (tweet me at @awbjs) what you think about these slide bites. I’m still going to write longer form pieces but for some ideas I may start with a slide bite and then expand it into a longer prose piece.

In 2011 I wrote a blog post where I present the big picture model I use for thinking about what some people were calling the “post-PC computing era”. Since then I’ve written other related posts, given talks,  and had conversations with many people.  Time appears to be validating my model so it seems like a good time to do an updated version of the original post.

There seems to be broad agreement that the Personal Computing Era has ended. But what does that really mean? What happens when one era ends and another begins? What is a “computing era”?

e·ra /ˈirə,ˈerə/

google.com

noun

  1. a long and distinct period of history with a particular feature or characteristic.

Digital computing emerged in the years immediately following World War II and by the early 1950s computers started to be commercially available. So, the first era of computing must have started about 1950. But how many other eras have passed since then? There are many ways that people slice up the history of modern computing. By the late 1960s the electronics foundation of computers had reached it “3rd generation” (vacuum tubes, transistors, integrated circuits). Some people consider that the emergence of software technologies such time-sharing, relational databases, or the web correspond to distinct eras.

I don’t think any of those ways of slicing up computing history represent periods that are long enough or distinctive enough to match the dictionary definition of “era” given above. I think we’ve only passed through two computing eras and have just recently entered the third. This picture summarizes my perspective:

computingeras

The must important idea from this picture is that, in my view, there have only been three major “eras” of computing. Each of these eras span thirty or more years and represents a major difference in the primary role computers play in human life and society. The three eras also correspond to major shifts in the dominant form of computing devices and software. What is pictured is a conceptual timeline, not a graph of any actual data. The y-axis is intended to represent something like overall impact of computing upon average individuals but can also be seen as an abstraction of other relevant factors such as the socioeconomic impact of computing technologies.

The first era was the Corporate Computing Era. It was focused on using computers to enhance and empower large organizations such as commercial enterprises and governments. Its applications were largely about collecting and processing large amounts of schematized data. Databases and transaction processing were key technologies.

During this era, if you “used a computer” it would have been in the context of such an organization. However, the concept of “using a computer” is anachronistic to that era. Very few individual had any direct contact with computing and for most of those that did, the contact was only via corporate information systems that supported some aspects of their jobs.

The Corporate Computing Era started with the earliest days of computing in the 1950’s and obviously corporate computing still is and will continue to be an important sector of computing. This is an important aspect of my model of computing eras. When a new era emerges, the computing applications and technologies of the previous eras don’t disappear. They continue and probably even grow. However, the overall societal impact of those previous forms of computing become relatively small in comparison of the scope of impact of computing in the new era.

Around 1980 the primary focus of computing started to rapidly shift away from corporate computing. This was the beginning of the Personal Computing Era. The Personal Computing Era was about using computers to enhance and empower individuals. Its applications were largely task-centric and focused on enabling individuals to create, display, manipulate, and communicate relatively unstructured information. Software applications such as word processors, spreadsheets, graphic editors, email, games, and web browsers were key technologies.

We are currently still in the early days of the third era. A change to the dominant from of computing is occurring that will be at least a dramatic as the transition from the Corporate Computing Era to the Personal Computing Era. This new era of computing is about using computers to augment the environment within which humans live and work. It is an era of smart devices, perpetual connectivity, ubiquitous information access, and computer augmented human intelligence.

We still don’t yet have a universally accepted name for this new era. Some common names are post-PC, pervasive, or ubiquitous computing. Others focus on specific technical aspects of the new era and call it cloud, mobile, or web computing. But none of these terms seem to capture the breadth and essence of the new era. They are either too focused on a specific technology or on something that is happening today rather than something that characterizes a thirty year span of time. The name that I prefer and which seems to be gaining some traction is “ambient computing”,

am·bi·ent /am-bee-uh nt/

dictionary.com

adjective

  1. of the surrounding area or environment
  2. completely surrounding; encompassing

In the Ambient Computing Era humans live in a rich environment of communicating computer enhanced devices interoperating with a ubiquitous cloud of computer mediated information and services. We don’t even perceive most of the computers we interact with. They are an invisible part of our everyday things and activities. In the Ambient Computing Era we still have corporate computing and task-oriented personal computing style applications. But the defining characteristic of this era is the fact that computing is shaping the actual environment within which we live and work.

The early years of a new era are an exciting time to be involved in computing. We all have our immediate goals and the much of the excitement and opportunity is focused on shorter term objectives. But while we work to create the next great app, machine learning model, smart IoT device, or commercially successful site or service we should occasionally step back and think about something bigger: What sort of ambient computing environment do we want to live within and is our current work helping or hindering its emergence?

I written before about a transition period to a new era of computing. Earlier this month I gave a keynote talk at the Front-Trends conference in Warsaw.  In preparing this talk I discovered a very interesting graphic created by Asymco for an article about the Rise and Fall of Personal Computing.   It was so interesting that I used it to frame my talk. Here is what my first slide looked like, incorporating the Asymco visualization:

rise-and-fall-pc

This graph is showing market-share of various computing platforms since the very first emergence of what can be characterize as a personal computer.  I urge you to read the Asymco article if you are interested in the details of this visualization.   Keep in mind that it is showing percentage share of a rapidly expanding market. Over on the left edge we are talking about a total world-wide computer population that could be measured in the low hundreds of thousands.  On right we are talking about a market size in the high hundreds of millions of computers.

For my talk, I used the graph as an abstraction of the entire personal computing era. The important thing was that there was a period of around ten years before the Windows/Intel PC platform really began to dominate.  I remember those days. I was a newly graduated software engineer and those were exciting times.  We knew sometime big was happening, we just didn’t know for sure what it was and how it was all going to shake out.  Each year there was one or more new technologies and companies that seems to be establishing themselves  as the dominant platform.  But then something changed and within a year or two somebody else seemed to be winning.  It wasn’t until  the latter part of the 1980’s that the Wintel platform could be identified as the clear winner. That was the beginning of a 20+ year period that, based upon this graph, I’m calling the blue bubble.

While many interesting things (for example, the Web)  happened during the period of the blue bubble, overall it was a much less exciting time to be working in the software industry. For most of us, there was no option other than to work within the confines of the Wintel platform.  There were good aspects to this as a fixed and relatively stable platforms provided a foundation for the evolution of PC-based applications and ultimately the applications were what  was most important from a user perspective. But as a software developer, it just wasn’t the same as that earlier period before the bubble formed. To those of us who were around for the first decade of the PC era there were just too many constraints inside the bubble. There were still plenty of technical challenges, but there wasn’t the broad sense that we were all collectively changing the world.  But then, the blue bubble became normal. Until very recently,  must active software developers have never experienced a professional life outside that bubble.

The most important thing for today is what is happening on the right-hand side of this graph.  Clearly, the big blue bubble is coming to an end. This coincides with what I call the transition from the Personal Computing Era to the Ambient Computing Era.  Many people thank we are already inside the next blue bubble.  That Apple, or Google, or many be even “the Web” has already won platform dominance for the next computing era.  Maybe so, but I doubt it.  Here is a slide I used at the end of my recent talk:

rise-and-fall-ambient

It’s the same graphic.  I only removed the platform legend and changed the title and time line. The key point is that we probably aren’t yet inside the next blue bubble.  Instead, we are most likely in a period that is more similar to the first ten years of the PC Era.  It’s a time of chaotic transition.  We don’t know for sure which companies and technologies map to the colors in the  graph.  We also don’t know the exact time scale;  2013 isn’t necessarily equivalent to 1983.  It’s probably the case that the dominant platform  of the Ambient Computing Age is not yet established. The ultimate winner may  already be out there along with several other contenders.  We just don’t know with certainty how it’s all going to come out.

Things are really exciting again. Times of chaos are times of opportunity. The constraints of the last blue bubble are gone and the next blue bubble isn’t set yet. We all need to drop our blue bubble habits and seize the opportunity to shape the new computing era. It’s a time to be aggressive and to take risks. It’s a time for new thinking and new perspectives.  This is the best of times to be a software developer. Don’t get trapped by blue bubble thinking and don’t wait too long. The window of opportunity will probably only last a few years before the next blue bubble is firmly set. After that it will be decades  until the next such opportunity.

We’re all collectively creating a new era of computing.  Own it and enjoy the experience!

My plan is for this to be the first in a series of posts that talk about specific medium term challenges facing technologists as we move forward in the Ambient Computing Era.  The challenges will concern things that I think are inevitable but which may not be getting enough attention right now. But with attention, we should see significant progress towards solutions over the next five years.

Here’s the first challenge.  I have too many loosely coordinated digital devices and digital services. Everyday, I spend hours using my mobile phone, my tablet, and my desktop Mac PC. I also regularly use a laptop, a FirefoxOS test phone, and my DirecTV set-top box/DVR.  Less, regularly I use the household iPad, an Xbox/Kinect in our family room, and a couple of Denon receivers with network access.   Then, of course, there are various other active digital devices like cameras, a FitBit, runner’s watches, an IPod shuffle, etc.  My car is too old to have much user facing intelligence but I sure that won’t be the case with the next one.

Each of these devices is connected (at least indirectly) to the Internet and most of them have some sort of web browser. Each of them locally hold some of my digital possessions. I try to configure and use services like Dropbox and Evernote to make sure that my most commonly used possessions are readily available on all my general-purpose devices, but sometimes I still resort to emailing things to myself.

I also try to similarly configure all my MacOS devices and all my Android devices. But even so, everything I need isn’t always available on the device I’m using at any instance, even in cases where the device is perfectly capable of hosting it.

Even worse, each device is different in non-essential, but impossible to ignore ways.  I’m never just posting a tweet or reading my favorite new streams.  I’m always doing it on my tablet, or at my desk, or with my phone and the experience is different for each of them in some ways.  In every case, I have to focus as much attention on the device I’m physically using and how it differs from my other devices as I do on the actual task I’m interested in accomplishing.  And, its getting worse. Each new device I acquire may give me some new capability but it also adds to the chaos.

Now, I have the technical skills that enable me to deal with this chaos and get a net positive benefit from most of these devices. But it isn’t where I really want to be investing my valuable time.

I simply want to think about all my “digital stuff” as things that are always there and always available.  No matter where I am or which device I’m using.  When I get a new device, I don’t want to spend a day installing apps and configuring it.  I just want to identify myself and have all my stuff immediately available. I want my stuff to look and operate familiarly.  The only differences should be those that are fundamental to the specific device and its primary purpose.  My attention should always be on my stuff.   Different devices and different services should fade into the background. “Digital footprint” was the term I used my Cloud on Your Ceiling to refer to all this digital stuff.

Is any progress being made towards achieving this? Cloud hosted services from major industry players such as Google and Apple may feel like they are addressing some of these needs. But, they generally force you to commit all your digital assets to a single corporate caretaker and whatever limitations they choose to impose upon you.  Sometimes such services are characterized as “digital lockers”.  That’s not really what I’m looking for. I don’t want to have to go to a locker to get my stuff; I just want it to appear to always be with me and under my complete control.

The Locker Project is something that I discovered while researching this post that sounded like relevant work but it appears to be moribund.  However, it led me to discover an inspirational short talk by one of its developers, Jeremie Miller,  who paints a very similar vision to mine. The Locker Project appears to have morphed in to the Singly AppFabric  product, which seems to be a cloud service for integrating social media data into mobile apps.  This is perhaps a step in the right direction, but not really the same vision.  I suspect there is a tension between achieving the full vision and the short-term business realities of a startup.

So, that’s my first Ambient Computing challenge. Create the technology infrastructure and usage metaphors that make individual devices and services fade into the background and allow us all to focus our attention on actually living our digitally enhanced lives.

I’m interested in hearing about other relevant projects that readers may know about and other challenges you think are important.

(Photo by “IndyDina with Mr. Wonderful”, Creative Commons Attribution License. Sculpture by Tom Otterness)