Today we celebrate the birthday of Mark David Weiser (23 Jul 1952 – 27 Apr 1999), the visionary American computer scientist who first coined the term ‘Ubiquitous Computing’.

Weiser, who worked as Chief Technologist at XEROX PARC, came up with the term in 1988, describing a future scenario where personal computers will be largely replaced by a distributed network of interconnected “tiny computers” embedded in everyday items like toasters, fridges, photocopiers, phones, couches etc, turning these into “smart” objects. Sound familiar?

While Weiser’s scenario has not come to full fruition yet, things are definitely moving in that direction. Smart phones are already a common sight, smart TV’s are popping up all over the place, connectivity and interconnected devices is becoming the norm… It certainly no longer requires a stretch of the imagination to visualise a world of ubiquitous computing, or ‘pervasive computing’, ‘ambient intelligence’, or ‘everyware’, as the paradigm has also been described.

The common site of a shopping list stuck up on the fridge may soon be a thing of the past, with your future fridge likely to interact with the rest of the kitchen, checking your supplies and auto-ordering any depleted groceries.
(© All Rights Reserved)

While the concept sounds daunting – computers everywhere, no getting away from it, etc – Weiser actually described it as the era of “calm technology”, where technology recedes into the background of our lives. He defined it as “machines that fit the human environment instead of forcing humans to enter theirs”. So the idea is that while you will continually engage with numerous computing devices, this will happen in a largely unobtrusive manner, allowing you to go on with your life. The fully connected environment also implies a greater degree of location independence, so you won’t necessarily be stuck at a desk behind a computer screen – this is already happening, with the shift from desktops to laptops to tablets and cloud computing.

Of course the idea of computers fitting in with, rather than changing, the human environment, is a bit of a false utopia. While smart phones definitely adapt more to the human environment than, say, a laptop computer, it does fundamentally chance the way humans act and operate – simply look at a group of school children with their smart phones, and compare that to the pre-mobile-phone scenario.

Like it or not, the pervasiveness of computers and computing devices are unlikely to disappear any time soon. The question is in which direction the pervasive-invasive balance will tip, and how things will progress along the man-serving-machine-serving-man continuum.

Where do you see us heading?

Leave a comment