Douglas Engelbart and the evolution of the computer mouse

We’re back to computers today, as we celebrate the birthday of Douglas Engelbart (born 30 Jan 1925), the American electrical engineer and human-computer interface specialist who developed the first practically useable prototype of the computer mouse.

The computer mouse has become such a ubiquitous part of a home computer setup that its quite difficult to think back to the time when computers didn’t come stock standard with a mouse. Of course early command-line computers had no real need for a mouse, given that they didn’t have a graphical user interface, and there was no need for a device to select different objects on the screen.

The classic Apple mouse - a masterpiece of user-friendly industrial design.(© All Rights Reserved)
The classic Apple mouse – a masterpiece of user-friendly industrial design.
(© All Rights Reserved)

Engelbart’s computer interfacing device, that he developed with his colleague Bill English at the Stanford Research Institute, basically consisted of a handheld ‘box’ with two wheels protruding at the bottom, pointed perpendicular to each other so that, when the device was moved along a flat surface, the rotation of the wheels translated into motion along the horisontal and vertical axes on the screen. The device became referred to as a mouse because of its size and because the electric cable running out behind the device resembled a mouse’s tail.

Even though Engelbart patented his computer mouse (on Nov 17, 1970), this was a case where the invention was so far ahead of its time that the patent ran out before the device found widespread application in personal computers. Hence he never received any royalties for his groundbreaking invention.

The mouse was actually only one of several different devices that Engelbart experimented with to enable humans to easier interact with computers, including a joystick-type device, as well as head-mounted devices attached to the chin or nose. Personally I am quite relieved that the hand-held mouse won out – imagine if we all sat around staring at our computer screens with pointing devices attached to our noses. Then again, we may not have thought it funny – if you think how absurd ear-mounted bluetooth mobile phone headsets look (a personal pet-hate of mine!), perhaps a nose-mounted computer pointer wouldn’t have been that odd…

Of course by today the computer mouse has become a complex, highly sophisticated device, with variants ranging from multi-functional gaming mice that look like something out of a science fiction fantasy, to Apple’s classic smooth and simple design masterpieces.

And all this thanks to Doug Engelbart’s visionary work more than 40 years ago.

John Backus and the development of high-level computer programming languages

Today we’re celebrating the birthday of John Backus (3 Dec 1924 – 28 Oct 1988), American computer scientist and the leader of the team who invented the Fortran programming language (at the time called FORTRAN) while working at IBM in the mid 1950s.

Fortran was the first so-called ‘high-level computer language’, which means it was capable of converting standard mathematical formulas and English-based expressions into binary code used by computers. The language is particularly suited to scientific computing and numeric computation. Over the years, many improvements were made to the original Fortran language, with versions known by a sometimes strange series of numeric identifiers – FORTRAN, FORTRAN II, FORTRAN III, FORTRAN IV, FORTRAN 66, FORTRAN 77, Fortran 90, Fortran 2003 and Fortran 2008.

FORTRAN was the first widely used high-level computer language, providing an interface between equations and expressions understandable to humans,  and binary code used by computers.(© All Rights Reserved)
FORTRAN was the first widely used high-level computer language, providing an interface between equations and expressions understandable to humans, and binary code used by computers.
(© All Rights Reserved)

Despite being one of the oldest computer languages, it has been one of the most enduring, and after more than half a century it is still a preferred language for computationally intensive applications such as weather prediction, computational fluid dynamics and finite element analysis. One of the reasons for Fortran’s longevity is that some of the later Fortran compilers in particular are capable of generating very fast and efficient code, which can make a big difference when solving large, complex mathematical computations. It is still the primary language for used on many supercomputers, and many of the floating-point benchmarks to test the performance of new processors are still written in Fortran.

As a high-level language, Fortran has also provided an impetus for the development of numerous subsequent computer languages such as ALGOL, COBOL and BASIC.

The IEEE awarded John Backus the W.W. McDowell Award in 1967 for the development of FORTRAN. He also received the National Medal of Science in 1975 and the ACM Turing Award in 1977 for his contributions to the design of high-level computer programming systems.

It’s World Computer Literacy Day

A couple of days ago I commented on Computer Security Day. Today we’re back to computers, but this time the issue is way more fundamental – today is World Computer Literacy Day.

Celebrated for the first time in 2001 in India, the day has since expanded to an international initiative. Computer literacy relates to the ability to comfortably use computers and related information and communications technologies (ICTs). Some of the key issues impacting computer literacy include basic access to ICT, and the ability to use these technologies in your own language.

Promoting computer literacy and connectivity in the developing world is critical in creating economic opportunities for all.(© All Rights Reserved)
Promoting computer literacy and connectivity in the developing world is critical in creating economic opportunities for all.
(© All Rights Reserved)

In an attempt to raise awareness about the plight of those who are not privileged enough to have access to computers, Irish charity organisation Camara Education has launched a challenge to those of us for whom ICT is a part of everyday life, to go without technology for 24 hours. Through this initiative, known as ‘Techfast’, they hope to highlight the digital divide that still exists in the world today.

Being connected always and everywhere, it is easy to forget that the global digital village we are part of really isn’t that global at all, with ICT and computer literacy very much concentrated in developed countries. While we get treated to high speed, low cost Internet, the developing world continues to lag further and further behind.

There are positive examples in the developing world where the digital divide is actively being addressed. While countries like Ethiopia and Zambia still have less than 2% of the population connected to the Internet, the situation in Kenya, for example, looks very different – from 2009 to 2010 the percentage of Internet users have increased from 10% to 26%. A massive digital boom indeed, and one which is reported to also be providing an economic boost to the country.

While I often wonder whether 24/7 connectivity is a blessing or a curse, the fact of the matter is that, to participate in the global economy, connectivity and computer literacy is of paramount importance.

While you’re comfortably browsing through your blog roll on your high-speed internet connection, spare a thought on World Computer Literacy Day for those who are not as technologically privileged.

Techies Day and the growing need for skilled high-tech workers

According to the Merriam-Webster dictionary, a ‘techie’ is defined as “a person who is very knowledgeable or enthusiastic about technology and especially high technology”. And today, I am told, is Techies Day, launched in 1999 by Techies.com. Yes indeed, when no-one else bothered to create a day for appreciating the techies, they just did what any good techie would do and created it themselves. Gotta love a techie!

All jokes aside, this is the day to take some time to acknowledge and appreciate all the ways in which your life is made easier thanks to a baffling array of techies – the guys and gals who keeps the telecommunications systems communicating; who ensure the computing systems keep computing; who keep our ever increasing collection of digital devices up and running; who enable the blogging platforms to keep on supporting the 433,743 bloggers, 1,058,607 new posts, 1,283,513 comments, and 246,669,831 words posted every day (and that’s just on our favourite platform).

Take time today to show some love and appreciation for the techies in your life.
(© All Rights Reserved)

Internationally there’s an ever growing demand for qualified technology workers, and a growing recognition of the need for initiatives aimed at drawing more bright young people into technology domains. In a ComputerWeekly report from April this year, the lack of IT talent is described as a ‘global issue’ by recruitment group Hays, who has pinpointed IT as “one of the top ‘hard skills’ in demand” in their list of top ten skills that are globally lacking. The article further points out that, while international outsourcing is still a popular option for many companies to address their shortages, there is a trend to rather try to attract the skills to develop projects in-house.

The situation is no different down here in New Zealand. As reported in the NZ Herald, Minister of Economic Development Steven Joyce , while addressing the Nethui Internet Conference, said “There is a worldwide shortage of ICT skills currently and it’s not getting any better and New Zealand is part of that. One of the challenges for all of us, particularly those of you who are evangelists for the digital revolution, is actually to get schools, people, students, families to get more focused on ICT careers because there is a danger that the focus on the skills, that will be required, lags [behind] the opportunities.”

The ICT domain keeps expanding, requiring more and more techies to keep it up and running.
(© All Rights Reserved)

So, the next time you interact with a techie and he/she looks a tad stressed, have some sympathy – they’re probably overstretched and can do with some appreciation. Too often these days we consider the IT systems and connectivity supporting our lives a right and not a privelege, and we get righteously peeved off when things go wrong and take it out on the first line of support we hit.

Today, instead of fighting, show some love for the techies in your life.

Happy Techies Day, everyone.

The birth of Linux, giant killer of the Open Source world

A while ago, I published a post on the start of the open source operating system revolution. As mentioned there, Linus Torvalds did not ‘invent’ the open source operating system with Linux, but there’s no denying that he is one of the true superstars of the open source world, and that Linux is, without a doubt, one of the few open source operating systems that have managed to make the big commercial players sit up and take notice.

From cellphones to supercomputers – Linux is a popular operating system across a wide range of platforms.
(© All Rights Reserved)

There is some debate around the date that should be considered the ‘official’ birthday of Linux – there are three early emails from Torvalds making reference to his operating system – but the general consensus seems to be that his email of 25 August 1991 best represents Linux’s inception:

From:torvalds@klaava.Helsinki.FI (Linus Benedict Torvalds)
Newsgroup: comp.os.minix
Subject: What would you like to see most in minix?
Summary: small poll for my new operating system
Message-ID: 1991Aug25, 20578.9541@klaava.Helsinki.FI
Date: 25 Aug 91 20:57:08 GMT
Organization: University of Helsinki.

Hello everybody out there using minix-

I’m doing a (free) operating system (just a hobby, won’t be big and professional like gnu) for 386(486) AT clones. This has been brewing since april, and is starting to get ready. I’d like any feedback on things people like/dislike in minix; as my OS resembles it somewhat (same physical layout of the file-sytem due to practical reasons) among other things.

I’ve currently ported bash (1.08) an gcc (1.40), and things seem to work. This implies that i’ll get something practical within a few months, and I’d like to know what features most people want. Any suggestions are welcome, but I won’t promise I’ll implement them 🙂

Linus Torvalds torvalds@kruuna.helsinki.fi

Originally developed for Intel x86 personal computers, the Linux operating system has since been ported to a wider range of platforms than any other operating system, ranging from servers to supercomputers to embedded systems. The Android operating system, used by a wide range of mobile devices, is built on a Linux kernel. Quite amazing for a system that it’s creator described as “just a hobby, won’t be big and professional like gnu”.

The Linux story really is a feel-good tale of how a non-commercial product, based on a free and open community-based development model, can match and exceed its multi-million dollar commercial competition.

Happy birthday, Linux, and power to you, Linus Torvalds – may you long continue to steer the ship, and take others along on your quest for the open and the free.

Celebrating IBM PC Day

This day marks the release, 31 years ago in 1981, of the very first IBM Personal Computer (PC) model 5150.

The original IBM PC. (R de Rijcke, Wikimedia Commons)

Developed in less than a year, using existing off-the-shelf components, it proved a runaway success in the small business market, and launched the era of the personal computer. The IBM PC used an operating system developed by Microsoft, helping to establish Microsoft’s dominance in the in the PC market.

Specifications of the original IBM PC included an Intel 8080 processor with a processing speed of 4.77 MHz, 16-64K memory and data storage consisting of 5.25″ floppy drives, cassette tape and (later on) a hard disk.

Even though the term “personal computer” wasn’t first coined by IBM (it was used as early as 1972 in reference to the Xerox PARC Alto), the success and prevalence of the IBM product resulted in the term PC referring specifically to computers and components compatible to the IBM PC. This led to peripherals and components being advertised as ‘IBM format’, further cementing IBM as the industry standard.

The IBM Blue Gene/P system (2008), capable of 14 trillion individual calculations per second. Yep, it’s a bit faster than the IBM PC model 5150!
(© All Rights Reserved)

As a result of it’s amazing longevity (many IBM PCs have remained in use well into the 21st century), and the fact that it represents the first true personal computer, the original IBM PC have become popular among collectors of vintage PCs.

So, if you happen to still have an old model 5150 sitting in a cupboard somewhere, treasure it – depending on it’s condition it can be worth almost $5000, and unlike just about all other electronic equipment in your house, it’s value may actually increase!

Pervasive or Invasive: the Birth of Ubiquitous Computing

Today we celebrate the birthday of Mark David Weiser (23 Jul 1952 – 27 Apr 1999), the visionary American computer scientist who first coined the term ‘Ubiquitous Computing’.

Weiser, who worked as Chief Technologist at XEROX PARC, came up with the term in 1988, describing a future scenario where personal computers will be largely replaced by a distributed network of interconnected “tiny computers” embedded in everyday items like toasters, fridges, photocopiers, phones, couches etc, turning these into “smart” objects. Sound familiar?

While Weiser’s scenario has not come to full fruition yet, things are definitely moving in that direction. Smart phones are already a common sight, smart TV’s are popping up all over the place, connectivity and interconnected devices is becoming the norm… It certainly no longer requires a stretch of the imagination to visualise a world of ubiquitous computing, or ‘pervasive computing’, ‘ambient intelligence’, or ‘everyware’, as the paradigm has also been described.

The common site of a shopping list stuck up on the fridge may soon be a thing of the past, with your future fridge likely to interact with the rest of the kitchen, checking your supplies and auto-ordering any depleted groceries.
(© All Rights Reserved)

While the concept sounds daunting – computers everywhere, no getting away from it, etc – Weiser actually described it as the era of “calm technology”, where technology recedes into the background of our lives. He defined it as “machines that fit the human environment instead of forcing humans to enter theirs”. So the idea is that while you will continually engage with numerous computing devices, this will happen in a largely unobtrusive manner, allowing you to go on with your life. The fully connected environment also implies a greater degree of location independence, so you won’t necessarily be stuck at a desk behind a computer screen – this is already happening, with the shift from desktops to laptops to tablets and cloud computing.

Of course the idea of computers fitting in with, rather than changing, the human environment, is a bit of a false utopia. While smart phones definitely adapt more to the human environment than, say, a laptop computer, it does fundamentally chance the way humans act and operate – simply look at a group of school children with their smart phones, and compare that to the pre-mobile-phone scenario.

Like it or not, the pervasiveness of computers and computing devices are unlikely to disappear any time soon. The question is in which direction the pervasive-invasive balance will tip, and how things will progress along the man-serving-machine-serving-man continuum.

Where do you see us heading?