Page 31 of 34

Celebrating robots and robotics – useful and seriously cool!

Today we celebrate the birthday of Joseph F. Engelberger (born in New York City, July 26, 1925), physicist, engineer and entrepreneur, and the man often called the “Father of Robotics”.

Engelberger, together with inventor George Devol, was responsible for the development of the first industrial robot in the US, in the late 1950’s. The robot, called the Unimate, worked on a General Motors assembly line at the Inland Fisher Guide Plant in New Jersey in 1961. It picked up die castings from an assembly line and welded these to the auto bodies – a potentially dangerous task for humans.

The Unimate was inducted into the Robot Hall of Fame in 2003.

Engelberger and Devol also started Unimation, the world’s first robot manufacturing company. Engelberger was a strong advocate for robotic technology beyond the manufacturing plant, and promoted the use of robotics in fields as diverse as health care and space exploration.

Robots – not only are they useful in fields as diverse as manufacturing, transport, space exploration and surgery, but they make seriously cool toys!
(© All Rights Reserved)

The field of robotics deal with automated machines that can take the place of humans, performing various activities in potentially hazardous or tedious processes in fields ranging from manufacturing to research to exploration. While Engelberger was responsible for the first industrial robot, the robotics concept dates back much further, to the start of the 20th century. The word “robot” was first coined by the Czech writer Karel Čapek in 1920.  In 1942, science fiction writer Isaac Asimov published his “Three Laws of Robotics”, which constituted the first use of the term “robotics”.

A lot of effort and investment has gone into research and development in the field of human-machine interaction, covering areas such as voice synthesis, gesture recognition, and facial expressions.

I’m not sure if it’s thanks to the fact that robots are so popular in science fiction – often depicted as an intelligent, cunning and efficient super-race – but I find it difficult not to feel awed, and even a little threatened, when facing one of these amazing inventions.

Dōmo arigatō, Mr. Roboto!

Celebrating the birth of the first ‘test tube’ baby

Today we celebrate a special birthday – Louise Joy Brown, the world’s first ‘test tube’ baby, was born on this day back in 1978 in Oldham, England.

Louise was conceived in a petri dish (so technically she was a ‘petri dish baby’ rather than a ‘test tube baby’), via the process of in vitro fertilisation (IVF). Her parents, Lesley and John Brown, had been trying to conceive for nine years, but faced complications of blocked fallopian tubes.

The process was a great success, and amazingly, by the time Louise turned 21 in 1999, more than 300 000 babies had been born using similar IVF techniques.

Louise’s IVF was performed by Dr Robert Edwards of Cambridge, who had previously successfully performed similar procedures with animals. He was assisted by gynaecologist Patrick Steptoe, who was already the Browns’ doctor. Edwards was awarded the 2010 Nobel Prize in Medicine for his contributions in the field of reproductive medicine.

The Latin term ‘in vitro’ is used for any biological process that occurs outside the organism it would normally be occurring in.
(© All Rights Reserved)

In vitro fertilisation is a procedure where an egg cell gets fertilised by sperm outside the body. After successful fertilisation, the fertilised egg (zygote) gets transferred to the patient’s uterus in order to continue developing like a normal pregnancy.

The term in vitro (Latin: ‘in glass’) came about to describe a procedure that specifically occurred in a glass container (such as a test tube or petri dish), but its use has been extended to refer to any biological procedure that occurs outside the organism it would normally be occurring in.

Louise Brown got married in 2004, and her own son, conceived naturally, was born in late 2006. Happy 34th birthday, Louise!

Have you heard the one about…?

Today is ‘Tell an Old Joke Day’.

It is also the day we commemorate the death of Sir James Chadwick (20 Oct 1891 – 24 Jul 1974), who was awarded the 1935 Nobel Prize for Physics for the discovery of the neutron.

So I guess my old joke for this special day kind of selects itself…

A neutron walks into a bar…
(© All Rights Reserved)

A neutron walks into a bar and orders a whisky.
The bartender pours him a stiff one.
“How much do I owe you?”, the neutron asks.
“For you?” replies the bartender, “no charge!”

Pervasive or Invasive: the Birth of Ubiquitous Computing

Today we celebrate the birthday of Mark David Weiser (23 Jul 1952 – 27 Apr 1999), the visionary American computer scientist who first coined the term ‘Ubiquitous Computing’.

Weiser, who worked as Chief Technologist at XEROX PARC, came up with the term in 1988, describing a future scenario where personal computers will be largely replaced by a distributed network of interconnected “tiny computers” embedded in everyday items like toasters, fridges, photocopiers, phones, couches etc, turning these into “smart” objects. Sound familiar?

While Weiser’s scenario has not come to full fruition yet, things are definitely moving in that direction. Smart phones are already a common sight, smart TV’s are popping up all over the place, connectivity and interconnected devices is becoming the norm… It certainly no longer requires a stretch of the imagination to visualise a world of ubiquitous computing, or ‘pervasive computing’, ‘ambient intelligence’, or ‘everyware’, as the paradigm has also been described.

The common site of a shopping list stuck up on the fridge may soon be a thing of the past, with your future fridge likely to interact with the rest of the kitchen, checking your supplies and auto-ordering any depleted groceries.
(© All Rights Reserved)

While the concept sounds daunting – computers everywhere, no getting away from it, etc – Weiser actually described it as the era of “calm technology”, where technology recedes into the background of our lives. He defined it as “machines that fit the human environment instead of forcing humans to enter theirs”. So the idea is that while you will continually engage with numerous computing devices, this will happen in a largely unobtrusive manner, allowing you to go on with your life. The fully connected environment also implies a greater degree of location independence, so you won’t necessarily be stuck at a desk behind a computer screen – this is already happening, with the shift from desktops to laptops to tablets and cloud computing.

Of course the idea of computers fitting in with, rather than changing, the human environment, is a bit of a false utopia. While smart phones definitely adapt more to the human environment than, say, a laptop computer, it does fundamentally chance the way humans act and operate – simply look at a group of school children with their smart phones, and compare that to the pre-mobile-phone scenario.

Like it or not, the pervasiveness of computers and computing devices are unlikely to disappear any time soon. The question is in which direction the pervasive-invasive balance will tip, and how things will progress along the man-serving-machine-serving-man continuum.

Where do you see us heading?

You can have your pi and eat it, on Pi Approximation Day (22/7)!

Today is 22/7. No prizes for guessing what that means – yes, its Pi Approximation Day! March 14th (3.14) is also celebrated as Pi Day, but I kind of prefer the 22/7 version.

Pi, that curious little number that seems to pop up every time we start going in circles. A number so important that it even got its own name – not many numbers can claim that distinction!

Instead of going in circles trying to figure out what to give the kids for lunch, take your cue from the date and bake them a pi!
(© All Rights Reserved)

Pi, or π, is a mathematical constant that represents the ratio between a circle’s circumference and its diameter, or π = C/d. It is what’s known as an irrational number – a number that cannot be expressed as a ratio between two integers. Being irrational, it has an infinite number of digits in its decimal representation, and it does not end with a repeating sequence of digits. It is also a trancendental number – a number that cannot be expressed with a finite sequence of algebraic operations.

In addition to its application in geometry and trigonometry, the constant π is found in many formulae, in a variety of sciences, including physics, number theory, thermodynamics, statistics, electromagnetism and mechanics.

The value of π (to 5 decimal places) is 3.14159, which is also approximately the value of 22 divided by 7. Calculating the value of π to higher and higher degrees of accuracy have been a challenge to mathematicians and computer scientists through the ages. Utilising the latest computing technology, the digital representation of π has now been calculated to more than 10 trillion digits. Memorising π to a large number of digits (a practice called piphology) is another challenge taken up by many pi-fanatics, and the current record stands at an astounding 67 890 digits, recited in 2005 in China by Lu Chao over a period of more than 24 hours. (Wow, he probably doesn’t get out much!)

A nice trick to remember the first few digits of pi is to use a poem or sentence where the lengths of the words correspond to the digits in pi. A well-known example, courtesy of English scientist James Jeans, is “How I want a drink, alcoholic of course, after the heavy lectures involving quantum mechanics”, cleverly representing pi’s first 15 digits.

Such is the pervasiveness of the number π that it can even boast numerous appearances in modern popular culture, from TV series (Simpsons, Twin Peaks) to novels (Carl Sagan’s “Contact”) to pop music (Kate Bush’s “Pi“).

Brrrr! Celebrating the coldest day ever

Today we celebrate an event that may be somewhat unthinkable on this day to our Northern Hemisphere friends, especially everyone suffering in the US heat. On this day, back in 1983, the coldest ever temperature on earth was recorded at Vostok Station, Antarctica.

So how cold was it? Well, believe it or not, but exactly 19 years ago, the poor folk at Vostok Station recorded an icy -89.2°C (-128.6°F).

Ice crystals on a frozen stream.
(© All Rights Reserved)

That’s pretty darn chilly…  Certainly not a temperature you want to be exposed to for any length of time.  Prolonged exposure to very cold temperatures has some interesting effects on the body.

Goose pimples and shivers

When the temperature falls below 8°C, touch sensitivity starts being compromised.  Goose pimples appear, lifting hair follicles as the body tries to protect itself from the cold. Unfortunately this does not help us humans much, because we don’t have enough body hair to have a significant effect, but you can imagine how this can be very useful to an animal with a dense fur coat.

The next step is shivering, as the body starts to increase its heat production by working the muscles – shivering is said to increase the body’s heat production five-fold.

Skin discolouration

Your skin also starts doing strange things. From below 10°C, the surface blood vessels start to dilate (your skin becomes red). As it gets colder, the blood vessels start constricting again, to avoid heat loss through your extremities. This is followed by alternating periods of dilation and constriction, as the body tries to balance the supply of oxygen and nutrients to the skin, with protection from heat loss. So you may start sporting an interesting blend of red and white skin tones.

Frostbite

During extended exposure to cold, the body has to start making decisions on how its available heat should be best applied.  In order to keep vital organs warm and avoid hypothermia, our extremities – fingers, feet, ears, nose – will be allowed to cool down, and blood flow to the extremities will also be reduced (to avoid blood cooling down as it circulates to the extremities). If this situation persists, it can lead to frost-bite, where the cells close to the skin surface start freezing and die. When heat returns to these cells, it results in swelling and blisters, forming a hardened black layer.

In extreme conditions, the frostbite can reach deeper layers of muscle and bone, resulting in permanent tissue damage, and ultimately amputation of body parts – a fate that has befallen many polar explorers and extreme mountaineers.

Hypothermia

Even though the body will do its best to maintain its core temperature, even sacrificing body parts in the process, it cannot keep up the heat if exposure to extreme cold continues.  Next the body will slow its metabolism to minimize blood flow and limit energy loss. At some point, however, the body core starts to cool, and hypothermia sets in. Not much of a core drop is needed for this – clinically, hypothermia sets in when the core temperature drops below 35°C.

First symptoms of hypothermia include reduced motor skills and slowed reaction times. Judgment also becomes impaired, with the dangerous result that the hypothermia sufferer may lose the ability to recognize the condition.

As the core temperature drops below 35°C, the body starts shivering more violently in an attempt to reverse the situation. You get more sluggish and tired, with a strong need to give up and go to sleep. Below 32°C the shivering stops, as there is no energy to keep it going, resulting in even quicker heat loss.

Unconsciousness sets in when the body core drops below 30°C. In a final primal attempt to avoid death, the heart rate and breathing slows down severely, to the point where the metabolism is so slow that the sufferer basically appears to be dead.

Below 28°C cardiac arrhythmias become more common. If the sufferer has not yet died, the heart finally stops beating at a core temperature of about 20°C.

Gender and age matters

Interestingly, women can survive extreme cold better than men. The temperature gradient from skin to body core is greater in women – women’s bodies will more readily allow the skin surface and extremities to cool down, while better protecting core temperature. So while a woman may sooner suffer frostbite, her warm core is likely to keep her alive longer. Women also tent to have a higher subcutaneous fat percentage, further helping to protect core temperature.

Age also plays a role, with people losing their ability to handle extreme cold as they age. Children are more likely to recover from the effects of extreme hypothermia – their organs appear less likely to be affected by physical stresses that may be fatal to older organs.

(Source: Science of the Cold)

Brrrrrr! Chilling stuff!  Suddenly the chilly New Zealand mornings seem decidedy mild. Enjoy the weather – whether you’re basking in the northern summer heat or shivering in the southern winter cold.  It could have been worse!

Get moonstruck on Moon Day

July 20th is Moon Day, commemorating the day in 1969 when man first walked on the moon.

The magic of the full moon remains an inspirational sight.
(© All Rights Reserved)

As part of the Apollo Space Program, initiated by President John F Kennedy, Apollo 11 was the mission that fulfilled the dream of putting man on the moon. Apollo 11, launched on 16 July 1969 with a Saturn V rocket, carried three astronauts, Neil Armstrong, Michael Collins, and Edwin Aldrin, to their historic date with destiny.

On 20 July, lunar module “Eagle” landed on the moon, prompting the first of Neil Armstrong’s famous quotes, “The Eagle has landed”. After touch down, Armstrong became the first man to set foot on the surface of the moon, and millions of people the world over, listening breathlessly, were treated to his second immortal sound bite, “That’s one small step for a man, one giant leap for mankind”.  After Armstrong’s pioneering step, Aldrin also got an opportunity to walk on the moon (with much less fanfare), while poor Michael Collins never got the chance, remaining alone in lunar orbit while the Eagle touched down.

The Apollo Space Program, and especially the week of the moon landing, remains one of the most momentous events in modern human history – a time when man felt truly immortal, and capable of anything. Since the historic first landing, five more landings took place between 1969 and 1972, with a total of 12 men experiencing the privilege of landing on the moon. Of course after the thrill of the initial landing, public interest dwindled, and I bet very few people will be able to name the 10 men who landed on the moon after Armstrong and Aldrin.

Since the golden age of moon exploration, from the late Sixties to early Seventies, numerous unmanned moon landings have occurred, including missions from the USA, the Soviet Union, Japan, the European Space Agency, China and India. Of these, most have been planned crash landings, with only the USA and Soviet Union achieving unmanned “soft landings”.

The Google Lunar X Prize competition, aimed at promoting the state of the art in private space exploration, offers a $20 million award for the first privately funded team to land a robotic probe on the Moon.

Of course the moon landing has also become a very popular subject for some elaborate conspiracy theories, with many groups and individuals presenting compelling ‘evidence’ that the landing never happened, and that it was all an elaborately staged hoax by NASA.

But that’s another story…

To celebrate Moon Day, why don’t you kick back and watch your favourite space movie?  Or make a playlist of songs about the moon to be the soundtrack to your day.

I’ve got the Cowboy Junkies’ “Blue Moon Revisited” playing in the background, and I think The Waterboys’ “The Whole of the Moon” and Neko Case’s “I Wish I was the Moon” are next – what ‘moon songs’ would you recommend?

Samuel Colt, creator of an American icon

Today we celebrate the birthday of Samuel Colt (July 19, 1814 – January 10, 1862). He did not grow to be very old, but in his lifetime he did establish an American icon, Colt’s Patent Fire-Arms Manufacturing Company (now known as Colt’s Manufacturing Company). Through his company, he developed the first viable mass produced revolver.

After a number of unsuccessful attempts at getting a gun-making business off the ground, Colt got his break when the Texas Rangers ordered 1000 of his revolvers in 1847, during the American Civil War with Mexico. His guns were also used by both the North and the South during the American Civil War. The 1872 Colt Single Action Army revolver (also known as the Model P, the Peacemaker and the Colt 45) has become one of the best known sidearms in history.

Colt’s Manufacturing Company – still going strong 150 years after the death of its founder.
(© All Rights Reserved)

Even though he did not invent the revolver, he did contribute meaningful practical adaptations to the design. Samuel Colt’s real innovation, however, lay in his use of an assembly line approach to manufacturing and using interchangeable parts in the construction of his guns. This approach, enabling him to be more efficient and cost-effective than his competition, placed him at the forefront of the Industrial Revolution.  In Colt’s words, “The first workman would receive two or three of the most important parts and would affix these and pass them on to the next who add a part and pass the growing article on to another who would do the same, and so on until the complete arm is put together.”

Colt was also an advertising and marketing pioneer, employing techniques like celebrity endorsement and corporate gifts to promote his wares. He may at times have gone a bit too far in terms of ‘marketing’, however, having often been accused of promoting his weapons through bribery, threats and monopoly.

Reading up on the man, its clear that Colt was a larger than life character who thought big, lived extravagantly, and didn’t shy away from conflict and controversy.

In 2006, Samuel Colt was inducted into the National Inventors Hall of Fame.

‘Intel Inside’ and the personal computing boom

This day in 1968 marks a very important moment in the history of personal computing – it is the day that semiconductor giant Intel was founded.

Intel was founded by Gordon Moore and Robert Noyce. They initially wanted to name the company “Moore Noyce”, but that sounded too much like “more noise”, so they settled on their initials for the name NM Electronics. The name Intel, derived from Integrated Electronics, was adopted later the same year.  Intel produced their first product, a RAM chip, in 1969, and memory chips represented the majority of its business for the first decade. In the meantime they also produced microprocessors, for example releasing the 8080 microprocessor, which was deployed in a vast array of products, from cash registers and traffic lights to computers, in 1974.

The success of IBM microcomputers in the early 80’s prompted Intel to increase its efforts to gain dominance in the microprocessor market. Their subsequent x86 series of microprocessors, followed by the Pentium series, became staples in most personal computers from the 1990’s onwards. Initially a company famous only among engineers and computer scientists, the ‘Intel Inside’ marketing campaign turned Intel into a household name.

The “Intel Inside” campaign remains one of the most famous and successful advertising and marketing campaigns in IT history.
(© All Rights Reserved)

The 1990’s represented an era of unprecedented growth for the company as primary hardware supplier for the personal computer industry. After 2000, changes in market dynamics and increased competition slowed the company’s growth, but Intel has been able to sufficiently adapt to remain relevant in the fast-changing IT sector. In June 2005, Apple CEO Steve Jobs announced Apple’s transition from its PowerPC architecture to Intel-based architecture, and by mid 2006 the entire Apple Mac consumer line was sporting Intel processors.

Intel currently remains, by revenue, the world’s largest semiconductor chip maker. If you’re reading this on a desktop computer or laptop, you are in all likelihood doing so on an “Intel Inside” system.

Party like a mathematician on Yellow Pig Day!

17 July is not just another ordinary day; just like a yellow pig is not just another ordinary pig. Today is Yellow Pig Day, the day to take a moment to honour the magical, mathematic pig that has inspired mathematicians for years.

The yellow pig was invented in the early 60’s by two Princeton maths students, Michael Spivak and David C. Kelly, while working on an assignment to identify unique properties of the number 17. After some intense mental gymnastics (and possibly a few pints at the local pub), when they finally ran out of ideas, they thought up the yellow pig, a mythical 17-eyelashed creature (that’s eight lashes on one eye and nine on the other, of course).

OK, so Winnie the Pooh’s friend Piglet is generally considered to be pink, but look closely, and you may notice him turning yellow for just one day of the year…
(© All Rights Reserved)

Spivak has since written a number of mathematics textbooks, where he regularly includes hidden references to yellow pigs, while David Kelly presents an annual mathematics summer school to high school students, where he introduces them to the “Cult of the Yellow Pig”.  He is rumoured to be the proud owner of an impressive collection of between 289 (17 squared) and 4913 (17 cubed) yellow pigs. When asked about the significance of a yellow pig, he responds, “If you have to ask, you just won’t understand.”

Through Spivak and Kelly’s efforts, yellow pigs have become popular toys among mathematicians. Yellow Pig Day is also celebrated at various (mainly US) University Maths Departments, with the singing of yellow pig carols and the eating of yellow pig cake.

By the way, if you’re unsure about the significance of the number 17, look no further than this list. There’s no doubt the number is as special and magical as the yellow pig itself…