Daffodil Day and the ongoing fight against cancer

It’s Daffodil Day today, August 31st. Well, it’s Daffodil Day in New Zealand, to be exact – Australian Daffodil Day happened on the 24th of this month already. The US, bless them, seem to have a whole bunch of different Daffodil Days across different states. (With Daffodils being a spring flower, it obviously makes sense that most US Daffodil Days happen earlier in the year, around February, and not August/September, as it does down here in the South.)

Daffodil Day is all about cancer – raising awareness of the disease, raising funds for cancer related research, and creating a support network for individuals suffering from the disease.

The reason why the daffodil flower is used internationally by Cancer Societies as the global symbol of hope for people living with cancer, is that it is one of the first, and one of the strongest, flowers of spring, and as such is a symbol for hope and renewal, new life, new beginnings and new possibilities.
(© All Rights Reserved)

Cancer is an incredibly pervasive, prevalent disease – here in New Zealand it is the leading cause of death in the country –  and I’m sure there are very few people who are not in some way fairly directly affected by it. My dad died of cancer in his liver and colon; my mother in law is a breast-cancer survivor; just about everyone I know has someone close to them who has either died from, or is living with, the disease.

In a nutshell, cancer occurs when cells in the body accumulate genetic changes (due to various factors), resulting in a loss of growth control. Normal cells grow, divide and die in an orderly manner, in response to signals from the body and the environment. When cells become cancerous, however, they fail to respond to the normal signals, and start growing and dividing in an uncontrolled manner. These out-of-control cells can spread through the body via the bloodstream or lymph vessels (a process called metastasis) and continue to grow and replace normal tissue. It is the fact that it’s the body’s own cells that go crazy and effectively turn against their host, that makes it such a complex disease to treat.

As mentioned, one of the critical focus areas of Daffodil Day is raising money to support research into finding cures for the disease.

Over the years, literally billions of dollars have been spent on cancer research, and it’s quite a sobering thought when you realise that, in spite of all this, the death rate from the disease has changed little over the past 50 or so years. As new therapies are developed, cancer also adapts and evolves, finding new ways to kill.

Now this does not mean all is in vain – millions of people have been saved from the therapies that have been developed. All it means is that there is no room for complacency, and new and more effective cancer therapies are continually needed to stay ahead of, or at least keep up with, the disease.

In my job as a science photographer, I interact with a wide range of research and technology organisations, and one of the most inspiring of these is the Malaghan Institute of Medical Research – New Zealand’s leading medical research institute, and a registered charity based in Wellington, NZ. The reason I mention this fact is that one of their main fields of research is cancer (they also research cures for asthma, arthritis, multiple sclerosis and infectious diseases) and they are one of the organisations supported through the proceeds of fundraising events like Daffodil Day.

One of the main fields of cancer research that the Malaghan Institute focuses on is Immunotherapy, which basically involves using the immune system and it’s unique properties to complement existing cancer treatments. As they explain, “Immune cells are specific and have the capacity to discriminate between normal and cancer cells, they have powerful effector capacity and can recruit inflammatory cells to destroy neoplastic tissue, and they can migrate to different tissues and eliminate residual metastatic disease.” So, similar techniques to those used in helping the immune system recognise and fight contagious diseases (such as vaccination, etc), can also be used to help the immune system recognise cancer cells and to strengthen their ability to destroy them.

Another more recent research subject at the Institute is cancer stem cell research. Cancer stem cells are cancer’s evil root – these tumor initiating cells are highly resistant to drug and radiation treatment – and the focus of the research is on finding safe and effective ways to eradicate them.

Researchers at the Malaghan Institute of Medical Research are conducting research into Immunotherapy, unleashing the full cancer-fighting potential of the immune systems of cancer patients to fight the disease.
(© All Rights Reserved)

Organisations like the Malaghan Institute, and many others like them across the world, are doing incredible work to address the continually evolving threat of cancer, and really need all the support they can get. It’s a scary, scary topic, and it’s good to know there are talented, committed scientists and researchers out there facing the challenge head on.

Celebrating Ernest Rutherford, the father of nuclear physics

Today we celebrate the life and work of New Zealand’s greatest scientist, Lord Ernest Rutherford – the father of nuclear physics. In the words of the author John Campbell, “He is to the atom what Darwin is to evolution, Newton to mechanics, Faraday to electricity and Einstein to relativity.”

Rutherford was responsible for three fundamental contributions to the field: (1) he explained radioactivity as the spontaneous disintegration of the atom; (2) he determined the structure of the atom; and (3) he was the first to split the atom.

One of New Zealand’s proudest sons, ‘Lord Rutherford of Nelson’, graces the front of the country’s highest value bank note, the $100 note. Appearing with him is his Nobel Prize medal and a graph plotting the results from his investigations into naturally occurring radioactivity.
(© All Rights Reserved)

Ernest Rutherford was born on 30 August 1871 in the South Island town of Nelson, New Zealand. His father James Rutherford, the son of a Scottish immigrant, came to New Zealand at the age of four, while his mother, Martha Rutherford (née Thompson) emigrated with her widowed mother from England when she was thirteen. The Rutherfords, in the words of dad James, wanted “to raise a little flax and a lot of children”. Not sure how they managed on the flax, but they certainly lived up to their aspirations in the children department – young Ernest was the second son, and fourth child, of no less than twelve Rutherford children.

Rutherford excelled academically, winning a scholarship to the Canterbury College of the University of New Zealand. After completing his basic university studies through the University of New Zealand, he successfully applied for another scholarship, which enabled bim to go to the UK to complete his postgraduate studies at the Cavendish Laboratory, University of Cambridge.

While working with Professor JJ Thompson, Rutherford discovered that radioactive uranium gave off two separate types of emissions – he named these alpha and beta rays. Beta rays were subsequently identified as high speed electrons.

Radioactivity and the spontaneous disintegration of the atom

In 1898 Rutherford accepted a professorship at McGlll University in Montreal, Canada. It was here, with the help of a young chemist, Frederick Soddy, that he conducted the research that gained him the 1908 Nobel Prize in Chemistry, investigating “the disintegration of the elements and the chemistry of radioactive substances”. (Soddy himself later received the 1921 Nobel Prize in Chemistry.)

Determining the structure of the atom

A subsequent move to Manchester, England, to be nearer to what he considered the main centres of science, saw Rutherford taking on a professorship at the Manchester University. With his research assistant, Ernest Marsden, he investigated the scattering of alpha rays (something he first noticed while still at McGill). They noticed some alpha rays would ‘bounce back’ when hitting even a thin gold film – this was a most surprising result, with Rutherford likening it to firing a large naval shell at a tissue paper, and seeing it bounce back. This led to him developing his concept of the ‘nucleus’, his greatest contribution to physics. According to this concept the whole mass of the atom, and all its positive charge, is concentrated in a miniscule point at its centre, which he termed the nucleus.

The Danish physicist Niels Bohr began working with Rutherford, and he adapted Rutherford’s nuclear structure to include electrons in stable formation around the nucleus. The Rutherford-Bohr model of the atom, with some improvements from Heisenberg, remains valid to this day.

Splitting the atom

In 1919, during his last year at Manchester, Rutherford noted that the nuclei of certain light elements, like nitrogen, would disintegrate when bombarded by alpha particles coming from a radioactive source, and that during this process fast protons were emitted. By doing this, Rutherford became the first person to split the atom. Patrick Blackett (winner of the 1948 Nobel Prize in Physics) later proved that splitting the nitrogen atom actually transformed it into an oxygen isotope, so Rutherford effectively became the first to deliberately transmute one element into another.

Rutherford received the knighthood in 1914; he was appointed to the Order of Merit in 1925, and in 1931 he was raised to the peerage as Lord Rutherford of Nelson. A proud New Zealander despite living and working abroad for most of his academic career, he chose to include in his coat of arms a Kiwi, a Maori Warrior and Hermes Trismegistus, the patron saint of knowledge and alchemists.

He died in Cambridge on October 19, 1937, leaving his wife Mary Newton, and only child Eileen.

A great scientist, Rutherford’s contribution is perhaps best summarised in his eulogy in the New York Times:
“It is given to but few men to achieve immortality, still less to achieve Olympian rank, during their own lifetime. Lord Rutherford achieved both. In a generation that witnessed one of the greatest revolutions in the entire history of science he was universally acknowledged as the leading explorer of the vast infinitely complex universe within the atom, a universe that he was first to penetrate.”

It’s ‘More Herbs, Less Salt’ Day – time to give your heart a breather

Today, according to those in the know, is ‘More Herbs, Less Salt’ Day. Another of those days that has been thought up to try and nudge us towards a slightly healthier lifestyle (much like ‘Independence from Meat’ Day, that I blogged about earlier).

Indeed, leaning towards herbs, rather than heaps of salt, to season your food is not a bad idea at all. I’m sure anyone who has opened a general lifestyle magazine in the last 10 years will know that salt isn’t all that great for our overly stressed 21st century bodies – our poor hearts already have enough to deal with. Giving the heart a further knock by subjecting it to a high salt diet really isn’t a winning idea.

Using more herbs and less salt not only makes your food healthier, but tastier and prettier too.
(© All Rights Reserved)

There’s a significant body of research linking high sodium diets to high blood pressure, which in turn is linked to heart attacks, strokes, kidney disease and other nasties. Proving that a decrease in salt actually reduces the risk of heart disease has been more difficult, but a long-term research project conducted a few years ago, aimed to do exactly that. In an article entitled “Long term effects of dietary sodium reduction on cardiovascular disease outcomes: observational follow-up of the trials of hypertension prevention (TOHP)”, the research team from Harvard Medical School presents their results from a long-term follow-up assessment related to a sodium-reduction, hypertension prevention study done 15 years earlier. In the original intervention, a group of adults followed a sodium reduced diet for between 18 and 48 months. From the long-term follow-up research it was found that, compared to the general population, “Risk of a cardiovascular event was 25% lower among those in the intervention group (relative risk 0.75, 95% confidence interval 0.57 to 0.99, P=0.04), adjusted for trial, clinic, age, race, and sex, and 30% lower after further adjustment for baseline sodium excretion and weight (0.70, 0.53 to 0.94), with similar results in each trial.”

This led them to the conclusion that “Sodium reduction, previously shown to lower blood pressure, may also reduce long term risk of cardiovascular events.”

To really put you off a high salt diet, a visit to World Action on Salt and Health, a website dedicated to “improve the health of populations throughout the world by achieving a gradual reduction in salt intake”, should do the trick. Just note, however, that this day (and most scientific research) calls for ‘less salt’, not ‘no salt’. As one of the primary electrolytes in the body, salt is essential for the body to function – just not at the levels that we’re consuming it.

Herbs on the other hand don’t just taste good – they’re like a veritable medicine cabinet in your garden (or pantry, if you don’t grow your own). Besides often being rich in vitamins and trace elements the body needs, specific herbs have long been known for their medicinal effects.

Herbs like chamomile and lavender is known to have a calming effect, parsley, oregano and echinacea can boost the immune system, garlic contains selenium, which can help reduce blood pressure (now there’s a good one to fight the effects of a high sodium diet!), mint and feverfew have been reported to reduce headaches, basil and bergemot fights colds and flu, lemon balm and rosemary is good for concentration and memory… The list goes on.

Of course, as with everything in life, the key is moderation – ‘more herbs’ should not be seen as a license to go overboard on every herb you can lay your hands on. Reckless and injudicious use of herbal supplements can be very detrimental to your health, to say the least. Colodaro State University hosts a nice site, Herbals for Health?, which is worth a read – it gives a balanced overview of the pro’s and cons of a few popular herbal supplements.

Despite the cautionary notes above, culinary herbs, especially freshly home-grown, generally speaking should not cause health risks when used in moderation as an alternative to salt in daily cooking, and that, after all, is what this day is all about. Using herbs in cooking can be a very exciting way to improve your health and well-being, so have fun experimenting with all those new tastes and flavours!

Weekly Photo Challenge: Urban

When I think ‘urban’, I think ‘people’. Which leads my thoughts to what those people do in their urban environment, which immediately screams ‘shopping!’ And then of course, one starts thinking about ‘the people who serve the people – the shopkeepers’.

So here, in response to the ‘urban’ theme, some local shopkeepers. 🙂

Shopkeeper series 1: Barista
(© All Rights Reserved)
Shopkeeper series 2: Antiques dealer
(© All Rights Reserved)
Shopkeeper series 3: Collectables dealer in the week; viking over the weekend
(© All Rights Reserved)
Shopkeeper series 4: Streetside green-grocer
(© All Rights Reserved)

Celebrating sound science communication with Scientific American

Today we celebrate a veritable institution in the international popular science communication landscape – the magazine Scientific American today celebrates its incredible 167th birthday, making it the oldest continuously published monthly in the US.

Scientific American – a staple on the news stands and magazine racks of good bookshops around the world.
(© All Rights Reserved)

The first issue of the magazine, then a four page weekly newspaper, appeared on this day back in 1845.  It was published by Rufus Porter, a very interesting character who, besides being a magazine publisher, was also a painter, inventor, schoolmaster and editor. In line with Porter’s personal interests, the magazine reported on happenings in the US Patent Office, as well as having popular articles on inventions of the time.

Porter’s interest in the magazine didn’t last long – after 10 months he sold it to Alfred Beach and Orson Munn I (for a whopping $800).  It remained under ownership of Munn & Company, who, in the century between 1846 and 1948, grew it from its humble beginnings to a large and influencial periodical. In the late 40’s it was put up for sale again, and this time the magazine was sold to three partners, Gerard Piel, Dennis Flanagan, and Donald Miller Jr. They reportedly planned on starting their own new science magazine, but finding that Scientific American was for sale, they opted to rather buy that and work their ideas into the existing title. They made significant changes to the magazine, updating and broadening its appeal. Ownership remained stable from 1948 to 1986, when it was sold to the German Holtzbrinck group, who has owned it since. The current Editor in Chief is Mariette DiChristina – an experienced science journalist and the first woman in the magazine’s history to hold the position.

What has kept the magazine alive and relevant for so many years, is the fact that it has consistently focused on an educated, but not necessarily scientific public, clearly explaining the scientific concepts it reported on and maintaining strong editorial quality control. It has also, since its inception, focused on clear, explanatory visual illustrations to accompany its articles. In its long lifetime, the magazine has published contributions from many famous scientists, including more than 140 Nobel laureates. Albert Einstein contributed an article called “On the Generalized Theory of Gravitation” in 1950.

In 1996, the Scientific American website was launched. A mobile site, as well as the Scientific American Blog Network, followed in 2011. For the past 10 years since 2002, the magazine has been hosting its own annual awards, the Scientific American 50, recognising important science and technology contributions of the previous year, across a wide range of categories from agriculture to defence to medicine.

Here’s looking forward to many more years of quality science communication, and a big double-century celebration in 2045!

Getting solarised on Man Ray’s birthday

Today we celebrate the birthday of one of the great avant-garde photographers of the modern era – the enigmatic Man Ray. Born Emmanual Radnitzky (27 Aug 1890 – 18 Nov 1976) in Pennsylvania, US, he was the oldest child of Russian Jewish immigrants.

He changed his named to Man Ray in his early 20’s – ‘Ray’, a shortened form of Radnitsky, was something his brother came up with in reaction to the anti-Semitism prevalent at the time, while ‘Man’ came from his childhood nickname ‘Manny’.

Interested in art from an early age, Man Ray pursued a career as an artist after leaving school. Starting with painting as his medium of choice, he soon developed an interest in the avant-garde movement and became involved with the Dadaists in New York. He started investigating alternative image-making methods, including photography, as well as experimenting with various new artistic forms and techniques, including readymades (influenced by his friend Marcel Duchamp) and kinetic art.

In 1921 he relocated to the Montparnasse quarter in Paris, France, an area favoured by artists of the time. Over the next 20 years, he focused on photography, becoming an influential photographic artist and photographing many of the key figures in the art world, from James Joyce to Jean Cocteau.

While he made a notable contribution as painter, he is perhaps best remembered for his photography – he is responsible for some of the most iconic photographic images of the 20th century. Together with his assistant and lover Lee Miller, herself a surrealist photographer of note, he ‘reinvented’ the technique of solarisation when Lee accidentally over-exposed an image in his darkroom.

A digitally ‘solarised’ image. Given my science photography focus, I’ve opted for a scientific image to subject to the solarisation treatment.
(© All Rights Reserved)

Solarisation is the partial reversal of an image which occurs when a film or print is subjected to a brief period of extreme over-exposure. The effect was first discovered in the early 19th century and was already identified by photographic pioneers such as Daguerre and Draper, so Man Ray definitely didn’t invent the concept. He did, however, recognise the creative potential of this ‘accidental technique’, which usually occurs when a film or print is accidentally exposed to brief flash of light (like briefly switching on a light in the darkroom). He spent a lot of time and effort perfecting the technique, and produced some of the classic examples in this style.

While the solarisation technique is a physical, chemical process achieved during the development of a piece of photographic film or print, various digital processing techniques have been developed to mimic the solarisation effect – Adobe Photoshop even has a readymade ‘Solarize’ filter. A decent digital approximation of the solarisation effect can be achieved using tools like Photoshop, but it’s not quite the same as the real thing. It is definitely less exciting in the sense that you are almost in too much control of the effect – you can precisely control the levels of ‘digital solarisation’, unlike the physical situation where you are partially at the mercy of the chemistry of your medium, and the element of chance becomes an integral part of the artistic process.

Flowers make excellent subjects for solarisation.
(© All Rights Reserved)

Of course art is not just about the tools – whether you use a flash of light or a photoshop tweak to achieve a creative result, the technique will always be subservient to the artistic inspiration. In the words of Man Ray himself:

“… there will always be those who look only at technique, who ask ‘how’, while others of a more curious nature will ask ‘why’. Personally, I have always preferred inspiration to information.”

Celebrating the invention of toilet paper

Here’s an amusing story – today is the birthday of toilet paper! On this day back in the year 580 AD, the Chinese invented toilet paper (well, at least according to historyorb.com they did). I doubt the accuracy of this fact, as various sources give widely differing historic accounts of this rather personal product. It is, however, too good a topic to let pass, so I will accept it as true for now.

To make things more interesting, I have also found a site claiming that today is the day back in 1871 when toilet paper was first sold on a roll in the US, and that today is, in fact, National Toilet Paper Day in the States.

So whichever way you look at it, toilet paper’s shadow looms large over this day.

Spotlight on toilet paper – basic commodity or luxury item?
(© All Rights Reserved)

Of course, when you start thinking about “the first use of toilet paper”, the second thought that enters your mind almost immediately, is “what did they use before?”. Well, whatever was available, it seems – grass, leaves, moss, corncobs, coconut shells (I cannot quite get my mind around that one!), snow, sheep’s wool… The Romans, fancy buggers that they were, used sponges and salt water.

It does seem to be a generally accepted fact that it was the Chinese who introduced the use of paper for cleaning up after ‘the act’. The earliest recorded reference to the use of toilet paper seems to come from the Chinese scholar Yan Zhitui, who wrote in 589 AD: “Paper on which there are quotations or commentaries from the Five Classics or the names of sages, I dare not use for toilet purposes.” (According to Wikipedia.)

On a roll

Rolled and perforated toilet paper, similar to what we know today, only saw the light of day in the mid 19th century, with American Zeth Wheeler taking out a patent for it in 1871. It seems the commercial potential of purpose-made toilet paper was marred in the early days by the fact that people were too embarrassed to ask for it, or to be seen buying it, so Wheeler’s first company, the Rolled Wrapping Paper Company, failed to turn a profit. Things have obviously changed since then, with toilet paper today being a multi-billion dollar industry.

The future

It’s interesting to speculate about the future of bathroom hygiene.  Will toilet paper remain the product of choice in the Western world? A toilet known as the ‘Washlet’ (a toilet equipped with a bidet and air blower) is growing in popularity in Japan, while many countries in the Middle East and Asia prefer water cleaning. As we continue to exhaust the world’s natural resources, and manufacturing costs continue to rise, will a product as humble as the toilet roll become too much of a luxury item for many people to afford?

Interesting thought… Considering that the average American reportedly uses almost 60 squares of toilet paper a day, and the market for the product is booming in developing countries, it really is a huge volume of wood pulp that simply goes down the toilet – thousands upon thousands of trees are consumed daily by the toilet paper industry.

Over or under?

OK, time for a quick amusing fact:  In brand new research published in the US, a survey was done to find out whether Americans prefer their toilet paper to hang over or under the roll. The result? A staggering 75% of respondents preferred the paper hanging over the roll. Women appear to be even more adamant about this, as do people over the age of 60. Nevada turned out to be the ‘over-hanging’ capital of the US, with almost 100% preferring the over-the-roll option. For more have-to-know information, you can read more on the survey results here.

So how do you roll?

The birth of Linux, giant killer of the Open Source world

A while ago, I published a post on the start of the open source operating system revolution. As mentioned there, Linus Torvalds did not ‘invent’ the open source operating system with Linux, but there’s no denying that he is one of the true superstars of the open source world, and that Linux is, without a doubt, one of the few open source operating systems that have managed to make the big commercial players sit up and take notice.

From cellphones to supercomputers – Linux is a popular operating system across a wide range of platforms.
(© All Rights Reserved)

There is some debate around the date that should be considered the ‘official’ birthday of Linux – there are three early emails from Torvalds making reference to his operating system – but the general consensus seems to be that his email of 25 August 1991 best represents Linux’s inception:

From:torvalds@klaava.Helsinki.FI (Linus Benedict Torvalds)
Newsgroup: comp.os.minix
Subject: What would you like to see most in minix?
Summary: small poll for my new operating system
Message-ID: 1991Aug25, 20578.9541@klaava.Helsinki.FI
Date: 25 Aug 91 20:57:08 GMT
Organization: University of Helsinki.

Hello everybody out there using minix-

I’m doing a (free) operating system (just a hobby, won’t be big and professional like gnu) for 386(486) AT clones. This has been brewing since april, and is starting to get ready. I’d like any feedback on things people like/dislike in minix; as my OS resembles it somewhat (same physical layout of the file-sytem due to practical reasons) among other things.

I’ve currently ported bash (1.08) an gcc (1.40), and things seem to work. This implies that i’ll get something practical within a few months, and I’d like to know what features most people want. Any suggestions are welcome, but I won’t promise I’ll implement them 🙂

Linus Torvalds torvalds@kruuna.helsinki.fi

Originally developed for Intel x86 personal computers, the Linux operating system has since been ported to a wider range of platforms than any other operating system, ranging from servers to supercomputers to embedded systems. The Android operating system, used by a wide range of mobile devices, is built on a Linux kernel. Quite amazing for a system that it’s creator described as “just a hobby, won’t be big and professional like gnu”.

The Linux story really is a feel-good tale of how a non-commercial product, based on a free and open community-based development model, can match and exceed its multi-million dollar commercial competition.

Happy birthday, Linux, and power to you, Linus Torvalds – may you long continue to steer the ship, and take others along on your quest for the open and the free.

Celebrating George Crum and the birth of the potato chip

I should start today’s post with a bit of a disclaimer – while this tale is told as the truth, the exact date details are difficult to confirm. However, most references I could find stated the date as 24 August 1853, so here goes.

On the above date, Railroad magnate Commadore Cornelius Vanderbilt went dining at the Moon Lake House, a restaurant in Saratoga Springs, New York. He ordered french fries, but found the fries he received too thick, bland and soggy, so he sent them back to the kitchen. George Crum, the chef at the Moon Lake House, wasn’t impressed by what he considered to be an overly fussy customer, so he went overboard to address his concerns – he sliced the fries paper-thin, fried them to a crisp and seasoned them with a generous helping of salt. Much to his amazement, Vanderbilt loved the the crispy chips, so much so that the restaurant decided to add them as a regular menu item, under the name ‘Saratoga Chips’.

A few years later, in 1860, chef Crum opened his own restaurant, and he took pride in serving his ‘signature dish’, placing potato chips in baskets on every table.

Crispy, crunchy potato chips – not the healthiest snack around, but we cannot seem to get enough of them.
(© All Rights Reserved)

Despite the popularity of Crum’s invention, no-one recognised it’s potential as a mass-produced, off-the-shelf snack – it remained a restaurant delicacy until 1926, when Mrs Scudder began mass-producing potato chips packaged in wax paper bags. In 1938, Herman Lay started producing Lay’s Potato Chips, the first successful national brand in the US.

The rest, as they say, is history – chips (or crisps, as the Brits like to call them) have taken over the world, with the global chip market in 2005 generating total revenues of more than US$16 billion. That’s more than a third of the total savoury snack market for the year.

Of course, being deep-fried and doused in salt, chips aren’t exactly a health snack. They have been identified as one of the leading contributors to long-term weight gain, as well as being linked to heart disease. In response to these issues, potato chips companies are investing huge amounts in research and development of new, more health-conscious products. Frito-Lay, for example, have reportedly invested more than $400 million in new product development, including techniques to reduce the salt content in Lay’s potato chips without compromising taste.

Now flavour is one thing, but did you know that the crunch produced when we bite into a chip, also plays a significant role in our perception of the snack? According to a New York Times article, a team of psychologists at Oxford University conducted an experiment where they equipped test subjects with sound-blocking headphones, and made them bite into potato chips in front of a microphone. In different test runs, using the exact same chips, the sound of the crunch was processed in different ways and passed back to the testers via the earphones. Taking their perception of the unaltered sound as the benchmark, they found that when the crunchy sound was amplified, testers considered the chips to taste fresher and crispier, while muting the crunch resulted in the same chips being rated as less crispy and stale.

Hmmm, all this talk about crunchy chips is making me hungry – I can definitely do with a bag of good old Salt & Vinegar chips right about now!

Joining hands on Black Ribbon Day

Today is International Black Ribbon Day; also celebrated as the European Day of Remembrance for Victims of Stalinism and Nazism in Europe. While it is a day highlighting a dark part of history, more than anything else, today is a celebration of the human spirit, about unity and about how amazing things can be achieved by joining hands and standing together (quite literally, in this case).

Joining hands to overcome hardship (and to solve mathematical problems!).
(© All Rights Reserved)

Black Ribbon Day originated in the 1980s, as a annual series of demonstrations, held on 23 August in various western countries to highlight crimes and human rights violations in the former Soviet Union. The date marks the anniversary of the signing of the Molotov-Ribbentrop pact between the Nazi and Soviet Communist regimes – an event described by President Jerzy Buzek of the European Parliament as “the collusion of the two worst forms of totalitarianism in the history of humanity.”

Starting with initial participation of western countries only, it spread to the Baltic states in 1987, and in 1989 culminated in a historic event known as the Baltic Way. The Baltic Way, also referred to as the Baltic Chain, the Chain of Freedom and the Singing Revolution, was a peaceful demonstration involving almost two million people joining hands to form a 600km long human chain across the three Baltic states (Estonian SSR, Latvian SSR, and Lithuanian SSR), to protest against continued Soviet occupation.

The Baltic Way was meant to highlight the Baltic states’ desire for independence and to show the solidarity between the 3 nations. It proved an effective, emotionally captivating event. Within 6 months of the protest, Lithuania became the first Republic of the Soviet Union to declare independence, with Estonia and Latvia following in 1991.

Now you may be wondering why I’m discussing International Black Ribbon Day and the Baltic Way on this blog. Well, besides it being an opportunity to celebrate the strength of the human spirit in overcoming adversity, what caught my attention was something small and (almost) unrelated that grew out of it – the Baltic Way Mathematical Contest.

This maths contest has been organised annually since 1990, in commemoration of the Baltic Way human chain demonstrations. It differs from most other international mathematical competitions in that it is a true team contest. Teams, consisting of 5 secondary school students each, are presented with 20 problems, and they have four and a half ours to collaboratively solve these.

Initial participation was limited to the three Baltic states, but the competition has grown to include all countries around the Baltic Sea. Germany participates with a northern regions team, and Russia with a team from St Petersburg. Iceland has a special invitation for being the first state to recognise the independence of the Baltic States, and guest countries (including Israel, Belarus, Belgium and South Africa) have been invited in particular years, at the discretion of the organisers.

From people joining hands to overcome political hardship to students teaming up to solve complex mathematical problems, today truly is a day to celebrate strength in unity.