Page 11 of 18

Celebrating the legacy of Carl Zeiss

Today we celebrate one of the great names in optics – it’s the birthday of Carl Zeiss, born on 11 September 1816.

Zeiss studied mathematics, physics and optics, among other subjects, at the University of Jena, before he started experimenting with making lenses. By 1847 he founded Carl Zeiss AG and started manufacturing microscopes full time.

Carl Zeiss built the Zeiss empire through the manufacture of innovative, high quality optics for use in microscopy.
(© All Rights Reserved)

Zeiss real contribution came from his realisation that, to differentiate himself from other manufacturers in the optics industry, he had to significantly up the ante in terms of quality and innovation. He first teamed up with the physicist Dr Ernst Abbe, who calculated that the optical quality of lenses at the time left much room for improvement, but also found that the optical glass available was not up to his manufacturing requirements. Zeiss then brought on board glass chemist Dr Otto Schott, who established a glassworks at Jena where he produced new, better quality glass that was able to meet and exceed Abbe’s requirements.

While the lenses produced by Zeiss were initially primarily used in the manufacture of microscopes, the glass produced at Jena also opened up possibilities for the creation of much improved photographic lenses, for use in still and video cameras. Zeiss’ early innovations in photographic lenses happened mostly through the contributions of Dr Paul Rudolph, who was responsible for many classic Zeiss lenses around the end of the 19th century including the famous Planar® in 1896. Later famous Zeiss lenses included the Tessar® (1902) and the Sonnar® (1931). In 1935, Alexander Smakula developed an innovative anti-reflective coating for camera lenses, known as the Carl Zeiss T-coating, which opened up totally new possibilities in lens design, and is a key component in modern photographic lens design.

Even though much of the photographic contributions made by the Carl Zeiss AG company only happened after the death of its founder (Carl Zeiss died on 3 December 1888), his name will always be inextricably linked to top quality photographic optics. Zeiss lenses were used extensively in the cameras manufactured by Zeiss Ikon, one of the companies in the Zeiss group, who started producing the classic Contax cameras in the mid-20th century. The Contax rangefinder was the first 35mm camera to pose a serious challenge to the iconic Leica M-series of the time.

Zeiss lenses have been used by many of the great camera brands, including Voigtlander, Hasselblad, Rollei and Sony.

Even in the 21st century, the name Carl Zeiss remains synonymous with quality optics, and brands sporting Zeiss lenses proudly flaunt the fact.
(© All Rights Reserved)

Aside from microscopy and photography, the optical innovations created by Carl Zeiss and his company have found a use in a wide range of applications, from medical solutions to sports optics to industrial metrology.

Commemorating the discovery of DNA fingerprinting

Some time ago, I did a post about fingerprinting and personal identification. Now while fingerprinting has been around for more than 150 years, a new breakthrough in personal identification happened much more recently – on this day in 1984, DNA fingerprinting was discovered in Leicester, England.

The man who discovered this revolutionary new technique, was Sir Alex Jeffreys of the University of Leicester. He was the first to realised that each person’s DNA has a unique pattern, almost like a bar code, and that this could be used as a biological identification method. The technique has, over the past 25+ years, proved an invaluable tool in forensics, crime investigations and identification of genetic relationships.

Geneticist studying a DNA profile.
(© All Rights Reserved)

Of course no technique is perfect, and in the case of DNA fingerprinting there are also rare occasions where the technique cannot be used. Identical twins, for example, have matching DNA, and so their DNA fingerprints are the same.  A much rarer, and much stranger, occurrence is when a single person has more than one DNA fingerprint.

Strange as this may seem, having a single person with two distinct genetic identities is possible. This condition is known as chimerism, named after the chimera, a mythological creature with features from more than one distinct animal, for example a lion’s head and a serpent’s tail.

A human chimera is formed during pregnancy. Normally the male gamete (sperm) fuses with the female gamete (ova) to form a zygote, the cell that becomes the embryo. This embryo has a new genetic identity, formed from a unique combination of the DNA of the mother and the father. On rare occasions, two male gametes will fuse with two female gametes, to form two zygotes which develop into two (non-identical) twin embryos. These embryos will each have a different, unique new DNA fingerprint, inherited from the father and mother.

In extremely rare cases, these two embryos can fuse, growing into a single child, but formed from four gametes, and thus having two distinct sets of DNA. The chimera child can grow up without anyone ever knowing about his double identity, but may in fact have different organs or body parts that have completely different genetic fingerprints. Even when a DNA identity test is done on a chimera, DNA will usually only be taken from a single source, such as a blood sample or cheek swab, and the second identity may never be known.

Chimerism may, in rare occasions, visibly manifest itself, for example with people having both male and female reproductive organs, or two different colour eyes. (However, different eye colours, or heterochromia, can have different causes, and is, as far as I know, not necessarily an indication of chimerism.)

(© All Rights Reserved)

The most famous example of a chimera confounding DNA profiling came from a case in 2003, when a mother of three were told, after DNA tests were done on her and her three children, that two of her three sons were not hers, even though she maintained that she had conceived them with her husband, and delivered them naturally.

After more extensive testing, it was discovered that she was a chimera, and that the two sons thought not to be hers did in fact match her ‘second identity’.

Definitely a case where truth is, in fact, stranger than fiction.

Salami – good when it’s meat, less so when it’s science

Today is a celebration of that greatest of cured meats – it’s Salami Day.

Salami is a cured, fermented and air-dried sausage-style meat, usually made from pork and/or beef, but also sometimes from a range of other meats including venison and turkey (and even, apparently, shark and swordfish in Japan). The meat is minced together with a range of spices, garlic, minced fat, herbs and wine or vinegar, and left to ferment for a day or so before being stuffed into a (usually edible) casing and hung out to cure. The casing is sometimes treated with an edible mold culture which adds flavour and helps protect the salami from spoilage.

It first became popular with South European peasants, thanks to the fact that it doesn’t require refrigeration, and can last at room temperature for a month or longer. (It is this feature that also makes it one of my personal favourite foods to take on multi-day hikes – few things beat a couple of slices of salami on some cracker-bread over lunch, somewhere out in the middle of nowhere.)

A traditional aged, peppered Hungarian salami – finger-licking good.
(© All Rights Reserved)

Of course, in science, salami has a very different connotation. The phrase ‘salami science’ refers to a scientific publishing tactic where the same body of research is published in more than one journal, or, more commonly, the results from a single research project is sliced up into multiple smaller research results (spread over time, for example) and published separately. This second option is also referred to as ‘salami slicing’ because you are effectively slicing your single research salami into a whole bunch of smaller slices, spread across different publications.

This is an unfortunate practice because it can skew research data, and it makes it more difficult to get the ‘big picture’ with regards to a specific body of research. It is, however, the result of the way the value or worth of a scientist is measured in the scientific community – the more you publish, the better you are rated, and the more funding you can attract. This ‘publish or perish’ phenomenon is well-known in science, where the size of an individual or group’s scientific output is overemphasized, rewarding quantity over quality.

Nature magazine has gone so far as to say that salami science “threatens the sustainability of scientific publishing as we know it”. Fighting this practice means more time and effort have to be spent by journals and publications to ensure that the same results have not been published elsewhere, thus increasing the workload on already stretched staff and peer reviewers.

Of course quantity is not the only criterion used to judge or measure a scientist’s research output – references and citations also play an important role. However, formulae for quantifying research output is often oversimplified and skewed towards quantity. To again quote Nature magazine, “The challenge then is not only to establish more sophisticated means to assess the worth of a researcher’s scientific contribution, but for bodies making such assessments to make it plain that it is scientific rigour and not merely numerical output that will lead to success”.

It definitely seems slicing your salami thin is better when you’re talking meat than when you’re talking science. In fact, referring to the meaty version, it’s probably a very good idea to slice it thin – when it comes to processed meat (including salami), moderation is definitely a good thing. In a report in the Guardian, the World Cancer Research Fund (WCRF) has warned that excessive intake of processed meat can increase your risk of developing cancer.

According to the WCRF, “If everyone ate no more than 70g of processed meat – the equivalent of three rashers of bacon – a week, about 3,700 fewer people a year in Britain would be diagnosed with bowel cancer”.

So, in celebration of Salami Day, get yourself a good quality salami (paying a bit more really is worth it when it comes to enjoying a good salami) and enjoy a taste of meat-heaven.

Just don’t overdo it.

And don’t cheat with your research. 🙂

Celebrating World Chocolate Day (while we still can!)

We all love chocolate, don’t we? So much so, that there’s a whole host of chocolate-y celebrations out there – National Chocolate Day, International Chocolate Day, days for different flavours chocolate… Forced to pick one date, I’ve decided to dedicate a post to this particular chocolate day – today, 4 September, we celebrate World Chocolate Day.

Chocolate has long been associated with love and attraction, despite there being no conclusive proof as to its aphrodisiacal properties. It does, however, contain theobromide, which is both a stimulant and proven to assist in physical and mental relaxation, as well as tryptophan, a chemical used by the brain to produce serotonin, which stimulates endorphins, resulting in feelings of elation. Moreover it contains phenylethylamine, a neurotransmitter that helps promote feelings of excitement and giddiness. All together, that’s close enough to an aphrodisiac in my book.
(© All Rights Reserved)

Looking at some of the latest chocolate-related science news that I’ve come across, however, the chocoholics among us may in future have less and less reason for celebration, as the future of chocolate looks ominously bleak.

Firstly, it appears that worldwide chocolate consumption is exceeding production, which means that chocolate will increasingly become a luxury commodity, fetching higher and higher prices. Apparently cacao trees can only be grown naturally in a narrow band within 10 degrees around the equator, and more and more farmers in this band are turning to more lucrative alternative crops such as genetically modified maize, soybeans and palm oil. Geneticists are trying to develop better yielding cacao crops, but there are no guarantees yet that this will remedy the situation.

If that is not enough, there is a fear that diseases may devastate what is left of the global cocoa supply. Fungal diseases such as witch’s broom and frosty pod have already destroyed most cacao crops in Central America, and the concern is that if these diseases spread to Africa, the majority of the global cocoa production may be at risk. Again, the best defense lies in bioscience – if scientists can succeed in sequencing the cacao tree genome, it will help them developing genetically modified plants that are resistant to infection.

If you think the above challenges make the future of chocolate look a bit suspect, here’s the cherry on top – apparently, climate change may result in West Africa (the source of most of the world’s chocolate supply) becoming too hot to sustain cacao growing in the region.

According to a report in Scientific American, it is estimated that by 2060, more than 50% of the West Afican cocoa-producing countries may be too hot to continue growing the crop, which will also contribute to chocolate prices spiraling out of control. The slack in the market may be picked up by regions that were previously too cool for growing cacao, but that would require these regions to switch from other crops that may be considered more lucrative. Thus there’s yet another challenge to the genetic scientists – developing a drought-resistant cacao tree capable of handling the effects of global warming.

So, in a nutshell, to save chocolate from becoming an unaffordable luxury commodity, scientists are in a mad race to develop new, genetically modified strains of cacao tree that are higher yielding, infection resistant and able to withstand heat and drought.

Sorry to leave you with such a depressing message on World Chocolate Day – I guess we can only hope that science will step up to the plate and save the day, enabling us to continue enjoying the wonderful product of the cacao tree for many years to come!

John Macarthur and the birth of the Australian wool industry (not just another Aussie sheep joke!)

Today we celebrate the birthday of John Macarthur (3 Sep 1767 – 11 Apr 1834) the English-born Aussie who is recognised as the pioneer of the wool industry that boomed in Australia in the early 19th century, and has since been one of the country’s agricultural trademarks.

Sheep – serious business Down Under.
(© All Rights Reserved)

Macarthur was born in Plymouth, Devon in the UK. He began his career in the army, and after various assignments and activities became part of the New South Wales corps in 1789 and was posted to faraway Sydney, Australia. A fiery character, his life story reads like a historical romance novel, with way too many saucy details (battles with authorities, involvement in a military coup, land battles and much more) to get into on this forum.

Suffice to say, after settling in Australia, Macarthur got involved in rearing sheep for mutton, purchasing his first flock in 1795. He also purchased a small flock of Spanish Merino, imported from the Cape Colony (part of the later South Africa) in 1797. The merino is an excellent wool breed, and it didn’t take long for Macarthur to recognise the economic potential of wool production for export, as opposed to simply rearing sheep for the local meat-market. What made wool a potential export hit was the fact that it was a non-perishable commodity (a necessary feature, given Australia’s distance from the markets of the UK and Europe) and offered a high value per unit of weight.

On a trip back to London he lobbied for more land, and succeeded in being granted 5000 acres of the best pasture land in New South Wales. He became the largest sheep rearer in the colony, and made a fortune exporting merino wool to the UK, who were at the time cut off from their traditional wool supplier, Spain, as a result of the Napoleonic Wars. He also gained recognition for producing wool of the finest quality, which further upped the prices at which he was able to sell his produce.

Macarthur’s ventures opened the door for others to follow, and Australia’s wool export market started to boom in the early 19th century. It remains a key export commodity, with Australia remaining the world’s largest producer of the wool, mainly from merino sheep. New Zealand is in second place, and China in third. The wool produced in Australia and New Zealand is considered to be of the finest international quality – the best Aussie and Kiwi merino wool is known as grade 1PP, and is the industry benchmark of excellence for merino wool.

Natural wool is one of nature’s super-products. It is technically superior to synthetic materials in various ways – it has better heat insulation and superior hydryphilic properties, it is naturally flame-retardant, resistant to static electricity, and hypoallergenic.  Researchers at the Royal Melbourne Institute of Technology have developed a material blending wool and kevlar (the material often used in body armour) and it was found that the blend was lighter and cheaper, and outperformed kevlar in damp conditions.

What’s more, wool is also environmentally preferable to materials like nylon or polypropylene. According to the latest research on the role of wool in the natural  carbon cycle, it has been suggested that under the correct circumstances, wool production can potentially be carbon neutral.

So while the Aussies and Kiwis may suffer endless jokes relating to their sheep, the product being produced is something very special.  And John Macarthur deserves a tip of the hat as the bloke who kicked it all off more than 200 years ago.

Getting fired up on Redhead Day

The 1st of September is generally considered the first day of Spring in the Southern Hemisphere. However, dedicating the day to celebrating the coming of spring and the end of winter seems a little cruel to all the Northernites out there who are just entering their long cold winter (especially considering that I’ve already pretty much done exactly that a couple of days ago).

So, rather than discussing the seasons again, lets consider another special reason to celebrate this day – today is International Redhead Day!

Legend has it that blondes have more fun, but I’m not convinced!
(© All Rights Reserved)

People born with red-hair are, in a way, similar to those born left-handed – a genetic minority group with a fierce pride in that unique feature that makes them special and part of an exclusive ‘club’. In fact, the ‘natural redhead club’ is even more exclusive than the lefthanders club, with less than 1% of the world population having naturally red hair. Except for Scotland and Ireland, where more than 1 in 10 people have red hair. Former colonies of the UK are also blessed with a significantly higher than average sprinkling of redheads.

Quite a few redheads may count themselves as part of the super-exclusive intersection of the redhead club and the lefthanders club  – apparently, since red hair is a recessive trait, and recessive traits often come in pairs, redheads are more likely than others to be lefthanded!

Redhead Day started in the Netherlands, as a festival called Roodharigendag, that takes place every first weekend in September in the city of Breda. Taking its cue from this event, the celebration of red hair has spread around the world with the 1st Saturday of September becoming an international celebration of the fiery top.

The Dutch Roodharigendag festival is itself a pretty global affair, attracting attendance from over 50 countries.  In addition to being a gathering of thousands of people with natural red hair, the festival also celebrates art featuring the colour red, and includes lectures, workshops and demonstrations aimed at red-haired people. And of course many, many photo shoots.

It’s quite interesting that the festival is held in the Netherlands, where less than 2% of the population have red hair. However, it is exactly this fact that resulted in the festival happening in the first place. It was started in 2005 when the Dutch painter Bart Rouwenhorst decided to do 15 paintings of redheads. Knowing how hard it is to find redheads in the Netherlands, he placed an advert in the newspaper, and to his surprise 150 people volunteered. Rather than turning most away, he chose 14 and then organised a group photo of the others and used a chance lottery to select the 15th and final model. This get-together of 150 redhead would-be-models became the first redhead festival. It made headline news in the Dutch national press, and the rest, as they say, is history, with the numbers of attendees increasing exponentially each year.

It is known that red hair is caused by gene mutation – it is a variant of MC1R, or the melacortin-1 receptor. This mutative gene is what’s known as a recessive gene, which means that, for a child to have red hair (s)he has to inherit a copy of the mutated MC1R from each parent.  While a reasonable number of people carry a copy of the mutated gene, the chances of two people who carry the same gene, having kids, is quite rare. However, the claim that redheads may become extinct due to this is unfounded.  While recessive genes can become rare, they are unlikely to disappear completely, unless some natural disaster causes everyone carrying the gene to die. So even if they may become rarer, there should always be people around who carry the gene, and so redheads should continue to pop up from time to time. As stated in the National Geographic, “while redheads may decline, the potential for red isn’t going away”.

Which I think is a good thing – the world would definitely be a worse place without redheads around to spice things up.

Perhaps we should leave the last word to Mark Twain (himself a redhead), who famously quipped that “while the rest of the human race are descended from monkeys, redheads derive from cats”.

Happy Redhead Day, everyone!

Daffodil Day and the ongoing fight against cancer

It’s Daffodil Day today, August 31st. Well, it’s Daffodil Day in New Zealand, to be exact – Australian Daffodil Day happened on the 24th of this month already. The US, bless them, seem to have a whole bunch of different Daffodil Days across different states. (With Daffodils being a spring flower, it obviously makes sense that most US Daffodil Days happen earlier in the year, around February, and not August/September, as it does down here in the South.)

Daffodil Day is all about cancer – raising awareness of the disease, raising funds for cancer related research, and creating a support network for individuals suffering from the disease.

The reason why the daffodil flower is used internationally by Cancer Societies as the global symbol of hope for people living with cancer, is that it is one of the first, and one of the strongest, flowers of spring, and as such is a symbol for hope and renewal, new life, new beginnings and new possibilities.
(© All Rights Reserved)

Cancer is an incredibly pervasive, prevalent disease – here in New Zealand it is the leading cause of death in the country –  and I’m sure there are very few people who are not in some way fairly directly affected by it. My dad died of cancer in his liver and colon; my mother in law is a breast-cancer survivor; just about everyone I know has someone close to them who has either died from, or is living with, the disease.

In a nutshell, cancer occurs when cells in the body accumulate genetic changes (due to various factors), resulting in a loss of growth control. Normal cells grow, divide and die in an orderly manner, in response to signals from the body and the environment. When cells become cancerous, however, they fail to respond to the normal signals, and start growing and dividing in an uncontrolled manner. These out-of-control cells can spread through the body via the bloodstream or lymph vessels (a process called metastasis) and continue to grow and replace normal tissue. It is the fact that it’s the body’s own cells that go crazy and effectively turn against their host, that makes it such a complex disease to treat.

As mentioned, one of the critical focus areas of Daffodil Day is raising money to support research into finding cures for the disease.

Over the years, literally billions of dollars have been spent on cancer research, and it’s quite a sobering thought when you realise that, in spite of all this, the death rate from the disease has changed little over the past 50 or so years. As new therapies are developed, cancer also adapts and evolves, finding new ways to kill.

Now this does not mean all is in vain – millions of people have been saved from the therapies that have been developed. All it means is that there is no room for complacency, and new and more effective cancer therapies are continually needed to stay ahead of, or at least keep up with, the disease.

In my job as a science photographer, I interact with a wide range of research and technology organisations, and one of the most inspiring of these is the Malaghan Institute of Medical Research – New Zealand’s leading medical research institute, and a registered charity based in Wellington, NZ. The reason I mention this fact is that one of their main fields of research is cancer (they also research cures for asthma, arthritis, multiple sclerosis and infectious diseases) and they are one of the organisations supported through the proceeds of fundraising events like Daffodil Day.

One of the main fields of cancer research that the Malaghan Institute focuses on is Immunotherapy, which basically involves using the immune system and it’s unique properties to complement existing cancer treatments. As they explain, “Immune cells are specific and have the capacity to discriminate between normal and cancer cells, they have powerful effector capacity and can recruit inflammatory cells to destroy neoplastic tissue, and they can migrate to different tissues and eliminate residual metastatic disease.” So, similar techniques to those used in helping the immune system recognise and fight contagious diseases (such as vaccination, etc), can also be used to help the immune system recognise cancer cells and to strengthen their ability to destroy them.

Another more recent research subject at the Institute is cancer stem cell research. Cancer stem cells are cancer’s evil root – these tumor initiating cells are highly resistant to drug and radiation treatment – and the focus of the research is on finding safe and effective ways to eradicate them.

Researchers at the Malaghan Institute of Medical Research are conducting research into Immunotherapy, unleashing the full cancer-fighting potential of the immune systems of cancer patients to fight the disease.
(© All Rights Reserved)

Organisations like the Malaghan Institute, and many others like them across the world, are doing incredible work to address the continually evolving threat of cancer, and really need all the support they can get. It’s a scary, scary topic, and it’s good to know there are talented, committed scientists and researchers out there facing the challenge head on.

Celebrating Ernest Rutherford, the father of nuclear physics

Today we celebrate the life and work of New Zealand’s greatest scientist, Lord Ernest Rutherford – the father of nuclear physics. In the words of the author John Campbell, “He is to the atom what Darwin is to evolution, Newton to mechanics, Faraday to electricity and Einstein to relativity.”

Rutherford was responsible for three fundamental contributions to the field: (1) he explained radioactivity as the spontaneous disintegration of the atom; (2) he determined the structure of the atom; and (3) he was the first to split the atom.

One of New Zealand’s proudest sons, ‘Lord Rutherford of Nelson’, graces the front of the country’s highest value bank note, the $100 note. Appearing with him is his Nobel Prize medal and a graph plotting the results from his investigations into naturally occurring radioactivity.
(© All Rights Reserved)

Ernest Rutherford was born on 30 August 1871 in the South Island town of Nelson, New Zealand. His father James Rutherford, the son of a Scottish immigrant, came to New Zealand at the age of four, while his mother, Martha Rutherford (née Thompson) emigrated with her widowed mother from England when she was thirteen. The Rutherfords, in the words of dad James, wanted “to raise a little flax and a lot of children”. Not sure how they managed on the flax, but they certainly lived up to their aspirations in the children department – young Ernest was the second son, and fourth child, of no less than twelve Rutherford children.

Rutherford excelled academically, winning a scholarship to the Canterbury College of the University of New Zealand. After completing his basic university studies through the University of New Zealand, he successfully applied for another scholarship, which enabled bim to go to the UK to complete his postgraduate studies at the Cavendish Laboratory, University of Cambridge.

While working with Professor JJ Thompson, Rutherford discovered that radioactive uranium gave off two separate types of emissions – he named these alpha and beta rays. Beta rays were subsequently identified as high speed electrons.

Radioactivity and the spontaneous disintegration of the atom

In 1898 Rutherford accepted a professorship at McGlll University in Montreal, Canada. It was here, with the help of a young chemist, Frederick Soddy, that he conducted the research that gained him the 1908 Nobel Prize in Chemistry, investigating “the disintegration of the elements and the chemistry of radioactive substances”. (Soddy himself later received the 1921 Nobel Prize in Chemistry.)

Determining the structure of the atom

A subsequent move to Manchester, England, to be nearer to what he considered the main centres of science, saw Rutherford taking on a professorship at the Manchester University. With his research assistant, Ernest Marsden, he investigated the scattering of alpha rays (something he first noticed while still at McGill). They noticed some alpha rays would ‘bounce back’ when hitting even a thin gold film – this was a most surprising result, with Rutherford likening it to firing a large naval shell at a tissue paper, and seeing it bounce back. This led to him developing his concept of the ‘nucleus’, his greatest contribution to physics. According to this concept the whole mass of the atom, and all its positive charge, is concentrated in a miniscule point at its centre, which he termed the nucleus.

The Danish physicist Niels Bohr began working with Rutherford, and he adapted Rutherford’s nuclear structure to include electrons in stable formation around the nucleus. The Rutherford-Bohr model of the atom, with some improvements from Heisenberg, remains valid to this day.

Splitting the atom

In 1919, during his last year at Manchester, Rutherford noted that the nuclei of certain light elements, like nitrogen, would disintegrate when bombarded by alpha particles coming from a radioactive source, and that during this process fast protons were emitted. By doing this, Rutherford became the first person to split the atom. Patrick Blackett (winner of the 1948 Nobel Prize in Physics) later proved that splitting the nitrogen atom actually transformed it into an oxygen isotope, so Rutherford effectively became the first to deliberately transmute one element into another.

Rutherford received the knighthood in 1914; he was appointed to the Order of Merit in 1925, and in 1931 he was raised to the peerage as Lord Rutherford of Nelson. A proud New Zealander despite living and working abroad for most of his academic career, he chose to include in his coat of arms a Kiwi, a Maori Warrior and Hermes Trismegistus, the patron saint of knowledge and alchemists.

He died in Cambridge on October 19, 1937, leaving his wife Mary Newton, and only child Eileen.

A great scientist, Rutherford’s contribution is perhaps best summarised in his eulogy in the New York Times:
“It is given to but few men to achieve immortality, still less to achieve Olympian rank, during their own lifetime. Lord Rutherford achieved both. In a generation that witnessed one of the greatest revolutions in the entire history of science he was universally acknowledged as the leading explorer of the vast infinitely complex universe within the atom, a universe that he was first to penetrate.”

It’s ‘More Herbs, Less Salt’ Day – time to give your heart a breather

Today, according to those in the know, is ‘More Herbs, Less Salt’ Day. Another of those days that has been thought up to try and nudge us towards a slightly healthier lifestyle (much like ‘Independence from Meat’ Day, that I blogged about earlier).

Indeed, leaning towards herbs, rather than heaps of salt, to season your food is not a bad idea at all. I’m sure anyone who has opened a general lifestyle magazine in the last 10 years will know that salt isn’t all that great for our overly stressed 21st century bodies – our poor hearts already have enough to deal with. Giving the heart a further knock by subjecting it to a high salt diet really isn’t a winning idea.

Using more herbs and less salt not only makes your food healthier, but tastier and prettier too.
(© All Rights Reserved)

There’s a significant body of research linking high sodium diets to high blood pressure, which in turn is linked to heart attacks, strokes, kidney disease and other nasties. Proving that a decrease in salt actually reduces the risk of heart disease has been more difficult, but a long-term research project conducted a few years ago, aimed to do exactly that. In an article entitled “Long term effects of dietary sodium reduction on cardiovascular disease outcomes: observational follow-up of the trials of hypertension prevention (TOHP)”, the research team from Harvard Medical School presents their results from a long-term follow-up assessment related to a sodium-reduction, hypertension prevention study done 15 years earlier. In the original intervention, a group of adults followed a sodium reduced diet for between 18 and 48 months. From the long-term follow-up research it was found that, compared to the general population, “Risk of a cardiovascular event was 25% lower among those in the intervention group (relative risk 0.75, 95% confidence interval 0.57 to 0.99, P=0.04), adjusted for trial, clinic, age, race, and sex, and 30% lower after further adjustment for baseline sodium excretion and weight (0.70, 0.53 to 0.94), with similar results in each trial.”

This led them to the conclusion that “Sodium reduction, previously shown to lower blood pressure, may also reduce long term risk of cardiovascular events.”

To really put you off a high salt diet, a visit to World Action on Salt and Health, a website dedicated to “improve the health of populations throughout the world by achieving a gradual reduction in salt intake”, should do the trick. Just note, however, that this day (and most scientific research) calls for ‘less salt’, not ‘no salt’. As one of the primary electrolytes in the body, salt is essential for the body to function – just not at the levels that we’re consuming it.

Herbs on the other hand don’t just taste good – they’re like a veritable medicine cabinet in your garden (or pantry, if you don’t grow your own). Besides often being rich in vitamins and trace elements the body needs, specific herbs have long been known for their medicinal effects.

Herbs like chamomile and lavender is known to have a calming effect, parsley, oregano and echinacea can boost the immune system, garlic contains selenium, which can help reduce blood pressure (now there’s a good one to fight the effects of a high sodium diet!), mint and feverfew have been reported to reduce headaches, basil and bergemot fights colds and flu, lemon balm and rosemary is good for concentration and memory… The list goes on.

Of course, as with everything in life, the key is moderation – ‘more herbs’ should not be seen as a license to go overboard on every herb you can lay your hands on. Reckless and injudicious use of herbal supplements can be very detrimental to your health, to say the least. Colodaro State University hosts a nice site, Herbals for Health?, which is worth a read – it gives a balanced overview of the pro’s and cons of a few popular herbal supplements.

Despite the cautionary notes above, culinary herbs, especially freshly home-grown, generally speaking should not cause health risks when used in moderation as an alternative to salt in daily cooking, and that, after all, is what this day is all about. Using herbs in cooking can be a very exciting way to improve your health and well-being, so have fun experimenting with all those new tastes and flavours!

Celebrating sound science communication with Scientific American

Today we celebrate a veritable institution in the international popular science communication landscape – the magazine Scientific American today celebrates its incredible 167th birthday, making it the oldest continuously published monthly in the US.

Scientific American – a staple on the news stands and magazine racks of good bookshops around the world.
(© All Rights Reserved)

The first issue of the magazine, then a four page weekly newspaper, appeared on this day back in 1845.  It was published by Rufus Porter, a very interesting character who, besides being a magazine publisher, was also a painter, inventor, schoolmaster and editor. In line with Porter’s personal interests, the magazine reported on happenings in the US Patent Office, as well as having popular articles on inventions of the time.

Porter’s interest in the magazine didn’t last long – after 10 months he sold it to Alfred Beach and Orson Munn I (for a whopping $800).  It remained under ownership of Munn & Company, who, in the century between 1846 and 1948, grew it from its humble beginnings to a large and influencial periodical. In the late 40’s it was put up for sale again, and this time the magazine was sold to three partners, Gerard Piel, Dennis Flanagan, and Donald Miller Jr. They reportedly planned on starting their own new science magazine, but finding that Scientific American was for sale, they opted to rather buy that and work their ideas into the existing title. They made significant changes to the magazine, updating and broadening its appeal. Ownership remained stable from 1948 to 1986, when it was sold to the German Holtzbrinck group, who has owned it since. The current Editor in Chief is Mariette DiChristina – an experienced science journalist and the first woman in the magazine’s history to hold the position.

What has kept the magazine alive and relevant for so many years, is the fact that it has consistently focused on an educated, but not necessarily scientific public, clearly explaining the scientific concepts it reported on and maintaining strong editorial quality control. It has also, since its inception, focused on clear, explanatory visual illustrations to accompany its articles. In its long lifetime, the magazine has published contributions from many famous scientists, including more than 140 Nobel laureates. Albert Einstein contributed an article called “On the Generalized Theory of Gravitation” in 1950.

In 1996, the Scientific American website was launched. A mobile site, as well as the Scientific American Blog Network, followed in 2011. For the past 10 years since 2002, the magazine has been hosting its own annual awards, the Scientific American 50, recognising important science and technology contributions of the previous year, across a wide range of categories from agriculture to defence to medicine.

Here’s looking forward to many more years of quality science communication, and a big double-century celebration in 2045!