Page 3 of 4

Commemorating the discovery of DNA fingerprinting

Some time ago, I did a post about fingerprinting and personal identification. Now while fingerprinting has been around for more than 150 years, a new breakthrough in personal identification happened much more recently – on this day in 1984, DNA fingerprinting was discovered in Leicester, England.

The man who discovered this revolutionary new technique, was Sir Alex Jeffreys of the University of Leicester. He was the first to realised that each person’s DNA has a unique pattern, almost like a bar code, and that this could be used as a biological identification method. The technique has, over the past 25+ years, proved an invaluable tool in forensics, crime investigations and identification of genetic relationships.

Geneticist studying a DNA profile.
(© All Rights Reserved)

Of course no technique is perfect, and in the case of DNA fingerprinting there are also rare occasions where the technique cannot be used. Identical twins, for example, have matching DNA, and so their DNA fingerprints are the same.  A much rarer, and much stranger, occurrence is when a single person has more than one DNA fingerprint.

Strange as this may seem, having a single person with two distinct genetic identities is possible. This condition is known as chimerism, named after the chimera, a mythological creature with features from more than one distinct animal, for example a lion’s head and a serpent’s tail.

A human chimera is formed during pregnancy. Normally the male gamete (sperm) fuses with the female gamete (ova) to form a zygote, the cell that becomes the embryo. This embryo has a new genetic identity, formed from a unique combination of the DNA of the mother and the father. On rare occasions, two male gametes will fuse with two female gametes, to form two zygotes which develop into two (non-identical) twin embryos. These embryos will each have a different, unique new DNA fingerprint, inherited from the father and mother.

In extremely rare cases, these two embryos can fuse, growing into a single child, but formed from four gametes, and thus having two distinct sets of DNA. The chimera child can grow up without anyone ever knowing about his double identity, but may in fact have different organs or body parts that have completely different genetic fingerprints. Even when a DNA identity test is done on a chimera, DNA will usually only be taken from a single source, such as a blood sample or cheek swab, and the second identity may never be known.

Chimerism may, in rare occasions, visibly manifest itself, for example with people having both male and female reproductive organs, or two different colour eyes. (However, different eye colours, or heterochromia, can have different causes, and is, as far as I know, not necessarily an indication of chimerism.)

(© All Rights Reserved)

The most famous example of a chimera confounding DNA profiling came from a case in 2003, when a mother of three were told, after DNA tests were done on her and her three children, that two of her three sons were not hers, even though she maintained that she had conceived them with her husband, and delivered them naturally.

After more extensive testing, it was discovered that she was a chimera, and that the two sons thought not to be hers did in fact match her ‘second identity’.

Definitely a case where truth is, in fact, stranger than fiction.

Weekly Photo Challenge: Near and Far

I took these a few years ago while on a photography trip in the Richtersveld, a breathtakingly beautiful and barren landscape in the Northern Cape province of South Africa, right on the border between SA and Namibia. The focus of the shots was very much on texture and shape, and playing with near and distant elements.

(© All Rights Reserved)

Both these shots were taken on a trusty old Nikon F100 (I still love that camera so) and a 28mm prime lens, using Fuji Velvia slide film.  The film was cross-processed using the C41 colour negative development process, which basically ends up giving you very contrasty results, with quite aggressive grain and unexpected colour casts.

(© All Rights Reserved)

Ever since I first discovered cross-processing during my photographic studies, it has always been one of my favourite processing techniques. I love the fact that you’re never quite sure what you’re going to end up with – different types of slide film give widely varying results, and even the age of the film can lead to a different outcome.

While it is possible to simulate cross-processing quite successfully in digital photography during post-processing, it simply does not come close to the magic of getting your roll of cross-processed film back from the lab, and discovering the results for the first time.

Hmmm….. I need to get out and shoot some film again – digital is great, but you get withdrawal symptoms if you’ve been away from film too long!

The day the first computer bug was discovered

OK, so the legend goes like this:

Back in the late 1940s, the US Navy financed the building of an electromechanical computer at Harvard University, called the Harvard Mark II. It was basically a super-fast (for the time) calculating machine, made unique because several calculations such as the reciprocal, square root, logarithm and exponential, were built into the hardware, making execution much faster than on other similar machines of the time. Unlike modern computers, the Mark II was not a stored-program computer. Instead, program instructions were read sequentially from a tape, and then executed.

Anyway, back to the legend…  On this day, back in 1947, while the Harvard Mark II was doing its thing, humming away (as I presume they did), a technician noted an unusual object trapped in one of the computer’s relays. On closer inspection, he found it was a moth. The moth was removed and taped into the computer’s log book. Grace Hopper, computer scientist and US Navy Rear Admiral, saw the moth entry in the logbook, and added the caption, “First actual case of bug being found”. This reference to a computer problem or glitch as a ‘bug’, caught on with other computer scientists, and has been used ever since, together with terms like debugging, etc.

I’ve discovered that I have a computer screen bug – hope it won’t cause serious problems!
(© All Rights Reserved)

Much of the above story is true – there was a moth found in the Harvard Mark II, on 9 September 1947 at 15:45. And it was indeed taped into the log book, with the above-noted caption. However, this was far from the first use of the word ‘bug’ to refer to a technical error – small machine glitches have been called ‘bugs’ for many years, with the first known reference coming from a letter written by Thomas Edison in 1878:
“Bugs – as such little faults and difficulties are called – show themselves and months of intense watching, study and labour are requisite before commercial success or failure is certainly reached.”

So, while it would have been cool if this was the real origin of the term computer bug, it sadly wasn’t. What is probably true about the story of Grace Hopper and the Harvard Mark II, is that this may indeed be the first known case of an actual computer bug, or computer moth, to be more exact. Which is still kind of amusing. 🙂

Happy Sunday, everyone – hope you’re not being bugged by bugs of any kind todayyy.y..yy…yyyyy.yy. Bugger…

Luv or h8 it, txting is gr8 4 literacy

Today, 8 September, is International Literacy Day – the day the world’s attention is focused on literacy as one of the fundamental human rights, and the foundation of all learning. In the words of UNESCO Director General Irina Bokova, “Education brings sustainability to all the development goals, and literacy is the foundation of all learning. It provides individuals with the skills to understand the world and shape it, to participate in democratic processes and have a voice, and also to strengthen their cultural identity.”

In the information age, literacy is a more critical basic requirement than ever. The literacy landscape is also rapidly changing – children’s reading and writing experience is changing from a paper-based to a digital context. Many kids’ primary exposure to the written word is through texting – SMS, instant messaging and Twitter – thanks to the global proliferation of mobile phones and internet connectivity.

Texting teens may have a literacy edge over their non-texting peers.
(© All Rights Reserved)

Texting has long been blamed for being one of the main causes of decreasing linguistic savvy among children and teenagers, with parents and teachers fearing that texting shorthand (incorporating linguistic shortcuts, weak grammar and little or no punctuation) was destroying their ability to write ‘properly’.

While it’s true that these teenage ‘textisms’ drive most people over thirty up the proverbial wall, it may in fact not be quite the scourge it was thought to be at the turn of the century. New research is showing that, while it may not promote perfect grammar, text messaging may in fact have a positive impact on basic literacy. For one thing, there is no arguing that it is increasing young people’s level of interaction with the written word. Instead of speaking, kids are very likely to communicate via text messages, even when they are in the same physical location.

As reported in an article in the Telegraph, researchers are suggesting that using a mobile phone can boost children’s spelling abilities. In a research project at Coventry University in the UK, 114 children aged 9-10, who were not already mobile phone users, were split into two groups. Half were given handsets and encouraged to text often, while the control group remained without mobile phones. After 10 weeks, both groups were subjected to a series of reading, spelling and phonological awareness tests, and the researchers claimed they found that texting made a significant positive contribution to to children’s spelling development during the study. According to Professor Clare Wood of the university’s Psychology Department, they also found “no evidence that children’s language play when using mobile phones is damaging literacy development.”

Similar sentiments have been expressed by Professor David Crystal, honorary professor of linguistics at Bangor University, who says it’s an urban myth that text speech are taking over childrens’ regular writing. He considers it “merely another way to use language”, and suggests that the use of textisms and shortcuts is exaggerated: “If you collected a huge pile of messages and counted all the whole words and the abbreviations, the fact of the matter is that less than 10% would be shortened.”

So, while language may be changing in the age of texting, the undeniably positive part is that it is exposing children to the written word, in both the traditional and the abbreviated sense.

And that, as they say in the classics, is gr8 4 literacy.

Salami – good when it’s meat, less so when it’s science

Today is a celebration of that greatest of cured meats – it’s Salami Day.

Salami is a cured, fermented and air-dried sausage-style meat, usually made from pork and/or beef, but also sometimes from a range of other meats including venison and turkey (and even, apparently, shark and swordfish in Japan). The meat is minced together with a range of spices, garlic, minced fat, herbs and wine or vinegar, and left to ferment for a day or so before being stuffed into a (usually edible) casing and hung out to cure. The casing is sometimes treated with an edible mold culture which adds flavour and helps protect the salami from spoilage.

It first became popular with South European peasants, thanks to the fact that it doesn’t require refrigeration, and can last at room temperature for a month or longer. (It is this feature that also makes it one of my personal favourite foods to take on multi-day hikes – few things beat a couple of slices of salami on some cracker-bread over lunch, somewhere out in the middle of nowhere.)

A traditional aged, peppered Hungarian salami – finger-licking good.
(© All Rights Reserved)

Of course, in science, salami has a very different connotation. The phrase ‘salami science’ refers to a scientific publishing tactic where the same body of research is published in more than one journal, or, more commonly, the results from a single research project is sliced up into multiple smaller research results (spread over time, for example) and published separately. This second option is also referred to as ‘salami slicing’ because you are effectively slicing your single research salami into a whole bunch of smaller slices, spread across different publications.

This is an unfortunate practice because it can skew research data, and it makes it more difficult to get the ‘big picture’ with regards to a specific body of research. It is, however, the result of the way the value or worth of a scientist is measured in the scientific community – the more you publish, the better you are rated, and the more funding you can attract. This ‘publish or perish’ phenomenon is well-known in science, where the size of an individual or group’s scientific output is overemphasized, rewarding quantity over quality.

Nature magazine has gone so far as to say that salami science “threatens the sustainability of scientific publishing as we know it”. Fighting this practice means more time and effort have to be spent by journals and publications to ensure that the same results have not been published elsewhere, thus increasing the workload on already stretched staff and peer reviewers.

Of course quantity is not the only criterion used to judge or measure a scientist’s research output – references and citations also play an important role. However, formulae for quantifying research output is often oversimplified and skewed towards quantity. To again quote Nature magazine, “The challenge then is not only to establish more sophisticated means to assess the worth of a researcher’s scientific contribution, but for bodies making such assessments to make it plain that it is scientific rigour and not merely numerical output that will lead to success”.

It definitely seems slicing your salami thin is better when you’re talking meat than when you’re talking science. In fact, referring to the meaty version, it’s probably a very good idea to slice it thin – when it comes to processed meat (including salami), moderation is definitely a good thing. In a report in the Guardian, the World Cancer Research Fund (WCRF) has warned that excessive intake of processed meat can increase your risk of developing cancer.

According to the WCRF, “If everyone ate no more than 70g of processed meat – the equivalent of three rashers of bacon – a week, about 3,700 fewer people a year in Britain would be diagnosed with bowel cancer”.

So, in celebration of Salami Day, get yourself a good quality salami (paying a bit more really is worth it when it comes to enjoying a good salami) and enjoy a taste of meat-heaven.

Just don’t overdo it.

And don’t cheat with your research. 🙂

Fight Procrastination Day

Note to self…… September 6th – Fight Procrastination Day….. also Read a Book Day….hmmmm, what to do…… Read up on procrastination – causes, statistics, interesting facts, etc…. or other topic. Think of photo ideas to illustrate concept….. Get going on this sooner rather than later…………

Doh…!

I knew I shouldn’t have put it off ’til the last minute…….

Weekly Photo Challenge: Free Spirit

 

OK, so my ‘free spirit’ photo doesn’t really break any photography rules, feature lens glare or anything like that, but when I photographed this gypsy, he just had this most incredible sense of peace and freedom about him.
This, to me, is the embodiment of a ‘free spirit’.
(© All Rights Reserved)

 

On beards, taxes and the laws of attraction

It’s 5 September, the day back in 1698 when the good Tsar Peter the Great of Russia, in all his wisdom, decided the macho, fully bearded look sported by most of his fellow countrymen, was simply too out of touch with the times, and that Russian men really ought to follow the example of their European counterparts and cut their beards.

Rumour has it that, after returning from a trip to Europe where he was most impressed by the forward-thinking, clean-shaven Europeans, he personally cut off the beards of the men in his court. He obviously couldn’t take it upon himself to clear all the bearded Russians of their facial hair, so to make them take his request a little more seriously, he imposed a ‘beard tax’, announced on 5 September 1698, which meant that any man who opted to keep his beard would incur a hefty tax penalty. Luckily for the more rustic farmer-types, the tax was only imposed in the cities, so they could keep their beards while on the farm. If and when they needed to go to the city, however, they also had to shave, or pay a fine to keep their beards.

Bearded blokes actually had to carry with them a token showing that they had paid their beard tax. To further remind them of the silliness of their facial hair, the token was inscripted with the message “A beard is a useless burden”, or something to that effect.

Paying taxes to keep your beard is enough reason to be a bit depressed. And now it turns out the ladies don’t like ’em either…
(© All Rights Reserved)

Turning to modern times, I recently came across the results of a study conducted by a team from Canada and New Zealand, investigating the reactions of men and women to bearded and beardless men. Nineteen men from New Zealand and Samoa were first photographed with 6-week old beards, in two sets of photographs – one where they looked serious, and another where they were asked to make an angry face. Their beards were then shaved off, and they were again photographed in the same poses. According to the feedback from respondents, women were more drawn to the beardless men, while men considered the bearded men to appear more important and imposing. So, it seems you have a choice – do you want to impress the guys, or charm the ladies?

Further in the same article, however, there’s mention of a study where the reactions of women to bearded men was extended to also include chaps with 5 o’clock stubble. It appears that this may be the magic option from the attraction point of view – as the article notes, it seems women like men who can grow beards, but don’t quite do. Perhaps these men are seen as suitably masculine, yet not quite out of touch with their feminine sides.

I have also found a report on a recent survey of more than 2,000 men and women conducted by Lynx, which gives some rather conclusive anti-beard statistics – while 63% of the men surveyed believed their facial hair improved their manliness and attractiveness, no less than 92% of the women preferred a clean-shaven man. In fact, 86% went so far as to say they found beards unattractive.

Perhaps that can be taken as some modern form of Peter the Great’s beard tax. In Tsar Peter’s case, men were allowed to keep their beards as long as they were willing to part with their money; nowadays you can keep your money, but you may well have to say goodbye to any romantic possibilities!

Celebrating World Chocolate Day (while we still can!)

We all love chocolate, don’t we? So much so, that there’s a whole host of chocolate-y celebrations out there – National Chocolate Day, International Chocolate Day, days for different flavours chocolate… Forced to pick one date, I’ve decided to dedicate a post to this particular chocolate day – today, 4 September, we celebrate World Chocolate Day.

Chocolate has long been associated with love and attraction, despite there being no conclusive proof as to its aphrodisiacal properties. It does, however, contain theobromide, which is both a stimulant and proven to assist in physical and mental relaxation, as well as tryptophan, a chemical used by the brain to produce serotonin, which stimulates endorphins, resulting in feelings of elation. Moreover it contains phenylethylamine, a neurotransmitter that helps promote feelings of excitement and giddiness. All together, that’s close enough to an aphrodisiac in my book.
(© All Rights Reserved)

Looking at some of the latest chocolate-related science news that I’ve come across, however, the chocoholics among us may in future have less and less reason for celebration, as the future of chocolate looks ominously bleak.

Firstly, it appears that worldwide chocolate consumption is exceeding production, which means that chocolate will increasingly become a luxury commodity, fetching higher and higher prices. Apparently cacao trees can only be grown naturally in a narrow band within 10 degrees around the equator, and more and more farmers in this band are turning to more lucrative alternative crops such as genetically modified maize, soybeans and palm oil. Geneticists are trying to develop better yielding cacao crops, but there are no guarantees yet that this will remedy the situation.

If that is not enough, there is a fear that diseases may devastate what is left of the global cocoa supply. Fungal diseases such as witch’s broom and frosty pod have already destroyed most cacao crops in Central America, and the concern is that if these diseases spread to Africa, the majority of the global cocoa production may be at risk. Again, the best defense lies in bioscience – if scientists can succeed in sequencing the cacao tree genome, it will help them developing genetically modified plants that are resistant to infection.

If you think the above challenges make the future of chocolate look a bit suspect, here’s the cherry on top – apparently, climate change may result in West Africa (the source of most of the world’s chocolate supply) becoming too hot to sustain cacao growing in the region.

According to a report in Scientific American, it is estimated that by 2060, more than 50% of the West Afican cocoa-producing countries may be too hot to continue growing the crop, which will also contribute to chocolate prices spiraling out of control. The slack in the market may be picked up by regions that were previously too cool for growing cacao, but that would require these regions to switch from other crops that may be considered more lucrative. Thus there’s yet another challenge to the genetic scientists – developing a drought-resistant cacao tree capable of handling the effects of global warming.

So, in a nutshell, to save chocolate from becoming an unaffordable luxury commodity, scientists are in a mad race to develop new, genetically modified strains of cacao tree that are higher yielding, infection resistant and able to withstand heat and drought.

Sorry to leave you with such a depressing message on World Chocolate Day – I guess we can only hope that science will step up to the plate and save the day, enabling us to continue enjoying the wonderful product of the cacao tree for many years to come!

John Macarthur and the birth of the Australian wool industry (not just another Aussie sheep joke!)

Today we celebrate the birthday of John Macarthur (3 Sep 1767 – 11 Apr 1834) the English-born Aussie who is recognised as the pioneer of the wool industry that boomed in Australia in the early 19th century, and has since been one of the country’s agricultural trademarks.

Sheep – serious business Down Under.
(© All Rights Reserved)

Macarthur was born in Plymouth, Devon in the UK. He began his career in the army, and after various assignments and activities became part of the New South Wales corps in 1789 and was posted to faraway Sydney, Australia. A fiery character, his life story reads like a historical romance novel, with way too many saucy details (battles with authorities, involvement in a military coup, land battles and much more) to get into on this forum.

Suffice to say, after settling in Australia, Macarthur got involved in rearing sheep for mutton, purchasing his first flock in 1795. He also purchased a small flock of Spanish Merino, imported from the Cape Colony (part of the later South Africa) in 1797. The merino is an excellent wool breed, and it didn’t take long for Macarthur to recognise the economic potential of wool production for export, as opposed to simply rearing sheep for the local meat-market. What made wool a potential export hit was the fact that it was a non-perishable commodity (a necessary feature, given Australia’s distance from the markets of the UK and Europe) and offered a high value per unit of weight.

On a trip back to London he lobbied for more land, and succeeded in being granted 5000 acres of the best pasture land in New South Wales. He became the largest sheep rearer in the colony, and made a fortune exporting merino wool to the UK, who were at the time cut off from their traditional wool supplier, Spain, as a result of the Napoleonic Wars. He also gained recognition for producing wool of the finest quality, which further upped the prices at which he was able to sell his produce.

Macarthur’s ventures opened the door for others to follow, and Australia’s wool export market started to boom in the early 19th century. It remains a key export commodity, with Australia remaining the world’s largest producer of the wool, mainly from merino sheep. New Zealand is in second place, and China in third. The wool produced in Australia and New Zealand is considered to be of the finest international quality – the best Aussie and Kiwi merino wool is known as grade 1PP, and is the industry benchmark of excellence for merino wool.

Natural wool is one of nature’s super-products. It is technically superior to synthetic materials in various ways – it has better heat insulation and superior hydryphilic properties, it is naturally flame-retardant, resistant to static electricity, and hypoallergenic.  Researchers at the Royal Melbourne Institute of Technology have developed a material blending wool and kevlar (the material often used in body armour) and it was found that the blend was lighter and cheaper, and outperformed kevlar in damp conditions.

What’s more, wool is also environmentally preferable to materials like nylon or polypropylene. According to the latest research on the role of wool in the natural  carbon cycle, it has been suggested that under the correct circumstances, wool production can potentially be carbon neutral.

So while the Aussies and Kiwis may suffer endless jokes relating to their sheep, the product being produced is something very special.  And John Macarthur deserves a tip of the hat as the bloke who kicked it all off more than 200 years ago.