Tuesday, November 30, 2010

Is asthma 'catching'? Is everything? Who said we'd defeated infectious disease?

The incidence of childhood asthma skyrocketed through the 1980s and 1990s and no one knows why.  Environmental epidemiologists have attributed the cause to a wide variety of factors including breastfeeding, bottle feeding, absence of helminth infestation, dirt, excessive cleanliness, air quality (good or bad), presence of pets in the household in infancy, the absence of pets, the use of acetaminophen in infancy, and even Facebook -- note that many of these are directly contradictory.  A few studies early on found that kids who grow up on farms tend to be at lower risk of asthma, and this led to the "Hygiene Hypothesis" which posits essentially that an immune system without enough to do is an immune system that goes awry.


A trend in epidemiology in the last few decades, in direct response to the inability to definitively identify environmental causes of complex diseases such as asthma or heart disease or type 2 diabetes, has been to look inward instead, to look for genes 'for' these diseases.  This isn't surprising, given the promises made by the Human Genome Project, for example, that knowing the genome would allow us to explain the majority of disease.  And, in fairness, given the great successes of human genetics early on at finding single genes for rare largely pediatric diseases like cystic fibrosis or Tay Sachs, the idea that genes could explain other diseases (even most diseases) was highly appealing and seductive.

But this meant that diseases were geneticized that, in our opinion, should have been left to the environmental epidemiologists -- genes can't explain an epidemic that took off as fast as the asthma epidemic did because gene frequencies don't change nearly fast enough.  Even if you posit that those who are affected have a genetic susceptibility to whatever environmental factor is triggering the disease, still it's clear that something rapidly changed in the environment, and, for our money, research dollars should be going toward finding that rather than genes.  (Why epidemiology has such a hard time finding such factors is another story, but it's largely because the tools of the trade are best at finding risk factors with large effects, such as infectious agents.  Though, why that hasn't been true of asthma, which, given the nature of the spike in incidence, seems to have had a main effect cause, is perplexing.)

But, largely because of this failure, many genetic studies of asthma have been done in the last several decades, including large genomewide association studies (GWAS).  And, just as with other genetic studies of complex diseases, in what is a rather repetitive drumbeat for trait after trait, nothing major has been found to reliably explain this epidemic.  Sure, some genes have been reported, but they just don't explain enough to be the answer.  Now researchers still interested in the genetics of asthma are suggesting, as with other complex diseases, that there must be numerous interacting genes with small effect.  Which is much more likely than one or two causative genes, yes, but it doesn't answer the question of what caused the epidemic.

There have been many ideas about the environmental component, most of them having to do not with pathogens but with pollutants, a number of candidates being listed above.  But now Science reports in the 26 November issue that there may be a bacterial link with asthma.  Again, the 'more microbes is better' hypothesis:
As odd as this might sound, there's mounting evidence that bacteria matter. Babies born via cesarean section, who experience a more sterile entry into the world than those born vaginally, are more likely to get asthma. So are young children treated with many courses of antibiotics. Along with animal studies, these observations suggest that the balance of bacteria and other microbes help guide immune development—and that when the balance is disrupted, disease may follow.
But it's complex.
All of us play host to bacterial residents. But children who develop asthma, researchers are learning, are home to different bacteria—and sometimes a less diverse mix—than those who stay healthy. “It's really coming down to the bacterial community structure, who's there, and in what numbers, and where,” [University of Michigan immunologist] Huffnagle says. 
So far, the evidence linking asthma and bacteria are associations, not proof that an imbalance of bacteria causes the disease. The big question, says Martinez, is, “Do asthmatics have an immune system that makes them be colonized by different things? … Or is it because they were colonized by different things that caused them to have asthma?” 
Researchers are currently comparing the microbiome (the community of microbes that colonize an individual) of kids with asthma and kids without, kids on farms, and kids in cities, to try to answer this question.  This is another way to ask if it's genes or environment -- which came first, the microbes, or the host environment?  And, of course many kids born by C section don't get asthma, and many kids born vaginally do.  It's complex indeed.

Microbiome analysis has some merits, beyond being yet another genomic Golden Calf for research labs. But we can predict that it is yet again a hunt for needles in a needle-stack of complexity.  We're likely still to be left with the problem that as a complex phenotype, like all other complex phenotypes, asthma is hard to explain, and hard to define.  In fact, an editorial in The Lancet several years ago suggested that every case should be considered unique, causally, physiologically and in how it's treated.

It is interesting, however, that schools of public health  abandoned their infectious disease departments in the '70s, declaring effectively that we'd won that battle (cheered on,  naturally, by geneticists seeing themselves as the funding beneficiaries).  Now, not only are clearly infectious issues like HIV/AIDS, multiple antibiotic resistant TB, SARS, and various influenzas still plaguing us, along with resurgent malaria and other neglected tropical diseases.  But many GWAS 'hits' are landing in genes involved in the immune system.

Inflammatory bowel disease, Crohn's disease, macular degeneration, and even schizophrenia are in this category.  If these pan out (and the first ones seem clearly to have done so), it means that non-acute, sub-clinical infection of long duration may build into diseases that did not seem either contagious or infectious.  If that's so, and the message sinks in, then GWAS may have done a service.  And epidemiologists like the sometimes-fringy Paul Ewald who have been touting the role of infectious role in chronic disease will have been vindicated.

Monday, November 29, 2010

RAiN gutter, or RNAi down the tubes?


Here's a story that may not seem so but is actually about the kind of extensive cooperation that characterizes life, and is a major theme in our book.

"Drug giants turn their backs on RNAi," reports Nature this week.  The headline writer must have had a hard time resisting the urge to end that sentence with an exclamation point, because RNA interference has been seen for the last decade or so as the latest greatest drug technique on the horizon, and if that's no longer true, that's news.
Not long ago, a technique called RNA interference (RNAi) seemed to be on the fast track to commercial success. Its discovery in 1998 revealed a new way to halt the production of specific proteins using specially designed RNA molecules, and it quickly became a favourite tool of basic research. In 2006, the scientists who made the discovery were awarded the Nobel prize for medicine, and the New Jersey-based pharmaceutical giant Merck paid more than US$1 billion to snatch up Sirna Therapeutics in San Francisco, California — one of the first biotechnology companies aiming to harness RNAi to create new drugs.
As one company, Alnylam Pharmaceuticals, one of the best endowed RNAi start-ups in the world, describes it:
RNAi is a revolution in biology, representing a breakthrough in understanding how genes are turned on and off in cells, and a completely new approach to drug discovery and development. RNAi offers the opportunity to harness a natural mechanism to develop specific and potent medicines, and has the potential to become the foundation for a whole new class of therapeutic products.
The discovery of RNAi has been heralded as a major scientific breakthrough that happens only once every decade or so, and represents one of the most promising and rapidly advancing frontiers in biology and drug discovery today. 
Well, according to the story in Nature, Alnylam has just laid off 50 workers, more than a fifth of its work force, because the Big Pharma company, Novartis, declined to extend its partnership with them.  Of course Alnylam says they still believe wholeheartedly in the promise of RNAi, but they would have to say that, wouldn't they?  Is protecting your stock prices and nervous investors an excuse for shading honesty, one of our recent themes?

RNAi is a naturally occurring process that interferes with gene expression when the antisense strand of an RNA molecule binds to the sense strand, thus inhibiting its translation into a protein.  The discovery of RNAi was a major discovery worthy of a Nobel prize because it revealed how cells naturally titer their level of gene expression.  They can start using a gene, but then quickly shut it down if brief or highly controlled timing is important, in forming organs in an embryo, for example, or cell differentiation or response to environmental changes.  It shows that nature is like the mushroom in Alice in Wonderland: nibble from one side to get taller, and the other to get shorter.

RNAi was quickly and widely heralded as a major breakthrough, and its potential in treatment of diseases like Huntington's or Parkinson's and so forth was obvious and exciting to researchers, pharma, and patients alike.

But, as with other forms of gene therapy, the realities of delivering the RNAi molecule to its target within cells is proving to be daunting.
The development of RNAi-based drugs has stalled as companies confront the challenge of delivering RNA molecules, which are notoriously fragile, to target cells in the human body, and then coaxing those cells to take up the RNA. "Getting these molecules exactly where we want them to go is a little more difficult than originally thought," says Michael French, chief executive of Marina Biotech, an RNAi company based in Bothell, Washington.
Of the dozen RNAi-based therapeutics in early clinical testing, most apply the RNA molecules directly to the target tissues, or aim to shut down the production of a protein in the liver, which takes up the RNA as it filters the blood. Several candidates also package the RNA within a lipid nanoparticle, a delivery vehicle that both protects the RNA and allows it to be shuttled across cell membranes. 
Alnylam claims to have a number of possible drugs in the proverbial (just-in-time-for-Christmas?) pipeline, and still enough money, even without Novartis, to do some testing, and other companies are still holding on as well.  So apparently we aren't hearing the actual death knell yet, but it's starting to look a lot like the sobering of the dance of enthusiasm for personalized genomic medicine by Big Pharma a decade or so ago because of the realities of complex disease.

There are several things here worthy of comment besides the discouraging news that a hopeful-sounding therapy might not work.  Differentiated organisms rely on internal integrity of their many cooperating parts -- organs, tissues within each organ, and complex interactions among components within each cell.  Stuff from the outside that is not brought in under controlled circumstances is not likely to be usefully incorporated into cellular machinery.  That's why immune systems of various kinds, often quite intricate, exist.  That's why cells highly control what crosses their membranes from the outside in, and get rid of what's no longer needed by kicking it out.

All this involves detection and cooperative action of large numbers of components that must be in the right place, on guard or ready for duty, at the right time.  It is no wonder that trying to sneak in something external, that is supposed to coopt the cell for its own purposes, is difficult!  Especially if this requires shanghaing many different parts of the cell to get this done.  Whether the problem can be solved is only for the future to tell, but the hyperbole by companies about how RNAi would quickly lead to a revolution in medicine was typically highly over-worked.

Years ago, David Stock, a very fine post-doc in our lab (now a prominent faculty member in beautiful Boulder, Colorado) spotted something strange in some work we were doing with embryonic mouse teeth.  We were interested in how teeth are patterned, and were working with a gene we had discovered called Dlx3.  David said he had sequenced some RNA -- purportedly messengerRNA coding for the Dlx3 gene -- but instead he had discovered the reverse sequence.  This was, at the time, 'impossible' and we dismissed it as a laboratory artifact.  RNAi had not yet been discovered, and since it was so unexpected we just never followed up on this finding (maybe we should have!).  That's how surprising RNAi seemed to be when it was shown to be real, widespread, ancient, and important in the biology of normal organisms.

But the idea, including the potential for practical application, was not new.  Even in our lab, we had tinkered with experimental uses of anti-sense RNA.  Various people had thought of using introduced anti-sense RNA in cells to experimentally alter gene expression.  The idea that this could have  application was also expressed by many, and we think some pharma explored this.  If genes are expressed via mRNA, which is translated into protein, then if you could inhibit that translation you could experimentally (or therapeutically) slow or stop the expression of the targeted gene.  AntisenseRNA would bind to the corresponding messengerRNA in the cell, so it couldn't be translated into protein.

Unfortunately, the method proved very difficult to use.....that is, it basically didn't work.  Even in cell or organ culture (such as, in our lab, growing embryonic mouse tooth germs in culture), the cells simply did not like incoming RNA, and RNA is quite unstable to begin with.  We and many others tried this approach, but gave up on it.

Thus it was that investigators thought of doing what nature had been doing, systematically and precisely and unbeknownst to science, for many many millions of years.  And unlike BigPharma, nature has made it work! 

The bottom line here (for science, citizens, and investors) is to keep the enthusiasm under control when making promises or pronouncements.  Stay closer to the truth.  And work harder before drawing conclusions.

Sooner or later technology and engineering often do succeed, and it's dangerous to bet against them.   But not everything works, and it's hard to know in advance what will.  Hopefully something as specifically targeted as RNAi will find useful biomedical or other application eventually, but without the hype.

Wednesday, November 24, 2010

Scientists say "Don't worry"....so start worrying !

The latest Scientists Say bulletin is that the x-ray machines in your nearest airport are not-to-worry.  Yes, x-rays cause cancer, but no, our company that makes these machines, and the Government that has signed off on them, free of course of any vested interest, says these are 'safe'.  What that means is safe for their jobs, because at this stage someone who tells some truth will lose his or her job.  Today the NYTimes has a story on risks of dental x-ray machines.

The problem is that even when the machines work, and the tech operating them actually knows what s/he's doing, they expose people to one of the world's best-known carcinogens.  When a scan (including a CT scan, not just something called 'x-rays', by the way) is done for therapeutic or truly diagnostic reasons, then the risk-benefit trade-off seems clear.  But dentists and docs often just want to 'take a look', or add to your bill, or play with toys that make them feel important and intelligent.  Or they themselves have little understanding of the risks of radiation carcinogenesis.

Of course, government bureaucrats will always deny that they've allowed us to be exposed to risks.  Remember dismissals of HIV (only our social scum get it, if you recall....plus Africans (who cares about them?)).  Or mad cow disease in the UK.  Or from Canadian beef?  Or the possibility that New Orleans would flood. Or many others like these examples.  Or climate change...ever heard of that?

The point is that scientists and governments cannot automatically be trusted. They (we) have all sorts of clear and covert, known and perhaps unaware, vested interests.  Some do indeed state warnings, and not all of them (us) are perfect in that. But the typical story, it seems, is that over-worry is discouraged because the interests often lie more heavily in the drill-baby-drill contingent than the cautious one. There's usually little profit in being cautious.

There are no easy answers that we know of in regard to airport security. We don't travel by air much because it has become a frequently miserable experience.  And it's clear that if we staunch one vulnerable point, terrorists will search for another. Unfortunately, however, the 'facts' seem to be elusive in just the cases in which these issues are most important.  The problem is, of course, when risks are so small, as in the case of airport x-rays, as to be basically immeasurable, but we have biological reason of decades' standing to know that they are not zero.  And they are greatest to the youngest, to premenopausal women, and so on.

What we're witnessing, unfortunately, is a huge victory for the bin Ladens of this world.  One looney-toon puts some explosive in a shoe, and tens of millions of people have to take their shoes off in airports.  One sad case loads up his undies with something, and we're all going to get a scan or a pat where the sun don't shine.

Meanwhile, we all learned recently that cargo is many times more vulnerable than passengers to being loaded.   Again, while we have no answers, it is clear that the current x-ray approach is too good for business to be denied.

Tuesday, November 23, 2010

Skullduggery (part ii)

On Nov 18, BBC science journalist Quentin Cooper covered the story of the exhumation of Tycho Brahe on Material World on BBC Radio 4 (the photo to the left is by Jacob C Ravn, Aarhus University).  As you may remember, we covered the same story here last week, and were a little less than enthusiastic about the science.  We called it the latest from the Desecration Newsroom.

Tycho Brahe was a 16th century Danish astronomer -- indeed, Oxford Historian of Astronomy and Medical Historian Allan Chapman, commenting on this story on Material World, called him 'the greatest founder of modern astronomy', whose 'influence was simply incalulable.'  He showed the world 'how to do modern mathematical astronomy.'


Another great early astronomer, Johannes Kepler, worked with Brahe for the year before he died, and was at the banquet at which Brahe suffered from urinary suppression soon before his death -- or was murdered, depending on your point of view. Kepler's account of the event had it that Brahe had a full bladder at dinner, but was too polite to excuse himself to go to the loo, thus leading to the cascade of events that killed him.

Or not.  Hair samples analyzed in the 1990s showed evidence of mercury, which suggested to some that he'd been so murdered, as described here.  
A new theory by Danish scholars claims that Brahe was poisoned with mercury on the orders of Christian IV, the King of Denmark, because the astronomer had an affair with his mother. It is even suggested that Shakespeare used the alleged liaison as an inspiration for Hamlet.
Peter Andersen, a Danish scholar at the University of Strasbourg, told The Times that the astronomer was poisoned by his cousin Count Eric Brahe, a Swedish diplomat in the service of the Danish Crown.
Last year Professor Andersen found the diary of the alleged murderer, in which he records many meetings with Hans, the brother of Christian IV, on whose orders he is believed to have gone to Prague to murder his cousin. 

Nonsense, scoffed Professor Chapman. If you had urinary suppression in the 1500s, you'd have a catheter inserted into your urinal system, it would have been filthy and you'd have lapsed into fever a few days later due to a generalized infection that began in the urinary system, and you'd soon be dead.  Mercury was used as a standard part of 16th century pharmacy, so of course you'd have been treated with it, and of course it would be found in your body later.  Even much later.  Further, as Chapman said, mercury was very slow acting so if you wanted to kill someone, you'd have used fast-acting arsenic, not mercury.  

Well, will the exhumation resolve the question?  No, not even according to the people doing the exhuming.  But photos of the great event can be seen here.  


Said Svend Aage Morgensen, "The main reason for exhuming Tycho Brahe is not to find out cause of death, because we could never know for sure....but Tycho Brahe is one of the great Danes... and we are interested in knowing everything about these persons; how they are living and what they are thinking and so on."

But now we're really confused.  CT scanning him will tell us what he was thinking or how he was living?  We've just heard that it wasn't done to determine cause of death, as that can't be determined.  

Cooper asked Morgensen what it was like to open the casket.  He described a scene with 100 journalists and many tv cameras, and said it was 'like seeing a Hollywood star entering a movie theatre' -- right, a spectacle.  They found, as expected, two jars, one containing the remains of Brahe's skull and the other the remains of his brain.  They took hair from Brahe's beard, eyebrows and head and samples from his bones, and his remains were CT scanned at a hospital in Prague before he was reburied on November 18.

Cooper asked if Morgensen was at all disturbed by disturbing Brahe.  No, he said, "We are quite confident that TB as a great scientist would have agreed to let science exhume him."  

Really.  You'd hope that Brahe would have wanted science to have a real purpose for doing so, not just the prurient desire to create a spectacle. 

Monday, November 22, 2010

Cheating your way to the top. Shadow scientists?

Is it relevant to Mermaid's Tale to refer to this article in the Chronicle of Higher Eduction on how much professional and pervasive cheating goes on among students?  Posted on Facebook by a co-blogger of ours.  It's pretty disgusting, in our supposedly leading--Christian even!--society.  Universities know it goes on, but why do they/we tolerate it?  Could it be because to police the system would jeopardize our flow of tuition money?  That is perhaps cynical, but not as far-fetched as it may sound.

This cheater's honor role of topics includes, among many other things, fradulent papers for students of ethics--amazingly!--but he claims nothing in science.  Maybe we scientists are so technical that a hack writer can't fake it for us. We need lab notes, DNA clones, and so on.  Maybe we can't cheat......or maybe just not in the same way.  Or maybe it takes a scientist to do sham science well.

The people who hired this guy were students, supposedly getting an education in their chosen fields.  But presumably they'll graduate one day, thanks to their hired gun.  What do they, or we, do when they (we) become professionals?

Is there a fine line between them and us?  In science, have we not trained students to exaggerate, stretch, rationalize, omit citation to relevant literature or precedence, construct artifactual 'significance' and over-estimate study 'power'? Do we not train students in 'grantsmanship', massaging the truth in whatever way it takes to get a grant, because "you have to, to get funded"?  Do we accept the habit of running to the news media with exaggerated claims, and pious statements that our dramatic discoveries will of course require further funding?  Or dividing what should be one, careful, measured paper into countless fragmented paperettes, to make sure out CVs look spectacular?


Shadow scientists, too?
These may be comments about society, but they may also be relevant to science.  At what point is dishonor what it is--dishonor--rather than just how things are?  How much of this, how much fad and momentum, and bureaucrats' portfolio protection, etc. goes on?  Could it be detrimental to science, by pushing lots of clever people to game the system rather than doing what would be more careful science?  Or wasting funds on what we can buy expensive gear or design huge studies to do, when real questions go unanswered or inadequately addressed.

Or would a more honorable system not distract attention in wasteful, inefficient directions, pushing people instead to think hard, take their time, and try to develop better answers to nature's questions?

Each of us has to answer these questions for him/herself. For many, perhaps things are fine.  The System as it now is, is bountiful for those who are good at it.  For others, it's a source of worry, or even real sorrow.  In any case, for those who need to game the system, we know a nice ghost writer to recommend.

We think that striving for honor is worth the effort, even if it seems to be in vain much of the time.

Friday, November 19, 2010

But seriously, what do finger bones have to do with sex?

A recent paper on fossil hominin hand morphology inspired journalists to write some of the wackiest headlines about human evolution in recent memory.

Here's a good one:

Neanderthals Were More Promiscuous Than Modern Huma
ns, Fossil Finger Bones Suggest.

What could you possibly take away from this headline besides,

"Seriously? Your finger bones indicate how sexually active you are or whether you're prone to cheat on your partner?"

While Googling for a way to evaluate your own fingers, or your partner's, you may also think,

"Stupid Neanderthals! That's not how you make babies! No wonder they went extinct."

Maybe Neanderthal baseball only had three bases?

Other headlines weren't so accidentally confusing... like this outrageous one in The Telegraph:

Neanderthals really were sex-obsessed thugs.

The subtitle reads: "Neanderthals really did act like Neanderthals, new research
suggests, as our early relatives were found to be more aggressive, competitive and promiscuous than modern man."

So what did the study actually find?

Lucky for you and me, the article is open access so anyone with internet access can see for themselves.



P.S. On a related note, here's a funny little question-and-answer about finger size and use.

Wednesday, November 17, 2010

Sniffing out the truth

There was some discussion here last week about the integration of multiple senses -- in that case, hearing and sight -- and now there's news of a study published in the journal Chemical Sense that reports that a heightened sense of smell affects how much we eat.  This is important because, well, we'll let them explain. 
The relationship between hunger state and olfactory sensitivity is important for both olfactory and appetite research. This connection has a more urgent need due to the obesity epidemic and the demonstration that obese adults have reduced olfactory sensitivity.
Of course, this is a science story so it's real headline is the 'urgent need' for more research.  That pressing priority notwithstanding, previous studies have found that people have a heightened sense of smell either before or after they have eaten -- according to the authors of the current study, this discrepancy may be because of methodological differences in the studies.  Previous studies have also found that people with a high body mass index (BMI) have lower olfactory acuity than those with lower BMI -- unless it's the other way around.

So the authors of the present study wanted to determine which was right (Yin or Yang), and further, whether people have a heightened sense of smell when they are hungry or full, and whether it differs by BMI.   Or mood.  Or alertness. Or just a way to get another grant.  Maybe the answer is 'maybe'?  But fortunately for these authors, whatever they find, the answer will agree with some previous study and so can be said to be furthering the field!   

This study looked at a huge total of 64 people, and assessed their sense of smell before and after lunch, which was chosen as described below (the study was in the UK). 
Participants were asked to choose between 2 sandwich options, either chicken and bacon (470 Kcal) or cheese and celery (480 Kcal) (Marks and Spencer). They were also given a packet of Hula Hoops cheese and onion crisps (129 Kcal) and a bowl of chocolate chip cookies (126 Kcal) (Sainsburys). In order to ensure the lunches were acceptable, a pilot study was completed where 6 participants (3 females and 3 males) were presented with a selection of sandwiches with different fillings and a variety of sweet and savory snacks. The savory and sweet snacks with the highest pleasantness ratings were selected for the study. For the sandwiches, in order to cater for vegetarian and meat options, the 2 sandwiches with the highest pleasantness ratings that were also most similar on this dimension were selected.
They did personality measures, mood measures, hunger and alertness measures, the latter as described here:
Ratings of hunger were made using 100 mm unmarked line scales end-anchored “not at all” and “extremely,” with the adjective “hungry” centered above the line. In addition to this adjective, other mood adjectives were also used (alert and drowsy), mainly to divert attention away from the real purpose of the study but also to provide data on how temporary suspension of lunch might affect behavior.
And, they measured olfactory threshold and discrimination (from the Sniffin Sticks battery), before and after lunch.

And what did they find?  That people's smell acuity doesn't vary for non-food related odors, but that they can smell food odors somewhat better after lunch than before.  Especially if their BMI is high. 
At first glance, this seems counter intuitive because on the basis of evolutionary theory, we might well expect the ability to detect foods that are edible and ripe to be more advantageous in a hungry compared with satiated state. Though it could be theorized that better olfactory acuity following a meal might in fact aid in the regulation of food intake, that is, as it is then easier to detect and reject foods that are no longer required. 
 
One response to this can't be printed in a PG-13 post.  Olfactory sensation is based on combinatorial detection among hundreds of highly variable olfactory receptor genes so, like the immune system  we don't have to be pre-programmed for specific odors or recognize specific pathogens.  Whatever we're able to detect does seem to be in major chemical families that certainly relate to food.....and predators, and so on.  But humans have largely lost specific pheromone sense, again pointing to the idea that we learn smells, and are born with the ability to detect an open-ended set of odors.

There may of course be modifiers of acuity, and this study seems to claim that there are, and that they are environmentally contextual.   Of course, those who for whatever reason get into McFoods are likely to come to like their aroma, and perhaps be more sensitive to them.  Afterwards, all they need is a good belch and a nap.  Skinnier people need to keep thinking "Food!"  Do we need Darwin and lethal fitness differences to account for that?  To imagine what, in Alley Oop's day, smelled like Big McMax, but only to those who had just tagged along at the hunt and were only thrown a bone or two to gnaw?
The bow to Just-So stories -- er, evolutionary explanations -- aside, here's what the lead author had to say to the BBC about his results:

[Dr Stafford's] team found that people who are overweight - those with a higher body mass index or BMI - have a far heightened sense of smell for food compared to slim people, particularly after they have eaten a full meal.
Dr Stafford believes this keener sense of smell might compel the individual to carry on eating, even when they are full.
He said: "It could be speculated that for those with a propensity to gain weight, their higher sense of smell for food related odours might actually play a more active role in food intake.
Is this a disguised 'thrifty genotype' argument, that those with the better sense of smell ate more, stored more, and starved less?   It doesn't seem to be.

But clearly more research is needed.  But in fact, why?  If slim people stop eating when their sense of smell is more acute than when they are hungry -- or rather, when they stop eating their sense of smell is slightly heightened -- how does it follow that the fact that obese people might have a more heightened sense of smell after eating make them eat more?   And why would we think that this correlation (keeping in mind that correlation is not causation) has anything to do with why people eat, or why they eat too much?

Why can't the press smell useless science by now? We report these items as a service to our readers.

Tuesday, November 16, 2010

Latest Bulletin from the Desecration Newsroom!

Well, sports fans, the latest episode in our Science Won't be Slowed Department, or in this case, the Desecration Newsroom is that the famous astronomer Tycho Brahe (1546-1601) will be disinterred.  The reason is the urgent, pressing 'need' for us to know whether the poor guy died because his bladder exploded or whether it was mercury poisoning.  (Note the fake nose.  That's not a scratch on the etching, but a prosthesis because Brahe had his shortie amputated in a duel; hence the frown)


Whether or not next of kin  (or perhaps Inspector Lestrade) were consulted for permission to dig into this story, we can't say.  But this seems to be turning into a habit, because the Honorable Dr Brahe has been dug up before!  That also seems to have been on another slow news day way back in 1901  when a hot medical bulletin was needed to fill the pages (the investigators at the time, being interested in sustainability of this disinterment project, only took a mustache snippet).

Here a distinguished team of 'scientists' somehow have funds and permission to go digging (is this what skull-duggery refers to?) for something to justify their promotions or tenure, or to show how deeply insightful they are.

Whether finding (or not) whatever they are looking for will tell whether Tycho's untimely death was due to his dabbling in the metallics of alchemy, or was assassinated by the King of Denmark, or was murdered by his rival Johannes Kepler is not clear. Given what they can find in clothing or bones, there seems to be no bones about the inevitably murky results.  Even the authors agree:
Professor Jens Vellev, from Aarhus University, is leading the team of scientists and archaeologists which opened the tomb in Tyn Church on Monday.
He says he hopes to get better samples of hair and bones than were taken in 1901.
The use of the latest technology to test the samples may also help shed more light on the mystery of the astronomer's death, although Professor Vellev is not promising anything.
"Perhaps, we will be able to come close to an answer, but I don't think we will get a final answer to that question," he said.

We're certainly relieved that the latest technology will be used, as well as to see that the Sustainability spirit is being maintained (i.e., 'we won't find a final answer, so we can justify another grant to dig the poor star-gazer up again next time we need a publication'). Apparently, grave-digging is OK by the Catholic Church, which we guess can find a way to rationalize anything attention-getting (pass the plate!), since a special mass is being held in 'honor' of the event  (Man helping God to understand why Brahe passed to wherever he went.  If it turns out it was just a burst bladder after all, is that a ticket to heaven, since it rules out alchemy?).

We thought MT readers would want to know about this hot new bulletin from the Your Research Dollars at Honorable Work, that is, science doing its duty to the society that pays it.

Monday, November 15, 2010

Vitamin in Deeed!

According to many investigators and perhaps people seeking tidy hypotheses to explain evolutionary change, humans were black in the tropics where our species originated, to protect from the dangers of too much sun.  Even with dark skin, they got so many rays that they were able to make enough vitamin D. Vitamin D is obtained primarily by the effects of UV (solar) radiation enabling a vitamin-D producing reaction in the skin and only minimally by diet.

So we moved north, where there was less sunlight, and (as the story goes) were selected for lighter skin so we would not suffer disease due to vitamin D deficiency.  Such diseases supposedly include osteoporosis or other bone disease -- though if you believe the current hype, vitamin D deficiency (and how that's determined is another story; current thresholds would suggest that almost everyone is deficient) causes almost anything, from diabetes to cancer.  Whites get more UV light because, though there's less exposure in the high latitudes, they have less UV-filtering melanin in their skin.

So far so good.  But the lore on the relevant bone disease is 'fat, fair, female, and forty'.  Blacks as a group (in Europe and the US) do not have as much vit D deficiency-related problems, including  osteoporosis, as whites.  They do not get lots of rays in the north (or deep southern hemisphere) but they  don't need vit D supplements to reduce risk of fractures.

Now, a new study shows that whites who have vit D deficiency suffer strokes at a higher rate. Blacks are known to have higher rates of strokes in generally, but it's whites who apparently have higher stroke risk if they're vit D deficient.  This seems backward from the long-standing evolutionary story (or is it a Just-So story?), and it's not clear why. But what it does suggest is that lighter skin may not have evolved by natural selection in relation to vitamin D per se.

Strokes are serious, often crippling or fatal.  Usually, and probably much more so in pre-industrial times, strokes are rare or post-reproductive, so they have little impact on evolutionary fitness (natural selection).  Blacks in western environments have a higher risk of stroke, but this is generally thought to be due to the effects of lifestyle.  Many have argued that this (and the associated high blood pressure) is genetic--and all sorts of largely fanciful explanations have been offered for it (for example, that slaves whose bodies did not conserve salt died on the Middle Passage from Africa to the Americas).  But careful work by investigators including Charles Rotimi of NIH and Richard Cooper at Loyola Medical school in Chicago, both prominent and capable, have cast doubts on genetic explanations for this health difference.

Lighter skin has evolved at least twice (Europe and Asia) as people expanded north away from the tropics, and darker skin has evolved as Native Americans expanded from the arctic into the American tropics.  A recent and very good PhD student of our Department, Ellen Quillen, has shown in her dissertation some evidence for selection in at least a few genes related to pigment production.   There are always questions to be asked about this kind of research, and some important points have not yet been pinned down completely.  But at least it shows that evolutionary hypotheses related to skin color make considerable plausible sense.  But, then, what was the selective factor, if not health related to vitamin D?

We don't have the answer.  Except that ready acceptance of selective scenarios, rather than more careful and cautious approaches, is too often a part of evolutionary biology, and perhaps especially when it comes to humans.  We hunger to understand, but perhaps we also hunger too much to proffer hypotheses that are not fully baked.

Friday, November 12, 2010

The complexity of schizophrenia and how to understand it

An excellent, measured and thoughtful paper about the causes of schizophrenia appears in this week's Nature, a special issue on current knowledge about the disease.  Much recent research into this devastating disease has been gene-based, including of course genomewide association studies, but, as with all other complex traits, no simple genetic basis has been identified.  In this paper ("The environment and schizophrenia"), Jim van Os et al discuss reasons for this, and suggest ways to move the research forward.

GWAS have identified hundreds of genes for schizophrenia, or more, but currently only accounting for a small percent of the variation in disease presence or test-scores.  Depending on some assumptions and on what data one considers, estimates are that hundreds or even thousands of genes (including regulatory and other functional regions) contribute.  There are few that seem to make strong individual contributions, and one region that has been found to do that is in the HLA part of the immune system, a strange kind of finding.  Even with optimistic assumptions, predictive power will vary from sample to sample and population to population.  These will not, as currently designed, assess epigenetic changes, due to DNA modification.  And then there's the little trivial thing called the 'environment.'

Briefly, in this new paper van Os et al. argue that schizophrenia, and other 'psychotic syndromes', are the result of the interaction of the developing brain with environmental triggers during developmental sensitive periods in those with what is presumably a genetic susceptibility to disruptions in normal functioning of the brain.  We condense a long argument into one inadequate sentence here, but even so please note that it's a paean to complexity, flexibility as a response to the environment during development, and that genetic susceptibility is only one part of the picture.  

These researchers believe, based on a body of prior evidence, that the development of normal neuronal connections in the brain requires interaction with a variable environment, as they portray in the figure to the left.   Based on epidemiological data for associations of psychosis with environmental risk factors, they define risk as the stressors of urban environments, belonging to a minority group, developmental trauma, and/or cannabis use.  However, these factors are extremely common, while what they call the 'psychotic syndrome' is not.  This is where the genetic susceptibility comes in.
This suggests that beneath the relatively small marginal risks linking the environment to psychotic syndrome at the population level, vulnerable subgroups exist that are more sensitive to a particular environmental risk factor at a much larger effect size. Thus, the validity of observed associations with urban environment, developmental trauma, cannabis use and minority group position hinges on evidence of vulnerable subgroups. Genetically sensitive studies indicate that differential sensitivity to the psychosis-inducing effects of environmental factors may be mediated by genetic factors. For example, in siblings of patients with a psychotic disorder, who are at increased genetic risk to develop psychotic disorder, the psychotomimetic effect of cannabis is much greater than in controls, as is the risk to develop psychotic disorder when growing up in an urban environment
It's one thing to state this, and another thing to suggest ways to test this complex interaction of risk factors.  They do just this, however, describing a number of animal studies that can now be done to look at the effect of environmental factors on the developing brain of susceptible and non-susceptible animals.  The authors recognize that this won't be easy -- sensitive periods during development, risk factors, at-risk animals, and evidence of psychosis all must be well-defined and observed.

And then any potentially disastrous effects of methodological shortcomings, such as bias or confounding -- including genetic confounding -- must be ruled out before results can be considered credible.  E.g., are at-risk adolescents self-medicating with cannabis, making it appear that cannabis is the trigger when it's something intrinsic instead?  Does genetic susceptibility predispose to the use of cannabis, again making it look as though cannabis is the risk factor when it's genes?  They in fact include a box devoted to issues related to weighing the evidence.

Whether van Os et al. are correct in the details of which factors are most important to account for, or in the timing of the sensitive periods for specific aspects of normal growth we certainly can't say, but even so this is a beautifully nuanced and well-reasoned argument for accepting complexity, with suggestions for how to move forward from there, all based on the mountains of data that have come before.  And, the authors don't over claim, or say it will be easy. 

Indeed, the existence of many 'sensitive stages' during development means any number of pathways could be affected at any time to lead to disease.  Schizophrenia, like any trait, of course has genetic underpinnings, but given the 4-dimensional complexity of the developmental pathways in the brain,  this means that there are many ways that things could go awry, which only increases the difficulty with which they can be found, or, if found, be useful for prediction.

The authors conclude:
The human brain has evolved as a highly context-sensitive system, enabling behavioural flexibility in the face of constantly changing environmental challenges. There is evidence that genetic liability for psychotic syndrome is mediated in part by differential sensitivity to environments of victimization, experience of social exclusion and substances affecting brain functioning, having an impact during development. Given the complexity of the phenotype and evidence of dynamic developmental trajectories, with environmentally sensitive periods, longitudinal research on gene–environment interplay driving variation in behavioural expression of liability, that subsequently may give rise to more severe and more ‘co-morbid’ expressions of psychopathology and need for care, is required to identify the causes and trajectories of the psychotic syndrome. Examination of differential sensitivity to the environment requires technology to assess directly situated phenotypes indexing dynamic, within-person environmental reactivity as substrate for molecular genetic studies; parallel multidisciplinary translational research, using novel paradigms, may help identify underlying mechanisms and point the way to possible interventions.
So, this is an unusually sober and realistic treatment of a complex disease.  Is there anything surprising here?  Only that Nature is giving a number of pages to a nuanced treatment of a subject that is so often treated simply.

Thursday, November 11, 2010

Yawning over the latest news about the genes 'for' sleep

A story with headlines like "Gene for whether you sleep a lot or a little", "The Sandman gene", "How much you sleep is determined by genes", or "The gene for Zzzzzzzz" (that last one's on the Science site) would seem to promise news of a significant finding.  But it turns out that this is the report of a gene variant that may determine whether those who have it sleep 28 minutes less than those without it -- 7.5 vs 8 hours a night.  Wow!  What a difference.  It's rejuvenating just to think about it!

This was apparently a genomewide association study (GWAS, the genetic equivalent of snake oil, to some), of 4,200 Europeans, who had been asked how much they sleep.  As far as we can tell, the results have not yet been published. 

As Science reports,
Sleep duration correlated strongly with a single genetic marker in a gene called ABCC9. When allowed to sleep as long as they want, those who have two copies of one version of this marker sleep on average 6% less than those carrying two copies of the other version, or about 7.5 hours versus 8 hours, says postdoc Karla Allebrandt, who is leading the study at the Centre for Chronobiology headed by Till Roenneberg at the University of Munich in Germany. Allebrandt presented the work last week at the annual meeting of the American Society of Human Genetics in Washington, D.C.
ABCC9, or SUR2, is a gene that encodes an ATP-binding cassette transporter, or part of an ion channel that ferries potassium across cell membranes.
When the researchers knocked down the corresponding gene in two species of fruit flies, the flies slept significantly less at night compared with controls, Allebrandt reported.
Just how significant, we wonder.

Science also mentions, with no apparent irony, that sleep varies with weight -- people with a high body mass index tend to sleep less.  And therefore, the sentence following that one says, researchers are interested in sleep genes. But, this is a bit of a nonsequitur. Though, we hasten to add, on the part of the journalist, not, we assume, the researchers.  The story doesn't seem to be suggesting that the gene for sleep causes diabetes or heart disease, which are associated with obesity.  If it did, this connection might make sense, even if the suggestion did not.  But looking for genes for a trait that responds to largely non-genetic variables like weight?  No wonder the effect they report is so small.  Or should Leptin (a gene 'for' obesity) be renamed NoNap?

Reports like this are more sleep-inducing than any gene could be -- except when one thinks that research money was (predictably) thrown away on it

Wednesday, November 10, 2010

GWAS, hype and cashing in -- DTC genetic testing

In our opinion, the problems with direct-to-consumer (DTC) genetic testing has never been so accurately described as it is by Murray et al. in the latest issue of Trends in Genetics (well, except maybe when we did it here).
 Among the estimated 480 different traits covered in the collective offerings of current DTC genetic-testing companies there are tests for the creative, musical, linguistic, and shyness ‘genes’, as well as for intelligence, athletic aptitude, and bad behavior.One company promotes a testing package for ‘inborn talent’ as the ‘result of the Human Genome Project’ (http://www.mygeneprofile.com/talent-test.html). Other companies use ‘proprietary technology’ to offer personalized nutritional products based on genetic testing (http://www.ilgenetics.com). Often without citing a single DNA variant or gene, companies promote their products under the protective cloak of legitimate science.
The authors provide a number of examples of tests these companies offer for genetic susceptibility to traits like athletic ability or 'avoidance of error' based on data from small studies or results that haven't been replicated.  They do say that the quality of supporting scientific evidence offered varies by company, with some being at least more formally careful than others to tell their customers that the evidence isn't strong.  But Murray et al. rightly point out that to offer the tests at all, under the guise of scientific authority, is misleading and wrong.
Why are these tests being put on the market at all? Their very offering with the accompanying ‘information’ implies that the test results are meaningful, and thus misrepresents how the scientific community comes to accept conclusions as valid.
They go on to say that scientists publishing new genetic findings have the responsibility to publicly and pre-emptively state that their results are not definitive, and are not for commercial use.  Editorial writers for journals should point out the same thing when they highlight these kinds of results.  The distortion (their word) of science in the way that the DTC companies offer should be countered by "active engagement of the public by scientists in a way that both informs and encourages debate over the social consequences of new scientific findings".

We heartily agree.  Even when a company posts some quality evaluation of studies they cite, as if they're informing the customer, they know very well what the impression of the customer  will be.  If the studies are weak, an honorable company would not do the test (and reduce costs and claims accordingly).

The issue of FDA oversight of these companies has gotten a lot more play than the question of whether they are offering anything of actual value to the consumer, or the extent to which this is just a new version of snake-oil salesmen.  These companies are often, perhaps largely using results from GWAS and other similar kinds of studies, that by now every candid person knows have not delivered nearly the promised goods in regard to clearly causal, reliable, risk-estimable results.

Of course, some alleles can be tested for and have high predictive power, but those can, and often if not usually have, been the purview of legitimate genetic counseling services.  And most people who use the services out of the blue won't have such alleles, or they or their family would already have the condition, if the allele is due to a reliably predictive allele.  If they do, they should already be under the care of a specialist clinic.  And recessive traits involve mates not just individual customers, and are generally much trickier to understand from DNA sequence data.  And a negative DTC result does not mean a clean bill of health, for too many reasons to go into here.

This is all if we're talking disease.  If we're talking designer children and whether to make your kid practice his/her football or violin, who knows?  Or deciding how you should vote  based on your genotype.  What unconstrained kinds of nonsense are afoot!

Of course the companies want to stay in business (is it fair to say 'cash in' while they can?)  but there is probably, for some at least, a more than trivial objective of doing good for customers.  But the overall industry doesn't seem much like a responsible, ethical endeavor.  Some of the scientists involved in the best companies know very well what they are doing. 

At the very least, as in any technical industry in which consumers have no way at all to defend themselves adequately, we need strong evidence-based regulation of this industry.  Otherwise it will cost the health care system enormous amounts of money chasing down false signals, will provide fodder for all sorts of malpractice suits, and in doing so will show that culpably casual acceptance and promotion of GWAS-like studies that promise 'personalized genomic medicine' have been damaging rather than benefiting to public health.  Casual hype is not idle play, and is short-term gain for a few with long term costs for many.

The recent studies claiming that, in a contorted way, a genetic variant determines how you vote, and some of the 'talent' claims of DTC companies that we mentioned above, have more than a little whiff of similarity to the same kinds of claims made, starting with Darwin and through the eugenics movement up to World War II.   Triggered by the 'liberal gene' study, we have promised to expound on that and other fashionable social-behavior genetics and we'll do it, but we've been too busy this week and last.

Even if nobody in the DTC companies has a new eugenics or a new round of social Darwnism in mind, the potential for their hyped promotion of genetic determinism to be used for corporate or government policy, with potential societal abuse, is obvious.  Dismissing DTC as a fad is playing with matches.

Tuesday, November 9, 2010

More genetics done right -- GWAS and HIV controllers

About 1 in 300 people infected with HIV don't go on to develop AIDS. or do so much more slowly than expected -- this has been known for some time.  It was once thought that these people would never develop AIDS, so this subset of people was termed the "long-term non-progressors" but in fact many eventually do develop AIDS, and this group are now called "HIV controllers."

Now The International HIV Controllers Study reports the results of a genomewide association study (GWAS) undertaken to try to determine what is different about HIV controllers compared with those who develop disease (the ScienceExpress paper is here, and a story in The Independent about the report is here).  They included 900 HIV controllers from around the world and 2600 'progressors', people for whom the natural history of HIV infection was as expected. 

Not surprisingly, the significant genetic differences between controllers and progressors were within the major histocompatibility complex (MHC), a part of the immune system that distinguishes self from non-self, self from viruses and bacteria.  Specifically, they found 5 single nucleotide polymorphisms (SNPs) within a stretch of the MHC that codes for HLA-B, a gene for a protein involved in defending against viruses. It presents fragments of HIV on the surface of infected cells, and this presentation is recognized by CD8+ T-cells.  While there is extensive variation in MHC genes among humans, the HLA-B protein in controllers regularly differed from that of progressors by 5 amino acids, which changes the shape of  the protein and how it binds to HIV.  
Altogether, these results link the major genetic impact of host control of HIV-1 to specific amino acids involved in presentation of viral peptides on infected cells. Moreover, they reconcile previously reported SNP and HLA associations with host control and lack of control to specific amino acid positions within the MHC class I peptide binding pocket, where the HIV fragment is 'housed' on presentation. Although variation in the entire HLA protein is involved in the differential response to HIV across HLA allotypes, the major genetic effects are condensed to the positions highlighted in this study, indicating a structural basis for the HLA association with disease progression likely mediated by the conformation of the peptide...
(The illustration is from Wikimedia Commons and is of the backbone structure of HLA-B*5101 complexed with HIV's immunodominant epitope KM2 1e28. Peptide is shown in yellow in the binding pocket. Beta 2 microglobulin is show in the lower left, and membrane attachment site is in the lower right. 3D Structure is derived from Maenaka, K. et al. (2000) Nonstandard peptide binding revealed by crystal structures of HLA-B*5101 complexed with HIV immunodominant epitopes. J.Immunol. 165: 3260-3267)

The authors of the International HIV Controllers Study write that there are other differences between controllers and progressors that may be important in whether or how quickly they progress to disease, but that they believe that the differences they found in the HLA-B gene, and what they mean for viral/peptide interaction, are the most likely explanation.  They are cautious about over interpreting, even when talking to the media, which is not always the case. As quoted in the The Independent:
Dr Walker emphasised that the discovery is not like a "light switch" that turns someone into an HIV controller. It is one factor among several that increases the chances of someone being able to survive for many years with HIV and not antiretroviral treatment, he said.
"We've identified a major determinant but there are other factors that will influence the pathway. We've not identified the precise mechanism to explain HIV controllers but we know that of all the genetic influences involved, this is by far the most important," Dr Walker said.
And they can't yet describe the specifics of how the altered protein works, but that is the focus of current studies, and one can assume that they will ultimately be successful now that the target of investigation is clearer.

As regular MT readers know, we aren't great fans of GWAS in general, but this is one example of a successful study.  This is because the genetic effect is large enough to be detectable. The investigators used clever techniques to dissect out (statistically) the effects of particular variants in the HLA-B gene.  The strongest signal was in an allele called HLA-B*5701.  This was replicated and in fact was the only usefully strong signal in this study.  That shows that GWAS works, because that allele was already known to be involved in slow progression.  Indeed, it's screened for in treating HIV patients because those bearing that allele can over-react to a drug called abacovir (the drug is not used on such patients).  So, one must be tempered about this finding and the usefulness of the GWAS approach.

Beyond this, the study found weak signal (semi-imaginary?) in a CCR5-CCR2 region, where variation has been suggested in other studies to affect HIV sensitivity.  Nothing else in the genome generated any signal.  And these two sites were responsible for 19 and 5 percent of resistance.  So it is a typical GWAS result.  Nonetheless, it focuses attention on the HLA region as 'the' region to think about, at least at this stage.  And it shows that focused, problem-specific GWAS studies can do their job, even if there were other ways to find these genes--why?  because their signal is strong.  And it confirms the general challenge to us, that many genes with minor effect are probably involved here as with other complex traits....what to do about them is the next challenge.

A discussion of the paper in Nature concludes:
However, it will be a long time before this work gives rise to treatments or vaccines. "We're a long way from translating this, but the exciting part is that this GWAS led us to an immune response. That has to be good news for vaccines, because they manipulate the immune response," says Walker. "We're cautiously optimistic that this will help us develop ways of inducing better responses, because we now know what it is that we're trying to induce."

Monday, November 8, 2010

The Potter Wasp

We have just learned about a beautiful little insect called the potter wasp (the picture of the pot to the left is from Wikimedia Commons) from the Nov 10 episode of the BBC Radio 4 program, Living World.  There are 6500 species of wasp in the UK, but this is a solitary and rather elusive wasp that lives on the southern edge of the isles, but they can also be found in Europe, east to Asia and west to North America.  We've come late to this discovery, and to blogging about it -- here are some additional lovely pictures of the pots these wasps construct.

The adult female lives for about 6 weeks in the summer, during which time she constructs 15 or 20 little pots, each maybe 1 centimeter tall and made of clay that she mixes from earth from a carefully chosen "quarry" and her own saliva.  She'll fly 60 meters or more from the quarry to construct her pots, often in a bush, but if she's near a town, she might make them on roofs or other such 'unnatural' places.

Each pot takes a few hours to complete.  When it's finished, the wasp lays a single egg inside, and then proceeds to provision it.  She hunts for tiny caterpillars, which she paralyzes and stuffs into the pot until there's no room for any more. She then seals the pot and flies off to make another.  The larva hatches in a few weeks, consumes the caterpillars, and makes itself a cocoon in which it overwinters.  When the weather is warm enough, the insect eats its way out of the pot and goes on to live the short life of an adult potter wasp.

Now, this is very interesting indeed.  No wasp is taught to make these pots, nor are the larva taught to make their cocoons, nor to eat their way out of their first homes to out into the world to perpetuate the cycle.  This is clearly genetically driven behavior, and complex behavior at that.  

And these wasps aren't automatons.  They apparently leave a distinctive signature on their pots, some building all slanted pots, some all upright, and so on.  And they spend time choosing the spot of the quarry from which they'll construct their pots, assessing the soil and deciding where it's just right, and then choosing where to locate their pots.  And, if they've laid an egg and then the weather turns, or the temperature drops, or they can't find caterpillars and the egg dies, they recognize this and don't then fill the pot with food.  


Clearly complex behavior, and apparently with some decision making going on.  There may -- must -- be a lot 'hard-wiring' involved.  But how can this work?  How do these wasps know what to do from day one, and with no instruction when clearly decisions must be based on their local experiences?  And why do we here at MT continually criticize what we think are too-easy assertions that human behavior is genetically determined?  

Synapses in human brains apparently are laid down in response to experience.  How do insect brains develop?  The standard Darwinian answer is important, and it is not new, and the problem (through other examples) was well-known to Darwin.

The standard explanation is that the trait evolves by natural selection.  The wasp doesn't have to 'know' in any conscious sense what it's doing.  It suffices that gradually, wasps who performed earlier more primitive versions of this behavior reproduced better than wasps who didn't.  Over time what was once a rudimentary version, became today's highly sophisticated version.  The Darwinian explanation is really a statement of assumption that this kind of process is responsible -- and we tend to accept such explanations with the idea that someday neurobiology will be able to work out the mechanisms in today's wasps by which we can develop more genetically specific explanations.

That's a major axiom to make such a blanket application of Darwin's masterful idea.  Of course, this is just the kind of trait that Intelligent Design advocates like to point to as proof that Darwin was wrong.  Evolutionary biologists invoke natural selection because there is so much evidence that the general idea was cogent, and also because we simply have no better kind of explanation (and instant creation is no explanation at all).  But history shows that science will make progress in understanding these wasps' talents. Perhaps wasp experts will be able to point to 'intermediate' states of this kind of behavior in other closely related wasp species.  Even if that shows that gradual evolution of such a trait could occur, it doesn't explain the neural mechanism, which must be very interesting indeed.

Darwin himself marveled at insect behavior.  In Descent of Man, 1871, he wrote:
...the wonderfully diversified instincts, mental powers, and affections of ants are notorious, yet their cerebral ganglia are not so large as the quarter of a small pin’s head. Under this view, the brain of an ant is one of the most marvelous atoms of matter in the world, perhaps more so than the brain of a man.
For all the reasons of the nature of complexity, that are so easy to observe in so many traits, and that we write about a lot, it is a challenge to accept the standard explanations without wondering if  there may not be something about these processes that science simply has not yet discovered.  But if that is or at least should be a nagging concern even for many biologists, one of the other pervasive and humbling facts that we must  keep in mind is that humans have a difficult time getting our heads around the notion of what can happen in millions of generations.  It is to a great extent today, as it was to Darwin, the major challenge to understanding the workings of evolution.

Wednesday, November 3, 2010

Nothing fishy about a cure for Alzheimer's

So B vitamins don't prevent dementia, and now comes the news that omega-3 fatty acid doesn't either, according to a paper in the Journal of the American Medical Association
Docosahexaenoic acid (DHA) is an omega-3 fatty acid identified as a potential treatment for Alzheimer disease. Epidemiological studies have shown that omega-3 fatty acid consumption reduces Alzheimer disease risk and DHA modifies the expression of Alzheimer-like brain pathology in mouse models.
Several studies have found that consumption of fish, the primary dietary source of omega-3 fatty acids, is associated with a reduced risk of cognitive decline or dementia. Some studies have found that consumption of DHA, but not other omega-3 fatty acids, is associated with a reduced risk of Alzheimer disease. Studies of plasma fatty acids have confirmed the dietary studies, finding that plasma levels of omega-3 fatty acids, and especially DHA, are associated with a reduced risk of Alzheimer disease. The most abundant long-chain polyunsaturated fatty acid in the brain, DHA is enriched in synaptic fractions and is reduced in the brains of patients with Alzheimer disease The other major omega-3 fatty acid found in fish, eicosapentaenoic acid, is virtually absent from the brain.
These findings motivated researchers to conduct animal studies that used DHA, rather than mixed omega-3 fatty acids, for intervention studies aimed at reducing Alzheimer disease brain pathology in transgenic mouse models.
This was another randomized, double-blind, placebo-controlled trial of DHA supplementation in a group of people in the early stages of dementia.  They were included in the study only if they were medically stable, and didn't already consume significant quantities of DHA in their diet or as supplements.

Outcome measures were the rate of change in cognitive abilities as measured by standard tests, and scores on variables such as activities of daily living.  A subset of the sample underwent MRI to determine with the rate of brain atrophy had changed.  In addition, plasma fatty acids were measured before and after the study.  And they assessed the effect of DHA supplementation on progression of dementia in people with the APOE4 risk allele compared with those without.  The study lasted for 18 months.

Despite the fact that they enriched their sample for people with low baseline DHA levels, they found no evidence of a benefit from DHA supplementation either in slowed progression of dementia or of brain atrophy.  They did find some weak indication that carriers of the  APOE4 allele who received DHA rather than placebo had somewhat slower progression, but the analysis wasn't adjusted for multiple testing, which could erase the apparent effect -- "weaken the interpretation that this effect is clinically meaningful".

The authors say their results are strongly negative, but the possibility that starting supplementation long before the initial stages of dementia, when the architecture of the brain is starting to change, long before cognitive effects are noticed, could be preventive can't be ruled out.

So, like any complex trait, dementia is a heterogeneous collection of phenotypes with multiple causes. Tests of things like ability to carry out activities of daily life can't produce a detailed picture of what's gone wrong structurally in the brain -- nor are they meant to -- and so can't be a way to help sort people into more homogeneous groupings.  Indeed, a subset of the study group carries the Alzheimer's risk allele, APOE4 and these people may have responded differently to DHA supplementation (APOE4 is an allele that confers somewhat increased risk of dementia, but by no means guarantees it--but see NOTE below).  It's possible that there are other variables that could be used to create more homogeneous groups that might better respond to DHA -- or the B vitamins -- but what they are is not known.  So, a mixed grouping of phenotypes is one possible explanation for the negative results of this study, and heterogeneity is always a potential problem with complex traits.  But even so, it's very unlikely that a single dietary component is going to be either the cause or the cure for dementia, or it would have been found by now.  Science is very good at finding causes with large effect. 

But we conclude as we did when we wrote about Alzheimer's disease here: Whether we're wishing for something that's not possible in the face of causal complexity, or a stunning treatment will answer our wishes, nobody knows.  That's what keeps us tossing resources down the well.

NOTE:  We were just meeting yesterday with an expert genetic epidemiologist who works intensely on  Alzheimer's disease.  He told us that although many studies have implicated the APOE4 allele in Alzheimer's, it is still not clear that it is that gene rather than something else nearby on the same chromosome that is actually responsible--even though the E4 allele has been institutionalized as a major 'success' in disease mapping and chronic disease genetics.  So even this, largely because of its modest effects and the complexity of variation at the chromosomal level, is very difficult to nail down definitively.

Tuesday, November 2, 2010

Alcohol -- the new heroin

We're working on some posts related to the tidal wave of new genetic determinism.  But we want to do that carefully and properly rather than hastily.  Meanwhile, there are other Hot Items to post about:

There's a story on the BBC radio and website about a paper in The Lancet reporting a new classification of illegal drugs, along with alcohol and tobacco.  They find that according to their classification scheme alcohol is more destructive than any of the illegal drugs, including heroin and crack cocaine.

The authors of the paper applied "multicriteria decision analysis" (MCDA) to the problem -- sounds scientific, doesn't it? (Indeed, there's a whole journal of MCDA; the idea is to allow decision makers to apply 'scientifically sound' criteria to policy decisions that are otherwise essentially subjective.)
Members of the Independent Scientific Committee on Drugs, including two invited specialists, met in a 1-day interactive workshop to score 20 drugs on 16 criteria: nine related to the harms that a drug produces in the individual and seven to the harms to others. Drugs were scored out of 100 points, and the criteria were weighted to indicate their relative importance.
MCDA modelling showed that heroin, crack cocaine, and metamphetamine were the most harmful drugs to individuals (part scores 34, 37, and 32, respectively), whereas alcohol, heroin, and crack cocaine were the most harmful to others (46, 21, and 17, respectively). Overall, alcohol was the most harmful drug (overall harm score 72), with heroin (55) and crack cocaine (54) in second and third places.
This is an interesting conclusion in its own right, and the issues of prohibition and legalization have been widely discussed on the BBC and elsewhere as a result because the harmfulness of alcohol is largely due to its legal and thus widespread use compared with illegal substances.  But this also tells us something about the limitations of science as it pertains to policy decisions.  The authors of the study recognize upfront that there is much subjectivity in the assessment of 'harm' and 'danger' and have tried to at least take that into account in their classification scheme using MCDA.  But, the measures are still subjective, and that undermines their 'scientific' nature, to say the least.

More importantly, the results will be ignored.  Policy decisions about drug legalization will still be political.  We learned long ago that we cannot successfully ban alcohol from society.  We are not yet to the stage where even marijuana can be widely legitimized.  No matter what the science says, we won't ban alcohol, you can bet your beer on it.  And despite attempts to stifle it, there is no sign that we have the gumption to ban tobacco (nor, according to a great many people who smoke, sniff, or chew, would that be in any way justified so long as the only harm is self-directed).

Similarly, matter what the science says, we simply are not going to legalize all recreational drugs. Why?  This is a cultural view, not one about the facts.  We won't, at least not in the US and Britain.  No matter how much damage the drug lords in Mexico or Afghanistan or Colombia cause to our own population.  No matter that it is our usage here in the US and Europe that is responsible for the trade.  We are not going to legalize it, even if it's as harmless as a baby's smile (it isn't, of course).  Culture will have to lead any such change.

Monday, November 1, 2010

Genes for how we think and what we think

Several people have sent us the latest genetics of political science story that has been getting attention all over the web (e.g., here and here).  The paper referred to in these stories just appeared in The Journal of Politics with the title, "Friendships Moderate an Association between a Dopamine Gene Variant and Political Ideology."  People become liberal, the authors claim, if they've got a particular variant of the "novelty seeker gene", DRD4, and had a lot of friends in high school.


This is a follow-up to a 2007 paper by two of the same authors, Fowler and Dawes, reporting two genes that predicted voter turnout in twin studies, although these genes are also moderated by environment, they say; turnout depends on the association between genes and exposures to religious social activity.  They conclude that these findings are important to "how we both model and measures political interactions."


And a follow-up to a 2009 paper reporting the gene for partisanship and joining of political groups (the DRD4 gene again).  Among other papers. 


Now, these are only a few of the authors responsible for the recent large crop of papers reporting genes for voting behavior and political ideology and other behaviors in the realm of political science, but they are particularly visible, and publishing the latest paper just before the November elections -- gee, wonder why -- certainly helps.  The headline in the Fox News story about this work is "Researchers find the liberal gene", and they then go on to say that "liberals can't help themselves, it's in their genes".


OK, let's stick with this for a minute.  Much as we wanted to, we couldn't ignore what's below the headlines, the meat of the piece:

"Ideology is about 40 percent heritable. It's almost half genes and half environment," Fowler told FoxNews.com. What's more, he said, any trait that can be inherited has potentially been with the human race for a long time, meaning political ideology has been a part of us for tens of millions of years.
"If it's really the case that genetic variation is influencing ideology, this isn't something we've been living with for the last ten years," he told FoxNews.com. "These are processes that have been going on for the past million years."
Fowler suggests that it made more sense to be liberal in certain environments at specific points in human history, and in others a conservative ideology was merited. "And this is what made it possible for our species to survive," he said.
"If it made sense for us all to be liberal, natural selection would have made us all liberal."
There's so much wrong here, and if the quotes are accurate, it's Fowler who's wrong, not the journalist.  Fowler may 'know' that DRD4 is a dopamine receptor gene, the receptor for a neurotransmitter, and he may think that political ideology is a result of gene by environment interaction, but the Fox story suggests that he believes he's found a gene 'for' liberalism (in spite of the sentence in the paper itself that rightly cautions that "perhaps the most valuable contribution of this study is not to declare that ‘‘a gene was found’’ for anything, but rather, to provide the first evidence for a possible gene-environment interaction for political ideology").  Emics and etics -- what he says is not what he does.


And,  "...political ideology has been a part of us for tens of millions of years"?  "These are processes that have been going on for the past million years"?  Let's see, Homo sapiens dates from when?  100-200 thousand years ago?  Were our Homo erectus ancestors liberals or Tories?  Being liberal sometimes and conservatives others is what allowed our species to survive?  Who knows what that could even mean.  And of course Fox News picks up on the idea that natural selection didn't make us all liberal, now did it?  No it didn't, because it didn't make sense.


So, if political ideology is genetic, there must be a gene for conservatism too?  In which case, why do we see the kinds of swings in voting patterns in the US that we have seen in the last several decades, and are likely to see this week?  It can't be that it's primarily those who are genetically programmed to be liberal who vote in some elections, and those who are conservative who vote in others.  People actually do change their minds.  Oh, there must be a gene for that, too.


But much more important than these silly quotes, or even than the details of the paper, this paper is an indication of the use and abuse to which genetic data are currently being put.  In a day or two we'll write about the data set that this study was actually based on but for now let's think about what it means to even ask whether there are genes for voting behavior.  (And, by extension, if how you vote is due to gene by environment interaction, and in this case, the 'environment' is the number of friends a teenager has in high school, couldn't it be that there's a gene for the number of friends we have? Well, it won't be a surprise to hear that the answer is yes if you believe studies done with the same National Longitudinal Study of Adolescent Health -- paid for with your tax dollars -- used in the latest Fowler paper.)


It isn't irrelevant to ask who wants to know, who's asking about genes for behavior.  Or why they find these kinds of questions so interesting.  Or whether this kind of science should just be smiled at tolerantly even if it's bogus.  Or what we should think to do, at this stage and before it's too late, if it happens to be correct.


There are many issues here, and they involve the spectrum of things from personal politics (ours, the investigators, and yours), the nature of legitimate science, of criteria for scientific inference, of vested interests, of politics, of ideology and tribalism and much more.


One thing is clear.  Like it or not, this search for genes for traits related to how we behave, or for how well we think and even for what we think, is a wave, perhaps a tidal wave, of a kind of Darwinian determinism sweeping across the political spectrum, led by the United States but (as so often happens) with followers in Europe and Asia (mainly these days, China).  It is not idle, ivory tower science.  Whether it is yet another passing fancy, or something more serious, time will tell.  But for many reasons, this like other aspects of genetic determinism these days is not going to be changed because of the facts: it is a commitment, and taxpayers are going to be paying for it for a long time to come, for better or worse.