Thursday, September 30, 2010

Genetics the right way....er, at least part-way!

A story about a gene for migraine headaches is making the rounds of health-related websites.  A paper published in Nature Medicine on Sept 26 reports that mutations in a gene that encodes the potassium ion channel TRESK K2P are responsible for common migraine with aura (one such aura is depicted in the picture to the left, though the artist says it doesn't do the temporal/spatial experience justice at all).  This ion channel was already known to have a role in pain pathways and as such has been a target for anesthesia, and other ion channel genes have been found to be associated with rare forms of migraine, which is why the authors decided to investigate KCNK18, the gene that codes for this particular ion channel.  So far so good.

The authors found that a mutation in this gene 'segregates perfectly' in a large family -- that is, everyone with migraine in that family had the mutation (which causes a frame-shift that results in a truncated channel), and no one without migraine had it. They then looked for expression of this gene in 'migraine-salient areas', and found it where it was expected. So, this was a good use of a priori knowledge about the biology of a trait to zero in on a gene that may cause migraines, and of course the possible sequel to the story is that it may thus be a therapeutic target.

'Segregates perfectly', however, raises the question about whether all those who did not have migraine in the family did not carry the mutant allele.  As it turned out, no one without migraine in the family had the mutation, so it did segregate perfectly (we have to assume that no unaffecteds in the family later became affected, which may not be a trivial point)  But then the researchers went on to sequence this gene in unrelated people with and without migraine0--a case-control comparison. This, it turns out, was problematic.

Unfortunately you have to go to the Supplementary information to find this out -- and we have to say that burying important data in the supplementary tables, without clearly spelling out its implications in the regular text that everybody (including reporters) reads, can be a cynical way to take the focus off serious issues.  We hope that we haven't somehow misunderstood the supplementary data, but of the hundreds of KCNK18 genes these authors sequenced, they found 14 different mutations in the gene, most of them very rare but more importantly, none of them statistically more common in those with migraine than in those without.  The frame-shift mutation that may be responsible for migraine in the one family is very rare as well, and certainly comes no where close to being 'the' gene, or even a primary gene, for migraine, even if it does explain it in this one pedigree.

Now, the fact that migraine generally takes a trigger -- or people with genetic susceptibility would have excruciating headaches all the time -- means that people with a genetic predisposition but who have never met their trigger may still be susceptible.  But it also means that this is one of those disorders that's 100% environmental, and 100% genetic.  And it makes the choice of controls difficult or even uninformative, when anyone with a truly causative mutation might be unaffected because they haven't been exposed to the provocative environment.

But this is a trivial problem with respect to this study, since so few cases or controls have a mutation in this gene. In this family, too, it could be that the effect is manifest only in some genetic background that was not varying in the family which would generate a 100% concordance pattern but would implicate the particular gene only in that background.  In other words, suppose a variant at some other gene, X, was shared by all members of this family, and the detected effect at KCNK18 only occurs in people also carrying the gene-X variant. That is by no means implausible.

We don't want to complain all the time, although resistance to poor practices, vested interests and the like is important, in at least a small way, to help keep science on as close to an optimal track as can be done in the realities of human society.  We don't think people, especially scientists, should wander in dreamland, intentionally or avoidably oblivious to known truths.  There are enough unknown truths to deal with!

But you don't get published in Nature Medicine by being too forthright with caveats.  The current reward system encourages burying caveats in Supplemental material, or minimizing them.

We still say that, in this case, the discovery process seems to have been done the right way -- by putting biology first.  The investigators used the kinds of understanding of basic biology, as being about signaling interactions among cells, that we try to describe in our Mermaid's Tale book.  Neurological traits are about interactions among neurons, and they signal to each other by a limited variety of means, ion channels being a major component of that.  But it would still be wrong to yield to temptation and call this 'the' migraine gene, since it explains so few cases in the population, and is found in controls as well.  Neither can it be called a 'migraine gene', since one would assume that its function and evolution were not to cause migraines.

Searching candidate genes, in this case known ion channel genes for mutations, rather than blind genomewide association studies (GWAS), is a sensible way to find genetic factors.  Indeed, GWAS for common migraine with aura had not worked -- and that's no surprise.  In a family with clearly inherited migraine problems, to search the limited genome space occupied by ion channel genes makes biological sense, and that's what seems to have worked in this one family.  But if there are many causes of this trait, this one might not have been frequent enough in cases vs controls sampled from the general population, to be statistically detectable.  Or, if the variant is rare, to even have been included in the set of cases that were analyzed in the GWAS.

While we don't believe these authors have done much for migraine sufferers in general, in spite of the widespread attention this paper got (and this column in The Guardian explains why the attention was so sloppy), at least the approach was a right one.

Wednesday, September 29, 2010

Taking it on faith?

Well, here's a surprise....or is it?  The NYTimes reports results from a survey that tests peoples' knowledge about religion.  And guess who know their faith the best?  Atheists, agnostics, Jews, and Mormons.  Other 'Christians' fared worse, often not knowing even basic things about the tenets of their faith.  (Here are the results of that survey.)

Well, ignorant and proud of it might be one response: the whole idea of religion to many is that it must be taken on faith. Let the pastor or priest tell you what's what.  Don't ask questions.

This might be OK in the sense that if religion can't be proven by scientific means, one must get it from the experts in costume who must somehow get it more directly and authoritatively.  Details don't matter. 

But this is a difficult position to square with the widespread aggressive assertions of religion against what we actually know about the world, the real one, the one we live in.  These assertions purport to give real-world reasons for the Faith and against the marauding of science.  If science is actually so evil (especially, those nasty evolutionists!), then what are the counter truths on which its opponents' knives are whetted?

We won't say more about this than that many of our strident Christians seem not to have read the fundamentals, like the Sermon on the Mount.   At least, the 'religious' engage in enough greed, hate, and inequality to suggest that, as we're seeing in the US these days.  (And Christians are not alone: plenty of Muslims cite what is convenient from the Prophet, assuming those strapping bombs on their chests have actually read any of it.)

More relevant  is that this shows the symbolic, cultural, nature of the science-religion divisions in our society.  This is about cultural power and influence, partly perhaps in terms of access to wealth but probably mainly access to psychological and symbolic 'wealth'.  Don't bother me with the facts!  It's highly tribalistic in that sense, about membership, waving the right banner, and that sort of thing, more than it is about the facts of the world.

The story has some symmetries.  Many in evolutionary biology and genetics do not have a very sophisticated or complete knowledge of what is actually known, and cling to various beliefs of their own--beliefs that life is simpler or more deterministic than we actually know is the truth.  The belief in Darwinian essentialism--you are what your genes are--is as deeply invoked, regardless of the evidence, as theological beliefs.  Or, scientists proclaim as if science somehow could prove the falseness of religion (i.e., that there's no God), showing that they, too, haven't understood the limits of their own field.

So, when we're in a cultural conflict, what role do the facts actually play, and does knowledge of the essentials that are purportedly behind one's expressed point of view matter?  Of course, in a democracy it is perfectly legitimate to vote  however you want without having to give a reason--or even without having to have a reason.  So does a democracy decide relevant questions, such as the legality of stem cell research, based on facts of some sort (religious or scientific), or is this mainly about planting one's flag in the enemy's territory?

Tuesday, September 28, 2010

More on the potential of crowdsourcing

Maybe the Office of Rare Diseases at the NIH could do with some crowdsourcing.  A place for patients to go when all else has failed, this office is a repository for unsolved medical mysteries.  Even with a record of more unsolved than solved cases, this clinic is a source of hope for many, because they're in the business of mining all the information they can on rare conditions.

The patients who go there are desperately hoping for answers, but the clinicians who try desperately to find them are facing a major limitation of science.  Science is about causal regularity in Nature, and does rather poorly with unique observations.  Repetition and prediction are gold-standard criteria for understanding.  Yet these cases are essentially unique!

A story on the CNN website tells of two patients with undiagnosed diseases who spent five days at the NIH, the recipients of exhaustive testing for every possible thing that could be wrong.  The problem with rare diseases, of course, is that they can be so rare that only one case has ever been seen.  Or, at least rare enough that there just isn't enough data in the medical literature, or collective experience with the disease, to facilitate a diagnosis.  

Here's where crowdsourcing could be -- and often is -- useful, as we suggested about last week.  The huge numbers of people who populate the web, and perhaps intensely when they have a medical dilemma that concerns them, provides a potentially unique way in which repetition can be ascertained.  People seem to find each other, and relate their stories.   Of course, these are informal and not rigorous by normal clinical standards, but experience shows that sense can be made of the data, at least to some useful extent.

Doctors try as best they can to pigeon-hole their patients into categories for which there are treatments.  Most doctors work with small sets of patients, or in small communities, where they simply can't be expected to see very rare traits more than once or twice, if at all, and then not at the same time.

Taking advantage of huge population samples, assembling rare information to reveal patterns.  Crowdsourcing.

Monday, September 27, 2010

Is science killing God?

So, Stephen Hawking declares it was physics, not God, that set the universe in motion, and now scientists declare it was natural processes that parted the Red Sea (in a paper published in PLoS One).


New computer simulations have shown how the parting of the Red Sea, as described in the Bible, could have been a phenomenon caused by strong winds.
The account in the Book of Exodus describes how the waters of the sea parted, allowing the Israelites to flee their Egyptian pursuers.
Simulations by US scientists show how the movement of wind could have opened up a land bridge at one location.
Sustained winds of 63 mph, it turns out, could have maintained an open land bridge for 4 hours, long enough for the Israelites to cross.  

Is this proof that the parting of the Red Sea wasn't a miracle?  Moses (and therefore God) wasn't needed after all?  

Is science killing God?

We did assume that was the intent of this paper, at first glance.  If what has previously been deemed a miracle can be found to have 'natural causes', then no deity need be invoked, hands wiped, science wins again.   

But apparently we were wrong.  The 'competing interests' section of the PLoS paper says this:
Competing interests: The lead author has a web site, theistic-evolution.com, that addresses Christian faith and biological evolution. The Red Sea crossing is mentioned there briefly. The present study treats the Exodus 14 narrative as an interesting and ancient story of uncertain origin.
The lead author is, it seems, a computer scientist in the Department of Atmospheric and Oceanic Sciences at the University of Colorado.  If you go to his theistic evolution website, you see that he's a Christian who believes in evolution and, he says, in questioning everything.  We can only guess that, as a scientist, he wanted to convince himself that the parting of the Red Sea was scientifically explicable -- and that if he could convince himself of this, this would confirm his belief in the Biblical story.  Indeed, we don't know but we're guessing that, as a scientist, he doesn't want to believe in miracles, but if he can turn them into natural events, he's sold.  


What interests us is that we're surprised that this is the point of this paper since the motivation and interpretation of this simulation could just as easily have gone the other way, evidence that no God was needed, the Israelites escaped on their own, with the aid of sustained high winds.  Of course, we would have then pointed out that this wasn't going to be good enough, and that no amount of evidence will change minds on either side.


Let's assume for the moment that a Big Wind  could, in fact, part the waters and make a walkway to Promised Land.  Why would this mean that God wasn't needed?  No, it could have been God who made the Red Sea partable.  In fact, if we accept the plausibility of the biblical story, this natural occurrence happened just at the time the fleeing Israelites needed it.  How plausible is that?  Its probability is about the same as the probability that God made it happen.

There is simply no 'scientific' way out.  If you're a believer in the Biblical story, you can find a way to explain it, be that natural or supernatural. If you're a skeptic, the partability of the Red Sea has nothing to do with whether it parted at exactly the right time, or whether the Biblical story is historically true. 

Let's have a bit of fun with this.  The old saw "How odd that God should choose the Jews" would now make sense.  So one should comb for other similarly revealing stories.  God seemed to choose the Muslims, when the Patna, abandoned by Lord Jim, did not sink as the abandoning crew thought it would.  

Of course these are fictional examples.  But Saladin really did capture Jerusalem for the Muslims!  And what about the motive for God's anger when a party of Christians, the Donner party, was fatally snowed in as they tried to cross the Sierras for their promised land?  Or the salting of the water and plagues of locusts that beset the early Mormon settlers in theirs? (they survived, showing God's favoritism yet again--but to a different group).  And why did God arrange for the massacre of Jews--his alleged favorites--by Christian crusaders, not to mention the Nazis?


Even science cannot account for God's fickleness.

Friday, September 24, 2010

Massaging the news so that it’s shocking, brand new news rather than just neato news

A short piece called “Regimens: Massage benefits are more than skin deep” is the most emailed story on the New York Times website today.

And apparently all kinds of people are completely surprised, since, according to the article, so were the researchers!

At least, that's what we’re led to believe with snippets like,

“Does a good massage do more than just relax your muscles? To find out, researchers at Cedars-Sinai Medical Center in Los Angeles…”

And,

“To their surprise, the researchers, …, found that a single session of massage caused biological changes.”

Listen everybody, this was not shocking to any of these researchers nor to anyone who understands that if you become relaxed, less stressed, or happier, THAT’S YOUR BIOLOGY CHANGING.

Not only that, but the importance of massage and touch has been known for a long time, not just by scientists but by introductory psychology students.

Ever heard of Harry Harlow’s monkey experiments?

Touch is not just love but it's also life.

Now what IS ACTUALLY PROBABLY news here (but I'm not an expert in this field of research so I'm not sure), is that they found differences between the effects of Swedish massage and “light” massage.

Neato!

Massage therapists and massage connoisseurs probably already knew that there were different outcomes to different methods, but these scientists actually measured what those could be (going beyond merely interviewing massage-getters) by sampling hormones and immune system components in the volunteers’ blood.

“Volunteers who received Swedish massage experienced significant decreases in levels of the stress hormone cortisol in blood and saliva, and in arginine vasopressin, a hormone that can lead to increases in cortisol. They also had increases in the number of lymphocytes, white blood cells that are part of the immune system. Volunteers who had the light massage experienced greater increases in oxytocin, a hormone associated with contentment, than the Swedish massage group, and bigger decreases in adrenal corticotropin hormone, which stimulates the adrenal glands to release cortisol.”

Now that the effects can be better sampled and quantified, as made clear by this research, maybe more people will search for and apply effective (e.g. Swedish or light) "alternative and complementary" well-being practices.

All right, now after using all those CAPITAL LETTERS, I need a massage….

"Oh, don't be such a crybaby! Walk it off!" (says the harsh coach. And he's right!)

If this story doesn't make the point about the elusive nature of genetic causation, what will?
About 10,000 cases of breast and bowel cancer could be prevented each year in the UK if people did more brisk walking, claim experts.
Cancer is a disease of misbehaving cells.  But 'walking' can reduce the risk.  How can that be?

A tumor and its metatastic dissemination is a clone of cells, often if not usually descended from a single misbehaving cell whose misbehavior is inherited by the daughter cells it produces when it divides.  The initiating event may be a mutation in the usual sense (change in DNA sequence in some body cell, such as in the lung or intestine), or it may be due to some other trigger. But it is generally a clonal trigger.

So how can an organism-level exposure, such as to walking or other exercise, have anything to do with what happens to one your billions of cells?  The answer is unclear, but what is clear is that it works through confounding.  That is, it's not exercise itself, but something correlated with it.  In this particular case, if the story is correct, the confounder is obesity.  That leads to chemical agents circulating (related to normal body steroids, perhaps) that can modify cell behavior.  The less obese you are, goes the argument, the less of these mutagens are flying around your bod and bumping into DNA here and there, raising the odds the wrong gene in the wrong cell will be damaged.  Or something like that.

If this explanation is right, obesity is itself a correlate or confounder of the circulating molecules.  Thus walking is correlated with obesity which is correlated with the actual causative agent.  If so, it is easy to see why it is so damned difficult to find a 'cause' of cancer.  Genetic susceptibility variants that are measured are confounded by these other factors.  If undetected, the variant itself becomes an additional confounder.

The confounders are not perfectly associated with each other.  Not all who exercise are slim, not all slim people exercise, not all of either need have any given genetic variant, countless variants across the genome could individually contribute, and what about smoking, diet, and other factors, measured and unmeasured?

If there are too many such factors, even if all are measured, sample sizes for detecting their individual or combinational effects may be prohibitively large.

Welcome to the world of complexity!

Thursday, September 23, 2010

Crowdsourcing: large-scale, peoples' epidemiology.

Crowdsourcing is the outsourcing of a task previously performed by a small group or organization, to a much larger community.  As if you didn't already know.  People all over the UK, for example, have been painting garden snails with nail polish, swapping them with snails from their neighbors, and figuring out if the little guys know how to get back home.  (Yep, they do, it turns out, within 100 meters -- we know this because thousands of people have sent their data to the researchers for analysis.)

Of course, the Audubon Society in the US has been enlisting birders in the US to help in their annual Christmas bird count for decades, so the idea isn't new, it's just a lot easier to contact masses of people, and collate their data with computers.

And SETI, the search for extra-terrestial intelligence, and other astronomical things, has used many many people's home computers.  Crowdsourcing is cheaper than outsourcing, and involves the public in general rather than relying on experts who may or may not have blind-spots, vested interests, or rigid institutionalized structures.

So, we wonder, could crowdsourcing replace some clinical trials?

What if you want to know, say, the best possible treatment for acne?  You've taken your dermatologist's advice for years, religiously using that retin A, or taking those antibiotics, but maybe you're worried about contributing to antibiotic resistance problems, or you're just not happy with the results.

So you log onto the web, poke around a bit and eventually come to rest at a site called Acne.org.  Where you find some guy's homegrown regimen for treating acne, and you're tempted to dismiss it as yet another get-rich-quick-on-the-internet scheme, but hold on.  It turns out that this guy has spent years experimenting with and adjusting his use of both prescription and over-the-counter remedies, and based on his own experience, has found one that works.

And it works not just for him but for tens of thousands of other people as well, which we know from the message boards hosted by his website and from the many compelling before and after pictures posted in the gallery. 

This guy may be onto something. 

And in no small part because of the informal testing and testimonials of thousands of others who took the time to contribute their experience with both this guy's own regimen and other treatments they've tried as well.  If you want to solve your acne problem, you could do worse than start here. 

But of course acne isn't the only affliction for which you'll find this kind of coming together of people and information on the web.  There's a website -- at least one website -- for almost any disease or disorder, real or imagined, that you can think up, collecting stories, dispensing advice, selling products, and much more.  Granted, much of it will be the modern-day equivalent of snake-oil, but surely there's at least some gold in them thar hills.

But, how to separate the real from the fool's gold?  After all, what professional scientists are supposed to bring to the research table is training in formal study design and statistical analysis, and if that's missing, what's left? 

Indeed, the most obvious problem with many of these sites is that the contributors who write in tend to be sicker, more unhappy, non-responders, etc. than the general population, or even the population of people with that disorder. People who recovered just fine don't write in.  They've moved on.  Crackpots can participate and may be hard to identify.  People, including companies, could pose as jus' folks (this happens, for example, in Amazon's and various other companies' customer reviews). 

That is, there's rampant selection bias going on, so it is in fact often quite difficult or impossible to learn much that makes sense about cause, effect, or effectiveness of therapeutic regimens.  These kinds of sites would in no sense be replacements for traditional clinical trials.

So, there are serious constraints on the instances when this could work, but we can imagine that, if a disease or disorder is one that can be easily defined, possible treatments are readily available, and effectiveness can be easily documented, this kind of crowdsourcing approach to data collection could be brilliant.  There's no limit to the number of people who can contribute, the experiments are free to conduct and free to access, and we can all make up our own minds.  What's not to like?

In fact, established academia has an area of work called SEDA, for structured exploratory data analysis, which bears some resemblance to public science. The idea is to take less-than-ideal data or crude methods, to see whether there is evidence of an interesting signal.  If there is, then formal studies would be called for.  Or, SEDA can suggest causal hypotheses one might not have thought of.

Given the cost and also great size limitations of Establishment science, and the fact that vested interests of all sorts can have their fingers in the pie, crowdsourcing could become a very important, very inexpensive, very effective way to understand complex causal problems----and then to exert political pressure to respond more quickly than stodgy rules normally allow, to results.

Meanwhile, if you know someone with a stubborn case of acne, send 'em Acne.org's way.

Wednesday, September 22, 2010

The role of XMRV in Chronic Fatigue Syndrome remains unclear

A few weeks ago we posted about the confusion over research into Chronic Fatigue Syndrome (CFS), or myalgic encephalitis (ME), which is or isn't caused by the recently discovered virus, XMRV, which also does or does not cause a form of hereditary prostate cancer.

Science reports that a meeting of several hundred researchers took place two weeks ago to discuss the controversy surrounding the role of this virus in disease.  But, as one of the organizers of the workshop concluded, the field remains "a zone of chaos". The presence of XMRV has been confirmed in men with prostate cancer by some groups in the US, but not in Europe, and in some people with ME but not others -- by some labs.  Some labs have definitively documented contamination with XMRV, and thus can't confirm its role in disease, while others report finding it in most cases they've tested but few controls, suggesting it's real and not contaminant.  The ability of all labs that have reported results of tests for the virus, whether positive or negative, to detect the virus has been tested and all were able to, so it seems that the possibility that there's an easy answer to this, and the conflicting results are due to varying laboratory techniques is not to be.

The head of the National Institute of Allergy and Infectious Diseases, Anthrony Fauci, has asked an epidemiologist at Columbia to test blood samples from 100 CFS patients from four different areas of the US and 100 controls for the presence or absence of XMRV.  It's unlikely that throwing the results from yet another lab, whether positive or negative, into the mix will convince anyone on either side of this issue, however.  Meanwhile, we conclude this post as we did our previous post on the subject -- it's possible that the only way the controversy over whether XMRV causes disease will be sorted out is if CFS patients can be successfully treated with antiretroviral medications.

Precedent isn't very helpful.  Theories of disease come and go, and viral theories are no exception.  At one point that was cancer's general cause, but not cervical cancer which was due to promiscuity.  Ulcer was due to stress, not bacteria.  And so on.  It is not clear how we can know with confidence, but in some examples like these there is an answer, and a rather clear one.  In others, the theory simply fades away.  It will be very interesting to see how this one goes.

Tuesday, September 21, 2010

The evolutionary abattoir, or Getting to the meat of the subject

A piece in the NYTimes on Sunday has garnered much comment both on and off the NYT website. In "The Meat Eaters", Jeff McMahan, professor of philosophy at Rutgers University, laments the cruelty of carnivory and says:
If I had been in a position to design and create a world, I would have tried to arrange for all conscious individuals to be able to survive without tormenting and killing other conscious individuals.  I hope most other people would have done the same.
He also says that it's not possible to reconcile the fact that carnivores must kill to live with the idea of a benevolent god.  He muses about the possibility that all carnivores could be allowed (encouraged) to become extinct, or to become herbivores.
Suppose that we could arrange the gradual extinction of carnivorous species, replacing them with new herbivorous ones.  Or suppose that we could intervene genetically, so that currently carnivorous species would gradually evolve into herbivorous ones, thereby fulfilling Isaiah’s prophecy.  If we could bring about the end of predation by one or the other of these means at little cost to ourselves, ought we to do it? 
In stark Darwinian terms, however, life is only about surviving to reproduce -- Darwin's dangerous idea, to quote Daniel Dennett. It doesn't matter how an organism survives, just that it does even if that involves murder and mayhemHappiness, pleasure, the right to die a natural death, none of that counts one wit in evolutionary terms.  Nor do good and evil, or selfishness or morality.  These are purely human conceits.

Indeed, in his Autobiography, as recorded by Randal Keynes in Darwin, His Daughter and Human Evolution, Darwin wrestled with whether there was 'more of misery or of happiness' in the life of all sentient beings, 'whether the world as a whole is a good or a bad one'.  His reply, writes Keynes, "was halting and flat, and eloquent in its weakness". 'According to my judgement," Darwin wrote, "happiness decidedly prevails, though this would be very difficult to prove.'

His reasoning was that unhappy people would be less likely to reproduce than happy people (he was thinking only of humans, not lions or crocodiles).  This is of course absolutely consistent with his view of life, which was that it is only about survival and reproduction.

Anything else that humans want to lay on to this, imbuing life with purpose or morality or laws or meaning, is a creation of the human mind.  It is we who give life meaning, not something we find in nature.  If Jeff McMahan wants to argue that it is our moral duty to eliminate carnivory, and he can convince science and society to go along with him, more power to him. 

Of course, by that reasoning we should let ourselves, as great a predatory carnivorous species as there is, go extinct.

All organisms die and mostly it's not pleasant.  If teeth and talons don't do the job from the outside, parasites do it from the inside.  The other ways organisms die are at least as nasty as being eaten alive.

But there is a lot of anthropocentric arrogance in McMahan's article, belying his non-evolutionary view. Why does he restrict his prescription to 'conscious' animals, and what does he mean by that.....and how does he think he even knows who's conscious?

Do fish or flies not fear?  Their behavior obviously suggests that they do, even if not in the same way we do. After all, to dismiss their protective/escape behaviors as just instinct or reflex misses the point that our behavior is just neural signaling and similar mechanical processes.  And for that matter, who says carrots like to be pulled up from the ground where they're just motoring along minding their own business, and then skinned and eaten (or boiled) alive.
It seems that Darwin has given everyone a license to invoke whatever arguments they want about the nature of nature. It's a problem for those who like actual science, whether or not we let ourselves believe no animal was killed in the making of that burger we had for lunch.

It won't be any better, in fact, when Monsanto actually does learn to synthesize steer muscle in a dish, because even individual cells struggle to stay alive.  Life is an abattoir, even if we choose to believe that empathy and humane-ness are good, for our own psyches if nothing else.

Monday, September 20, 2010

Clinical trials on trial

So, how do pharmaceutical companies figure out whether a new drug is effective or not?  Or whether it saves lives?  The Sunday New York Times has a front page story about a drug trial of a new treatment for melanoma (skin cancer) and answers these questions, while also asking about the morality of drug testing as it's currently carried out.

Generally, drugs are tested in double-blind case-control studies.  Some subjects receive the treatment being tested and others, the controls, are given either the existing therapy or a placebo, and in a double-bline study neither the subjects nor the physicians know which group a patient is enrolled in.

Two young cousins had the extreme misfortune of being diagnosed with melanoma at about the same time.  Both were enrolled in a drug trial, but one was given the new therapy and the other the comparison state-of-the-art therapy.  The question being asked is apparently not whether the new drug 'cures' melanoma, but whether it lengthens the lives of those who are treated with it.  It was already known that the new drug improves quality of life, but the apparently the tumors of those treated with this drug in an early study returned.

As it turned out, the tumors of the cousin on the new drug shrank immediately and dramatically, while  those of the cousin in the control group did not.  In fact, he has now died from his cancer, while the first cousin has survived past the average longevity for people diagnosed with the advanced melanoma with which both he and his cousin were diagnosed, that is, 8 months.  He's now been alive for 9 months.

One man was 24 and the other 22 when diagnosed, one had children and the other, the one in the control group, did not.  When it was clear that the cousin on the new therapy was in much better shape than the one in the control group, his mother begged the researchers to switch him from the old therapy to the new.  They refused, saying that would make their test results uninterpretable.  Indeed, subjects in their control group need to die at a higher rate than those given the new therapy if the new treatment is to be deemed effective.

Two things are troubling about this story, if it's an accurate portrayal of this drug trial.  One, it seems that it was already known that the new drug for melanoma produces dramatically better results than any previous therapy -- at least in the short term.
The standard chemotherapy used in melanoma, dacarbazine, slowed tumor growth in 15 percent of patients for an average of two months. By contrast, PLX4032 had halted tumor growth in 81 percent of patients for an average of eight.
That is, quality of life is greatly improved for those whose tumors are genetically disposed to respond (apparently 80% of people with melanoma) on this drug.  The thing that isn't yet known is whether the new drug allows patients to live longer.

Even so, it's ethically troubling to many, including many oncologists, that answering this question requires allowing patients -- the control group -- to suffer unnecessarily.  Should longevity remain the only outcome of interest, when it's clear that a new medication can reduce suffering -- even if it doesn't improve longevity, which it may?  We think not, and there is plenty of precedent for halting clinical trials when it's clear that the drug being tested is either significantly more harmful or more beneficial than the state-of-the-art treatment.  If the NYTimes story is accurately portraying this trial, it should be halted now.

Second and more troubling, according to the story it's already known that patients diagnosed at the stage at which these two young men were diagnosed survive, on average, for eight more months.  Why isn't that a good enough comparison statistic for this new drug?  Why subject more people to the pain and suffering the state-of-the-art therapy entails in order to confirm this statistic?

The stainless steel answer is that without a 'gold standard' double-blind clinical trial, one cannot accurately assess if, or to what extent, a new therapy improves on the existing approach.  This is a gold standard that often falls short for various reasons, and one of the general standards, we thought, was that as soon as you have good reason to think one approach (new drug) or the other (current treatment)  is clearly better, you stop the trial and put everyone on the better treatment.  Why this is not routine, or applied in this case isn't clear (to us).  Granted, the results may be premature, and as the story itself reports,

One of the melanoma field’s senior clinicians, Dr. Chapman had lived through trial after trial of drugs that failed to live up to early promise. Almost every oncologist knew, too, of a case nearly 20 years earlier when bone marrow transplants appeared so effective thatbreast cancer patients demanded their immediate approval, only to learn through a controlled trial that the transplants were less effective than chemotherapy and in some cases caused death.

So perhaps there is reason to be cautious.  But if 'all' the patients get out of this is improved quality of the last months of their lives, isn't that something?


Of course, all the usual questions about evidence arise here, but so do societal questions.  Clinical trials are very expensive, and results often show no effect after many years and much expense.  How easily could we be convinced that some new treatment really works better, if vested interests learn that with any sort of evidence the costly clinical trial can be abandoned?  In other words, if the standard is relaxed, will it be abused?  Yet, if not, are patients being abused who suffer or die because, so to speak, some professor wants to finish and publish a trial, or some government agency wants to fill out a 'satisfactory' grade on some form?

To cause or not to cause, that is the question.  The answer isn't clear.

Friday, September 17, 2010

Stemming the tide

Do you ever do anything that might destroy some of your cells?  Probably we all do that all the time, when we scratch, take actions that may lead to cuts, and so on.  We shed live and dead cells all the time.  So what, they're just cells!

But then if we're willing to do that, why do we care about stem cells?  If stem cells are humans-in-waiting, why aren't skin cells the same?  For example, if they're alive do they have souls?  If not, is there a reason not to engineer them so they can grow organs like skin, livers, or even whole people?

Does it matter if a cell made into a general stem cell, from some other type of cell, is used to grow tissues, organs, or even whole individuals came from a fertilized egg rather than a skin cell?  Does it matter whether the tissue you destroy is an early embryo (blastocyst), is from a liver biopsy?

These kinds of question show, we think, why the stem cell debate, and its legal back-and-forths, miss the main points that are at issue.  Each of us has a different idea of what constitutes a 'life' that needs some sort of protection.  The courts are simply the arenas in which different views are aired.

Everyone seems to have a reason to support or oppose 'stem cell research', but each may differ as to what that means.  It is difficult to understand why a blastocyst (a few-stage early embryo) is different from what would result from de-differentiating a skin or blood cell so that it could form other tissues, or even a whole person (the latter may not currently be possible, but soon will, we think).

We can't really appeal to evolution, or genetics, or even religion for answers.  This is a new technology.  The decisions really aren't about science (or, by the way, whether Francis Collins is a fundamentalist or not), they are about individual views about the world.  Is it better to argue against stem cells because your sacred text somehow deals with the subject? To our knowledge, no sacred text deals with the subject.  Or is it somehow less relevant to advocate stem cell research because it might help you recover from some disease (does God care about that, if he'll welcome you to Heaven some day?).  Or to clone yourself as a source for organ harvesting if you need a transplant some day?  Or if you support stem cell research because you can make a big profit selling the results?

These are issues that bring science and society into a mix, but decisions about it really have nothing to do with science.  They have to do with ethics and politics.  Indeed, in our kind of society they should, and must, be decided in a sociopolitical way.  Science about what kinds of cells can have what potential, is simply unequipped to determine whether those potentials should or should not be realized.

Science may have to comment on what is feasible.  But science cannot decide what is right.

Thursday, September 16, 2010

Not BP, Not BurP, but BPA....and no answers. Why?

After years of study and millions (or tens of millions) invested, a report shows that we cannot yet tell whether the plastic molecule BPA (bisphenol-A) , to which we are all exposed, in bottles and can linings, is safe.  BPA apparently acts like estrogen in experimental animals and cell culture.  But is it harmful to humans?

Some scientists, whose job it is to try to look out for public safety, are concerned and say that we should limit exposure, and they want to prove that there's good reason for that.  Some Republicans, whose job it is to look out for private pocketbooks, say that this is like yet another Commie plot, and they don't need any evidence, thanks very much (and, by the way, perhaps nowadays it's an Evolutionist or Islamic plot).

We don't know the answer.  Sometimes in issues like these, the safe and sane thing to do is ban the questioned substance and just do without what usually minor loss it would lead to (except in the pockets of a few in industry who make whatever-it-is).  In this case, however, the issues are less clear.  Food packaging is important or even central to our general health and well-being.  We have to have at least some food transported from large distances, and hence preserved in some way to make the trip without perishing.  It's not all junk food that's involved.  Most of us live in cities and know more about growing tired than growing carrots.  Indeed, without our distribution system, would we live any longer in the wild than an escaped laboratory mouse?

So these issues go beyond objection to private self-interest by industry.  But if science is so powerful, why can't we know the truth? 

One response is that people, exposures and people's responses to exposures are highly variable.  And, it's difficult to measure exposure, or even to know how to measure it, and people's responses can be so variable that it it's almost impossible to come to a conclusion about cause and effect.  The evidence is weak because the net effect, the average effect, is small.  However, for some people or some exposures, the effect could be huge.  Maybe they have an unusual combination of exposures, or an unusually susceptible (but rare) genotype.  It's very hard to know.

Alternatively, the effect could be similar to all of us but very, very small.  In that case, what we would need to know is whether any alternative to using BPA would be without at least as many negative consequences.  For example, if some people would have lessened nutrition and hence higher disease rates, if they were not able to obtain inexpensive foods preserved in part by the use of BPA.  Even if a small number were helped, is that number larger than those who might be negatively affected?

Maybe when the risks appear to be very small, the best policy is not to worry.  The risk may change year to year with changes in exposures or other factors.  It may not be evaluable. On the other hand, what if there is some cumulative or long-delayed effect (a concern with genetically modified plants and the evolution of resistance, for example).  Then we should try to determine this now, and stop the exposure.

We are constantly faced with such issues in evaluating causation.  We have no answers.  Unfortunately, nobody else does, either.

Wednesday, September 15, 2010

What constitutes open and honest debate about scientific issues, and how do we make one happen?

Here's a subject that keeps coming up: Does better science education increase the public's trust of science or scientists? That seems to be a common assumption, but a study done some time ago of countries in the European Union showed that those in which people scored lowest in science understanding were those in which people were most enthusiastic about science.
"We should not be surprised by this finding. A good education in science should lead people to ask questions about the impact of science," according to Lord Sainsbury. [the Former Minister of Science for the UK]' 
Further, Lord Sainsbury suggests that scientists need to engage in more 'grown up' dialogue with the public, to allow them to better understand the risks and benefits of innovations or new technologies.  Science is changing so quickly that government assessment and regulation of risk can't keep up, he says, and people feel they are being "forced to accept changes they had not been consulted over and seemed to offer them no benefit."

An example of a science dialogue done well, according to Lord Sainsbury, is the debate over stem cells.  The people were given the facts and scientists 'engaged the public in an open and honest debate'.

But what does this mean?  And did they really? A large proportion of the public, at least in the US, thinks that stem cell research is unethical.  Research using embryonic stem cells, at least, because they believe this is killing.  What kind of science education has lead them to that opinion?

If pressed, presumably many people could tell you that embryonic stem cells are harvested from frozen embryos (what happens to those frozen embryos when they aren't used for IVF or research is another issue not often considered), but could they tell you anything more about the process?  And should they know more before they make up their minds on the ethics?  Can science inform -- never mind decide -- the ethical questions? 

What about climate change?  How much do most of us actually know about the science behind the issue?  We'd venture to say not a whole lot.  And yet, most of us have an opinion on whether or not it's happening, whether or not humans are causing it, and whether or not we as a society should be doing something about it.

How much do we have to know about astrophysics to make up our minds about whether we should spend billions to go to Mars?  

Debates over scientific issues are rarely debates over the science.

And anyway, should scientists be trusted?  One problem is that science leaders are generally the ones consulted by the media about science.  But the media often don't, or can't really know what motivations, interests, or perspectives such a scientist has (usually and naturally it's often to promote his/her area of work).  And who or what decides who is a 'leader'?  History is loaded with leaders who were thoroughly wrong, even about their own field -- usually because our knowledge is always less complete than our conviction that we understand that knowledge correctly.

More of a problem, perhaps, is that government officials, such as Lord Sainsbury's Parliament, makes policy based on 'white papers', or documents drafted by staffers who consult.....the same kinds of 'experts'.  And that process is what moves money, often in large amounts.  However, the number of scientists in legislative bodies is paltry.  That means that those who decide depend on their generalist staffers who depend on the whimsical aspects of choosing experts or choosing among different advice.  Here, in Britain as in the US, influence and vested interests have a guiding, if often silent behind-the-scenes hand.  We have seen this first-hand.

If control over knowledge is power, science education should matter.  However, control over what we believe is knowledge can be power too.

Tuesday, September 14, 2010

"Pas ce soir, chérie! J'ai....pas d'intérêt!"

The French seem to speak the universal language of love, but the universal language isn't what one should be thinking of when one thinks of the French, apparently, according to the latest hot, absolutely necessary research study.  In fact, the word for it is also French: ennui.

According to Hollywood, or perhaps popular wishful thinking, French men are great lovers, but aren't to be outdone by French women.  Unfortunately, if the latest study is to believed, the majority are as not-tonight-dear as any other culture.  Perhaps yet another illusion we have to give up: the envy that people somewhere are living the life we'd live if it weren't for late-nite TV.

Naturally, in a fully civic-minded effort, French Pharma will be advertising for ways to goose up the activity level.  Another chance at maintenance meds, lifelong supplements (to blue movies, we assume).

Science on the march!  Soon, the whispers will be angry rather than defensive:  "What?  Again??!"

Monday, September 13, 2010

SuperSize me...NOT! And kudo's (yes!) to Francis Collins

We here at Penn State are like those at other (modestly self-designated) Research One universities.  We're following McDonalds and Supersizing.  We're building buildings like our (modestly named) Millennium research lab, to house the future largesse that will come from tarting up ever more unwary (or insufficiently modest) professors for ever more of their time to be on the streets hustling....hustling grants for life-science research, that is.  The Editor of Science, Bruce Alberts, has written a most appropriate comment about this.   As he says,
Policies that offer incentives for individuals and institutions can unintentionally induce harmful behaviors. One such perverse incentive encourages U.S. universities, medical centers, and other research institutions to expand their research capacities indefinitely through funds derived from National Institutes of Health (NIH) research grants. A reliance on the NIH to pay not only the salaries of scientists but also the overhead (or indirect) costs of building construction and maintenance has become a way of life at many U.S. research institutions, with potential painful consequences. The current trajectory is unsustainable, threatening to produce a glut of laboratory facilities reminiscent of the real estate bust of 2008 and, worse, a host of exhausted scientists with no means of support.
The British Science Minister, David Willits, echoes much the same sentiment (and he a Tory!). 

Science Minister David Willetts has said the research-teaching balance has "gone wrong" in universities, after defending cuts to science research.
Addressing vice chancellors, he said he was shocked by how little teaching was valued in lecturers' promotions.
Universities that relegated the importance of teaching risked "losing sight" of their mission, he said.
We feel  (modestly) that they're simply copy-catting our earlier post on the universities as outsource odd-job hustlers.



We note also that Francis Collins, who we feel free to criticize when he earns it, has made similar statements and also recognizes the problem.  As Alberts writes, "NIH Director Francis Collins has boldly stated that "it is time for NIH to develop better models to guide decisions about the optimum size and nature of the U.S. workforce for biomedical research." 


He may be our national Sugar Daddy, but there's only so much sugar in his bowl.  He's doing the right thing by cautioning that endless growth cannot go on.  Willits points out that teaching (what's that?  we've forgotten) should have at least some tiny role in universities, even if it doesn't generate fancy buildings and funds to travel globally to talk about our stunningly important research.


Well, we doubt Francis or these other authors are actually Mermaid's Tale viewers.  Still, the point is that even in good economic times unconstrained growth can't be sustained.  And we're not in good times and there's no prospect of that the main source of sugar, NIH, will keep pace.  With the prospect of a grant actually being funded now around 10% much of the time, that's a lot of street-walking that has to be done to turn enough tricks to make a living.


Universities apparently, and perhaps mindlessly for organizations supposedly housing intelligent people, think they can continue to milk (or bleed) the system for their spending money, by turning their faculty into grant serfs funded by outsourced research grants and contracts.


It has to come crashing down if it's not let down gently.  We used an image of rusting hulks of former research buildings in an earlier post, and Alberts and Willits invoke some similar thoughts.  The system is in deeper than this even, as there is the surrounding growth of tech-gear and journal-publishing industries who depend on the research mill's products.  


Willits wants U research to be restricted to things with economic impact--an old Reagan saw that is just the opposite of what should be the case: business should take care of business and stop outsourcing.  Universities should do basic research and teaching to prepare students to go into business and do the research the business needs.  Of course 'basic' research should be relevant--no support for searching for all 1000 genes that affect the length of your earlobe.


Universities are on a bubble and they don't want to face it.  National student loan debt is higher than mortgage debt.  There will be defaults and parents and students will stop going in the hole only to be processed like sardines by part-time instructors.  As with most bubbles, each one on the surface hopes or pretends s/he'll out-smart the system.  But bubbles burst.  We need some slow, orderly, humane, sensible deflation of the bubble.

But until the faculty force change from below, nothing will change without the 'pop!' of personal tragedy among the army of the hopeful (the faculty).  Administrators, far too numerous and high on the drug of large amounts of money they didn't earn and brand-new research buildings, can't do it or at least won't.  All are afraid to be first to DownSize rather than SuperSize, even if high quality people would flock to their institutions for a breath of fresh careers.  Will change will have to come from hemorrhage, controlled trephanation, or, shall we say, from disorderly agitation from the street.

Friday, September 10, 2010

Algae: in your stomach and in your tank!

Craig Venter is probably the biggest, or at least best-known biodreamer of our time.  A recent story in the NYTimes reports his (modest) attempt to create living cells, like algae say, that can be grown in tanks to replace fossil fuel.

Now Dr. Venter is turning from reading the genetic code to an even more audacious goal: writing it. At Synthetic Genomics, he wants to create living creatures — bacteria, algae or even plants — that are designed from the DNA up to carry out industrial tasks and displace the fuels and chemicals that are now made from fossil fuels.
“Designing and building synthetic cells will be the basis of a new industrial revolution,” Dr. Venter says. “The goal is to replace the entire petrochemical industry.”

Exxon has been saying "Put a tiger in your tank" for decades, but will it seem as sexy to say "Burn your bugs!"?   Chemical life, artificially engineered but self-replicating like real life--it will actually be real life, would presumably provide a sustainable source of solar energy converted to portable energy.

But wait!  What if we could make it taste like beef, as Venter asks?  You could SuperSize your AlgaeBurger!  Could the mustard and ketchup industry keep up?  If so, that would have an additional, presumably positive ramification:  Vegans could BurgerSize with a completely clear conscience, too!  Probably even Jaynes, unless killing algae to eat them would be viewed as immoral.

This is science fiction today, but if it comes to pass it would be quite important.  But it would not be a conceptual breakthrough but simply a feat of bioengineering ('simply' in the sense that like GM plants or many other kinds of recombinant DNA technology, it's within the realm of normal science). 

The idea of miracles in a tank has appeal to our age, but a certain Thomas Malthus' ideas will sooner or later rear their heads. More food and more fuel will lead to more people, better fed and longer-lived.  Unless Craig can make algae that produce water and apartments, not all of our problems will be solved.

At least that's what we think.  He thinks bigger than anyone, so we're sure he'll have solutions for those problems, too.

Thursday, September 9, 2010

Francis Collins, not PT Barnum, and The Greatest Show on Earth

We have a few more thoughts about the pronouncements of Francis Collins, Director of the NIH.  We've recently commented on the issue of the relevance (or not) of his Christian fundamentalistic religious views.  But we are more concerned with his level of advocacy for genetics research as the cure for all that ills us.

We don't mean to keep pounding at Francis, who is a good and honorable person and an effective manager of major scientific enterprises.  But we feel that science is misrepresented by misstatement and overstatement, and that this has policy as well as scientific consequences.  If we say something out of proportion, nobody will care, and no money will shift pockets as a result.  But we don't lead the NIH (or edit a magazine like Nature).  So, when Dr Collins lets loose with his Barnumology, we have to react.  Otherwise, too many people simply accept, then repeat, then build into their work, the same statements.  We've seen this again and again, even in science, in the way pronouncements laden with vested interests become quickly adopted (dare we mention GWAS here?).

Not too many years ago Dr Collins was quoted as predicting 'silver bullets' to cure all sorts of ailments, and that they would be around more or less at this time -- 2010.  That was when he was Director of the Genome Institute, and the Hustler in Chief for their budgets. Now that he's Director of all of NIH, and wanting to steer our Ship of Health basically in the genomics direction, he's hyperbolizing in even more grandiose ways.


In his recent book The Language of Life, he says “There is no other scientific enterprise that humankind has mounted in an organized way that compares to this. I am sure that history will look back on this in a hundred years and say, ‘This was the most significant thing humankind has tried to do scientifically.'" 

Once the genome was sequenced, Collins directed his enthusiasm toward the medical “revolution”—his term—that would result.

We do genetics and genomics every day, using the results of work he sponsored.  It has been and clearly will continue to be important, and remarkable.  But we wonder whether anyone takes even a millisecond to think about such statements, much less to call him on them.  In a few milliseconds of our own, we wondered why and how the following would be judged to be of lesser impact.  So, what about:


1.  A modest development of human knowledge and technology that allowed our species to multiply a million fold in not too many generations?  It’s called ‘agriculture.’

2.  The discovery that the universe was controlled by inferrable laws?  It’s called Galileo, Newton, and Copernicus and the development of modern physics with all that has influenced of our way of life.

3.  The discovery of entirely unknown human inhabited planetary lands?  It’s called the Age of Sail, and due to organized policies and navigation technologies, led to transformative colonization and world trade.  Thank that for the bananas you had for breakfast and the spices you'll use on dinner.

4.  The discovery of ways to nearly entirely wipe out diseases that affect extremely large numbers, if not the majority, of people for thousands of years?  It’s called microbiology and antibiotics, and Pasteur made confrontation of infectious disease a major part of government.

5. The discovery of how everything (even including genes!) works.  It’s called chemistry.

6. The discovery of ways to have portable power.  It’s called the age of petroleum.

7.  The discoveries of ways for rapid mass movement.  It’s called internal combustion engines.

8.  The discovery of ways for rapid miniscule movement.  It’s called electronics and has totally transformed life in less than a century, including the internet and computing on which Francis Collins makes his pronouncements.

We could go on.  But the point is to continue to resist the self-interested promotion with hyperbole that is not justified, especially when it’s designed to pry resources from your pocket to satisfy someone else’s interests or vision, that he can’t simply pay for out of his own pocket if he wants to do it.

If you want to rate the improvements in technology by how many people it has helped, or how it’s transformed human society, genetics is pretty far down the list.  High up on that list would be water purification techniques.  Or even window screens that keep out mosquitoes.  

And then, of course, there are other areas of human life that could be helped with a bit of investment.  They’re called the arts. They help cure the spirit, at least as important as curing the body.

Tuesday, September 7, 2010

The Gospel of St Francis....Collins, that is

The Sept 6 New Yorker has a story about Francis Collins, head of the National Institutes of Health -- his life, his religion, and how he thought he had resolved the debate over the use of embryonic stem cells.  He is, as everyone knows, a born-again evangelical Christian, so naturally people wondered where he would come down on this issue. He's often treated as if his word is Gospel when it comes to health; but he's not Catholic so he's not St Francis.

Indeed, people wondered how he'd come down on many issues, given his religious bent, but he assured us that this job was separate from his religion (he was quoted in the New York Times at the time of his nomination as saying, “I have made it clear that I have no religious agenda for the N.I.H.,” he said, “and I think the vast majority of scientists have been reassured by that and have moved on").  And indeed, his friends all assured us when he was appointed that he wouldn't let his religion guide his decisions about science.

Oh, really?  As the story in the New Yorker puts it, the way he made his decision about stem-cell research was to reconcile his views on stem cells with his Christian morality.  He ultimately decided he was in favor of embryonic stem cell research, but not before he wrestled with the decision from his religious point of view.
Before Collins had a direct say in the Administration’s decision on stem cells, he was personally torn by the ethical questions posed by stem-cell research. He has long opposed the creation of embryos for the purpose of research. He sees a human embryo as a potential life, though he thinks that it is not possible scientifically to settle precisely when life begins. But Collins also feels it is morally wasteful not to take advantage of the hundreds of thousands of embryos created for in-vitro fertilization that ultimately are disposed of anyway. These embryos are doomed, but they can help aid disease research.

Although his feelings about embryonic stem-cells are indeed pro-research, and satisfied scientists, particularly those doing stem cell research -- he was quoted as being 'stunned' by the legal decision two weeks ago to halt stem cell research -- it can't be denied that he filtered his thinking through his religious beliefs.  Just as those who were not in favor of his appointment worried that he would do -- and those who supported him assured us he wouldn't.  If he makes the 'right' decision based on his religious views, in the minds of scientists who want to be free to do whatever they want to do, is that now okay?

We all have a moral filter of some sort, and make decisions based on what we think is right and wrong, but one would hope that the head of the organization that guides the direction of research into fundamental life and death matters would have a more nuanced, less superstition-driven basis for those decisions.  Indeed, in his official position, with his hands on so much of the people's money, his decisions should be driven by what would have the greatest impact on the health of the most people.  We trust him when he says that he doesn't believe he has a religious agenda for the NIH, but when his basis for decisions is his evangelical faith, what else can you call it?

And, it must be said that our own primary concern when he was selected to head the NIH had more to do with his genetic and technology fundamentalism than his religion -- a concern that the New Yorker story doesn't acknowledge, though surely we weren't the only scientists who felt this way.  Dr Collins is a technophile's technophile, and there are many reasons to challenge the unrestrained investment in hypertechnologies as the main mission of our national heath institute.

Friday, September 3, 2010

No God says Authority! Really?

So Stephen Hawking has a new book in which he proclaims that the universe was not created by God but that it created itself, so to speak, spontaneously.  The eminent Dr Hawking says:

Because there is a law such as gravity, the universe can and will create itself from nothing.
Spontaneous creation is the reason there is something rather than nothing, why the universe exists, why we exist. It is not necessary to invoke God to light the blue touch paper and set the universe going.

But why would this constitute news except to story-dry science journalism? In what sense does he have any more authority on such a subject than, say the Pope or authors of any sacred texts?

Hawking argues, perhaps proves, that our something could have arisen out of somebody else's nothing purely by self-driving physical processes.  We have no reason to challenge that assessment (and haven't read his book yet).  But the fact that he's a famous physicist gives him absolutely no  'authority' about such issues.

For one thing, God could have created the rules of physics and the starting stuff in which this happened.

The point is not to get into the God food-fight, but simply to say that while we might wish to invest authoritative wisdom to someone, to cure our angst for an answer, it is not to be found in this kind of scientific pronouncement (and likewise for those who say that since they can explain everything by evolution, therefore there's no God).

Everyone can have an opinion, and can rest it on whatever facts s/he takes as most cogent.  Scientific arguments may be able to account for a phenomenon.  As scientists we have perhaps a duty to point out when religious arguments about the material world are simply wrong (for example the young-earth or the Intelligent Designer arguments). There, authoritativeness has meaning and it's OK for the news media to report it as such.

It's also fine to cite Dr Hawking's reasons for why what we call the universe could have self-ignited.  But it is a continued misrepresentation of science, especially by journalists, to headline scientists' opinions as if they carry authority, when in fact they don't, and can't.  Making that kind of hero, to sell copy which is the bottom line here, is wrong.

We should have a more proper public understanding of what science actually is, and what it isn't.  Science journalists should be more properly trained.

Even if in a given case we would choose to listen to someone who knows a lot about the world to see why he thinks as he does.

Let's stop spending (wasting) money on science!

The flood of news from our enlightened nation makes it clear that we are wasting nearly all the money we spend on science (except for helping Hollywood develop better ways to do computer animation, and  chemists find substitutes for Botox).  In spite of the masses of convincing evidence, somehow, after all the billions spent on collecting that evidence and on teaching it in the schools, a lot of our fellow citizens (if Glenn Beck's rally this past weekend can be our guide) have decided that evolution is not true, that Darwin is to blame for 'progressivism' and thus racism and eugenics, and probably the climate change believers as well.  Apparently, we need our nation to 'return to God' (presumably Mr Beck doesn't mean commit suicide).

And these people actually have votes!

If this is how effective teaching is, let's do away with it.  Anyway, it would be much much cheaper to distribute a free taxpayer subsidized copy of the Bible to every American (but just the real ones, the ones born here), cheaper than expensive research studies of evolution whose net effect appears to many just to reinforce their conviction that this is a devil-driven mission of secular humanists to send everyone to Hell.

Then, after we've saved all that money currently wasted on researching and teaching evolution, let's move on to another big money waster.  Why all the investment in studies of climate change and global warming?  After all, those who already know better about evolution have also seen through this as just a (that is, another) ploy by universities to garner grant funds to support their soft heathen (that is, left-wing) lifestyles.  Despite all the research we invest in with taxpayer funds, climate change is widely viewed as yet another misinformation plot, by people who want to take away everyone's SUV or extended-cab pickup.  Those lefties want us to eat, rather than burn, corn.  There's already a movement afoot to reform the UN's disinformation body on the subject of climate change.  But, our enlightened, educated citizens say: why stop there?  Why not do the truly right thing, and abandon studies of climate change entirely?  Don't spend another penny on this heathen (that is, left-wing) conspiracy!

It's high time we in science fessed up to what we're actually doing, give up our hold on resources, stop confusing people and start telling the truth -- oh, and stop trying to make people stop eating bluefin tuna.  And, above all, pray for another major oil strike.

Yes, there are problems with science, especially in controversial areas.  We're fallible and have egos and weaknesses.  That is part of the reason for the UN climate change story.  But caving in on this is also catering to vested-interest Luddites, too -- or mainly!

The news these days doesn't lead one to have much confidence in the value of education to our society.

Thursday, September 2, 2010

The devil's workshop: outsourcing science to universities

In the old days universities were where students were acculturated to be the proper leaders in society.  True, it was the privileged who were taught which finger to crook when tea was drunk, how to quote Homer or Shakespeare at the appropriate moment, and how to lead a church service or ship into battle.
The rest of us merely had to be satisfied tending our plows.  There was some research done, certainly, but it wasn't like it is now.  But manly a professor was an honored individual, a source of knowledge, who took seriously the role of preparing the next generation.

Also once upon a time, if you were a company who needed some research done to improve your product, you turned to your research staff in Lab Building A, and presented them with the problem.  If you were lucky, and/or they were good, you made yourself a better mousetrap.

But there was a problem.  Once  the mousetraps were selling well, you still had the lab, its buildings and heating system, and the scientists (and their white coats and high salaries) to pay for.  But you didn't need them until mousetrap sales were down and you had to develop a new product.

And then someone had a brilliant idea!  What if we could contract out our research needs, job by job?  Then, we get the research we need, but somebody else is left holding the infrastructure bag.  They may have to look for more work, but it ain't our problem!

The military figured that out a while back, and so KP and other duties no longer fell to misbehaving buck privates, so to speak, but to civilian painters, peelers, and healers.  And in the research case, we had NIH and NSF and universities with professors eager for grants, to do society's out-sourced research.  Over the past 50 years, since we were scared to death by Sputnik and the possibility that 'they' (the bad guys) might be gaining on us, government and industry alike have invested heavily in out-sourced research.  And it's led to a huge outsource industry, eager for the contracts.  That industry is called 'universities.'  It is us.

We have so institutionalized the need for these out-source contracts that universities who have to pay their staff will do almost anything for them and we have become professionalized at thinking up important projects to do.  Some, of course, really are important.  But when you have a huge industry of people whose livelihood depends on getting these contracts, naturally they will have to find things to do, to justify their jobs, bring in the cash, and seem important in the process.

The growth of universities from teaching institutions to institutions that tolerate students if they have to, has occurred largely in this way.  We now hold the infrastructure that has to be fed by the continual inflow of funds (grants and contracts) and has to justify this by a continued out-sluicing of 'results.'  Otherwise, idle hands will be the devil's workshop and we might get into mischief, or lose our jobs.  Or the idea that we could now and then actually teach a class--a real waste of our precious talents--might occur.

And guess what?  To keep our sales reps (that is, professors) on the job more intensely to bring in these jobs, we've now started outsourcing our jobs.  We hire instructors to do the teaching so we don't have to, but yet the tuition-paying customers will still come buying our particular mousetraps.  Unlike us, they don't produce the burdensome cost of maintenance, because they haven't got job security (unless they entertain our customers and keep the student-trade moving).  And even they're outsourcing some of what they do by posting courses on online course sites.  And government is outsourcing even its administrative jobs to beltway bandit companies.

Without demeaning the segment of research that really is of some use, and again there certainly is a lot of it, this hunger for out-sourced jobs (we call it "grants and contracts") is a major reason why there are so many science journals, and so much fluff being published and breathlessly reported by another industry hungry for jobs: science popular media and journalism.  We can't put the assembly line of our activities to rest til we really have something useful to do, any more than a company can stop selling mousetraps til they have a better idea. 

This represents an interesting situation in which the burden of research has been off-loaded to create another dependent class, the professoriate and our administrators.  This chain of outsourcing and dropping of responsibility can't last forever.  Some day, it will crash, and there will be rusting hulks of buildings that once housed the famous professors who generated hundreds of profound papers each, in a steady stream....that mainly nobody read other than the authors' relatives, pets, and perhaps a journalist hungry for something to write breathlessly about 24/7--because after all, they're an industry, too.

Of course, as part of the system ourselves, we must confess that this steady stream of inane 'research' findings has all been very good indeed for the blogging industry.....