Using placebos as part of treatments

Nowadays, the testing of new drugs often involves comparisons not only with placebos but also with older established drugs in three-way double-blind tests. What is emerging from these trials is that the placebo effect seems to be getting stronger, which means that new drugs in clinical trials are having a harder time showing that they are better than the placebo. Another consequence of stronger placebo responses is that some well-known drugs used is the trials as the older standard (and that had beaten the placebo in earlier tests) seem not to be able to do so now.
[Read more…]

The placebo effect

In the previous post, I described the practice of homeopathy and explained why it should no longer be taken seriously. Now that we know that its originator Samuel Hahnemann was basically treating his patients with water, what made him think his treatment was effective? There is no evidence that he was a fraud or charlatan, foisting on his patients something he knew was bogus in order to take their money. He was probably genuine in his belief in the efficacy of his treatment.

It is likely that he was misled by the placebo effect, where patients recover from an illness due to any number of factors that have nothing to do with treatment provided by the doctor. People who want to believe seize on these random events and see patterns that don’t exist. For example, since colds get better after a few days, it is possible to get gullible people to believe that practically anything is a cure for cold since if you take it soon after the onset of symptoms, presto, the cold disappears in a couple of days.

Steve Silberman in Wired Magazine describes how the placebo effect was discovered.

The roots of the placebo problem can be traced to a lie told by an Army nurse during World War II as Allied forces stormed the beaches of southern Italy. The nurse was assisting an anesthetist named Henry Beecher, who was tending to US troops under heavy German bombardment. When the morphine supply ran low, the nurse assured a wounded soldier that he was getting a shot of potent painkiller, though her syringe contained only salt water. Amazingly, the bogus injection relieved the soldier’s agony and prevented the onset of shock.

Returning to his post at Harvard after the war, Beecher became one of the nation’s leading medical reformers. Inspired by the nurse’s healing act of deception, he launched a crusade to promote a method of testing new medicines to find out whether they were truly effective.

In a 1955 paper titled “The Powerful Placebo,” published in The Journal of the American Medical Association, Beecher described how the placebo effect had undermined the results of more than a dozen trials by causing improvement that was mistakenly attributed to the drugs being tested. He demonstrated that trial volunteers who got real medication were also subject to placebo effects; the act of taking a pill was itself somehow therapeutic, boosting the curative power of the medicine. Only by subtracting the improvement in a placebo control group could the actual value of the drug be calculated.

The placebo explains why so many medical procedures that are now viewed with horror were standard treatments in the past. Bloodletting, bleeding with leeches, attaching maggots, dousing with cold water, were among the treatments once recommended. Charles Darwin suffered from all manner of undiagnosed ailments that included frequent vomiting and he subjected himself to various uncomfortable water treatments in the belief that they helped him. His beloved daughter Annie died of an unknown illness after receiving similar water treatments.

In my own building on the third floor is a small museum of medical history that contains all manner of gruesome-looking medical devices that no one thinks of using today but once were believed to be effective, even state-of-the-art. As long as the physician and patient had confidence in the treatment, it must have seemed to work.

Because of the repeated discrediting of medical treatments that were once considered effective, it has been suggested that the history of medicine is actually the history of the placebo effect, with new placebos replacing the old, leading to the uncomfortable suggestion that our current treatments, however sophisticated they may seem, are merely the latest placebos.

But there is reason to think that we now have a much better idea of what really works and what is a placebo because Beecher’s work led to the invention of the practice of double-blind experimental testing, where neither the patient nor the researcher collecting the data and doing the analyses knows who is receiving the experimental treatment and who is receiving the placebo.

By 1962, the government had started requiring drug companies to perform clinical tests with placebos in order to get approval and this has led to the elimination of outright quackery in medicine. Without such precautions, people can, even with the best of intentions, subtly distort the results to get the result they want or expect.

As a result of the widespread adoption of double-blind testing, there is good reason to think that our current practices are significantly better than those of the past, and that we are no longer so easily fooled by placebos.

Next: Using placebos as part of treatment.

POST SCRIPT: How double blind tests work

Double-blind tests are useful not only in medicine. Richard Dawkins shows what happens when it is used to test the claims of people who think they can detect the presence of water by dowsing.

It is interesting that when the tests show the dowsers that the “powers” they thought they had is non-existent, they make up stuff to enable them to continue believing. Does that remind you of anything?

Skyhooks and cranes-8: Alternatives to natural selection

In the half century after Charles Darwin published his On the Origin of Species in 1859, the idea of evolution gained considerable ground but the theory of natural selection was just one of several mechanisms that drove the process, and hence the anti-religious implications of the theory were somewhat muted.

Some of these alternative theories were modified forms of Lamarckism, the idea that characteristics that an organism acquired during its lifetime that enabled it to survive better were somehow transmitted to the entities in the body that carried inherited traits to their progeny, so that children inherited that acquired trait. These changes could either come about because of animals needing or desiring a change (the famous Lamarckian example of giraffes getting longer and longer necks as a result of having to strain to reach high leaves) or the ‘use-disuse’ theory, that body features that people used a lot would grow and become more common while those that they did not need or use would atrophy and disappear (the example here being the building of certain muscles in the body or the disappearance of fish-like features once they became land animals).
[Read more…]

The stem cell issue-2: The ethics

Yesterday, I discussed the science involved in stem cell research. Today I want to discuss the ethics.

The ethical problems associated with stem cell research occur because although the fertilized eggs were not created for the purposes of research but to help infertile couples, since the method of in vitro fertilization for the treatment of infertility has not been perfected, more fertilized eggs are created than can be used to actually generate pregnancies, and the question of what to do with these extra frozen stored embryos is problematic.
[Read more…]

The stem cell issue-1: The science

The decision by the Obama administration to reverse the Bush-era policy of banning the use of federal funds for stem cell research has created some controversy. The earlier policy had led to some frustration in the scientific community.

Bush’s policy was intended to be a compromise: it banned the use of federal funds for the creation of new embryonic stem-cell lines while allowing scientists to study 21 lines that had already been created. But researchers say those lines aren’t diverse enough and they have been eager to study hundreds of other lines, some of which contain specific genetic mutations for diseases like Parkinson’s. There have been practical challenges as well. The restrictions forced scientists to use different lab equipment for privately funded and government-funded research; some even built entirely separate lab space. One of the most disconcerting aspects, researchers say, has been the negative effect on collaboration, a hallmark of the scientific process. Researchers supported by private money haven’t been able to team up with scientists funded by the government, potentially holding back new insights and advances.

Stem cells are those that have three properties. Unlike most cells like muscle or blood or nerve cells, 91) they are capable of replicating themselves for a long period (making them a valuable source to regenerate the body by replacing cells that die), (2) they are unspecialized, and (3) when they reproduce they can produce either more stem cells or become specialized cells like muscle or nerve or bone (a process known as differentiation). The National Institutes of health has an informative FAQ page on this topic.

The two main kinds of stem cells are the embryonic ones and the non-embryonic ones. The embryonic ones can proliferate for a year or more in the laboratory without differentiating while the non-embryonic ones cannot do so for very long, but the reasons for this difference are not known as yet. The embryonic stem cells are capable of eventually differentiating into any type of specialized cell, and are called pluripotent. Such pluripotent cells are valuable because they can be used to repair tissue in any part of the body as needed. But eventually they need to differentiate into specialized cells in order to perform the functions that those specialized cells carry out in the body. The process by which stem cells differentiate is still not fully understood, but part of it involves interaction with the external environment in which the stem cell finds itself.

Adult stem cells are one form of non-embryonic cells and are found amongst the differentiated cells that make up the tissues of the body, such as the brain and heart and bone marrow, and they are the cells that are used to maintain and repair those tissues by differentiating when needed to produce new tissues. Some adult stem cells seem to have the capacity to differentiate into more than one type of specialized cell though the range is limited, unlike in the case of embryonic stem cells. Such cells are called multipotent.

For example, some multipotent stem cells found in the bone marrow can generate bone, cartilage, fat, and connective tissue. Stem cells taken from umbilical cord blood and the placenta seem to also have multipotent properties and thus in the future it may become routine that a stock of umbilical or placental cells will be taken after every birth and preserved for possible future use. Adult stem cells have some uses but working with them is much more difficult since they are harder to obtain and are less flexible.

To understand the ethical issues involved in using embryonic stem cells, one should be aware that creating embryonic stem cell lines for research requires extraction of cells from the blastocyst. This is the stage reached by a fertilized egg after about three to five days when, after repeated cell division and duplication, there are about 70-100 identical cells in the shape of a hollow ball containing an inner clump of cells. The inner clump becomes the embryo and the outer hollow ball becomes the placenta. When this occurs in the uterus, this stage is reached before this collection of cells gets implanted in the uterus wall. Sometimes implantation does not occur, in which case the pregnancy is spontaneously terminated.

This video explains what stem cells are and how they work.

The embryos from which embryonic stem cells are taken are produced during treatment for infertility when a woman’s egg is taken from her body and fertilized and grown to blastocyst stage in a culture outside the woman’s body. In the very early days after the egg is fertilized and the cell starts splitting and reproducing itself, all the cells are identical. Embryonic stem cells are obtained from that inner clump of cells and thus the blastocyst has to be destroyed in the process. The cells from a single blastocyst can be used to generate millions of embryonic stem cells that can be divided among researchers, and these are the stem cell ‘lines’ that are referred to. The cells in a single line are all genetically identical.

While there are promising new ways of creating embryonic stem cells using adult skin cells (called induced pluripotent stem cells), they have their own ethical issues.

Since tissues created from a person’s stem cells have the same genetic information as the host, the host body will not reject the implanted tissues as a foreign body, thus overcoming one of the biggest hurdles in organ transplants. While the possibility of growing tissues and entire organs for transplant purposes is often publicized as the biggest potential benefit of using stem cells, there are other more immediately realizable potential uses for embryonic stem cells.

One is that it enables the process by which cells differentiate into their specialized forms to be studied. Another is that by creating cells that have a particular disease, say Parkinson’s or Lou Gehrig’s, one can observe under a microscope even the earliest stages of the progression of the disease and thus hope to develop better treatments. Another use is to test the effects of drugs on cells before testing them on a real person. That would enable you to see if they are toxic to a particular individual, creating a level of personalized medicine that we do not currently have.

The potential benefits of embryonic stem cells in research are clear, even though it is very early days yet and there is still a long way to go before we can hope to even begin realizing those benefits. The key question is how to balance the ethical concerns involved in using such cells with the benefits.

This question will be examined in the next post.

POST SCRIPT: The Daily Show on stem cells

The Daily Show With Jon Stewart M – Th 11p / 10c
Stem Sell
comedycentral.com
Daily Show Full Episodes Economic Crisis Political Humor

Why Darwin scares people

(The text of a talk given at CWRU’s Share the Vision program on Friday, August 22, 2008 at 1:00 pm in Severance Hall. This annual program is to welcome all incoming first year students. My comments centered on this year’s common reading book selection The Reluctant Mr. Darwin by David Quammen.)

Welcome to Case Western Reserve University!

You are fortunate that in your first year here you are going to part of a big year-long celebration, organized by this university, to mark the 200th anniversary of the birth of Charles Darwin and the 150th anniversary of his groundbreaking book On the Origin of Species.
[Read more…]

Scientific consistency and Conservapedia loopiness

One of the drivers of scientific research is the desire to seeking a greater and greater synthesis, to seek to unify the knowledge and theories of many different areas. One of the most severe constraints that scientists face when developing a new theory is the need for consistency with other theories. It is very easy to construct a theory that explains any single phenomenon. It is much, much harder to construct a theory that does not also lead to problems with other well-established results. If a new theory conflicts with existing theories, something has to give in order to eliminate the contradiction.
[Read more…]

Seeing evolution in real time

Evolution opponents tend to try and dismiss the evidence in its favor, as a last resort often resorting to the argument that no one has actually seen evolution occurring and a new species emerging, with all the intermediate stages clearly identified. One reason for this is, of course, that evolutionary change occurs very slowly, not visible in the transition from one generation to another. The emergence of a new species is almost always a retrospective judgment, made long after the fact, of a process that often takes thousands, or tens of thousands, of generations. By that time, most of the intermediate forms have become extinct and left no trace, since fossilization is such a rare event.
[Read more…]

The difference between human and other animal communication

In his book The Language Instinct (1994) Steven Pinker pointed out two fundamental facts about human language that were used by linguist Noam Chomsky to develop his theory about how we learn language. The first is that each one of us is capable of producing brand new sentences never before uttered in the history of the universe. This means that:

[A] language cannot be a repertoire of responses; the brain must contain a recipe or program that can build an unlimited set of sentences out of a finite list of words. That program may be called a mental grammar (not to be confused with pedagogical or stylistic “grammars,” which are just guides to the etiquette of written prose.)

The second fundamental fact is that children develop these complex grammars rapidly and without formal instruction and grow up to give consistent interpretations to novel sentence constructions that they have never before encountered. Therefore, [Chomsky] argued, children must be innately equipped with a plan common to the grammars of all languages, a Universal Grammar, that tells them how to distill the syntactic patters out of speech of their parents. (Pinker, p. 9)

Children have the ability to produce much greater language output than they receive as input but it is not done idiosyncratically. The language they produce follows the same generalized grammatical rules as others. This leads Chomsky to conclude that (quoted in Pinker, p. 10):

The language each person acquires is a rich and complex construction hopelessly underdetermined by the fragmentary evidence available [to the child]. Nevertheless individuals in a speech community have developed essentially the same language. This fact can be explained only on the assumption that these individuals employ highly restrictive principles that guide the construction of grammar.

The more we understand how human language works, the more we begin to realize how different human speech is from the communication systems of other animals.

Language is obviously as different from other animals’ communication systems as the elephant’s truck is different from other animals’ nostrils. Nonhuman communication systems are based on one of three designs: a finite repertory of calls (one for warnings of predators, one for claims of territory, and so on), a continuous analog signal that registers the magnitude of some state (the livelier the dance of the bee, the richer the food source that it is telling its hivemates about), or a series of random variations on a theme (a birdsong repeated with a new twist each time: Charlie Parker with feathers). As we have seen, human language has a very different design. The discrete combinatorial system called “grammar” makes human language infinite (there is no limit to the number of complex words or sentence in a language), digital (this infinity is achieved by rearranging discrete elements in particular orders and combinations, not by varying some signal along a continuum like the mercury in a thermometer), and compositional (each of the finite combinations has a different meaning predictable from the meanings of its parts and the rules and principles arranging them). (Pinker, p. 342)

This difference between human and nonhuman communication is also reflected in the role that different parts of the brain plays in language as opposed to other forms of vocalization.

Even the seat of human language in the brain is special. The vocal calls of primates are controlled not by their cerebral cortex but by phylogenetically older neural structures in the brain stem and limbic systems, structures that are heavily involved in emotion. Human vocalizations other than language, like sobbing, laughing, moaning, and shouting in pain, are also controlled subcortically. Subcortical structures even control the swearing that follows the arrival of a hammer on a thumb, that emerges as an involuntary tic in Tourette’s syndrome, and that can survive as Broca’s aphasic’s only speech. Genuine language . . . is seated in the cerebral cortex, primarily in the left perisylvian region. (Pinker, p. 342)

Rather than view the different forms of communication found in animals as a hierarchy, it is better to view them as adaptations that arose from the necessity to occupy certain evolutionary niches. Chimpanzees did not develop the language ability because they did not need to. Their lifestyles did not require the ability. Humans, on the other hand, even in the hunter-gatherer stage, would have benefited enormously from being able to share kind of detailed information about plants and animals and the like, and thus there could have been an evolutionary pressure that drove the development of language.

Human language was related to the evolution of the physical apparatus that enabled complex sound production along with the associated brain adaptations, though the causal links between them is not fully understood. Did the brain increase in size to cope with rising language ability or did the increasing use of language drive brain development? We really don’t know yet.

The argument against a linguistic hierarchy in animals can be seen in the fact that different aspects of language can be found to be best developed in different animals.

The most receptive trainee for an artificial language with a syntax and semantics has been a parrot; the species with the best claim to recursive structure in its signaling has been the starling; the best vocal imitators are birds and dolphins; and when it comes to reading human intentions, chimps are bested by man’s best friend, Canis familiaris. (Pinker, PS20)

It seems clear that we are unlikely to ever fully communicate with other species the way we do with each other. But the inability of other animals to speak the way we do is no more a sign of their evolutionary backwardness than our nose’s lack of versatility compared to the elephant’s trunk, or our inability to use our hands to fly the way bats can, are signs that we are evolutionarily inferior compared to them

We just occupy different end points on the evolutionary bush.

POST SCRIPT: But isn’t everyone deeply interested in golf?

If you want yet more reasons why TV news is not worth watching . . .

Can animals talk?

One of the most interesting questions in language is whether animals can talk or at least be taught to talk. Clearly animals can communicate in some rudimentary ways, some more so than others. Some researchers are convinced that animals can talk and have spent considerable efforts to try and do so but with very limited results. In the comments to an earlier post, Greg referred to the efforts by Sue Savage-Rumbaugh (and Duane Rumbaugh) to train the bonobo chimpanzee Kanzi to speak, and Lenen referred to the development of spontaneous language in children who had been kept in a dungeon. There have been other attempts with chimps and gorillas named Washoe, Koko, Lana, and Sarah.

One thing that is clear is that humans seem to have an instinctive ability to create and use language. By instinctive, I mean that evolution has produced in us the kinds of bodies and brains that make learning language easy, especially at a young age. It is argued that all humans are born possessing the neural wiring that contains the rules for a universal grammar. The five thousand different languages that exist today, although seeming to differ widely, all have an underlying grammatical similarity that is suggestive of this fact. For example, this grammar affects things like the subject-verb-object ordering in sentences. In English, we would say “I went home” (subject-verb-object) while in Tamil it would be “I home went” (subject-object-verb).

What is interesting is that of all the grammars that are theoretically possible, only a very limited set is actually found in existence. We do not find, for example, languages where people say “Home went I” (object-verb-subject). What early exposure to language does is turn certain switches on and off in the universal grammar wiring in our brains, so that we end up using the particular form of grammar of the community we grow up in. This suggests that language structures are restricted and not infinitely flexible, indicating a biological limitation.

The instinctive nature of language can be seen in a natural experiment that occurred in Nicaragua. There used to be no sign language at all in that country because the children were isolated from one another. When the Sandinistas took over in 1979, they created schools for the deaf. Their efforts to formally teach the children lip reading and speech failed dismally. But because the deaf children were now thrown together in the school buses and playgrounds, the children spontaneously developed their own sign language that developed and grew more sophisticated and is now officially a language that follows the same underlying grammatical rules as other spoken and sign languages. (Steven Pinker, The Language Instinct, 1994, p. 24)

What about animals? Many of us, especially those of us who have pets, would love to think that animals can communicate. As a result, we are far more credulous than we should be of claims (reported in the media) by researchers that they have taught animals to speak. But others, like linguist Steven Pinker, are highly skeptical. When looked at closely, the more spectacular elements of the claims disappear, leaving just rudimentary communication using symbols. The idea that some chimps can be taught to identify and use some symbols or follow some simple spoken commands does not imply that they possess underlying language abilities comparable to humans. The suggestion that animals use sign ‘language’ mistakenly conflates the sophisticated and complex grammatical structures of American Sign Language and other sign languages with that of a few suggestive gestures.

The belief that animals can, or should be able to, communicate using language seems to stem from two sources. One lies in a mistaken image of evolution as a linear process in which existing life forms can be arranged from lower to higher and more evolved forms. One sees this in posters in which evolution is shown as a sequence: amoebas→ sponges→ jellyfish→ flatworms→ trout→ frogs→ lizards→ dinosaurs→ anteaters→ monkeys→ chimpanzees→ Homo sapiens. (Pinker, p. 352) In this model, humans are the most evolved and it makes sense to think that perhaps chimpanzees have a slightly less evolved linguistic ability than we do but that it can be nudged along with some human help. Some people are also convinced that to think that animals cannot speak is a sign of a deplorable species superiority on our part.

But that linear model of evolution is wrong. Evolution is a branching theory, more like a spreading bush. Starting from some primitive form, it branched out into other forms, and these in turn branched out into yet more forms and so on, until we had a vast number of branches at the periphery. All the species I listed in the previous paragraph are like the tips of the twigs on the canopy of the bush, except that some (like the dinosaurs) are now extinct. Although all existing species have evolved from some earlier and more primitive forms, none of the existing species is more evolved than any other. All existing species have the same evolutionary status. They are merely different.

In the bush image, it is perfectly reasonable to suppose that one branch (species) may possess a unique feature (speech) that is not possessed by the others, just like the elephant possesses a highly useful organ (the trunk) possessed by no other species. All that this signifies is that that feature evolved after that branch separated from the rest of the bush and hence is not shared by others. The fact that nonhuman animals cannot speak despite extensive efforts at tutoring them is not a sign that they are somehow inferior or less evolved than us.

Some efforts to teach animals language skills seem to stem from a sense of misguided solidarity. It is as if the more features we share with animals, the closer we feel we are to them and the better we are likely to treat them. It is undoubtedly true that the closer we identify with some other living thing, the more empathy we have for it. But the solution to that is to have empathy for all living creatures, and not try to convince ourselves that we are alike in some specific ways.

As Pinker says:

What an irony it is that the supposed attempt to bring Homo sapiens down a few notches in the natural order has taken the form of us humans hectoring another species into emulating our instinctive form of communication, or some artificial form we have invented, as if that were a measure of biological worth. The chimpanzees’ resistance is no shame to them; a human would surely do no better if trained to hoot and shriek like a chimp, a symmetrical project that makes about as much scientific sense. In fact, the idea that some species needs our intervention before its members can display a useful skill, like some bird that could not fly until given a human education, is far from humble! (p. 351)

While any animal lover would dearly love to think that they can talk with animals, we may have to reconcile ourselves to the fact that it just cannot happen, because they lack the physical and perhaps cognitive apparatus to do so.

Next: The differences between animal and human communication.

POST SCRIPT: Superstitions

One of the negative consequences of religious beliefs is that it leads to more general magical thinking, one form of which is superstitions. Steve Benen lists all the superstitions that John McCain believes in.

It bothers me when political leaders are superstitious. Decision-makers should not be influenced by factors that have no bearing whatsoever on events.