The Origins Issue

Eight Brief Origins Stories

By Davide Castelvecchi
From Scientific American, September 2009

Every September, Scientific American comes out with a special issue, or single-topic issue, as the editors call it. The one in 2009 was about origins–the origins of everything, from the universe to life on Earth, from computers to the pill. It included four features written by experts, including Jack W. Szostak, who shortly thereafter went on to win a Nobel Prize in Chemistry.

Another feature told every origins story you’ve ever wanted to know about, and perhaps some you’ve never even thought of. It was made of 57 short items written by staff editors and writers as well as by some freelance writers.

Below are the eight items I contributed to that feaure. To jump straight to any of the items, just click on a link on the list.

1. Batteries

2. Scotch Tape

3. Economic Thinking

4. Carbon

5. The Placenta

6. The Eye

7. Photosynthesis

8. Chocolate


1. Batteries

Their inventor may not have known how they actually work

A battery’s power comes from the tendency of electric charge to migrate between different substances. It is the power that Italian scientist Alessandro Volta sought to tap into when he built the first battery at the end of 1799.

Although different designs exist, the basic structure has remained the same ever since. Every battery has two electrodes. One, the anode, wants to give electrons (which carry a negative electric charge) to the other, the cathode. Connect the two through a circuit, and electrons will flow and carry out work—say, lighting a bulb or brushing your teeth.

Simply shifting electrons from one material to another, however, would not take you very far: like charges repel, and only so many electrons can accumulate on the cathode before they start to keep more electrons from joining. To keep the juice going, a battery balances the charges within its innards by moving positively charged ions from the anode to the cathode through an electrolyte, which can be solid, liquid or gelatinous. It is the electrolyte that makes the battery work, because it allows ions to flow but not electrons, whereas the external circuit allows electrons to flow but not ions.

For example, a charged lithium-ion battery—the type that powers cell phones and laptop computers—has a graphite anode stuffed with lithium atoms and a cathode made of some lithium-based substance. During operation, the anode’s lithium atoms release electrons into the external circuit, where they reach the more electron-thirsty cathode. The lithium atoms stripped of their electrons thus become positively charged ions and are attracted toward the electrons accumulating in the cathode. They can do so by flowing through the electrolyte. The ions’ motion restores the imbalance of charges and allows the flow of electricity to continue—at least until the anode runs out of lithium.

Recharging the battery reverses the process: a voltage applied between the two electrodes makes the electrons (and the lithium ions) move to the graphite side. This is an uphill struggle, energetically speaking, which is why it amounts to storing energy in the battery. When he built his first battery, Volta was trying to replicate the organs that produce electricity in torpedoes, the fish also known as electric rays, says Giuliano Pancaldi, a science historian at the University of Bologna in Italy.

Volta probably went by trial and error before settling on using metal electrodes and wet cardboard as an electrolyte. At the time, no one knew about the existence of atoms, ions and electrons. But whatever the nature of the charge carriers, Volta probably was not aware that in his battery, the positive charges moved in opposition to the “electric fluid” moving outside. “It took a century before experts reached a consensus on how the battery works,” Pancaldi says.


2. Scotch Tape

Most new inventions quickly fall into oblivion; some stick

In 1930 food-packing companies were enthralled with the relatively new and improved film called cellophane, a transparent polymer made from cellulose. Cellophane wrappers could help keep packaged food fresh yet would still allow customers a view of the contents. Sealing cellophane packages satisfactorily was a problem, however, until the 3M Company invented and trademarked Scotch tape—a name that the public nonetheless widely uses for all adhesive-backed cellophane tapes. (The analogous product Sellotape, introduced seven years later in Europe, has the same problems with generic use of its name.)

Engineers call the glue in Scotch tape a pressure-sensitive adhesive. It does not stick by forming chemical bonds with the material it is placed on, says Alphonsus Pocius, a scientist at the 3M Corporate Research Materials Laboratory in St. Paul, Minn. Instead applied pressure forces the glue to penetrate the tiniest microscopic irregularities on the material’s surface. Once there, it will resist coming back out, thus keeping the tape stuck in place. The glue “has to be halfway between liquid and solid,” Pocius explains: fluid enough to spread under pressure but viscous enough to resist flowing.

Concocting the right kind of glue is only part of the invention, however. The typical adhesive tape contains not just two materials (glue and backing, which can be cellophane or some other plastic) but four. A layer of primer helps the glue stick to the plastic, while on the other side a “release agent” makes sure that the glue does not stick to the top. Otherwise, Scotch tape would be impossible to unroll.

Adhesive tape recently caught the attention of physicists. Researchers showed that unrolling tape in a vacuum chamber releases x-rays, and they used those x-rays to image the bones in their fingers as a demonstration. The discovery could lead to cheap, portable (and even muscle-powered) radiography machines. The unrolling creates electrostatic charges, and electrons jumping across the gap between tape and roll produce x-rays. In the presence of air the electrons are much slower and produce no x-rays. But try unrolling tape in a completely dark room, and you will notice a faint glow.


3. Economic Thinking

Even apparently irrational human choices can make sense in terms of our inner logic

Much economic thinking rests on the assumption that individuals know what they want and that they make rational decisions to achieve it. Such behavior requires that they be able to rank the possible outcomes of their actions, also known as putting a value on things.

The value of a decision’s outcome is often not the same as its nominal dollar value. Say you are offered a fair bet: you have the same chance of doubling your $1 wager as you have of losing it. Purely rational individuals would be indifferent to the choice between playing or not playing: if they play such a bet every day, on average they will be no better or no worse off. But as Captain Kirk might tell Mr. Spock, reality often trumps logic.

Or as Swiss mathematician Gabriel Cramer wrote in a 1728 letter to his colleague Nicolas Bernoulli, “The mathematicians estimate money in proportion to its quantity, and men of good sense in proportion to the usage that they may make of it.” Indeed, many people are “risk-averse”: they will forfeit their chance of winning $1 to be guaranteed of keeping the $1 they have, especially if it is their only one. They assign more value to the outcome of not playing than to the outcome of potentially losing. A risk-oriented person, on the other hand, will go for the thrill.

Cramer’s idea was later formalized by Bernoulli’s statistician cousin Daniel into the concept of expected utility, which is an implicit value given to the possible outcomes of a decision, as revealed by comparing them with the outcomes of a bet. Risk-averse and risk-oriented persons are not irrational; rather they make rational decisions based on their own expected utility. Economists generally assume that most people are rational most of the time, meaning that they know which decisions will maximize the expected utility of their choices. (Of course, doing so requires knowing how to evaluate risk wisely, which people do not always do well. AIG, anyone?)

Some experiments, however, have shown that people are occasionally unable to rank outcomes in a consistent way. In 1953 American mathematician Kenneth May conducted an experiment in which college students were asked to evaluate three hypothetical marriage candidates, each of whom excelled in a different quality. The students picked intelligence over looks, looks over wealth and wealth over intelligence.


4. Carbon

Synonymous with life, it was born in the heart of stars

Although carbon has recently acquired a bad rap because of its association with greenhouse gases, it has also long been synonymous with biology. After all, “carbon-based life” is often taken to mean “life as we know it,” and “organic molecule” means “carbon-based molecule” even if no organism is involved.
But the sixth element of the periodic table—and the fourth most abundant in the universe—has not been around since the beginning of time. The big bang created only hydrogen, helium and traces of lithium. All other elements, including carbon, were forged later, mostly by nuclear fusion inside stars and supernovae explosions.

At the humongous temperatures and pressures in a star’s core, atomic nuclei collide and fuse together into heavier ones. In a young star, it is mostly hydrogen fusing into helium. The merger of two helium nuclei, each carrying two protons and two neutrons, forms a beryllium nucleus that carries four of each. That isotope of beryllium, however, is unstable and tends to decay very quickly. So there would seem to be no way to form carbon or heavier elements. But later in a star’s life, the core’s temperature rises above 100 million kelvins. Only then is beryllium produced fast enough for there to be a significant amount around at any time—and some chance that other helium nuclei will bump into those beryllium nuclei and produce carbon. More reactions may then occur, producing many other elements of the periodic table, up to iron.

Once a star’s core runs out of nuclei to fuse, the outward pressure exerted by the nuclear fusion reaction subsides, and it collapses under its own weight. If a star is large enough, it will produce one of the universe’s most spectacular flares: a supernova explosion. Such cataclysms are good, because supernovae are what disperse carbon and the other elements (some of them forged in the explosions themselves) around the galaxy, where they will form new stars but also planets, life … and greenhouse gases.


5. The Placenta

An eggshell membrane evolved into the organ that lets fetuses grow in the womb

More than 120 million years ago, while giant dinosaurs crashed through the forests in fearsome combat, a quieter drama unfolded in the Cretaceous underbrush: some lineage of hairy, diminutive creatures stopped laying eggs and gave birth to live young. They were the progenitors of nearly all modern mammals (the exceptions, platypuses and echidnas, still lay eggs to this day).

What makes mammals’ live birth possible is the unique organ called the placenta, which envelops the growing embryo and mediates the flow of nutrients and gases between it and the mother via the umbilical cord. The placenta seems to have evolved from the chorion, a thin membrane that lines the inside of eggshells and helps embryonic reptiles and birds draw oxygen. Kangaroos and other marsupials have and need only a rudimentary placenta: after a brief gestation, their bean-size babies finish their development while suckling in the mother’s pouch. Humans and most other mammals, however, require a placenta that can draw nutrients appropriately from the mother’s blood throughout an extended pregnancy.

Recent studies have shown that the sophistication of the placenta stems in part from how different genes within it are activated over time. Early in embryonic development, both mouse and human placentas rely on the same set of ancient cell-growth genes. But later in a pregnancy, even though the placenta does not obviously change in appearance, it invokes genes that are much newer and more species-specific. Thus, placentas are fine-tuned for the needs of mammals with different reproductive strategies: witness mice, which gestate for three weeks with a dozen or more pups, versus humans, who deliver one baby after nine months.

To last more than a week or two, the placenta, which is primarily an organ of the fetus, must prevent the mother’s immune system from rejecting it. To do so, the placenta may deploy a mercenary army of endogenous retroviruses— viral genes embedded in the mammal’s DNA. Scientists have observed such viruses budding from the placenta’s cell membranes. Viruses may play crucial roles in pacifying the mother’s immune system into accepting the placenta, just as they do in helping some tumors survive.


6. The Eye

What was half an eye good for? Quite a lot, actually

One of creationists’ favorite arguments is that so intricate a device as the eye—with a light-regulating iris, a focusing lens, a layered retina of photosensitive cells, and so on—could not have arisen from Darwinian evolution. How could random mutations have spontaneously created and assembled parts that would have had no independent purpose? “What good is half an eye?” the creationists sneer, claiming the organ as prima facie proof of the existence of God.

Indeed, even Charles Darwin acknowledged in On the Origin of Species that the eye seemed to pose an objection to his theory. Yet by looking at the fossil record, at the stages of embryonic development and at the diverse types of eyes in existing animals, biologists since Darwin have outlined incremental evolutionary steps that may have led to the eye as we know it.

The basic structure of our eyes is similar in all vertebrates, even lampreys, whose ancestors branched away from ours about 500 million years ago. By that time, therefore, all the basic features of the eye must have existed, says Trevor Lamb of the Australian National University. But vertebrates’ next closest kin, the slippery hagfish—animals with a cartilaginous cranium but no other bones—has only rudimentary eyes. They are conical structures under the skin, with no cornea, no lens and no muscles, whose function is probably just to measure the dim ambient light in the deep, muddy seabeds where hagfish live.

Our eyes are thus likely to have evolved after our lineages diverged from those of hagfish, perhaps 550 million years ago, according to Lamb. Earlier animals might have had patches of light-sensitive cells on their brain to tell light from dark and night from day. If those patches had re-formed into pouchlike structures as in hagfish, however, the animals could have distinguished the direction from which light was coming. Further small improvements would have enabled the visualization of rough images, as do the pinhole-camera eyes of the nautilus, a mollusk. Lenses could eventually have evolved from thickened layers of transparent skin.

The key is that at every stage, the “incomplete” eye offered survival advantages over its predecessors.
All these changes may have appeared within just 100,000 generations, biologists have calculated, which in geologic terms is the blink of an eye. Such speedy evolution may have been necessary, because many invertebrates were developing their own kinds of eyes. “There was a real arms race,” Lamb says. “As soon as somebody had eyes and started eating you, it became important to escape them.”


7. Photosynthesis

The reaction that makes the world green is just one of many variants

When the sun shines, green plants break down water to get electrons and protons, use those particles to turn carbon dioxide into glucose, and vent out oxygen as a waste product. That process is by far the most complex and widespread of the various known versions of photosynthesis, all of which turn the light of particular wavelengths into chemical energy. (Studies have even suggested that certain single-celled fungi can utilize the highly energetic gamma rays: colonies of such fungi have been found thriving inside the postmeltdown nuclear reactor at Chernobyl.) Using water as a photosynthetic reactant instead of scarcer substances such as hydrogen sulfide eventually enabled life to grow and thrive pretty much everywhere on the planet.

Water-splitting photosynthesis was “invented” by the ancestors of today’s cyanobacteria, also known as blue-green algae. The organisms that now do this type of photosynthesis, including plants, green algae and at least one animal (the sea slug Elysia chlorotica), carry organelles called chloroplasts that appear to be the descendants of what once were symbiotic cyanobacteria. All of them use some form of the pigment chlorophyll, sometimes in combination with other pigments. Photosynthesis starts when arrays of chlorophyll molecules absorb a photon and channel its energy toward splitting water.

But water is a uniquely hardy molecule to be involved in photosynthesis. Taking electrons from water and giving them enough energy to produce glucose requires two separate assemblies of slightly different chlorophyll molecules (and an apparatus of more than 100 different types of proteins). Simpler forms of photosynthesis use one or the other version, but not both. The mystery is, Which one appeared first in evolution, and how did the two end up combined? “It’s a question we don’t really know the answer to,” says Robert Blankenship of Washington University in St. Louis.

Scientists also do not know when cyanobacteria learned to split water. Some evidence suggests that it may have been as early as 3.2 billion years ago. It surely must have happened at least 2.4 billion years ago, when oxygen shifted from being a rare gas to being the second most abundant one in the atmosphere—a change without which complex multicellular animals that can formulate scientific questions could never have existed.


8. Chocolate

Mixing the bitter treat with milk was the popular breakthrough

Chocolate was a favorite drink of the Maya, the Aztecs and other Mesoamerican peoples long before the Spaniards “discovered” it and brought it back to Europe. Archaeological evidence suggests that chocolate has been consumed for at least 3,100 years and not just as food: the Maya and other pre-Columbian cultures offered cacao pods to the gods in a variety of rituals, including some that involved human sacrifice.

But it was an Irish Protestant man who had what might be the most pivotal idea in chocolate history. In the 1680s Hans Sloane, a physician and naturalist whose estate—a vast collection of books and natural specimens—kick-started the British Museum, was in service to the British governor of Jamaica, collecting human artifacts and documenting local plants and animals. Sloane realized that the bitter local chocolate beverage would become much more palatable to his taste when mixed with milk. He later patented his invention. Although many had been enjoying chocolate made with hot water, Sloane’s version quickly became popular back in England and elsewhere in Europe. Milk also became a favorite addition to solid chocolate, and today around two thirds of Americans say they prefer milk chocolate to dark chocolate.

Chocolate’s positive health effects are by now well documented. Antioxidants such as polyphenols and flavonoids make up as much as 8 percent of a cacao bean’s dry weight, says Joe Vinson, a chemist at the University of Scranton. Antioxidants neutralize highly reactive molecules called free radicals that would otherwise damage cells. And it is not a coincidence that the cacao tree (and other antioxidant-rich plants such as coffee and tea) would originate from low latitudes. “Things that have high levels of antioxidants tend to grow in places near the equator, with lots of sun,” Vinson says. The sun’s ultraviolet rays break up biological molecules into free radicals, and these plants may produce antioxidants to better endure the stress.

Although eating too much chocolate results in excessive calorie intake, human and animal studies have shown that moderate chocolate consumption can have beneficial effects on blood pressure, slow down atherosclerosis and lower “bad” cholesterol. Chocolate may also be good for the mind: a recent study in Norway found that elderly men consuming chocolate, wine or tea—all flavonoid-rich foods— performed better on cognitive tests.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s