During the second half of the twentieth century, remarkable strides were made by geneticists and other researchers. In 1956 Vernon Ingram (1924–), who would soon be recognized as the "father of molecular medicine," identified the single base difference between normal and sickle-cell hemoglobin. The implications of his finding that the mutation of a single letter in the DNA genetic code was sufficient to cause a hereditary medical
disorder were far-reaching. This greater insight into the mechanisms of sickle-cell disease suggested directions for research into prevention and treatment. It prompted research that uncovered other diseases with similar causes, such as hemophilia (an inherited blood disease associated with insufficient clotting factors and excessive bleeding) and cystic fibrosis (an inherited disease of the mucous glands that produces problems associated with the lungs and pancreas). Just three years later, the first human chromosome abnormality was identified: people with Down's syndrome were found to have an extra chromosome, demonstrating that it is a genetic disorder that may be diagnosed by direct examination of the chromosomes.
Ingram's work has been the foundation for current research to map genetic variations that affect human health. For example, in 1989, more than thirty years after Ingram's initial work, the gene for cystic fibrosis was identified and a genetic test for the gene mutation was developed.
Using radioactive labeling to track each strand of the DNA in bacteria, American microbiologist Matthew Meselson (1930–) and his colleague Franklin Stahl (1929–) demonstrated with an experiment in 1958 that the replication of DNA in bacteria is semiconservative. Semiconservative replication occurs as the double helix unwinds at several points and knits a new strand along each of the old strands. Meselson and Stahl's experiment revealed that one strand remained intact and combined with a newly synthesized strand when DNA replicated, precisely as Watson and Crick's model predicted. In other words, each of the two new molecules created contains one of the two parent strands and one new strand.
In the early 1960s Crick, American biochemist Marshall Nirenberg (1927–), Russian-born American physicist George Gamow (1904–68), and other researchers performed experiments that detected a direct relationship between DNA nucleotide sequences, and the sequence of the amino acid building blocks of proteins. They determined that the four nucleotide letters (A, T, C, and G) may be combined into sixty-four different triplets. The triplets are "code" for instructions that determine the amino acid structure of proteins. Ribosomes are cellular organelles (membrane-bound cell compartments) that interpret a sequence of genetic code three letters at a time and link together amino acid building blocks of proteins specified by the triplets to construct a specific protein. The sixty-four triplets of nucleotides that can be coded in the DNA—which are copied during cell division, infrequently mutate, and are read by the cell to direct protein synthesis—make up the universal genetic code for all cells and viruses.
The Origins of Genetic Engineering
The late 1960s and early 1970s were marked by research that would lay the groundwork for modern genetic engineering technology. In 1966 DNA was found to be present not only in chromosomes, but also in the mitochondria. The first single gene was isolated in 1969, and the following year the first artificial gene was created. In 1972 American biochemist Paul Berg (1926–) developed a technique to splice DNA fragments from different organisms and created the first "recombinant" DNA, or DNA molecules formed by combining segments of DNA, usually from different types of organisms. In 1980 Berg was awarded the Nobel Prize in chemistry for this achievement, now referred to as "recombinant DNA technology."
In 1976 an artificial gene inserted into a bacterium functioned normally. The following year, DNA from a virus was fully decoded, and three researchers, working independently, developed methods to sequence DNA—in other words, to determine how the building blocks of DNA (the nucleotides A, C, G, and T) are ordered along the DNA strand. In 1978 bacteria were engineered to produce insulin, a pancreatic hormone that regulates carbohydrate metabolism by controlling blood glucose levels. Just four years later, the Eli Lilly pharmaceutical company marketed the first genetically engineered drug: a type of human insulin grown in genetically modified bacteria.
A 1980 U.S. Supreme Court decision permitted patents for genetically modified organisms; the first one was awarded to the General Electric Company for bacteria to assist in clearing oil spills. The following year, a gene was transferred from one animal species to another. In 1983 the first artificial chromosome was created and the marker—the usually dominant gene or trait that serves to identify genes or traits linked with it—for Huntington's disease (an inherited disease that affects the functioning of both the body and brain) was identified; in 1993 the disease gene was identified.
In 1984 the observation that some nonfunctioning DNA is different in each individual launched research to refine tools and techniques developed by Alec Jeffreys (1950–) at the University of Leicester in England that perform "genetic fingerprinting." Initially, the technique was used to determine the paternity of children, but it rapidly gained acceptance among forensic medicine specialists, who are often called on to assist in the investigation of crimes and interpret medico-legal issues.
The 1985 invention of the polymerase chain reaction (PCR), which amplifies (or produces many copies of) DNA, enabled geneticists, medical researchers, and forensic specialists to analyze and manipulate DNA from the smallest samples. PCR allowed biochemical analysis of even trace amounts of DNA. In A Short History of
Genetics and Genetic Engineering (New York: Cold Spring Harbor Library, 2003), Ricki Lake and Bernard Possidente described American biochemist Kary Mullis's (1944–) development of PCR as the "genetic equivalent of a printing press," with the potential to revolutionize genetics in the same way that the printing press had revolutionized mass communications.
Five years later, in 1990, the first gene therapy was administered. Gene therapy introduces or alters genetic material to compensate for a genetic mistake that causes disease. The patient was a four-year-old girl with the inherited immune deficiency disorder adenine deaminase deficiency. If left untreated, the deficiency is fatal. Given along with conventional medical therapy, the gene therapy treatment was considered effective. The 1999 death of another gene therapy patient, as a result of an immune reaction to the treatment, tempered enthusiasm for gene therapy and prompted medical researchers to reconsider its safety and effectiveness.
Cloning—the production of genetically identical organisms—was performed first with carrots. A cell from the root of a carrot plant was used to generate a new plant. By the early 1950s scientists had cloned frogs, and during the 1970s, mice, cows, and sheep were cloned. These clones were created using embryos and many did not produce healthy offspring, offspring with normal life spans, or offspring with the ability to reproduce. In 1993 researchers at George Washington University in Washington, D.C., cloned nearly fifty human embryos but their experiment was terminated after just six days.
In 1996 English embryologist Ian Wilmut (1944–) and his colleagues at the Roslin Institute in Scotland successfully cloned the first adult mammal that was able to reproduce. Dolly the cloned sheep, named for country singer Dolly Parton, focused public attention on the practical and ethical considerations of cloning.
The Human Genome Project and More
"Genetics" refers to the study of a single gene at a time, while "genomics" is the study of all genetic information contained in a cell. The Human Genome Project (HGP) set as one of its goals the determination of the entire nucleotide sequence of the more than three billion bases of DNA contained in the nucleus of a human cell. Initial discussions about the feasibility and value of conducting the HGP began in 1986. The following year, the first automated DNA sequencer was produced commercially. Automated sequencing, which enabled researchers to decode millions, as opposed to thousands, of letters of genetic code per day, was a pivotal technological advance for the HGP, which began in 1987 under the auspices of the U.S. Department of Energy (DOE).
In 1988 the HGP was relocated to the National Institutes of Health (NIH), and James Watson was recruited to direct the project. The following year, the NIH opened the National Center for Genome Research, and a committee composed of professionals from the NIH and DOE was named to consider ethical, social, and legal issues that might arise from the project. In 1990 the project began in earnest, with work on preliminary genetic maps of the human genome and four other organisms believed to share many genes with humans.
During the early 1990s several new technologies were developed that further accelerated progress in analyzing, sequencing, and mapping sections of the genome. The advisability of granting private biotechnology firms the right to patent specific genes and DNA sequences was hotly debated. In 1992 Watson resigned as director of the project to express his vehement disapproval of the NIH decision to patent human gene sequences. Later that year, preliminary physical and genetic maps of the human genome were published.
In 1993 Francis Collins (1950–) of the NIH was named director of the HGP, and international efforts to assist were underway in England, France, Germany, Japan, and other countries. When Stanford researchers released DNA chip technology, which simultaneously analyzes genetic information representing thousands of genes, in 1995, the development promised to speed the project to completion even before the anticipated date of 2005.
In 1995 investigators at the Institute for Genomic Research published the first complete genome sequence for any organism: the bacterium Haemophilus influenzae, with nearly two million genetic letters and 1,000 recognizable genes. The following year the yeast genome, composed of about 6,000 genes, was sequenced, and in 1997 the genome of the bacterium E. coli, which contains approximately 4,600 genes, was sequenced.
In 1998 the genome of the first multicelled animal, the nematode worm Caenorhabditis elegans, was sequenced, containing approximately 18,000 genes. The next year the first complete sequence of a human chromosome was published by the HGP. In 2000 the genome of the fruit fly Drosophila melanogaster, which Thomas Hunt Morgan and his colleagues had used to study genetics nearly a century earlier, was sequenced by the private firm Celexa. The fruit fly sequence contains about 13,000 genes, with many sequences matching already identified human genes that are associated with inherited disorders.
In 2000 the first draft of the human genome was announced, and it was published in 2001. Also in 2000 the first plant genome, Arabidopsis thaliana, was sequenced. This feat spurred research in plant biology and agriculture. Although tomatoes that had been genetically engineered for longer shelf life had been marketed during the mid-1990s, agricultural researchers began to see new possibilities to enhance crops and food products. For example, in 2000 plant geneticists developed genetically engineered rice that manufactured its own vitamin A. Many researchers believed the genetically enhanced strain of rice held great promise in terms of preventing vitamin A deficiency in developing countries.
The 2002 publication of the human genome estimated that humans have between 30,000 and 35,000 genes. The Human Genome Project was completed in 2003. The same year, the Cold Spring Harbor Laboratory held educational events to commemorate and celebrate the fiftieth anniversary of the discovery of the double helical structure of DNA. In 2004 human gene count estimates were revised downward from 35,000 to between 20,000 and 25,000. The "postgenomic" era began with a fire-storm of controversies about the direction of genetic research, human cloning, stem cell research, and genetically modified food and crops.