Daily Science Journal (Jun. 29, 2007) — Researchers at the J. Craig Venter Institute (JCVI) have announced the results of work on genome transplantation methods allowing them to transform one type of bacteria into another type dictated by the transplanted chromosome. The work, published online in the journal Science, by JCVI’s Carole Lartigue, Ph.D. and colleagues, outlines the methods and techniques used to change one bacterial species, Mycoplasma capricolum into another, Mycoplasma mycoides Large Colony (LC), by replacing one organism’s genome with the other one’s genome.

Colonies of the transformed Mycoplasma mycoides bacterium. (Credit: Image courtesy of J. Craig Venter Institute)


“The successful completion of this research is important because it is one of the key proof of principles in synthetic genomics that will allow us to realize the ultimate goal of creating a synthetic organism,” said J. Craig Venter, Ph.D., president and chairman, JCVI. “We are committed to this research as we believe that synthetic genomics holds great promise in helping to solve issues like climate change and in developing new sources of energy.”

Methods and techniques

The JCVI team devised several key steps to enable the genome transplantation. First, an antibiotic selectable marker gene was added to the M. mycoides LC chromosome to allow for selection of living cells containing the transplanted chromosome. Then the team purified the DNA or chromosome from M. mycoides LC so that it was free from proteins (called naked DNA). This M. mycoides LC chromosome was then transplanted into the M. capricolum cells. After several rounds of cell division, the recipient M. capricolum chromosome disappeared having been replaced by the donor M. mycoides LC chromosome, and the M. capricolum cells took on all the phenotypic characteristics of M. mycoides LC cells.

As a test of the success of the genome transplantation, the team used two methods — 2D gel electrophoresis and protein sequencing, to prove that all the expressed proteins were now the ones coded for by the M. mycoides LC chromosome. Two sets of antibodies that bound specifically to cell surface proteins from each cell were reacted with transplant cells, to demonstrate that the membrane proteins switch to those dictated by the transplanted chromosome not the recipient cell chromosome. The new, transformed organisms show up as bright blue colonies in images of blots probed with M. mycoides LC specific antibody.

The group chose to work with these species of mycoplasmas for several reasons — the small genomes of these organisms which make them easier to work with, their lack of cell walls, and the team’s experience and expertise with mycoplasmas. The mycoplasmas used in the transplantation experiment are also relatively fast growing, allowing the team to ascertain success of the transplantation sooner than with other species of mycoplasmas.

According to Dr. Lartigue, “While we are excited by the results of our research, we are continuing to perfect and refine our techniques and methods as we move to the next phases and prepare to develop a fully synthetic chromosome.”

Genome transplantation is an essential enabling step in the field of synthetic genomics as it is a key mechanism by which chemically synthesized chromosomes can be activated into viable living cells. The ability to transfer the naked DNA isolated from one species into a second microbial species paves the way for next experiments to transplant a fully synthetic bacterial chromosome into a living organism and if successful, “boot up” the new entity. There are many important applications of synthetic genomics research including development of new energy sources and as means to produce pharmaceuticals, chemicals or textiles.

This research was funded by Synthetic Genomics Inc.

Background and Ethical Considerations

The work described by Lartigue et al. has its genesis in research begun by Dr. Venter and colleagues in the mid-1990’s after sequencing Mycoplasma genitalium and beginning work on the minimal genome project. This area of research, trying to understand the minimal genetic components necessary to sustain life, underwent significant ethical review by a panel of experts at the University of Pennsylvania (Cho et al, Science December 1999:Vol. 286. no. 5447, pp. 2087 – 2090). The bioethical group's independent deliberations, published at the same time as the scientific minimal genome research, resulted in a unanimous decision that there were no strong ethical reasons why the work should not continue as long as the scientists involved continued to engage public discussion.

In 2003 Drs. Venter, Smith and Hutchison made the first significant strides in the development of a synthetic genome by their work in assembling the 5,386 base pair bacteriophage φX174 (phi X). They did so using short, single strands of synthetically produced, commercially available DNA (known as oligonucleotides) and using an adaptation of polymerase chain reaction (PCR), known as polymerase cycle assembly (PCA), to build the phi X genome. The team produced the synthetic phi X in just 14 days.

Dr. Venter and the team at JCVI continue to be concerned with the societal implications of their work and the field of synthetic genomics generally. As such, the Institute’s policy team, along with the Center for Strategic & International Studies (CSIS), and the Massachusetts Institute of Technology (MIT), were funded by a grant from the Alfred P. Sloan Foundation for a 15-month study to explore the risks and benefits of this emerging technology, as well as possible safeguards to prevent abuse, including bioterrorism. After several workshops and public sessions the group is set to publish a report in summer 2007 outlining options for the field and its researchers.

About the J. Craig Venter Institute

The J. Craig Venter Institute is a not-for-profit research institute dedicated to the advancement of the science of genomics; the understanding of its implications for society; and communication of those results to the scientific community, the public, and policymakers. Founded by J. Craig Venter, Ph.D., the JCVI is home to approximately 500 scientists and staff with expertise in human and evolutionary biology, genetics, bioinformatics/informatics, information technology, high-throughput DNA sequencing, genomic and environmental policy research, and public education in science and science policy. The legacy organizations of the JCVI are: The Institute for Genomic Research (TIGR), The Center for the Advancement of Genomics (TCAG), the Institute for Biological Energy Alternatives (IBEA), the Joint Technology Center (JTC), and the J. Craig Venter Science Foundation. The JCVI is a 501 (c)(3) organization. For additional information, please visit http://www.JCVI.org.

Adapted from materials provided by J. Craig Venter Institute.




Read the rest of this entry »

Daily Science Journal (Jun. 24, 2007) — Astronomers have discovered the most Earth-like planet outside our Solar System to date, an exoplanet with a radius only 50% larger than the Earth and capable of having liquid water. Using the ESO 3.6-m telescope, a team of Swiss, French and Portuguese scientists discovered a super-Earth about 5 times the mass of the Earth that orbits a red dwarf, already known to harbour a Neptune-mass planet. The astronomers have also strong evidence for the presence of a third planet with a mass about 8 Earth masses.

Artist's impression of the system of three planets surrounding the red dwarf Gliese 581. One of them is the first rocky planet lying in the habitable zone to have been discovered. (Credit: ESO)

This exoplanet - as astronomers call planets around a star other than the Sun -- is the smallest ever found up to now [1] and it completes a full orbit in 13 days. It is 14 times closer to its star than the Earth is from the Sun. However, given that its host star, the red dwarf Gliese 581 [2], is smaller and colder than the Sun -- and thus less luminous -- the planet nevertheless lies in the habitable zone, the region around a star where water could be liquid!


"We have estimated that the mean temperature of this super-Earth lies between 0 and 40 degrees Celsius, and water would thus be liquid," explains Stéphane Udry, from the Geneva Observatory (Switzerland) and lead-author of the paper reporting the result. "Moreover, its radius should be only 1.5 times the Earth's radius, and models predict that the planet should be either rocky -- like our Earth -- or covered with oceans," he adds.

"Liquid water is critical to life as we know it," avows Xavier Delfosse, a member of the team from Grenoble University (France). "Because of its temperature and relative proximity, this planet will most probably be a very important target of the future space missions dedicated to the search for extra-terrestrial life. On the treasure map of the Universe, one would be tempted to mark this planet with an X."

The host star, Gliese 581, is among the 100 closest stars to us, located only 20.5 light-years away in the constellation Libra ("the Scales"). It has a mass of only one third the mass of the Sun. Such red dwarfs are intrinsically at least 50 times fainter than the Sun and are the most common stars in our Galaxy: among the 100 closest stars to the Sun, 80 belong to this class.

"Red dwarfs are ideal targets for the search for low-mass planets where water could be liquid. Because such dwarfs emit less light, the habitable zone is much closer to them than it is around the Sun," emphasizes Xavier Bonfils, a co-worker from Lisbon University. Planets lying in this zone are then more easily detected with the radial-velocity method [3], the most successful in detecting exoplanets.

Two years ago, the same team of astronomers already found a planet around Gliese 581 (see ESO 30/05). With a mass of 15 Earth-masses, i.e. similar to that of Neptune, it orbits its host star in 5.4 days. At the time, the astronomers had already seen hints of another planet. They therefore obtained a new set of measurements and found the new super-Earth, but also clear indications for another one, an 8 Earth-mass planet completing an orbit in 84 days. The planetary system surrounding Gliese 581 contains thus no fewer than 3 planets of 15 Earth masses or less, and as such is a quite remarkable system.

The discovery was made thanks to HARPS (High Accuracy Radial Velocity for Planetary Searcher), perhaps the most precise spectrograph in the world. Located on the ESO 3.6-m telescope at La Silla, Chile, HARPS is able to measure velocities with a precision better than one metre per second (or 3.6 km/h)! HARPS is one of the most successful instruments for detecting exoplanets and holds already several recent records, including the discovery of another 'Trio of Neptunes' (ESO 18/06, see also ESO 22/04).

The detected velocity variations are between 2 and 3 metres per second, corresponding to about 9 km/h! That's the speed of a person walking briskly. Such tiny signals could not have been distinguished from 'simple noise' by most of today's available spectrographs.

"HARPS is a unique planet hunting machine," says Michel Mayor, from Geneva Observatory, and HARPS Principal Investigator. "Given the incredible precision of HARPS, we have focused our effort on low-mass planets. And we can say without doubt that HARPS has been very successful: out of the 13 known planets with a mass below 20 Earth masses, 11 were discovered with HARPS!"

HARPS is also very efficient in finding planetary systems, where tiny signals have to be uncovered. The two systems known to have three low mass planets -- HD 69830 and Gl 581 -- were discovered by HARPS.

"And we are confident that, given the results obtained so far, finding a planet with the mass of the Earth around a red dwarf is within reach," affirms Mayor.

This research is reported in a paper submitted as a Letter to the Editor of Astronomy and Astrophysics ("The HARPS search for southern extra-solar planets : XI. An habitable super-Earth (5 MEarth) in a 3-planet system", by S. Udry et al.)

The team is composed of Stéphane Udry, Michel Mayor, Christophe Lovis, Francesco Pepe, and Didier Queloz (Geneva Observatory, Switzerland), Xavier Bonfils (Lisbonne Observatory, Portugal), Xavier Delfosse, Thierry Forveille, and C.Perrier (LAOG, Grenoble, France), François Bouchy (Institut d'Astrophysique de Paris, France), and Jean-Luc Bertaux (Service d'Aéronomie du CNRS, France)

Notes

[1] Using the radial velocity method, astronomers can only obtain a minimum mass (as it is multiplied by the sine of the inclination of the orbital plane to the line of sight, which is unknown). From a statistical point of view, this is however often close to the real mass of the system. Two other systems have a mass close to this. The icy planet around OGLE-05-390L, discovered by microlensing with a network of telescopes including one at La Silla (ESO 03/06), has a (real) mass of 5.7 Earth masses. It, however, orbits much farther from its small host star than the present one and is hence much colder. The other is one of the planets surrounding the star Gliese 876. It has a minimum mass of 5.89 Earth masses (and a probable real mass of 7.53 Earth masses) and completes an orbit in less than 2 days, making it too hot for liquid water to be present.

[2] Gl 581, or Gliese 581, is the 581th entry in the Gliese Catalogue, which lists all known stars within 25 parsecs (81.5 light years) of the Sun. It was originally compiled by Gliese and published in 1969, and later updated by Gliese and Jahreiss in 1991.

[3] This fundamental observational method is based on the detection of variations in the velocity of the central star, due to the changing direction of the gravitational pull from an (unseen) exoplanet as it orbits the star. The evaluation of the measured velocity variations allows deducing the planet's orbit, in particular the period and the distance from the star, as well as a minimum mass.

Adapted from materials provided by European Southern Observatory, via EurekAlert!, a service of AAAS.



Read the rest of this entry »

New Species From Old Data

Daily Science Journal (Jun. 24, 2007) — Researchers have discovered three previously unknown species of a bacterium by scanning a publicly available data bank, reveals a study published today in the journal Genome Biology. The finding highlights the value of making unanalysed data from large-scale genome sequencing projects openly available online.

Steven Salzberg from The Institute for Genomic Research in Maryland and colleagues identified three new species of the bacterium Wolbachia from the genome sequences of the fruit fly Drosophila that are stored in the Trace Archive.


The Trace Archive is a public repository of raw genomic data from sequencing projects. When the genome of an organism is sequenced, the genome of endosymbiotic bacteria that live inside the organism can get incorporated into the data, contaminating the final genomic sequence of the host organism. Scanning raw sequences can therefore lead to the identification of previously unknown endosymbionts.

Salzberg and colleagues scanned through the newly sequenced genomes of seven different Drosophila species, using the genome of the bacterium Wolbachia pipientis wMel as a probe. From D. ananassae, they retrieved 32,720 sequences that matched the wMel strain. This yielded a new genome of 1,440,650 bp, which they identified as the new species Wolbachia wAna. Using the same technique, they identified Wolbachia wSim in the genome of D. simulans and Wolbachia wMoj in the genome of D. mojavensis.

"The discovery of these three new genomes demonstrates how powerful the public release of raw sequencing data can be" write the authors, who have deposited their findings in Genbank, another open repository of genomic sequences.

The team compared the new Wolbachia genomes with the wMel genome and found a number of new genes – up to 464 new genes in wAna – as well as a sign of extensive rearrangement between wMel and wAna, indicating that the two strains have diverged significantly since they first infected the two Drosophila species. The two most closely related strains are wAna and wSim, which have nearly identical genomes. wMel and wMoj share about 97% of their genomes with wAna and wSim but are a bit more distant from one another.

These findings might help shed light on the evolution of bacterial endosymbionts and on the mechanisms these organisms use to alter the cell cycle of the host in order to reproduce.

This press release is based on the article:

Serendipitous discovery of Wolbachia genomes in multiple Drosophila species Steven L. Salzberg, Julie C. Dunning Hotopp, Arthur L. Delcher, Mihai Pop, Douglas R. Smith, Michael B. Eisen, and William C. Nelson Genome Biology 6: R23

Adapted from materials provided by BioMed Central.



Read the rest of this entry »

Daily Science Journal (Jun. 22, 2007) — Global warming accounted for around half of the extra hurricane-fueling warmth in the waters of the tropical North Atlantic in 2005, while natural cycles were only a minor factor, according to a new analysis by Kevin Trenberth and Dennis Shea of the National Center for Atmospheric Research (NCAR). The study will appear in the June 27 issue of Geophysical Research Letters, published by the American Geophysical Union.

Hurricanes Ophelia, Nate, and Maria were among 15 hurricanes that raged across the Atlantic, Gulf of Mexico, and Caribbean in 2005. Click here or on image to enlarge. (Image by NASA-GSFC, data from NOAA GOES)


"The global warming influence provides a new background level that increases the risk of future enhancements in hurricane activity," Trenberth says. The research was supported by the National Science Foundation, NCAR's primary sponsor.

The study contradicts recent claims that natural cycles are responsible for the upturn in Atlantic hurricane activity since 1995. It also adds support to the premise that hurricane seasons will become more active as global temperatures rise. Last year produced a record 28 tropical storms and hurricanes in the Atlantic. Hurricanes Katrina, Rita, and Wilma all reached Category 5 strength.

Trenberth and Shea's research focuses on an increase in ocean temperatures. During much of last year's hurricane season, sea-surface temperatures across the tropical Atlantic between 10 and 20 degrees north, which is where many Atlantic hurricanes originate, were a record 1.7 degrees F above the 1901-1970 average. While researchers agree that the warming waters fueled hurricane intensity, they have been uncertain whether Atlantic waters have heated up because of a natural, decades-long cycle, or because of global warming.

By analyzing worldwide data on sea-surface temperatures (SSTs) since the early 20th century, Trenberth and Shea were able to calculate the causes of the increased temperatures in the tropical North Atlantic. Their calculations show that global warming explained about 0.8 degrees F of this rise. Aftereffects from the 2004-05 El Nino accounted for about 0.4 degrees F. The Atlantic multidecadal oscillation (AMO), a 60-to-80-year natural cycle in SSTs, explained less than 0.2 degrees F of the rise, according to Trenberth. The remainder is due to year-to-year variability in temperatures.

Previous studies have attributed the warming and cooling patterns of North Atlantic ocean temperatures in the 20th century—and associated hurricane activity—to the AMO. But Trenberth, suspecting that global warming was also playing a role, looked beyond the Atlantic to temperature patterns throughout Earth's tropical and midlatitude waters. He subtracted the global trend from the irregular Atlantic temperatures—in effect, separating global warming from the Atlantic natural cycle. The results show that the AMO is actually much weaker now than it was in the 1950s, when Atlantic hurricanes were also quite active. However, the AMO did contribute to the lull in hurricane activity from about 1970 to 1990 in the Atlantic.

Global warming does not guarantee that each year will set records for hurricanes, according to Trenberth. He notes that last year's activity was related to very favorable upper-level winds as well as the extremely warm SSTs. Each year will bring ups and downs in tropical Atlantic SSTs due to natural variations, such as the presence or absence of El Nino, says Trenberth. However, he adds, the long-term ocean warming should raise the baseline of hurricane activity.

Adapted from materials provided by National Center for Atmospheric Research.



Read the rest of this entry »

Daily Science Journal (Jun. 21, 2007) — The world is abuzz with the discovery of an extrasolar, Earth-like planet around the star Gliese 581 that is relatively close to our Earth at 20 light years away in the constellation Libra.

Artist's impression of the five-Earth mass planet, Gliese 581 c, found in the habitable zone around the red dwarf Gliese 581, with the instrument HARPS on the ESO 3.6-m telescope. (Credit: European Southern Observatory)

Bruce Fegley, Jr., Ph.D., professor of earth and planetary sciences in Arts & Sciences at Washington University in St. Louis, has worked on computer models that can provide hints to what comprises the atmosphere of such planets and better-known celestial bodies in our own solar system.


New computer models, from both Earth-based spectroscopy and space mission data, are providing space scientists compelling evidence for a better understanding of planetary atmospheric chemistry. Recent findings suggest a trend of increasing water content in going from Jupiter (depleted in water), to Saturn (less enriched in water than other volatiles), to Uranus and Neptune, which have large water enrichments.

"The farther out you go in the solar system, the more water you find," said Fegley.

Fegley provided an overview of comparative planetary atmospheric chemistry at the 233rd American Chemical Society National Meeting, held March 25-29, 2007, in Chicago. Fegley and Katharina Lodders-Fegley, Ph.D., research associate professor of earth and planetary sciences, direct the university's Planetary Chemistry Laboratory.

"The theory about the Gas Giant planets (Jupiter, Saturn, Uranus, and Neptune) is that they have primary atmospheres, which means that their atmospheres were captured directly from the solar nebula during accretion of the planets," Fegley said.

Gas Giants

He said that Jupiter has more hydrogen and helium and less carbon, nitrogen and oxygen than the other Gas Giant planets, making its composition closer to that of the hydrogen- and helium-rich sun. The elements hydrogen, carbon and oxygen are predominantly found as water, the gases molecular hydrogen and methane and in the atmospheres of the Gas Giant planets.

"Spectroscopic observations and interior models show that Saturn, Uranus and Neptune are enriched in heavier elements," he said. "Jupiter, based on observations from the Galileo Probe, is depleted in water. People have thought that Galileo might just have gone into a dry area. But Earth-based observations show that the carbon monoxide abundance in Jupiter's atmosphere is consistent with the observed abundances of methane, hydrogen and water vapor. This pretty much validates the Galileo Probe finding."

The abundances of these four gases are related by the reaction CH4+H20 = CO+3H2. Thus, observations of the methane, hydrogen and CO abundances can be used to calculate the water vapor abundance. Likewise, Earth-based observations of methane, hydrogen and carbon monoxide in Saturn's atmosphere show that water is less enriched than methane.

In contrast, observations of methane, hydrogen and carbon monoxide in the atmospheres of Uranus and Neptune show that water is greatly enriched in these two planets. Although generally classed with Jupiter and Saturn, Uranus and Neptune are water planets with relatively thin gaseous envelopes.

"On the other hand, the terrestrial planets Venus, Earth and Mars have secondary atmospheres formed afterwards by outgassing — heating up the solid material that was accreted and then releasing the volatile compounds from it," Fegley said. "That then formed the earliest atmosphere."

He said that by plugging in models he's done on the outgassing of chondritic materials and using photochemical models of the effects of UV sunlight, he and his collaborator Laura Schaefer, a research assistant in the Washington University Department of Earth and Planetary Sciences, can speculate on the atmospheric composition of Earth-like planets in other solar systems.

"With new theoretical models we are able to surmise the outgassing of materials that went into forming the planets, and even make predictions about the atmospheres of extrasolar terrestrial planets," he said.

"Because the composition of the galaxy is relatively uniform, most stars are like the sun — hydrogen-rich with about the same abundances of rocky elements — we can predict what these planetary atmospheres would be like," Fegley said. "I think that the atmospheres of extrasolar Earth-like plants would be more like Mars or Venus than the Earth."

Fegley said that photosynthesis accounts for the oxygen in Earth's atmosphere; without it, the Earth's atmosphere would consist of nitrogen, carbon dioxide and water vapor, with only small amounts of oxygen. Oxygen is 21 percent of Earth's atmosphere; in contrast, Mars has about one-tenth of one percent made by UV sunlight destroying carbon dioxide.

"I see Mars today as a great natural laboratory for photochemistry; Venus is the same for thermochemistry, and Earth for biochemistry," he said. "Mars has such a thin atmosphere compared to Earth or Venus. UV light can penetrate all the way down to the Martian surface before it's absorbed. That same light on Earth is mainly absorbed in the ozone layer in the lower Earth stratosphere. Venus is so dense that light is absorbed by a cloud layer about 45 kilometers or so above the Venusian surface."

Adapted from materials provided by Washington University in St. Louis.



Read the rest of this entry »

Daily Science Journal (Jun. 20, 2007) — Following Hurricane Katrina and the parade of storms that affected the conterminous United States in 2004–2005, the apparent recent increase in intense hurricane activity in the Atlantic basin, and the reported increases in recent decades in some hurricane intensity and duration measures in several basins have received considerable attention.

Hurricane Katrina taken Aug. 28, 2005, as the storm’s outer bands lashed the Gulf Coast of the United States a day before making landfall and leaving a path of destruction in its wake. (Credit: NOAA)


An important ongoing avenue of investigation in the climate and meteorology research communities is to determine the relative roles of anthropogenic forcing (i.e., global warming) and natural variability in producing the observed recent increases in hurricane frequency in the Atlantic, as well as the reported increases of tropical cyclone activity measures in several other ocean basins.

A survey of the existing literature shows that many types of data have been used to describe hurricane intensity, and not all records are of sufficient length to reliably identify historical trends. Additionally, there are concerns among researchers about possible effects of data inhomogeneities on the reported trends.

Much of the current debate has focused on the relative roles of sea-surface temperatures or large-scale potential intensity versus the role of other environmental factors such as vertical wind shear in causing observed changes in hurricane statistics. Significantly more research – from observations, theory, and modeling – is needed to resolve the current debate around global warming and hurricanes.

Adapted from materials provided by Blackwell Publishing Ltd.


Read the rest of this entry »

Daily Science Journal (Jun. 19, 2007) — Five years ago, Sharon Schafer Bennett suffered from migraines so severe that the headaches disrupted her life, kept her from seeking a job and interfered with participation in her children's daily activities.

Plastic surgeon Dr. Jeffrey Janis marks a site that, using the anti-wrinkle drug Botox, pinpointed a muscle later removed to help relieve Sharon Schafer Bennett's severe migraines. (Credit: Image courtesy of UT Southwestern Medical Center)

Now, thanks to an innovative surgical technique performed by a UT Southwestern Medical Center plastic surgeon who helped pioneer the procedure, the frequency and intensity of Mrs. Bennett's migraines have diminished dramatically -- from two to three per week to an occasional one every few months.


The technique -- performed by a handful of plastic surgeons in the U.S. -- includes using the anti-wrinkle drug Botox to pinpoint which of several specific muscles in the forehead, back of the head or temple areas may be serving as "trigger points" to compress, irritate or entrap nerves that could be causing the migraine. Because Botox temporarily paralyzes muscles, usually for about three months, it can be used as a "litmus test" or "marker" to see if headaches go away or become less intense while the Botox's effects last, said Dr. Jeffrey Janis, assistant professor of plastic surgery.

If the Botox is successful in preventing migraines or lessening their severity, then surgery to remove the targeted muscle is likely to accomplish the same result, but on a more long-term and possibly permanent basis, he said.

For Mrs. Bennett, the surgery proved to be life-altering.

"I can't even begin to tell you what a change this has made in my life," said Mrs. Bennett, 45, a Houston-area resident. "For the first time in years, I can live like a normal human being and do all the normal 'mom' and 'wife' things that the migraines physically prevented me from doing. My family thinks it's great because they don't have to put their lives on hold numerous times a week because of my migraines. I'm also going back to school to get a second degree, something I could never have considered before."

Dr. Janis said: "Many neurologists are using Botox to treat migraines, but they are making the injections in a 'headband-like' circle around the forehead, temple and skull. They are not looking at finding the specific location of the headache's trigger point. While patients may get temporary relief, after the Botox wears off they will have to go back and get more injections or continue medications for migraines.

"It's like a math equation. I will inject the Botox into one trigger point at a time and leave the others alone. The Botox is used as a diagnostic test to determine what trigger point is causing the problem. If patients get a benefit from the Botox, they likely will get a benefit from the surgery. If there's no benefit from the Botox, then there won't be a benefit from the surgery."

Dr. Janis began collaborating more than five years ago with Dr. Bahman Guyuron, a plastic surgeon at Case Western Reserve University and the first to explore using surgery to relieve migraines, following the revelation by several of his patients that their migraines had disappeared after they had cosmetic brow lifts. Dr. Janis has assisted his colleague by performing anatomical studies on cadavers to explore the nerves and pathways that might cause migraines. Together they have identified four specific trigger points and developed a treatment algorithm that includes using Botox prior to deciding whether to perform surgery.

During the past several years, numerous peer-reviewed articles have been published in Plastic & Reconstructive Surgery detailing their research efforts and the researchers have presented the technique at professional meetings of plastic surgeons.

Approximately 28 million Americans, 75 percent of those women, suffer from migraines, according to the National Institutes of Health. For employers, that translates into an estimated 157 million lost workdays annually.

"A migraine is something you can't explain to someone who hasn't had one," said Mrs. Bennett, who began suffering monthly migraines as a teenager. As she grew older, the headaches become more frequent and unpredictable. "They were messing up my life. I couldn't make any commitments or plan activities for my kids. This surgery has made a huge difference in my life. It's awesome."

Dr. Janis only sees patients who have been diagnosed with recurring migraines by a neurologist and have tried other treatments that have failed.

"Plastic surgeons are not in the business of diagnosing and treating headaches," he said. "This is a novel method of treatment that is proving to be effective and potentially more long lasting than other things used before. But it is still in its infancy."

Adapted from materials provided by UT Southwestern Medical Center.



Read the rest of this entry »

Say Cheese! Scientists In A Ferment Over Cheese-Starter Genome

Daily Science Journal (Jun. 19, 2007) — Whether sharp Cheddar or nutty Gouda, a fine cheese owes its flavor to milk-fermenting bacteria, such as the historically ancient starter Lactococcus lactis. In next month’s issue of Genome Research, researchers from France report the complete genome sequence of L. lactis, now the most commonly used starter in the cheese industry.


L. lactis is a member of the lactic acid bacteria (LAB) family, which includes not only cheese and yogurt starters, but also pathogens like Streptococcus pneumoniae. Until now, none of the LAB genomes have been sequenced. To produce the L. lactis sequence, Alexei Sorokin and colleagues from Génoscope and the Institut National de la Recherche Agronomique have used a novel approach that reduces the number of steps for obtaining a complete bacterial genome sequence.

Now the researchers report the entire L. lactis sequence of 2.4 million nucleotides, which encode 2310 genes (363 specific for lactococci). In their analysis of the genome, the researchers made several surprising discoveries, including genes suggesting this fermentative bacterium can perform aerobic respiration. This research marks a critical step towards understanding and manipulating the LAB and, in particular, will be useful for improving the flavor, texture, and preservation of10 million tons of cheese produced annually. Now that's a lot of cheese.

Adapted from materials provided by Cold Spring Harbor Laboratory.




Read the rest of this entry »

Daily Science Journal (Jun. 18, 2007) — Climate model simulations for the 21st century indicate a robust increase in wind shear in the tropical Atlantic due to global warming, which may inhibit hurricane development and intensification. Historically, increased wind shear has been associated with reduced hurricane activity and intensity.

The white arrows represent strong cross winds, also known as wind shear. These winds are predicted to become more common in the Atlantic due to global warming. They can disrupt or destroy a hurricane by blowing the top away. (Credit: NOAA)


This new finding is reported in a study by scientists at the Rosenstiel School of Marine and Atmospheric Science at the University of Miami and NOAA's Geophysical Fluid Dynamics Laboratory (GFDL) in Princeton, N.J., and, scheduled to be published April 18th in Geophysical Research Letters.

While other studies have linked global warming to an increase in hurricane intensity, this study is the first to identify changes in wind shear that could counteract these effects. "The environmental changes found here do not suggest a strong increase in tropical Atlantic hurricane activity during the 21st century," said Brian Soden, Rosenstiel School associate professor of meteorology and physical oceanography and the paper's co-author. However, the study does identify other regions, such as the western tropical Pacific, where global warming does cause the environment to become more favorable for hurricanes.

"Wind shear is one of the dominant controls to hurricane activity, and the models project substantial increases in the Atlantic," said Gabriel Vecchi, lead author of the paper and a research oceanographer at GFDL. "Based on historical relationships, the impact on hurricane activity of the projected shear change could be as large -- and in the opposite sense -- as that of the warming oceans."

Examining possible impacts of human-caused greenhouse warming on hurricane activity, the researchers used climate models to assess changes in the environmental factors tied to hurricane formation and intensity. They focused on projected changes in vertical wind shear over the tropical Atlantic and its ties to the Pacific Walker circulation -- a vast loop of winds that influences climate across much of the globe and that varies in concert with El Niño and La Niña oscillations. By examining 18 different models, the authors identified a systematic increase in wind shear over much of the tropical Atlantic due to a slowing of the Pacific Walker circulation. Their research suggests that the increase in wind shear could inhibit both hurricane development and intensification.

"This study does not, in any way, undermine the widespread consensus in the scientific community about the reality of global warming," said Soden. "In fact, the wind shear changes are driven by global warming."

The authors also note that additional research will be required to fully understand how the increased wind shear affects hurricane activity more specifically. "This doesn't settle the issue; this is one piece of the puzzle that will contribute to an incredibly active field of research," Vecchi said.

Adapted from materials provided by University of Miami Rosenstiel School of Marine & Atmospheric Science.



Read the rest of this entry »

Daily Science Journal (Jun. 17, 2007) — Researchers at Delft University of Technology have succeeded in carrying out calculations with two quantum bits, the building blocks of a possible future quantum computer. The Delft researchers are publishing an article about this important step towards a workable quantum computer in this week's issue of Nature.

Superconducting rings on a chip. (Credit: TU Delft)

Quantum computers have superior qualities in comparison to the type of computers currently in use. If they are realised, then quantum computers will be able to carry out tasks that are beyond the abilities of all normal computers.


A quantum computer is based on the amazing properties of quantum systems. In these a quantum bit, also known as a qubit, exists in two states at the same time and the information from two qubits is entangled in a way that has no equivalent whatsoever in the normal world.

It is highly likely that workable quantum computers will need to be produced using existing manufacturing techniques from the chip industry. Working on this basis, scientists at Delft University of Technology are currently studying two types of qubits: one type makes use of tiny superconducting rings, and the other makes use of 'quantum dots'.

Now for the first time a 'controlled-NOT' calculation with two qubits has been realised with the superconducting rings. This is important because it allows any given quantum calculation to be realised.

The result was achieved by the PhD student Jelle Plantenberg in the team led by Kees Harmans and Hans Mooij. The research took place within the FOM (Dutch Foundation for Fundamental Research on Matter) concentration group for Solid State Quantum Information Processing.

Adapted from materials provided by Delft University of Technology, via EurekAlert!, a service of AAAS.





Read the rest of this entry »