Daily Science Journal (Jun. 29, 2007) — Researchers at the J. Craig Venter Institute (JCVI) have announced the results of work on genome transplantation methods allowing them to transform one type of bacteria into another type dictated by the transplanted chromosome. The work, published online in the journal Science, by JCVI’s Carole Lartigue, Ph.D. and colleagues, outlines the methods and techniques used to change one bacterial species, Mycoplasma capricolum into another, Mycoplasma mycoides Large Colony (LC), by replacing one organism’s genome with the other one’s genome.

Colonies of the transformed Mycoplasma mycoides bacterium. (Credit: Image courtesy of J. Craig Venter Institute)


“The successful completion of this research is important because it is one of the key proof of principles in synthetic genomics that will allow us to realize the ultimate goal of creating a synthetic organism,” said J. Craig Venter, Ph.D., president and chairman, JCVI. “We are committed to this research as we believe that synthetic genomics holds great promise in helping to solve issues like climate change and in developing new sources of energy.”

Methods and techniques

The JCVI team devised several key steps to enable the genome transplantation. First, an antibiotic selectable marker gene was added to the M. mycoides LC chromosome to allow for selection of living cells containing the transplanted chromosome. Then the team purified the DNA or chromosome from M. mycoides LC so that it was free from proteins (called naked DNA). This M. mycoides LC chromosome was then transplanted into the M. capricolum cells. After several rounds of cell division, the recipient M. capricolum chromosome disappeared having been replaced by the donor M. mycoides LC chromosome, and the M. capricolum cells took on all the phenotypic characteristics of M. mycoides LC cells.

As a test of the success of the genome transplantation, the team used two methods — 2D gel electrophoresis and protein sequencing, to prove that all the expressed proteins were now the ones coded for by the M. mycoides LC chromosome. Two sets of antibodies that bound specifically to cell surface proteins from each cell were reacted with transplant cells, to demonstrate that the membrane proteins switch to those dictated by the transplanted chromosome not the recipient cell chromosome. The new, transformed organisms show up as bright blue colonies in images of blots probed with M. mycoides LC specific antibody.

The group chose to work with these species of mycoplasmas for several reasons — the small genomes of these organisms which make them easier to work with, their lack of cell walls, and the team’s experience and expertise with mycoplasmas. The mycoplasmas used in the transplantation experiment are also relatively fast growing, allowing the team to ascertain success of the transplantation sooner than with other species of mycoplasmas.

According to Dr. Lartigue, “While we are excited by the results of our research, we are continuing to perfect and refine our techniques and methods as we move to the next phases and prepare to develop a fully synthetic chromosome.”

Genome transplantation is an essential enabling step in the field of synthetic genomics as it is a key mechanism by which chemically synthesized chromosomes can be activated into viable living cells. The ability to transfer the naked DNA isolated from one species into a second microbial species paves the way for next experiments to transplant a fully synthetic bacterial chromosome into a living organism and if successful, “boot up” the new entity. There are many important applications of synthetic genomics research including development of new energy sources and as means to produce pharmaceuticals, chemicals or textiles.

This research was funded by Synthetic Genomics Inc.

Background and Ethical Considerations

The work described by Lartigue et al. has its genesis in research begun by Dr. Venter and colleagues in the mid-1990’s after sequencing Mycoplasma genitalium and beginning work on the minimal genome project. This area of research, trying to understand the minimal genetic components necessary to sustain life, underwent significant ethical review by a panel of experts at the University of Pennsylvania (Cho et al, Science December 1999:Vol. 286. no. 5447, pp. 2087 – 2090). The bioethical group's independent deliberations, published at the same time as the scientific minimal genome research, resulted in a unanimous decision that there were no strong ethical reasons why the work should not continue as long as the scientists involved continued to engage public discussion.

In 2003 Drs. Venter, Smith and Hutchison made the first significant strides in the development of a synthetic genome by their work in assembling the 5,386 base pair bacteriophage φX174 (phi X). They did so using short, single strands of synthetically produced, commercially available DNA (known as oligonucleotides) and using an adaptation of polymerase chain reaction (PCR), known as polymerase cycle assembly (PCA), to build the phi X genome. The team produced the synthetic phi X in just 14 days.

Dr. Venter and the team at JCVI continue to be concerned with the societal implications of their work and the field of synthetic genomics generally. As such, the Institute’s policy team, along with the Center for Strategic & International Studies (CSIS), and the Massachusetts Institute of Technology (MIT), were funded by a grant from the Alfred P. Sloan Foundation for a 15-month study to explore the risks and benefits of this emerging technology, as well as possible safeguards to prevent abuse, including bioterrorism. After several workshops and public sessions the group is set to publish a report in summer 2007 outlining options for the field and its researchers.

About the J. Craig Venter Institute

The J. Craig Venter Institute is a not-for-profit research institute dedicated to the advancement of the science of genomics; the understanding of its implications for society; and communication of those results to the scientific community, the public, and policymakers. Founded by J. Craig Venter, Ph.D., the JCVI is home to approximately 500 scientists and staff with expertise in human and evolutionary biology, genetics, bioinformatics/informatics, information technology, high-throughput DNA sequencing, genomic and environmental policy research, and public education in science and science policy. The legacy organizations of the JCVI are: The Institute for Genomic Research (TIGR), The Center for the Advancement of Genomics (TCAG), the Institute for Biological Energy Alternatives (IBEA), the Joint Technology Center (JTC), and the J. Craig Venter Science Foundation. The JCVI is a 501 (c)(3) organization. For additional information, please visit http://www.JCVI.org.

Adapted from materials provided by J. Craig Venter Institute.




Read the rest of this entry »

Daily Science Journal (Jun. 24, 2007) — Astronomers have discovered the most Earth-like planet outside our Solar System to date, an exoplanet with a radius only 50% larger than the Earth and capable of having liquid water. Using the ESO 3.6-m telescope, a team of Swiss, French and Portuguese scientists discovered a super-Earth about 5 times the mass of the Earth that orbits a red dwarf, already known to harbour a Neptune-mass planet. The astronomers have also strong evidence for the presence of a third planet with a mass about 8 Earth masses.

Artist's impression of the system of three planets surrounding the red dwarf Gliese 581. One of them is the first rocky planet lying in the habitable zone to have been discovered. (Credit: ESO)

This exoplanet - as astronomers call planets around a star other than the Sun -- is the smallest ever found up to now [1] and it completes a full orbit in 13 days. It is 14 times closer to its star than the Earth is from the Sun. However, given that its host star, the red dwarf Gliese 581 [2], is smaller and colder than the Sun -- and thus less luminous -- the planet nevertheless lies in the habitable zone, the region around a star where water could be liquid!


"We have estimated that the mean temperature of this super-Earth lies between 0 and 40 degrees Celsius, and water would thus be liquid," explains Stéphane Udry, from the Geneva Observatory (Switzerland) and lead-author of the paper reporting the result. "Moreover, its radius should be only 1.5 times the Earth's radius, and models predict that the planet should be either rocky -- like our Earth -- or covered with oceans," he adds.

"Liquid water is critical to life as we know it," avows Xavier Delfosse, a member of the team from Grenoble University (France). "Because of its temperature and relative proximity, this planet will most probably be a very important target of the future space missions dedicated to the search for extra-terrestrial life. On the treasure map of the Universe, one would be tempted to mark this planet with an X."

The host star, Gliese 581, is among the 100 closest stars to us, located only 20.5 light-years away in the constellation Libra ("the Scales"). It has a mass of only one third the mass of the Sun. Such red dwarfs are intrinsically at least 50 times fainter than the Sun and are the most common stars in our Galaxy: among the 100 closest stars to the Sun, 80 belong to this class.

"Red dwarfs are ideal targets for the search for low-mass planets where water could be liquid. Because such dwarfs emit less light, the habitable zone is much closer to them than it is around the Sun," emphasizes Xavier Bonfils, a co-worker from Lisbon University. Planets lying in this zone are then more easily detected with the radial-velocity method [3], the most successful in detecting exoplanets.

Two years ago, the same team of astronomers already found a planet around Gliese 581 (see ESO 30/05). With a mass of 15 Earth-masses, i.e. similar to that of Neptune, it orbits its host star in 5.4 days. At the time, the astronomers had already seen hints of another planet. They therefore obtained a new set of measurements and found the new super-Earth, but also clear indications for another one, an 8 Earth-mass planet completing an orbit in 84 days. The planetary system surrounding Gliese 581 contains thus no fewer than 3 planets of 15 Earth masses or less, and as such is a quite remarkable system.

The discovery was made thanks to HARPS (High Accuracy Radial Velocity for Planetary Searcher), perhaps the most precise spectrograph in the world. Located on the ESO 3.6-m telescope at La Silla, Chile, HARPS is able to measure velocities with a precision better than one metre per second (or 3.6 km/h)! HARPS is one of the most successful instruments for detecting exoplanets and holds already several recent records, including the discovery of another 'Trio of Neptunes' (ESO 18/06, see also ESO 22/04).

The detected velocity variations are between 2 and 3 metres per second, corresponding to about 9 km/h! That's the speed of a person walking briskly. Such tiny signals could not have been distinguished from 'simple noise' by most of today's available spectrographs.

"HARPS is a unique planet hunting machine," says Michel Mayor, from Geneva Observatory, and HARPS Principal Investigator. "Given the incredible precision of HARPS, we have focused our effort on low-mass planets. And we can say without doubt that HARPS has been very successful: out of the 13 known planets with a mass below 20 Earth masses, 11 were discovered with HARPS!"

HARPS is also very efficient in finding planetary systems, where tiny signals have to be uncovered. The two systems known to have three low mass planets -- HD 69830 and Gl 581 -- were discovered by HARPS.

"And we are confident that, given the results obtained so far, finding a planet with the mass of the Earth around a red dwarf is within reach," affirms Mayor.

This research is reported in a paper submitted as a Letter to the Editor of Astronomy and Astrophysics ("The HARPS search for southern extra-solar planets : XI. An habitable super-Earth (5 MEarth) in a 3-planet system", by S. Udry et al.)

The team is composed of Stéphane Udry, Michel Mayor, Christophe Lovis, Francesco Pepe, and Didier Queloz (Geneva Observatory, Switzerland), Xavier Bonfils (Lisbonne Observatory, Portugal), Xavier Delfosse, Thierry Forveille, and C.Perrier (LAOG, Grenoble, France), François Bouchy (Institut d'Astrophysique de Paris, France), and Jean-Luc Bertaux (Service d'Aéronomie du CNRS, France)

Notes

[1] Using the radial velocity method, astronomers can only obtain a minimum mass (as it is multiplied by the sine of the inclination of the orbital plane to the line of sight, which is unknown). From a statistical point of view, this is however often close to the real mass of the system. Two other systems have a mass close to this. The icy planet around OGLE-05-390L, discovered by microlensing with a network of telescopes including one at La Silla (ESO 03/06), has a (real) mass of 5.7 Earth masses. It, however, orbits much farther from its small host star than the present one and is hence much colder. The other is one of the planets surrounding the star Gliese 876. It has a minimum mass of 5.89 Earth masses (and a probable real mass of 7.53 Earth masses) and completes an orbit in less than 2 days, making it too hot for liquid water to be present.

[2] Gl 581, or Gliese 581, is the 581th entry in the Gliese Catalogue, which lists all known stars within 25 parsecs (81.5 light years) of the Sun. It was originally compiled by Gliese and published in 1969, and later updated by Gliese and Jahreiss in 1991.

[3] This fundamental observational method is based on the detection of variations in the velocity of the central star, due to the changing direction of the gravitational pull from an (unseen) exoplanet as it orbits the star. The evaluation of the measured velocity variations allows deducing the planet's orbit, in particular the period and the distance from the star, as well as a minimum mass.

Adapted from materials provided by European Southern Observatory, via EurekAlert!, a service of AAAS.



Read the rest of this entry »

New Species From Old Data

Daily Science Journal (Jun. 24, 2007) — Researchers have discovered three previously unknown species of a bacterium by scanning a publicly available data bank, reveals a study published today in the journal Genome Biology. The finding highlights the value of making unanalysed data from large-scale genome sequencing projects openly available online.

Steven Salzberg from The Institute for Genomic Research in Maryland and colleagues identified three new species of the bacterium Wolbachia from the genome sequences of the fruit fly Drosophila that are stored in the Trace Archive.


The Trace Archive is a public repository of raw genomic data from sequencing projects. When the genome of an organism is sequenced, the genome of endosymbiotic bacteria that live inside the organism can get incorporated into the data, contaminating the final genomic sequence of the host organism. Scanning raw sequences can therefore lead to the identification of previously unknown endosymbionts.

Salzberg and colleagues scanned through the newly sequenced genomes of seven different Drosophila species, using the genome of the bacterium Wolbachia pipientis wMel as a probe. From D. ananassae, they retrieved 32,720 sequences that matched the wMel strain. This yielded a new genome of 1,440,650 bp, which they identified as the new species Wolbachia wAna. Using the same technique, they identified Wolbachia wSim in the genome of D. simulans and Wolbachia wMoj in the genome of D. mojavensis.

"The discovery of these three new genomes demonstrates how powerful the public release of raw sequencing data can be" write the authors, who have deposited their findings in Genbank, another open repository of genomic sequences.

The team compared the new Wolbachia genomes with the wMel genome and found a number of new genes – up to 464 new genes in wAna – as well as a sign of extensive rearrangement between wMel and wAna, indicating that the two strains have diverged significantly since they first infected the two Drosophila species. The two most closely related strains are wAna and wSim, which have nearly identical genomes. wMel and wMoj share about 97% of their genomes with wAna and wSim but are a bit more distant from one another.

These findings might help shed light on the evolution of bacterial endosymbionts and on the mechanisms these organisms use to alter the cell cycle of the host in order to reproduce.

This press release is based on the article:

Serendipitous discovery of Wolbachia genomes in multiple Drosophila species Steven L. Salzberg, Julie C. Dunning Hotopp, Arthur L. Delcher, Mihai Pop, Douglas R. Smith, Michael B. Eisen, and William C. Nelson Genome Biology 6: R23

Adapted from materials provided by BioMed Central.



Read the rest of this entry »

Daily Science Journal (Jun. 22, 2007) — Global warming accounted for around half of the extra hurricane-fueling warmth in the waters of the tropical North Atlantic in 2005, while natural cycles were only a minor factor, according to a new analysis by Kevin Trenberth and Dennis Shea of the National Center for Atmospheric Research (NCAR). The study will appear in the June 27 issue of Geophysical Research Letters, published by the American Geophysical Union.

Hurricanes Ophelia, Nate, and Maria were among 15 hurricanes that raged across the Atlantic, Gulf of Mexico, and Caribbean in 2005. Click here or on image to enlarge. (Image by NASA-GSFC, data from NOAA GOES)


"The global warming influence provides a new background level that increases the risk of future enhancements in hurricane activity," Trenberth says. The research was supported by the National Science Foundation, NCAR's primary sponsor.

The study contradicts recent claims that natural cycles are responsible for the upturn in Atlantic hurricane activity since 1995. It also adds support to the premise that hurricane seasons will become more active as global temperatures rise. Last year produced a record 28 tropical storms and hurricanes in the Atlantic. Hurricanes Katrina, Rita, and Wilma all reached Category 5 strength.

Trenberth and Shea's research focuses on an increase in ocean temperatures. During much of last year's hurricane season, sea-surface temperatures across the tropical Atlantic between 10 and 20 degrees north, which is where many Atlantic hurricanes originate, were a record 1.7 degrees F above the 1901-1970 average. While researchers agree that the warming waters fueled hurricane intensity, they have been uncertain whether Atlantic waters have heated up because of a natural, decades-long cycle, or because of global warming.

By analyzing worldwide data on sea-surface temperatures (SSTs) since the early 20th century, Trenberth and Shea were able to calculate the causes of the increased temperatures in the tropical North Atlantic. Their calculations show that global warming explained about 0.8 degrees F of this rise. Aftereffects from the 2004-05 El Nino accounted for about 0.4 degrees F. The Atlantic multidecadal oscillation (AMO), a 60-to-80-year natural cycle in SSTs, explained less than 0.2 degrees F of the rise, according to Trenberth. The remainder is due to year-to-year variability in temperatures.

Previous studies have attributed the warming and cooling patterns of North Atlantic ocean temperatures in the 20th century—and associated hurricane activity—to the AMO. But Trenberth, suspecting that global warming was also playing a role, looked beyond the Atlantic to temperature patterns throughout Earth's tropical and midlatitude waters. He subtracted the global trend from the irregular Atlantic temperatures—in effect, separating global warming from the Atlantic natural cycle. The results show that the AMO is actually much weaker now than it was in the 1950s, when Atlantic hurricanes were also quite active. However, the AMO did contribute to the lull in hurricane activity from about 1970 to 1990 in the Atlantic.

Global warming does not guarantee that each year will set records for hurricanes, according to Trenberth. He notes that last year's activity was related to very favorable upper-level winds as well as the extremely warm SSTs. Each year will bring ups and downs in tropical Atlantic SSTs due to natural variations, such as the presence or absence of El Nino, says Trenberth. However, he adds, the long-term ocean warming should raise the baseline of hurricane activity.

Adapted from materials provided by National Center for Atmospheric Research.



Read the rest of this entry »

Daily Science Journal (Jun. 21, 2007) — The world is abuzz with the discovery of an extrasolar, Earth-like planet around the star Gliese 581 that is relatively close to our Earth at 20 light years away in the constellation Libra.

Artist's impression of the five-Earth mass planet, Gliese 581 c, found in the habitable zone around the red dwarf Gliese 581, with the instrument HARPS on the ESO 3.6-m telescope. (Credit: European Southern Observatory)

Bruce Fegley, Jr., Ph.D., professor of earth and planetary sciences in Arts & Sciences at Washington University in St. Louis, has worked on computer models that can provide hints to what comprises the atmosphere of such planets and better-known celestial bodies in our own solar system.


New computer models, from both Earth-based spectroscopy and space mission data, are providing space scientists compelling evidence for a better understanding of planetary atmospheric chemistry. Recent findings suggest a trend of increasing water content in going from Jupiter (depleted in water), to Saturn (less enriched in water than other volatiles), to Uranus and Neptune, which have large water enrichments.

"The farther out you go in the solar system, the more water you find," said Fegley.

Fegley provided an overview of comparative planetary atmospheric chemistry at the 233rd American Chemical Society National Meeting, held March 25-29, 2007, in Chicago. Fegley and Katharina Lodders-Fegley, Ph.D., research associate professor of earth and planetary sciences, direct the university's Planetary Chemistry Laboratory.

"The theory about the Gas Giant planets (Jupiter, Saturn, Uranus, and Neptune) is that they have primary atmospheres, which means that their atmospheres were captured directly from the solar nebula during accretion of the planets," Fegley said.

Gas Giants

He said that Jupiter has more hydrogen and helium and less carbon, nitrogen and oxygen than the other Gas Giant planets, making its composition closer to that of the hydrogen- and helium-rich sun. The elements hydrogen, carbon and oxygen are predominantly found as water, the gases molecular hydrogen and methane and in the atmospheres of the Gas Giant planets.

"Spectroscopic observations and interior models show that Saturn, Uranus and Neptune are enriched in heavier elements," he said. "Jupiter, based on observations from the Galileo Probe, is depleted in water. People have thought that Galileo might just have gone into a dry area. But Earth-based observations show that the carbon monoxide abundance in Jupiter's atmosphere is consistent with the observed abundances of methane, hydrogen and water vapor. This pretty much validates the Galileo Probe finding."

The abundances of these four gases are related by the reaction CH4+H20 = CO+3H2. Thus, observations of the methane, hydrogen and CO abundances can be used to calculate the water vapor abundance. Likewise, Earth-based observations of methane, hydrogen and carbon monoxide in Saturn's atmosphere show that water is less enriched than methane.

In contrast, observations of methane, hydrogen and carbon monoxide in the atmospheres of Uranus and Neptune show that water is greatly enriched in these two planets. Although generally classed with Jupiter and Saturn, Uranus and Neptune are water planets with relatively thin gaseous envelopes.

"On the other hand, the terrestrial planets Venus, Earth and Mars have secondary atmospheres formed afterwards by outgassing — heating up the solid material that was accreted and then releasing the volatile compounds from it," Fegley said. "That then formed the earliest atmosphere."

He said that by plugging in models he's done on the outgassing of chondritic materials and using photochemical models of the effects of UV sunlight, he and his collaborator Laura Schaefer, a research assistant in the Washington University Department of Earth and Planetary Sciences, can speculate on the atmospheric composition of Earth-like planets in other solar systems.

"With new theoretical models we are able to surmise the outgassing of materials that went into forming the planets, and even make predictions about the atmospheres of extrasolar terrestrial planets," he said.

"Because the composition of the galaxy is relatively uniform, most stars are like the sun — hydrogen-rich with about the same abundances of rocky elements — we can predict what these planetary atmospheres would be like," Fegley said. "I think that the atmospheres of extrasolar Earth-like plants would be more like Mars or Venus than the Earth."

Fegley said that photosynthesis accounts for the oxygen in Earth's atmosphere; without it, the Earth's atmosphere would consist of nitrogen, carbon dioxide and water vapor, with only small amounts of oxygen. Oxygen is 21 percent of Earth's atmosphere; in contrast, Mars has about one-tenth of one percent made by UV sunlight destroying carbon dioxide.

"I see Mars today as a great natural laboratory for photochemistry; Venus is the same for thermochemistry, and Earth for biochemistry," he said. "Mars has such a thin atmosphere compared to Earth or Venus. UV light can penetrate all the way down to the Martian surface before it's absorbed. That same light on Earth is mainly absorbed in the ozone layer in the lower Earth stratosphere. Venus is so dense that light is absorbed by a cloud layer about 45 kilometers or so above the Venusian surface."

Adapted from materials provided by Washington University in St. Louis.



Read the rest of this entry »

Daily Science Journal (Jun. 20, 2007) — Following Hurricane Katrina and the parade of storms that affected the conterminous United States in 2004–2005, the apparent recent increase in intense hurricane activity in the Atlantic basin, and the reported increases in recent decades in some hurricane intensity and duration measures in several basins have received considerable attention.

Hurricane Katrina taken Aug. 28, 2005, as the storm’s outer bands lashed the Gulf Coast of the United States a day before making landfall and leaving a path of destruction in its wake. (Credit: NOAA)


An important ongoing avenue of investigation in the climate and meteorology research communities is to determine the relative roles of anthropogenic forcing (i.e., global warming) and natural variability in producing the observed recent increases in hurricane frequency in the Atlantic, as well as the reported increases of tropical cyclone activity measures in several other ocean basins.

A survey of the existing literature shows that many types of data have been used to describe hurricane intensity, and not all records are of sufficient length to reliably identify historical trends. Additionally, there are concerns among researchers about possible effects of data inhomogeneities on the reported trends.

Much of the current debate has focused on the relative roles of sea-surface temperatures or large-scale potential intensity versus the role of other environmental factors such as vertical wind shear in causing observed changes in hurricane statistics. Significantly more research – from observations, theory, and modeling – is needed to resolve the current debate around global warming and hurricanes.

Adapted from materials provided by Blackwell Publishing Ltd.


Read the rest of this entry »

Daily Science Journal (Jun. 19, 2007) — Five years ago, Sharon Schafer Bennett suffered from migraines so severe that the headaches disrupted her life, kept her from seeking a job and interfered with participation in her children's daily activities.

Plastic surgeon Dr. Jeffrey Janis marks a site that, using the anti-wrinkle drug Botox, pinpointed a muscle later removed to help relieve Sharon Schafer Bennett's severe migraines. (Credit: Image courtesy of UT Southwestern Medical Center)

Now, thanks to an innovative surgical technique performed by a UT Southwestern Medical Center plastic surgeon who helped pioneer the procedure, the frequency and intensity of Mrs. Bennett's migraines have diminished dramatically -- from two to three per week to an occasional one every few months.


The technique -- performed by a handful of plastic surgeons in the U.S. -- includes using the anti-wrinkle drug Botox to pinpoint which of several specific muscles in the forehead, back of the head or temple areas may be serving as "trigger points" to compress, irritate or entrap nerves that could be causing the migraine. Because Botox temporarily paralyzes muscles, usually for about three months, it can be used as a "litmus test" or "marker" to see if headaches go away or become less intense while the Botox's effects last, said Dr. Jeffrey Janis, assistant professor of plastic surgery.

If the Botox is successful in preventing migraines or lessening their severity, then surgery to remove the targeted muscle is likely to accomplish the same result, but on a more long-term and possibly permanent basis, he said.

For Mrs. Bennett, the surgery proved to be life-altering.

"I can't even begin to tell you what a change this has made in my life," said Mrs. Bennett, 45, a Houston-area resident. "For the first time in years, I can live like a normal human being and do all the normal 'mom' and 'wife' things that the migraines physically prevented me from doing. My family thinks it's great because they don't have to put their lives on hold numerous times a week because of my migraines. I'm also going back to school to get a second degree, something I could never have considered before."

Dr. Janis said: "Many neurologists are using Botox to treat migraines, but they are making the injections in a 'headband-like' circle around the forehead, temple and skull. They are not looking at finding the specific location of the headache's trigger point. While patients may get temporary relief, after the Botox wears off they will have to go back and get more injections or continue medications for migraines.

"It's like a math equation. I will inject the Botox into one trigger point at a time and leave the others alone. The Botox is used as a diagnostic test to determine what trigger point is causing the problem. If patients get a benefit from the Botox, they likely will get a benefit from the surgery. If there's no benefit from the Botox, then there won't be a benefit from the surgery."

Dr. Janis began collaborating more than five years ago with Dr. Bahman Guyuron, a plastic surgeon at Case Western Reserve University and the first to explore using surgery to relieve migraines, following the revelation by several of his patients that their migraines had disappeared after they had cosmetic brow lifts. Dr. Janis has assisted his colleague by performing anatomical studies on cadavers to explore the nerves and pathways that might cause migraines. Together they have identified four specific trigger points and developed a treatment algorithm that includes using Botox prior to deciding whether to perform surgery.

During the past several years, numerous peer-reviewed articles have been published in Plastic & Reconstructive Surgery detailing their research efforts and the researchers have presented the technique at professional meetings of plastic surgeons.

Approximately 28 million Americans, 75 percent of those women, suffer from migraines, according to the National Institutes of Health. For employers, that translates into an estimated 157 million lost workdays annually.

"A migraine is something you can't explain to someone who hasn't had one," said Mrs. Bennett, who began suffering monthly migraines as a teenager. As she grew older, the headaches become more frequent and unpredictable. "They were messing up my life. I couldn't make any commitments or plan activities for my kids. This surgery has made a huge difference in my life. It's awesome."

Dr. Janis only sees patients who have been diagnosed with recurring migraines by a neurologist and have tried other treatments that have failed.

"Plastic surgeons are not in the business of diagnosing and treating headaches," he said. "This is a novel method of treatment that is proving to be effective and potentially more long lasting than other things used before. But it is still in its infancy."

Adapted from materials provided by UT Southwestern Medical Center.



Read the rest of this entry »

Say Cheese! Scientists In A Ferment Over Cheese-Starter Genome

Daily Science Journal (Jun. 19, 2007) — Whether sharp Cheddar or nutty Gouda, a fine cheese owes its flavor to milk-fermenting bacteria, such as the historically ancient starter Lactococcus lactis. In next month’s issue of Genome Research, researchers from France report the complete genome sequence of L. lactis, now the most commonly used starter in the cheese industry.


L. lactis is a member of the lactic acid bacteria (LAB) family, which includes not only cheese and yogurt starters, but also pathogens like Streptococcus pneumoniae. Until now, none of the LAB genomes have been sequenced. To produce the L. lactis sequence, Alexei Sorokin and colleagues from Génoscope and the Institut National de la Recherche Agronomique have used a novel approach that reduces the number of steps for obtaining a complete bacterial genome sequence.

Now the researchers report the entire L. lactis sequence of 2.4 million nucleotides, which encode 2310 genes (363 specific for lactococci). In their analysis of the genome, the researchers made several surprising discoveries, including genes suggesting this fermentative bacterium can perform aerobic respiration. This research marks a critical step towards understanding and manipulating the LAB and, in particular, will be useful for improving the flavor, texture, and preservation of10 million tons of cheese produced annually. Now that's a lot of cheese.

Adapted from materials provided by Cold Spring Harbor Laboratory.




Read the rest of this entry »

Daily Science Journal (Jun. 18, 2007) — Climate model simulations for the 21st century indicate a robust increase in wind shear in the tropical Atlantic due to global warming, which may inhibit hurricane development and intensification. Historically, increased wind shear has been associated with reduced hurricane activity and intensity.

The white arrows represent strong cross winds, also known as wind shear. These winds are predicted to become more common in the Atlantic due to global warming. They can disrupt or destroy a hurricane by blowing the top away. (Credit: NOAA)


This new finding is reported in a study by scientists at the Rosenstiel School of Marine and Atmospheric Science at the University of Miami and NOAA's Geophysical Fluid Dynamics Laboratory (GFDL) in Princeton, N.J., and, scheduled to be published April 18th in Geophysical Research Letters.

While other studies have linked global warming to an increase in hurricane intensity, this study is the first to identify changes in wind shear that could counteract these effects. "The environmental changes found here do not suggest a strong increase in tropical Atlantic hurricane activity during the 21st century," said Brian Soden, Rosenstiel School associate professor of meteorology and physical oceanography and the paper's co-author. However, the study does identify other regions, such as the western tropical Pacific, where global warming does cause the environment to become more favorable for hurricanes.

"Wind shear is one of the dominant controls to hurricane activity, and the models project substantial increases in the Atlantic," said Gabriel Vecchi, lead author of the paper and a research oceanographer at GFDL. "Based on historical relationships, the impact on hurricane activity of the projected shear change could be as large -- and in the opposite sense -- as that of the warming oceans."

Examining possible impacts of human-caused greenhouse warming on hurricane activity, the researchers used climate models to assess changes in the environmental factors tied to hurricane formation and intensity. They focused on projected changes in vertical wind shear over the tropical Atlantic and its ties to the Pacific Walker circulation -- a vast loop of winds that influences climate across much of the globe and that varies in concert with El Niño and La Niña oscillations. By examining 18 different models, the authors identified a systematic increase in wind shear over much of the tropical Atlantic due to a slowing of the Pacific Walker circulation. Their research suggests that the increase in wind shear could inhibit both hurricane development and intensification.

"This study does not, in any way, undermine the widespread consensus in the scientific community about the reality of global warming," said Soden. "In fact, the wind shear changes are driven by global warming."

The authors also note that additional research will be required to fully understand how the increased wind shear affects hurricane activity more specifically. "This doesn't settle the issue; this is one piece of the puzzle that will contribute to an incredibly active field of research," Vecchi said.

Adapted from materials provided by University of Miami Rosenstiel School of Marine & Atmospheric Science.



Read the rest of this entry »

Daily Science Journal (Jun. 17, 2007) — Researchers at Delft University of Technology have succeeded in carrying out calculations with two quantum bits, the building blocks of a possible future quantum computer. The Delft researchers are publishing an article about this important step towards a workable quantum computer in this week's issue of Nature.

Superconducting rings on a chip. (Credit: TU Delft)

Quantum computers have superior qualities in comparison to the type of computers currently in use. If they are realised, then quantum computers will be able to carry out tasks that are beyond the abilities of all normal computers.


A quantum computer is based on the amazing properties of quantum systems. In these a quantum bit, also known as a qubit, exists in two states at the same time and the information from two qubits is entangled in a way that has no equivalent whatsoever in the normal world.

It is highly likely that workable quantum computers will need to be produced using existing manufacturing techniques from the chip industry. Working on this basis, scientists at Delft University of Technology are currently studying two types of qubits: one type makes use of tiny superconducting rings, and the other makes use of 'quantum dots'.

Now for the first time a 'controlled-NOT' calculation with two qubits has been realised with the superconducting rings. This is important because it allows any given quantum calculation to be realised.

The result was achieved by the PhD student Jelle Plantenberg in the team led by Kees Harmans and Hans Mooij. The research took place within the FOM (Dutch Foundation for Fundamental Research on Matter) concentration group for Solid State Quantum Information Processing.

Adapted from materials provided by Delft University of Technology, via EurekAlert!, a service of AAAS.





Read the rest of this entry »

Daily Science Journal (Jun. 16, 2007) — NASA has formed an internal review board to look more in-depth into why NASA's Mars Global Surveyor went silent in November 2006 and recommend any processes or procedures that could increase safety for other spacecraft.

Artist's concept of Mars Global Surveyor. (Image credit: NASA/JPL)

Mars Global Surveyor launched in 1996 on a mission designed to study Mars from orbit for two years. It accomplished many important discoveries during nine years in orbit. On Nov. 2, the spacecraft transmitted information that one of its arrays was not pivoting as commanded. Loss of signal from the orbiter began on the following orbit.

Mars Global Surveyor has operated longer at Mars than any other spacecraft in history and for more than four times as long as the prime mission originally planned.


The Jet Propulsion Laboratory, Pasadena, Calif., manages Mars Global Surveyor for the NASA Science Mission Directorate, Washington. JPL is a division of the California Institute of Technology in Pasadena. Lockheed Martin Space Systems, Denver, developed and operates the spacecraft.

Information about the mission is available on the Internet at: http://www.nasa.gov/mission_pages/mgs/index.html.

Adapted from materials provided by NASA/Jet Propulsion Laboratory.

---------------------------------------

Add On Article :

Are You Ready For Mars?

ESA’s Mars Express probe is scheduled to arrive at Mars at Christmas: the Beagle 2 lander is expected to touch down on the surface of the Red Planet on the night of 24 to 25 December. Launched on 2 June 2003 from Baikonur (Kazakhstan) on board a Russian Soyuz launcher operated by Starsem, the European probe – built for ESA by a European team of industrial companies led by Astrium – carries seven scientific instruments that will perform a series of remote-sensing experiments designed to shed new light on the Martian atmosphere, the planet’s structure and its geology. In particular, the British-made Beagle 2 lander, named after the ship on which Charles Darwin explored uncharted areas of the Earth in 1830, will contribute to the search for traces of life on Mars through exobiology experiments and geochemistry research. On Christmas Eve the Mars Express orbiter will be steered on a course taking it into an elliptical orbit, where it will safely circle the planet for a minimum of almost 2 Earth years. The Beagle 2 lander - which will have been released from the mother craft a few days earlier (on 19 December) – instead will stay on a collision course with the planet. It too should also be safe, being designed for atmospheric entry and geared for a final soft landing due to a sophisticated system of parachutes and airbags.

On arrival, the Mars Express mission control team will report on the outcome of the spacecraft's delicate orbital insertion manoeuvre. It will take some time for Mars Express to manoeuvre into position to pick communications from Beagle 2. Hence, initially, other means will be used to check that Beagle 2 has landed: first signals from the Beagle 2 landing are expected to be available throughout Christmas Day, either through pick-up and relay of Beagle 2 radio signals by NASA’s Mars Odyssey, or by direct pick-up by the Jodrell Bank radio telescope in the UK. Mars Express will then pass over Beagle 2 in early January 2004, relaying data and images back to Earth. The first images from the cameras of Beagle 2 and Mars Express are expected to be available between the end of the year and the beginning of January 2004.

Adapted from materials provided by European Space Agency.



Read the rest of this entry »

Daily Science Journal (Jun. 15, 2007) — They are the largest group of white blood cells: neutrophil granulocytes kill microorganisms. Neutrophils catch microbes with extracellular structures nicknamed Neutrophil Extracellular Traps (NETs) that are composed of nucleic acid and aggressive enzymes.

Neutrophil granulocytes have trapped Shigella bacteria in NETs. (Image: Dr. Volker Brinkmann, Max Planck Institute for Infection Biology)

A group of scientists lead by Arturo Zychlinsky at the Max-Planck-Institute for Infectious Biology in Berlin, Germany discovered, how the neutrophils form this snaring network (Journal of Cell Biology, online, January 8, 2007). Once triggered, the cells undergo a novel program leading to their death. While they perish, the cells release the content of their nuclei. The nucleic acid, mingled with bactericidal enzymes, forms a lethal network outside the cell. Invading bacteria and pathogenic fungi get caught and killed in the NETs.


Every minute, several million neutrophils leave the bone marrow and are ready to defend the body of invading germs. They are the immune system’s first line of defence against harmful bacteria and migrate into the tissue at the site of infection to combat pathogens. For more than hundred years it was known that neutrophil granulocytes kill bacteria very efficiently by devouring them. After eating the germs neutrophils kill tehm with antimicrobial proteins.

The group of scientists lead by Arturo Zychlinsky at the Max-Planck-Institute for Infectious Biology discovered a second killing mechanism: neutrophil granulocytes can form web-like structures outside the cells composed of nucleic acid and enzymes which catch bacteria and kill them. The scientists were able to generate impressive micrographs of these nets. But it remained a mystery how the granulocytes could mobilise the contents of their nuclei and catapult it out of the cells.Only after lengthy live cell imaging and biochemical studies it became clear how neutrophils make NETs. The cells get activated by bacteria and modify the structure of their nuclei and granules, small enzyme deposits in the cytoplasm.

"The nuclear membrane disintegrates, the granules dissolve, and thus the NET components can mingle inside the cells", explains Volker Brinkmann, head of the microscopy group. At the end of this process, the cell contracts until the cell membrane bursts open and quickly releases the highly active melange. Once outside the cell, it unfolds and forms the NETs which then can trap bacteria.

Surprisingly, this process is as effective as devouring bacteria: "NETs formed by dying granulocytes kill as many bacteria as are eaten up by living blood cells", says Arturo Zychlinsky. Thus, neutrophils fulfil their role in the defence battle even after their deaths.

Adapted from materials provided by Max Planck Society.



Read the rest of this entry »

Daily Science Journal (Jun. 13, 2007) — CHAPEL HILL - The minimum number of protein-producing genes a single-celled organism needs to survive and reproduce in the laboratory is somewhere between 265 and 350, according to new research directed by a top University of North Carolina at Chapel Hill scientist.

Using a technique known as global transposon mutagenesis, Dr. Clyde A. Hutchison III, professor of microbiology at the UNC-CH School of Medicine, and colleagues at The Institute for Genomic Research (TIGR) in Rockville, Md., found that roughly a third of the genes in the disease-causing Mycoplasma genitalium were unnecessary for the bacterium's survival.


The technique -- a process of elimination -- involved randomly inserting bits of unrelated DNA into the middle of genes to disrupt their function and see if the organism thrived anyway.

Such research is a significant step forward in creating minimal, tailor-made life forms that can be further altered for such purposes as making biologically active agents for treating illness, Hutchison said. More immediately, it boosts scientists' basic understanding of the question, "What is life?"

"Cells that grow and divide after this procedure can have such disruptive insertions only in non-essential genes," he said. "Surprisingly, the minimal set of genes we found included about 100 whose function we don't yet understand. This finding calls into question the prevailing assumption that the basic molecular mechanisms underlying cellular life are understood, at least broadly."

Further work will explain those functions and create a more exact number of the minimal genes required to create life in the laboratory, the scientist said. New organisms bearing only the fewest genes needed to survive could have major commercial, social and ethical implications.

A report on the research appears in the Dec. 10 issue of the journal Science. Besides Hutchison, authors are Drs. Scott Peterson (Hutchison's former student), Steve Gill, Robin Cline, Owen White, Claire Fraser and Hamilton Smith, all of the institute, and Craig Venter, who founded TIGR and now heads Celera Genomics.

"Defining the minimal genome is a very fundamental problem, and no one else seems to be approaching it experimentally," said Nobel Prize winner Hamilton Smith, who was a TIGR investigator when the work began.

A genome is the complete set of genes, or genetic blueprints, an organism contains in each of its cells. The human genome is about 5,000 times larger than that of Mycoplasma genitalium, which causes gonorrhea-like symptoms in humans. Scientists study it in part because it contains only 517 cellular genes, the fewest known in single-celled organisms.

"The prospect of constructing minimal and new genomes does not violate any fundamental moral precepts or boundaries, but does raise questions that are essential to consider before the technology advances further," wrote Dr. Mildred K. Cho of the Stanford University Center for Biomedical Ethics and colleagues in an accompanying Science editorial.

"How does work on minimal genomes and the creation of new free-living organisms change how we frame ideas of life and our relationship to it?" Cho said. "How can the technology be used for the benefit of all, and what can be done in law and social policy to ensure that outcome?

"The temptation to demonize this fundamental research may be irresistible," she said. "However, the scientific community and the public can begin to understand what is at stake if efforts are made now to identify the nature of the science involved and to pinpoint key ethical, religious and metaphysical questions..."

Hutchison was co-inventor of a technique known as site-directed mutagenesis, which is now used by researchers around the world for introducing designed changes into genes. His friend and colleague Dr. Michael Smith of the University of British Columbia won a Nobel Prize for the work in 1993.

TIGR is non-profit research institute founded in 1992. Its researchers conduct structural, functional and comparative analyses of genomes and gene products in viruses, bacteria, other microorganisms, plants, animals and humans and has pioneered determining the sequences, or structures, of genomes.

Adapted from materials provided by University Of North Carolina At Chapel Hill.




Read the rest of this entry »

Daily Science Journal (Jun 10, 2007) — Scientists in Spain are reporting development of a new process to make cocoa powder with higher amounts of the healthful chemical compounds linked to chocolate's beneficial effects. The study is scheduled for publication in the May 30 issue of ACS' Journal of Agricultural and Food Chemistry.

Cocoa beans -- the source of chocolate -- in a cacao pod. (Credit: Photo by Keith Weller; courtesy of USDA/Agricultural Research Service)

Juan Carlos Espin de Gea and colleagues report that the new cocoa powder contains levels of some flavonoids 8 times higher than conventional cocoa. They achieved the higher flavonoid content by omitting the traditional fermentation and roasting steps used in the processing of cocoa beans. Those steps destroy some flavonoids, which are natural antioxidants.

Researchers used the flavonoid-enriched cocoa powder in a clinical trial to determine whether the compounds were bioavailable — in a form that humans can absorb. In the trial, six healthy volunteers consumed a milk drink made with flavonoid-enriched cocoa. The same volunteers later drank chocolate milk made from traditional cocoa. Blood and urine tests established the bioavailability of flavonoids in the enriched-milk drink, showing that people do absorb higher levels of the compounds.


Based on the results, researchers suggest further clinical trials on the health benefits of flavonoid-enriched cocoa powder.

Adapted from materials provided by American Chemical Society, via EurekAlert!, a service of AAAS.

------------------------------------------------------------------

Add On Article :

Cocoa, But Not Tea, May Lower Blood Pressure

Foods rich in cocoa appear to reduce blood pressure but drinking tea may not, according to an analysis of previously published research in the April 9 issue of Archives of Internal Medicine, one of the JAMA/Archives journals.

Current guidelines advise individuals with hypertension (high blood pressure) to eat more fruits and vegetables, according to background information in the article. Compounds known as polyphenols or flavonoids in fruits and vegetables are thought to contribute to their beneficial effects on blood pressure and cardiovascular risk. "Tea and cocoa products account for the major proportion of total polyphenol intake in Western countries," the authors write. "However, cocoa and tea are currently not implemented in cardioprotective or anti-hypertensive dietary advice, although both have been associated with lower incidences of cardiovascular events."

Dirk Taubert, M.D., Ph.D., and colleagues at the University Hospital of Cologne, Germany, conducted a meta-analysis of 10 previously published trials, five of cocoa's effects on blood pressure and five involving tea. All results were published between 1966 and 2006, involved at least 10 adults and lasted a minimum of seven days. The studies were either randomized trials, in which some participants were randomly assigned to cocoa or tea groups and some to control groups, or used a crossover design, in which participants' blood pressure was assessed before and after consuming cocoa products or tea.

The five cocoa studies involved 173 participants, including 87 assigned to consume cocoa and 86 controls, 34 percent of whom had hypertension (high blood pressure). They were followed for a median (middle) duration of two weeks. Four of the five trials reported a reduction in both systolic (the top number, when the heart contracts) and diastolic (the bottom number, when the heart relaxes) blood pressure. Compared with those who were not consuming cocoa, systolic blood pressure was an average of 4.7 millimeters of mercury lower and diastolic blood pressure was an average of 2.8 millimeters of mercury lower.

The effects are comparable to those achieved with blood pressure-lowering medications, the authors note. "At the population level, a reduction of 4 to 5 millimeters of mercury in systolic blood pressure and 2 to 3 millimeters of mercury in diastolic blood pressure would be expected to substantially reduce the risk of stroke (by about 20 percent), coronary heart disease (by 10 percent) and all-cause mortality (by 8 percent)," they write.

Of the 343 individuals in the five tea studies, 171 drank tea and 172 served as controls, for a median duration of four weeks. Drinking tea was not associated with a reduction in blood pressure in any of the trials.

Tea and cocoa are both rich in polyphenols, but while black and green tea contain more compounds known as flavan-3-ols, cocoa contains more of another type of polyphenol, procyanids. "This suggests that the different plant phenols must be differentiated with respect to their blood pressure-lowering potential and thus cardiovascular disease prevention, supposing that the tea phenols are less active than cocoa phenols," the authors write.

The findings do not indicate a widespread recommendation for higher cocoa intake to decrease blood pressure, but it appears reasonable to substitute phenol-rich cocoa products such as dark chocolate for other high-calorie or high-fat desserts or dairy products, they continue. "We believe that any dietary advice must account for the high sugar, fat and calorie intake with most cocoa products," the authors conclude. "Rationally applied, cocoa products might be considered part of dietary approaches to lower hypertension risk."

Adapted from materials provided by JAMA and Archives Journals.



Read the rest of this entry »

Daily Science Journal (Jun. 5, 2007) — University of Colorado at Boulder researchers will scan Venus during a spacecraft flyby this week using an $8.7 million instrument they designed and built for NASA's MESSENGER Mission, launched in 2004 and speeding toward Mercury.

An artists rendition of NASA's MESSENGER spacecraft, which will make its first flyby of Mercury in 2008. (Credit: Image courtesy of University of Colorado at Boulder)

Built by CU-Boulder's Laboratory for Atmospheric and Space Physics, the instrument will make measurements of the thick clouds and shrouded surface of Venus during the June 5th flyby, said LASP Senior Research Associate William McClintock, a mission co-investigator who led the CU-Boulder instrument development team. Known as the Mercury Atmospheric and Surface Composition Spectrometer, or MASCS, the instrument will compare the atmosphere of Venus with data from other spacecraft that have visited the planet in the past four decades.


"This is our first opportunity for a close flyby of a solar system object with MESSENGER, and we should be able to tell if the atmosphere of Venus has been changing in recent years, " said McClintock. "As importantly, we are using Venus as a test case to learn more about our instrument performance in preparation for the spacecraft's ultimate destination of Mercury."

Carrying seven instruments, MESSENGER will be the first spacecraft ever to orbit Mercury and the first to return data from the hot, rocky planet in more than 30 years. The circuitous, 4.9 billion mile journey to Mercury, which requires more than seven years and 13 loops around the sun, is using the gravity of Venus during its flyby this week to guide it closer to Mercury's orbit.

MESSENGER will make its first flyby of Mercury in January 2008, zipping by it again at a top speed of 141,000 miles per hour in October 2008 before flying by a third time in September 2009 and finally settling into orbit in March 2011. "This is a mission that requires some patience," said Mark Lankton, LASP's program manager for the MASCS instrument. "We are anticipating a brief symphony of action at Venus, and we have a lot of data to take in a hurry."

Dozens of CU-Boulder undergraduate and graduate students will be involved in data analysis from MESSENGER in the coming years, said Lankton.

MASCS's ultraviolet and visible spectrometer will be looking at the cloud composition of Venus. While the surface of Venus is hot enough to melt lead and its atmosphere is filled with noxious carbon dioxide gases and acid rain, Earth and Venus were virtual twins at birth, scientists believe.

The miniaturized MASCS instrument, which took more than three years to develop, weighs less than seven pounds and was built to last, said McClintock. "Many space instruments have a lifetime of only three to four years," he said. "But we knew we had to make this one robust enough to work for more than a decade under harsh conditions."

The MESSENGER spacecraft is about the size of a small economy car and is equipped with a semi-cylindrical thermal shade to protect it from the sun. More than half of the weight of the 1.2-ton spacecraft consists of propellant and helium. "We like to call it the little spacecraft that could," said McClintock.

"This event at Venus will be a very good tune-up for our first flyby of Mercury next January," said LASP Director Daniel Baker, also a co-investigator on the MESSENGER team. "The first encounter with Mercury will be extremely valuable, as it will essentially double the amount of information we now have about the planet."

A space physicist, Baker is interested in the magnetic field of Mercury and its interaction with the solar wind, including "substorms" associated with Mercury's magnetic field that occur in the planet's vicinity. Understanding Mercury's surface, tenuous atmosphere and magnetic field are the keys to understanding the evolution of the inner solar system, he said.

Mercury was visited only once before by a spacecraft, in 1974 and 1975, when NASA's Mariner 10 spacecraft made three flybys and mapped roughly 45 percent of the planet's rocky surface at the time.

MASCS will probe the mineral composition of Mercury's surface, the distribution of gases in its tenuous atmosphere and the workings of a giant, comet-like sodium gas cloud enveloping the planet, said McClintock. The researchers also hope to determine if Mercury ever had volcanoes on its surface and if the permanently shadowed craters at Mercury's poles contain water-ice.

MESSENGER is equipped with a large sunshield and heat-resistant ceramic fabric because Mercury is about two-thirds of the way nearer to the sun than Earth and is bombarded with 10 times the solar radiation. Sandwiched by the sun and Mercury -- which has daytime temperatures of about 800 degrees Fahrenheit -- the spacecraft will "essentially be on a huge rotisserie," said Baker.

Adapted from materials provided by University of Colorado at Boulder.



Read the rest of this entry »

Daily Science Journal (Jun. 01, 2007) — Researchers at the University of Warwick are examining a way of using bacteria to manufacture a new suite of potential anti-cancer drugs that are difficult to create synthetically on a lab bench.

A colony of S. coelicolor bacteria that could be used to make a prodiginine such as streptorubin (shown in yellow). (Image courtesy of University of Warwick)


The bacterium Streptomyces coelicolor naturally produce antibiotics called prodiginines.

This group of antibiotics has stimulated much recent interest as they can be used to target and kill cancer cells. A synthetic prodiginine analogue called GX15-070 is currently in phase 1 and 2 cancer treatment trials. However, analogues of other prodiginines, such as streptorubin B, could be even more powerful anti cancer tools, but they cannot currently be easily synthetically produced on a lab bench.

Professor Greg Challis and colleagues in the Chemistry Department of the University of Warwick have looked at the enzymes controlling the process that allows the bacterium Streptomyces coelicolor to create streptorubin B and have gained a clear understanding of which are the key enzymes that act at particular steps of that process. By manipulation of the enzyme content of the bacteria, they aim to produce a range of different compounds based closely on the form of streptorubin B normally formed by the bacteria. Some of these analogues of streptorubin B could provide the basis for developing useful new anti cancer drugs.

Professor Challis said: "This approach combines the strengths of conventional organic synthesis, with the synthetic power of biology, to assemble complex and synthetically difficult structures. It could be particularly valuable for generating analogues of streptorubin B with all the promise that holds for the development of new anti cancer drugs"

Adapted from materials provided by University of Warwick.




Read the rest of this entry »