Daily Science Journal (Nov. 29, 2007) — The oil spill that wreaked havoc in the Kerch Strait leading to the Black Sea in early November will take at least 5 to 10 years for the marine environment to recover, says WWF.

According to WWF specialists, the 2000-tonne spill has badly affected the local fishing industry. Fish caught in the Kerch Strait are not safe for consumption.

The spill has also threatened birds. About 11 endangered species inhabit the area around the strait, including the Dalmatian pelican and great black-headed gull, and many more migrating birds will be wintering in this area in the coming months.


Thanks to the efforts of clean-up crews, including WWF staff and members, some birds have been rescued. However, these activities can only help save a very small percentage of the thousands of affected birds. Two dolphins have also been found washed up on shore where clean-up operations are being conducted, but their chances of survival are slim. The Black Sea is home to common and bottlenose dolphins.

“Although it is practically impossible to completely eliminate the damage caused by the large oil spill,” said Igor Chestin, CEO of WWF-Russia, “we believe that to avoid such disasters in the future drastic changes need to be made in the oil transportation system; oil pollution laws need to be enacted.”

To avoid such accidents in the future, WWF and other environmental NGOs are developing recommendations for the Russian government, which include:
  • Local volunteers should be trained to respond to oil spills (WWF has already been training clean-up teams on the Russian coast of the Barents Sea for several years).
  • Oil export via the river-sea corridor should be stopped, and river vessels not suited for marine conditions should be instructed to enter ports.
  • Russia should develop a legislative base for oil spills, similar to the US Oil Pollution Act adopted after the Exxon Valdez oil spill in 1989, and should set up an independent agency responsible for environmental protection.
According to Alexey Knizhnikov, head of WWF-Russia’s oil and gas project, there is a prepared draft law introducing the “polluter pays” principle and environmental insurance. However, they have not been approved by the State Duma (Russia’s lower house of parliament).

“If these draft law is approved, many problems will be solved, as companies will feel more responsible for the risks they take,” says Knizhnikov.

“We hope that this accident will spur the process in adopting these laws and creating such an agency.”

Adapted from materials provided by World Wildlife Fund.



Read the rest of this entry »

Dogs Can Classify Complex Photos In Categories Like Humans Do

Daily Science Journal (Nov. 29, 2007) — Like us, our canine friends are able to form abstract concepts. Friederike Range and colleagues from the University of Vienna in Austria have shown for the first time that dogs can classify complex color photographs and place them into categories in the same way that humans do. And the dogs successfully demonstrate their learning through the use of computer automated touch-screens, eliminating potential human influence.

Researchers have shown for the first time that dogs can classify complex color photographs and place them into categories in the same way that humans do. (Credit: iStockphoto/Rami Ben Ami)

In order to test whether dogs can visually categorize pictures, and transfer their knowledge to new situations, four dogs were shown landscape and dog photographs, and expected to make a selection on a computer touch-screen.


In the training phase, the dogs were shown both the landscape and dog photographs simultaneously and were rewarded with a food pellet if they selected the dog picture (positive stimulus). The dogs then took part in two tests.

In the first test, the dogs were shown completely different dog and landscape pictures. They continued to reliably select the dog photographs, demonstrating that they could transfer their knowledge gained in the training phase to a new set of visual stimuli, even though they had never seen those particular pictures before.

In the second test, the dogs were shown new dog pictures pasted onto the landscape pictures used in the training phase, facing them with contradictory information: on the one hand, a new positive stimulus as the pictures contained dogs even though they were new dogs; on the other hand, a familiar negative stimulus in the form of the landscape.

When the dogs were faced with a choice between the new dog on the familiar landscape and a completely new landscape with no dog, they reliably selected the option with the dog. These results show that the dogs were able to form a concept i.e. ‘dog’, although the experiment cannot tell us whether they recognized the dog pictures as actual dogs.

The authors also draw some conclusions on the strength of their methodology: “Using touch-screen computers with dogs opens up a whole world of possibilities on how to test the cognitive abilities of dogs by basically completely controlling any influence from the owner or experimenter.” They add that the method can also be used to test a range of learning strategies and has the potential to allow researchers to compare the cognitive abilities of different species using a single method.

Journal reference: Range F et al (2007). Visual categorization of natural stimuli by domestic dogs (Canis familiaris). Animal Cognition (DOI 10.1007/s10071-007-0123-2).

Adapted from materials provided by Springer.



Read the rest of this entry »

Daily Science Journal (Nov. 27, 2007) — Using an innovative method called transcranial magnetic stimulation (TMS) to measure brain responsiveness, Yale researchers have found that a cocaine addict’s response to stimulation is decreased, indicating possible evidence that cocaine causes permanent brain damage.

"Contrary to what we expected, the results showed that cocaine-dependent individuals displayed increased resistance to brain stimulation," said Nashaat Boutros, associate professor of psychiatry at Yale and principal investigator on the study. "We expected them to be jumpy or more responsive because of the sensitizing effects of cocaine, but it took much stronger stimulation to get them to respond."

Boutros and his team examined 10 cocaine-dependent subjects (four men and six women) who had not used cocaine for at least three weeks and were addicted to no other drugs. A magnetic stimulus was delivered by TMS using a hand-held magnetic coil over the motor cortex, the part of the brain that moves the hands and fingers. The amount of magnetic stimulus needed to move the fingers is an indication of sensitivity in that part of the brain.


The results, published in the February 15 issue of Biological Psychiatry, showed that those without cocaine addiction need about 35 to 55 percent of the output to move their fingers. Boutros said cocaine-addicted individuals sometimes need as much as 80 percent. "Somehow addicts have increased their resistance to the effect of the stimulus," Boutros said.

Boutros and his team started out with the theory that the longer people use cocaine, the more it causes symptoms like paranoia, panic attacks and seizures. "People have thought that a process called kindling—over time, less stimulus is needed to elicit the same response—would apply to cocaine addicts," Boutros said. "But these results raise other possibilities."

Boutros believes there are two possible theories that could explain the findings. One is that the cocaine caused widespread damage that decreased the brain’s ability to respond to stimulation. "The other possibility," Boutros said, "is that the brain was indeed sensitized via the kindling response, and what we’re seeing is a normal response from the brain, like an overcompensation. The brain feels too sensitive from all the cocaine jitter and cools off the response."

A follow-up study, Boutros said, will confirm this initial finding by examining a different group of cocaine-addicted people using additional TMS techniques. TMS has been used to study neurological disorders for 10 years, but was first used in exploration of cortical function in psychiatric disorders about three years ago.

Adapted from materials provided by Yale University.




Read the rest of this entry »

Daily Science Journal (Nov. 22, 2007) — Astronomers have discovered white dwarf stars with pure carbon atmospheres. The discovery could offer a unique view into the hearts of dying stars.

Artists' concept of the surface of the white dwarf star H1504+65, believed to have somehow expelled all its hydrogen and all but a very small trace of its helium, leaving an essentially bare stellar nucleus with a surface of 50 percent oxygen and 50 percent carbon. When this star cools, it may have a carbon atmosphere, like the stars newly found by University of Arizona, Canadian and French astronomers. (Credit: Illustration credit: M.S. Sliwinski and L. I. Slivinska of Lunarismaar, Copyright photo by Sliwinski, M.S. and Sliwinska, L.I.)

These stars possibly evolved in a sequence astronomers didn't know before. They may have evolved from stars that are not quite massive enough to explode as supernovae but are just on the borderline. All but the most massive two or three percent of stars eventually die as white dwarfs rather than explode as supernovae.


When a star burns helium, it leaves "ashes" of carbon and oxygen. When its nuclear fuel is exhausted, the star then dies as a white dwarf, which is an extremely dense object that packs the mass of our sun into an object about the size of Earth. Astronomers believe that most white dwarf stars have a core made of carbon and oxygen which is hidden from view by a surrounding atmosphere of hydrogen or helium.

They didn't expect stars with carbon atmospheres.

"We've found stars with no detectable traces of helium and hydrogen in their atmospheres," said University of Arizona Steward Observatory astronomer Patrick Dufour. "We might actually be observing directly a bare stellar core. We possibly have a window on what used to be the star's nuclear furnace and are seeing the ashes of the nuclear reaction that once took place."

Dufour, UA astronomy Professor James Liebert and their colleagues at the Université de Montréal and Paris Observatory published the results in the Nov. 22 issue of Nature.

The stars were discovered among 10,000 new white dwarf stars found in the Sloan Digital Sky Survey. The survey, known as the SDSS, found about four times as many white dwarf stars previously known.

Liebert identified a few dozens of the newfound white dwarfs as "DQ" white dwarfs in 2003. When observed in optical light, DQ stars appear to be mostly helium and carbon. Astronomers believe that convection in the helium zone dredges up carbon from the star's carbon-oxygen core.

Dufour developed a model to analyze the atmospheres of DQ stars as part of his doctoral research at the Université de Montréal. His model simulated cool DQ stars, stars at temperatures between 5,000 degrees and 12,000 degrees Kelvin. For reference, our sun's surface temperature is around 5,780 degrees Kelvin.

When Dufour joined Steward Observatory in January, he updated his code to analyze hotter stars, stars as hot as 24,000 degrees Kelvin.

"When I first started modeling the atmospheres of these hotter DQ stars, my first thought was that these are helium-rich stars with traces of carbon, just like the cooler ones," Dufour said. "But as I started analyzing the stars with the higher temperature model, I realized that even if I increased the carbon abundance, the model still didn't agree with the SDSS data," Dufour said.

In May 2007, "out of pure desperation, I decided to try modeling a pure-carbon atmosphere. It worked," Dufour said. "I found that if I calculated a pure carbon atmosphere model, it reproduces the spectra exactly as observed. No one had calculated a pure carbon atmosphere model before. No one believed that it existed. We were surprised and excited."

Dufour and his colleagues have identified eight carbon-dominated atmosphere white dwarf stars among about 200 DQ stars they've checked in the Sloan data so far.

The great mystery is why these carbon-atmosphere stars are found only between about 18,000 degrees and 23,000 degrees Kelvin. "These stars are too hot to be explained by the standard convective dredge-up scenario, so there must be another explanation," Dufour said.

Dufour and Liebert say they these stars might have evolved from a star like the unique, much hotter star called H1504+65 that Pennsylvania State University astronomer John A. Nousek, Liebert and others reported in 1986. If so, carbon-atmosphere stars represent a previously unknown sequence of stellar evolution.

H1504+65 is a very massive star at 200,000 degrees Kelvin.

Astronomers currently believe this star somehow violently expelled all its hydrogen and all but a very small trace of its helium, leaving an essentially bare stellar nucleus with a surface of 50 percent carbon and 50 percent oxygen.

"We think that when a star like H1504+65 cools, it eventually becomes like the pure-carbon stars," Dufour said. As the massive star cools, gravity separates carbon, oxygen and trace helium. Above 25,000 degrees Kelvin, the trace helium rises to the top, forming a thin layer above the much more massive carbon envelope, effectively disguising the star as a helium-atmosphere white dwarf, Dufour and Liebert said.

But between 18,000 and 23,000 degrees Kelvin, convection in the carbon zone probably dilutes the thin helium layer. At these temperatures, oxygen, which is heavier than carbon, has probably sunk too deep to be dredged to the surface.

Dufour and his colleagues say that models of stars nine to 11 solar masses might explain their peculiar carbon stars.

Astronomers predicted in 1999 that stars nine or 10 times as massive as our sun would become white dwarfs with oxygen-magnesium-neon cores and mostly carbon-oxygen atmospheres. More massive stars explode as supernovae.

But scientists aren't sure where the dividing line is, whether stars eight, nine, 10 or 11 times as massive as our sun are required to create supernovae.

"We don't know if these carbon atmosphere stars are the result of nine-or-10 solar mass star evolution, which is a key question," Liebert said.

The UA astronomers plan making new observations of the carbon atmosphere stars at the 6.5-meter MMT Observatory on Mount Hopkins, Ariz., in December to better pinpoint their masses. The observations could help define the mass limit for stars dying as white dwarfs or dying as supernovae, Dufour said.

Adapted from materials provided by University of Arizona.



Read the rest of this entry »

Daily Science Journal (Nov. 20, 2007) — New research out of the Channing Laboratory at Brigham and Women’s Hospital (BWH) reports that frequent consumption of foods containing the flavonoid kaempferol, including nonherbal tea and broccoli, was associated with a reduced risk of ovarian cancer. The researchers also found a decreased risk in women who consumed large amounts of the flavonoid luteolin, which is found in foods such as carrots, peppers, and cabbage.

Broccoli and carrots are among the foods that are high in flavinoids. (Credit: iStockphoto/Albert Lozano)

“This is good news because there are few lifestyle factors known to reduce a woman’s risk of ovarian cancer,” said first author Margaret Gates, a research fellow at BWH. “Although additional research is needed, these findings suggest that consuming a diet rich in flavonoids may be protective.”


The causes of ovarian cancer are not well understood. What is known is that the earlier the disease is found and treated, the better the chance for recovery; however, the majority of cases are diagnosed at an advanced (metastasized) stage after the cancer has spread beyond the ovaries. According to the National Cancer Institute, the five-year relative survival rate for women diagnosed with localized ovarian cancer is 92.4 percent. Unfortunately, this number drops to 29.8 percent if the cancer has already metastasized.

In this first prospective study to look at the association between these flavonoids and ovarian cancer risk, Gates and colleagues calculated intake of the flavonoids myricetin, kaempferol, quercetin, luteolin, and apigenin among 66,940 women enrolled in the Nurses’ Health Study. In this population, 347 cases of epithelial ovarian cancer were diagnosed between 1984 and 2002.

Although total intake of these five common dietary flavonoids was not clearly beneficial, the researchers found a 40 percent reduction in ovarian cancer risk among the women with the highest kaempferol intake, compared with women with the lowest intake. They also found a 34 percent reduction in the risk of ovarian cancer among women with the highest intake of luteolin, compared with women with the lowest intake.

“In this population of women, consumption of nonherbal tea and broccoli provided the best defense against ovarian cancer,” concluded Gates, who is also a research fellow at the Harvard School of Public Health. “Other flavonoid-rich foods, such as onions, beans, and kale, may also decrease ovarian cancer risk, but the number of women who frequently consumed these foods was not large enough to clearly evaluate these associations. More research is needed.”

These findings appear in the Nov. 15, 2007, issue of the International Journal of Cancer.

Adapted from materials provided by Harvard University.



Read the rest of this entry »

Daily Science Journal (Nov. 19, 2007) — People with migraines have differences in an area of the brain that helps process sensory information, including pain, according to a new study.

The study found that part of the cortex area of the brain is thicker in people with migraine than in people who do not have the neurological disorder.

Comparing 24 people with migraine to 12 people without migraine, the study found that the somatosensory cortex area of the brain was an average of 21 percent thicker in those with migraine.

"Repeated migraine attacks may lead to, or be the result of, these structural changes in the brain," said study author Nouchine Hadjikhani, MD, of The Martinos Center for Biomedical Imaging at Massachusetts General Hospital in Boston. "Most of these people had been suffering from migraines since childhood, so the long-term overstimulation of the sensory fields in the cortex could explain these changes. It's also possible that people who develop migraines are naturally more sensitive to stimulation."


Hadjikhani said the results indicate that the brain's sensory mechanisms are important components in migraine. "This may explain why people with migraines often also have other pain disorders such as back pain, jaw pain, and other sensory problems such as allodynia, where the skin becomes so sensitive that even a gentle breeze can be painful."

Other studies have shown changes in the cortex. The area becomes thinner in neurological disorders such as multiple sclerosis and Alzheimer's disease. But the area thickens with extensive motor training and learning.

This research is published in the November 20, 2007, issue of Neurology®, the medical journal of the American Academy of Neurology.

The study was supported by grants from the National Institutes of Health, the Swiss Heart Foundation, and the Harvard School of Dental Medicine Dean's Award.

Adapted from materials provided by American Academy of Neurology, via EurekAlert!, a service of AAAS.

------------------------------------------------------------------------

Add On Article :

Scientists Identify Protein That May Promote Migraines

A University of Iowa study may provide an explanation for why some people get migraine headaches while others do not. The researchers found that too much of a small protein called RAMP1 appears to "turn up the volume" of a nerve cell receptor's response to a neuropeptide thought to cause migraines.

The neuropeptide is called CGRP (calcitonin gene-related peptide) and studies have shown that it plays a key role in migraine headaches. In particular, CGRP levels are elevated in the blood during migraine, and drugs that either reduce the levels of CGRP or block its action significantly reduce the pain of migraine headaches. Also, if CGRP is injected into people who are susceptible to migraines, they get a severe headache or a full migraine.

"We have shown that this RAMP protein is a key regulator for the action of CGRP," said Andrew Russo, Ph.D., UI professor of molecular physiology and biophysics. "Our study suggests that people who get migraines may have higher levels of RAMP1 than people who don't get migraines."

RAMP1 is a normal, required subunit of the CGRP receptor. Russo and his colleagues found that overexpression of RAMP1 protein in nerve cells increased the sensitivity and responsiveness of CGRP receptors to the neuropeptide -- more RAMP1 made CGRP receptors react to much lower concentrations of CGRP than usual and caused the receptors to respond more vigorously to the neuropeptide.

The UI team also engineered mice to express human RAMP1 in their nervous system in addition to the normal mouse version of the protein. These mice had double the amount of inflammation in response to CGRP than did normal mice. Nerve-induced inflammation is one of the effects associated with migraine headache.

Russo explained that his study raises the possibility that people who have migraines may have subtle genetic differences in the RAMP1 gene that result in increased levels of RAMP1 protein.

"There is clearly a genetic difference between people who get migraines and those who do not, and we think that difference could be RAMP1. Our studies provide a reason to look for variations in the DNA that encodes RAMP1 in humans," he said.

The study also suggests that the mice engineered to produce elevated levels of RAMP1 protein may be a good model for studying migraine and specifically trying to understand how the neuropeptide, CGRP, is working.

The UI team investigated CGRP receptors in the trigeminal nerve, which is responsible for relaying almost all sensory perception, including pain and touch, for the front of the head. The UI findings reinforce the emerging view that CGRP receptors in the trigeminal nerve play a key role in migraine headache.

However, there are other CGRP receptors throughout the body, and elevated CGRP levels are implicated in other types of pain, including arthritis. Russo predicts that his group's findings about RAMP1 will have implications for pain research beyond migraine headaches.

The study was funded by the National Institutes of Health and published in the Journal of Neuroscience.

Adapted from materials provided by University of Iowa.



Read the rest of this entry »

Strength Of Cocaine Cravings Linked To Brain Response

Daily Science Journal (Nov. 19, 2007) — Rats that have a strong craving for cocaine have a different biochemical response to the drug than their less-addicted counterparts, researchers at UT Southwestern Medical Center have found.

Dr. David Self (right), associate professor of psychiatry, and graduate student Scott Edwards have found that rats that are more highly addicted to cocaine develop a different biochemical reaction to the drug than less-addicted ones. The research may help explain why addicts find it so difficult to give up the drug. (Image courtesy of UT Southwestern Medical Center)

The difference lies in the pleasure-seeking area of the brain, according to a study available online and appearing in a future issue of the journal Neuropsychopharmacology.


"This work shows that there are profound alterations in the brain mechanisms that regulate motivated behavior with addiction," said Dr. David Self, associate professor of psychiatry at UT Southwestern and senior author of the paper.

"It really shows that the addicted person is ill-equipped to cope because the brain is now wired to make them crave drugs more and get less satisfaction out of the drug or other life events that may be rewarding, and this study found biological changes that would explain these behavioral changes," said Dr. Self.

The researchers looked at dopamine receptors — molecules on cell surfaces that are activated when dopamine or other molecules bind to them. They focused on two types of receptors called D1 and D2.

Molecules that activate D1 are believed to decrease the craving response, while D2 activators are believed to increase it. Both of the receptors bind to the neurotransmitter dopamine in a part of the brain called the mesolimbic dopamine system.

In the study, rats had tubes surgically implanted that fed into their bloodstream, through which they could give themselves cocaine injections by pressing a lever. Some rats voluntarily gave themselves higher doses of cocaine than others did, an indication that they were more addicted to the cocaine.

The rats then went through three weeks of cocaine withdrawal, during which time they ceased to press the lever. At the late stages of withdrawal, a drug that specifically activated the D2 receptor was given to see if it would prompt the rats to press the lever again in search of cocaine. In another experiment, the rats were given a small dose of cocaine and a drug that activated the D1 receptor to see if the drug would block them from seeking more cocaine.

The strongly addicted rats responded more aggressively to the craving-enhancing D2 activator than the less-addicted rats did, and were not as strongly deterred by the D1 activator.

"It's as if the cocaine-addicted animal is less easily satisfied and more easily induced to seek drugs due to alterations in these receptors," Dr. Self said.

Before the researchers administered cocaine, the rats were tested to see how much they moved around when given D1 or D2 activator drugs. Before getting the cocaine, their responses to each drug were the same. After being trained to take the cocaine, the strongly addicted rats were much more sensitive to the D2 activator but less sensitive to the D1 activator. These tests showed that the difference in sensitivity developed during the addiction process, rather than being already present in the animals from the beginning.

The researchers don't know, however, whether the responses in the rats they studied were due to changes in the numbers of the receptors or to the biochemical actions of the receptors already present. Future research may help clarify those different scenarios, Dr. Self said.

Understanding how receptors control cravings may be applicable to humans, although addiction is a complicated mix of brain biochemistry and learned responses to environmental cues, as well as stress, Dr. Self said.

"If people do become addicted and say they want to quit, their brain system for inhibiting craving is weaker. We want to try to strengthen those systems that help them inhibit their craving," he said.

The lead author in the study was Scott Edwards, a neuroscience graduate student at UT Southwestern. Other UT Southwestern researchers involved in the study were Kimberly Whisler, a research associate in psychiatry, Dwain Fuller, faculty associate in psychiatry, and Dr. Paul Orsulak, professor of psychiatry and pathology.

The work was supported in part by the National Institute on Drug Abuse.

Adapted from materials provided by UT Southwestern Medical Center.




Read the rest of this entry »

Daily Science Journal (Nov. 15, 2007) — Researchers at the UCLA Neuropsychiatric Institute found significant improvement in verbal recall among a group of people with age-associated memory impairment who took the herbal supplement ginkgo biloba for six months when compared with a group that received a placebo.

The UCLA study, released at the annual meeting of the Society for Neuroscience, held Nov. 8–12, in New Orleans, LA, used positron-emission tomography (PET) and found that for subjects taking gingko biloba, improved recall correlated with better brain function in key brain memory centers.

However, actual changes in brain metabolism, measured by PET for the first time, did not differ significantly between the study's two volunteer groups. Researchers noted that although all volunteers taking gingko biloba experienced better verbal recall, a larger sample size might be needed to effectively track brain metabolism results.


"Our findings suggest intriguing avenues for future study, including using PET with a larger sample to better measure and understand the impact of gingko biloba on brain metabolism," said Dr. Linda Ercoli, lead author of the study and an assistant clinical professor at the UCLA Neuropsychiatric Institute.

Gingko biloba is a Chinese herb often used as a dietary supplement to treat memory loss. The UCLA study and previous controlled clinical trials on ginkgo biloba's effects on verbal recall have yielded conflicting results.

"The research also raises questions regarding the significance of supplement quality and treatment duration," said principal investigator Dr. Gary Small, a UCLA professor on aging and director of the Aging and Memory Research Center at the UCLA Neuropsychiatric Institute. "The Food and Drug Administration does not regulate dietary supplements, and the quality of retail supplies varies widely. We used only the highest grade of ginkgo biloba in conducting our research."

Small also noted that the six-month UCLA study is one of the first to measure the effects of gingko biloba over a longer period of time. Most previous studies have measured the effect of the supplement over 12 weeks or less.

The study examined the impact of ginkgo biloba, compared to a placebo, in 10 patients, aged 45 to 75, who did not have dementia but complained of mild age-related memory loss. Four subjects received 120 mg of ginkgo biloba twice daily, and six received a placebo or inactive substance such as a sugar pill.

Researchers used cognitive tests to measure verbal recall and PET to measure brain metabolism before and after the treatment regimen. Magnetic resonance imaging was used to determine regions of interest to be examined by PET.

Funding for the study was provided by Dr. Willmar Schwabe GmbH & Co., the John Douglas French Alzheimer's Foundation, the Louis and Harold Price Foundation, the Larry L. Hillblom Foundation and the UCLA Center on Aging.

The UCLA Neuropsychiatric Institute is an interdisciplinary research and education institute devoted to the understanding of complex human behavior, including the genetic, biological, behavioral and sociocultural underpinnings of normal behavior, and the causes and consequences of neurophychiatric disorders. More information is available online at http://www.npi.ucla.edu.

Adapted from materials provided by University Of California - Los Angeles.



Read the rest of this entry »

Daily Science Journal (Nov. 12, 2007) — Scientists from the Max Planck Institute for Infection Biology in Berlin discovered why lung, but not skin, anthrax infections are lethal. As reported in the newest issue of PloS Pathogen (November 2007) Neutrophils, a form of white blood cells, play a key role in anthrax infections.

A human neutrophil takes up Bacillus anthracis. (Image: MPI for Infection Biology)

They can kill Bacillus anthracis by producing a protein called alpha-defensin. This discovery might now pave the way towards the development of new therapiesfor the fatal lung form of anthrax.


Bacillus anthracis is the causative agent of anthrax. What makes Bacillus anthracis especially dangerous is that these bacteria can form spores. The spores are extremely resistant against environmental stress and can survive for years.Infection with Bacillus anthracis can take place either via the lung or through the skin. Interestingly, the lung form of anthrax is almost always fatal, whereas skin infections remain localized and are rarely lethal. In contrast to the lung form, the skin form of anthrax can be treated without problems and most patients recover. During the past few years, Bacillus anthracis has also been used as a weapon for bioterrorism. Anthrax spores were sent in envelopes and inhaled and resulted in the death of 5 people in the USA.

The findings of the lab of Arturo Zychlinsky now help clarifying why the skin form is harmless in contrast to the lung form. After a skin infection with Bacillus anthracis, neutrophils are recruited to the site of infection. Neutrophils are white blood cells that can identify and kill microbes. In the skin, neutrophils take up the spores, which germinate inside the neutrophil to a vegetative ("growing") bacterium. This vegetative bacterium is then attacked and killed within the neutrophil. The scientists succeeded in identifying the substance responsible for the killing of the bacteria. After fractionation of neutrophil components only one protein remained which is sufficient for killing Bacillus anthracis: alpha-defensin

This mechanism is not effective in the lung form of anthrax. Here, the number of neutrophils recruited to the site of infection is known to be low, and insufficient to kill bacteria. Thus, inhaled spores can germinate and spread through the organism. The scientists in Berlin now hope that their discovery will help to develop new drugs against the lung form of anthrax. There might be the possibility that the inhalation of alphadefensin might kill vegetative bacteria in the lung and prevent dissemination.

Adapted from materials provided by Max Planck Society.




Read the rest of this entry »

Daily Science Journal (Nov. 12, 2007) — UC Davis wildlife experts are leading the rescue of oiled birds in San Francisco today after a container ship spilled nearly 60,000 gallons of heavy bunker fuel oil into the bay.

Veterinarians assess the health status of oiled birds that are being brought in from beaches and the bay waters. (Credit: UC Davis (archival photo))

Three veterinarians and a veterinary technician arrived at Fort Mason Wednesday to organize the rescue effort and begin treating injured birds.


By 1 p.m. November 8, there were 21 seabirds being treated, all of them surf scoters, according to UC Davis veterinarian Michael Ziccardi, director of the California Oiled Wildlife Care Network.

Jonna Mazet, a UC Davis veterinarian and international authority on the rescue and treatment of oiled wildlife, has said in the past that for every oiled seabird that is found washed ashore, an estimated 10 to 100 birds died at sea.

The UC Davis rescue team is working in a custom-built recovery and rehabilitation trailer. There, they assess the health status of oiled birds that are being brought in from beaches and the bay waters.

Then the birds are put in boxes and driven to the San Francisco Bay Oiled Wildlife Care and Education Center in Cordelia (just outside Fairfield), where they will receive the world's most advanced veterinary care for oiled wildlife.

At the center, the first order of business is not to remove oil from the birds. Instead, it is to warm the birds and nourish them. Once stabilized, they will be better able to withstand the stresses of being washed.

The Cordelia center is a 12,000-square-foot, $2.7 million facility capable of caring for up to 1,000 sick birds. It is the major Northern California rescue center in the statewide Oiled Wildlife Care Network, which comprises nine rescue facilities and 25 organizations prepared to care for oiled wildlife on short notice.

At each California rescue center, UC Davis wildlife veterinarians work in partnership with local, trained wildlife rehabilitators. At the Cordelia center, those rehabilitators are staff members of the International Bird Rescue Research Center.

At this time, a standing corps of trained volunteers is being called up to staff the rescue center.

The Oiled Wildlife Care Network is managed statewide by the UC Davis Wildlife Health Center, a unit of the UC Davis School of Veterinary Medicine.

The network is funded by the Office of Spill Prevention and Response of the California Department of Fish and Game. The Fish and Game monies come from interest on the $50 million California Oil Spill Response Trust Fund, built from assessments on the oil industry.

In addition to giving veterinary care, the network funds basic research into the effects of oil on wildlife and applied research into treatments that will improve survival.

Adapted from materials provided by University of California, Davis.



Read the rest of this entry »

Researchers Find That After Stopping Cocaine Use, Drug Craving Gets Stronger Over Time

Daily Science Journal (Nov. 12, 2007) — Using an animal model of drug craving in laboratory rats, researchers at the Intramural Research Program of the National Institute on Drug Abuse (NIDA) have found that craving for cocaine seems to increase, rather than decrease, in the days and months after drug use has stopped.

"This phenomenon helps explain why addiction is a chronic, relapsing disease," says NIDA Director Dr. Alan I. Leshner. "Craving is a powerful force for cocaine addicts to resist, and the finding that it persists long after last drug use must be considered in tailoring treatment programs."

The research team, which included Drs. Jeff Grimm, Bruce Hope, Roy Wise and Yavin Shaham, published its findings in the July 12, 2001, issue of Nature.


In the study, the scientists found that sensitivity to drug-associated environmental cues that often accompany drug craving and relapse increased over a 60-day withdrawal period. Cocaine craving was inferred from the behavior of rats trained to press a lever to receive intravenous cocaine injections. Once the animals had learned to associate the lever-pressing with receiving cocaine, they were tested under conditions where they could continue to press the lever, but no longer received cocaine.

In humans, drug-associated environmental cues often stimulate cocaine craving and accompany relapse to drug-using behavior. The NIDA investigators wrote in their report to Nature that "the data from this study suggest that an individual is most vulnerable to relapse to cocaine use well beyond the acute drug withdrawal phase."

Adapted from materials provided by NIH/National Institute On Drug Abuse.




Read the rest of this entry »

Daily Science Journal (Nov. 11, 2007) — A single-rod implantable contraceptive that has been available in other countries since 1998 is now being used in the United States, including in the Cincinnati area.

Implantable birth control is injected underneath the skin of the upper arm during an in-office procedure that takes about one minute. (Image courtesy of University Of Cincinnati)

The implant is injected underneath the skin of the upper arm during an in-office procedure that takes about one minute. The implant, the size of a matchstick, releases a steady stream of the female hormone etonogestrel (Implanon) over a three-year period.


"This is a great option for women who can't take pills or don't easily tolerate other birth control options like IUDs and the patch," says University of Cincinnati (UC) fertility expert and contraceptive researcher Michael Thomas, MD.

Etonogestrel works by thickening the cervical mucus, which prevents sperm from fertilizing an egg and also stops any egg that does get fertilized from implanting itself in the uterine wall. Etonogestrel completely inhibits the release of eggs from the ovaries during the first two years. In the third year, it begins to lose its effectiveness.

"Women who use this form of birth control don't have to worry about taking a pill every day or changing their birth control ring every month," says Thomas. "It's a great long-term option."

Thomas cautions, however, that the implant is not for everyone. "Unfortunately, irregular bleeding is a side effect. Women have to be willing to tolerate this possibility. Also, women who experience heavy bleeding or are significantly overweight may want to consider other birth control options."

Thomas is a physician with UC's Center for Reproductive Health, which has expertise in infertility, menopause and endocrinological disorders. Established in 1988, it's the only comprehensive patient care and research unit focused on women's health in the Cincinnati area. To contact the center, call (513) 585-2355. To learn about birth control studies at UC, visit http://www.researchforwomen.com or call (513) 584-4100.

Thomas has no financial interest in Organon USA, the manufacturer of Implanon.

Adapted from materials provided by University Of Cincinnati.




Read the rest of this entry »

Daily Science Journal (Nov. 9, 2007) — A lot better than we do, says Paul Miller, clinical professor of comparative ophthalmology at University of Wisconsin-Madison.

Dogs have good night vision, due in large part to the tapetum, a mirror-like structure which reflects light, giving the retina a second chance to register light that has entered the eye. This is also what makes dogs eyes glow at night. The dog is holding a toy in her mouth. (Credit: Michele Hogan)


“Dogs have evolved to see well in both bright and dim light, whereas humans do best in bright light. No one is quite sure how much better a dog sees in dim light, but I would suspect that dogs are not quite as good as cats,” which can see in light that’s six times dimmer than our lower limit. Dogs, he says, “can probably see in light five times dimmer than a human can see in.”

Dogs have many adaptations for low-light vision, Miller says. A larger pupil lets in more light. The center of the retina has more of the light-sensitive cells (rods), which work better in dim light than the color-detecting cones. The light-sensitive compounds in the retina respond to lower light levels. And the lens is located closer to the retina, making the image on the retina brighter.

But the canine’s biggest advantage is called the tapetum. This mirror-like structure in the back of the eye reflects light, giving the retina a second chance to register light that has entered the eye. “Although the tapetum improves vision in dim light, it also scatters some light, degrading the dog’s vision from the 20:20 that you and I normally see to about 20:80,” Miller says.

The tapetum also causes dog eyes to glow at night.

Adapted from materials provided by University of Wisconsin - Madison.



Read the rest of this entry »

Daily Science Journal (Nov. 8, 2007) — Amid continuing concerns that anthrax might be used as a bioterrorism weapon, government researchers report development of a faster, more sensitive blood test for detecting the deadly toxins produced by the anthrax bacterium, Bacillus anthracis. The test produces results in only 4 hours and could save lives by allowing earlier detection of infection, they say.

Anthrax spores as photographed under an electron microscope. (Credit: Courtesy of Centers for Disease Control and Prevention)

Standard identification of anthrax (Bacillus anthracis) infection relies on a combination of time-consuming steps, including cell culture and gene amplification, which can take several days to provide a diagnosis and have limitations for detecting early stages of infection. Early diagnosis is critical for effective treatment of pulmonary or inhalation anthrax, the most deadly form.


John R. Barr and colleagues in a multi-center team effort used a form of mass spectrometry to detect the presence of 'lethal factor,' the key toxin produced by the anthrax bug, in the blood of monkeys with inhalation anthrax.

The method took only four hours to identify the toxin and detected it at very low levels, demonstrating its potential for early detection of infection, the researchers say. The new method also shows promise as a research tool for providing a better understanding of the anthrax infection cycle and for evaluating the effectiveness of different therapies and methods to fight infections.

The article "Detection and Quantification of Anthrax Lethal Factor in Serum by Mass Spectrometry" is scheduled for publication in the Nov. 22 issue of ACS' Analytical Chemistry.

Adapted from materials provided by American Chemical Society, via EurekAlert!, a service of AAAS.




Read the rest of this entry »

Daily Science Journal (Nov. 5, 2007) — A new mathematical program developed in the Department of Computer Sciences at the University of Haifa will enable computers to "know" if the artwork you are looking at is a Leonardo da Vinci original, as the seller claims, or by another less well known artist. "The field of computer vision is very complex and multifaceted. We hope that our new development is another step forward in this field," said Prof. Daniel Keren who developed the program.

Through this innovation, the researchers "taught" the computer to identify the artworks of different artists. The computer learned to identify the artists after the program turned the drawings of nature, people, flowers and other scenes to a series of mathematical symbols, sines and cosines. After the computer "learns" some of the works of each artist, the program enables the computer to master the individual style of each artist and to identify the artist when looking at other works -- works the computer has never seen.


According to Prof. Keren, the program can identify the works of a specific artist even if they depict different scenes. "As soon as the computer learns to recognize the clock drawings of Dali, it will recognize his other paintings, even without clocks. As soon as the computer learns to recognize the swirls of Van Gogh, it will recognize them in pictures it has never seen before."

This new development is a step forward in the field of computer vision. According to Prof. Keren, this field is still inferior to human vision. "Human vision has undergone evolution of millions of years and our field is only 30 years old. At this stage computers still have difficulty doing things that are very simple for people, for example, recognizing a picture of a human face. A computer has difficulty identifying when a picture is of a human face or how many faces are in a picture. However, computers are very good at simulating and sketching 3 dimensional images like the arteries in the brain or a road network."

At present, the new program can be helpful to someone who appreciates art, but not to a real expert in the field. If you are a novice who paid a hefty price for a picture that the seller claimed is an exact copy of a Da Vinci, the program can tell you if you wasted your money or made a smart purchase.

Adapted from materials provided by University of Haifa.




Read the rest of this entry »

Daily Science Journal (Nov. 4, 2007) — UC San Diego electrical engineers have developed the world's most complex “phased array” -- or radio frequency integrated circuit. This DARPA-funded advance is expected to find its way into U.S. defense satellite communication and radar systems. In addition, the innovations in this chip design will likely spill over into commercial applications, such as automotive satellite systems for direct broadcast TV, and new methods for high speed wireless data transfer.

The UCSD DARPA Smart Q-Band 4x4 Array Transmitter, the world’s most complex silicon phased array chip. (Credit: Image courtesy of University of California - San Diego)

“This is the first 16 element phased array chip that can send at 30-50 GHz. The uniformity and low coupling between the elements, the low current consumption and the small size – it is just 3.2 by 2.6 square millimeters – are all unprecedented. As a whole system, there are many many firsts,” said Gabriel Rebeiz, the electrical engineering professor from the UCSD Jacobs School of Engineering leading the project.

This chip – the UCSD DARPA Smart Q-Band 4x4 Array Transmitter – is strictly a transmitter. “We are working on a chip that can do a transmit and receive function,” said Rebeiz.


“This compact beamforming chip will enable a breakthrough in size, weight, performance and cost in next-generation phased arrays for millimeter-wave military sensor and communication systems,” DARPA officials wrote in a statement.

“DARPA has funded us to try to get everything on a single silicon chip – which would reduce the cost of phased arrays tremendously. In large quantities, this new chip would cost a few dollars to manufacture. Obviously, this is only the transmitter. You still need the receiver but one can easily build the receiver chip based on the designs available in the transmitter chip. Our work addresses the most costly part of the phased array – the 16:1 divider, phase shifters, amplitude controllers and the uniformity and isolation between channels,” said Rebeiz

The chip also contains all the CMOS digital circuits necessary for complete digital control of the phased array, and was done using the commercial Jazz SBC18HX process. This is a first and greatly reduces the fabrication complexity of the phased array. The chip has been designed for use at the defense satellite communications frequency – the Q-band - which goes from 40 to 50 GHz.

“If you take the same design and move it to the 24 or 60 GHz range, you can use it for commercial terrestrial communications,” said Rebeiz who is also a lead on a separate project, funded by Intel and a UC-Discovery Grant, to create silicon CMOS phased array chips that could be embedded into laptops and serve as high speed data transfer tools.

The Intel project is a collaboration between Rebeiz, Larry Larson and Ian Galton – all electrical engineering professors at the UCSD Jacobs School of Engineering. Larson also serves as the chair of the Department of Electrical and Computer Engineering.

“If you wanted to download a large movie file, a base station could find you, zoom onto you, and direct a beam to your receiver chip. This could enable data transfer of hundreds of gigabytes of information very quickly, and without connecting a cable or adhering to the alignment requirements of wireless optical data transfer,” explained Rebeiz who estimated that this kind of system could be available in as little as three years.

Phased Array Background Information

Phased arrays have been around for more than half a century. They are groups of antennas in which the relative phases of the signals that feed them are varied so that the effective radiation pattern of the array is reinforced in a particular direction and suppressed in undesired directions. This property – combined with the fact that radio waves can pass through clouds and most other materials that stymie optical communication systems – has led engineers to use phased arrays for satellite communications, and for detecting incoming airplanes, ships and missiles.

Some phased arrays are larger than highway billboards and the most powerful – used as sophisticated radar, surveillance and communications systems for military aircraft and ships – can cost hundreds of millions of dollars. The high cost has prevented significant spread beyond military and high-end satellite communication applications. Engineers are now working to miniaturize them and fully integrate them into silicon-based electronic systems for both military and commercial applications.

The new UCSD chip packs 16 channels into a 3.2 by 2.6 mm² chip. The input signal is divided on-chip into 16 different paths with equal amplitude and phase using an innovative design, and the phase and gain of each of the 16 channels is controlled electronically to direct the antenna pattern (beam) into a specific direction.

By manipulating the phase, you can steer the beam electronically in nanoseconds. With the amplitude, you control the width of the beam, which is critical, for example, when you send information to from one satellite to another but you don’t want the signal to reach any nearby satellites. And with amplitude and phase control, you can synthesize deep nulls in the antenna pattern so as to greatly reduce the effect of interfering signals from neighboring transmitters.

The work was done by two graduate students, Kwang-Jin Koh and Jason May, both at the Electrical and Computer Engineering Department (ECE) at UCSD. Rebeiz presented the new chip at DARPA TEAM Meeting, August 28-29, 2007 in Chicago, Illinois. Additional details of the chip will be submitted to an academic journal later this year.

Adapted from materials provided by University of California - San Diego.



Read the rest of this entry »

Daily Science Journal (Nov. 3, 2007) — Cancer cells treated with carbon nanotubes can be destroyed by non-invasive radio waves that heat up the nanotubes while sparing untreated tissue, a research team led by scientists at The University of Texas M. D. Anderson Cancer Center and Rice University has shown in preclinical experiments.

Researchers show that the technique completely destroyed liver cancer tumors in rabbits. There were no side effects noted. However, some healthy liver tissue within 2-5 millimeters of the tumors sustained heat damage due to nanotube leakage from the tumor.

"These are promising, even exciting, preclinical results in this liver cancer model," says senior author Steven Curley, M.D., professor in M. D. Anderson's Department of Surgical Oncology. "Our next step is to look at ways to more precisely target the nanotubes so they attach to, and are taken up by, cancer cells while avoiding normal tissue."

Targeting the nanotubes solely to cancer cells is the major challenge in advancing the therapy, Curley says. Research is under way to bind the nanotubes to antibodies, peptides or other agents that in turn target molecules expressed on cancer cells. To complicate matters, most such molecules also are expressed in normal tissue.


Curley estimates that a clinical trial is at least three to four years away.

Curley conducted the research at M. D. Anderson in collaboration with nanotechnology experts at Rice University and with Erie, Pennsylvania, entrepreneur John Kanzius of ThermMed LLC, who invented the experimental radiofrequency generator used in the experiments. Kanzius is a cancer survivor and former radio station owner whose insights into the potential of targeted radio waves inspired this line of research.

At Rice, the work was begun by Nobel laureate Richard Smalley, several months before his untimely death from cancer in October 2005. Smalley was the founder of Rice's Carbon Nanotechnology Laboratory and one of the world's foremost experts on carbon nanotubes. He shared the Nobel Prize for the 1985 discovery of fullerenes, the family of carbon molecules that includes nanotubes. His research in 2005 was concentrated largely on the radiofrequency cancer research project.

Rice materials scientist professor Boris Yakobson, Ph.D., a co-author on the paper, recalled meeting with Smalley in his hospital room at M. D. Anderson five days before his death.

"He looked very ill, breathing heavily through the oxygen mask, but all he wanted to do was talk about the physics of this very phenomenon," Yakobson said. "Oblivious of his ebbing health, Rick was focused in the future. He had told Congress in 1999 that nanotechnology would help revolutionize cancer treatment, and he was a scientist wanting to know whether this technology might be one of the things that would make that possible."

In the liver cancer experiment, a solution of single-walled carbon nanotubes was injected directly into the tumors. Four treated rabbits were then exposed to two minutes of radiofrequency treatment, resulting in thermal destruction of their tumors.

Carbon nanotubes are hollow cylinders of pure carbon that measure about a billionth of a meter, or one nanometer, across.

Control group tumors that were treated only by radiofrequency exposure or only by nanotubes were undamaged.

In lab experiments, two lines of liver cancer cells and one pancreatic cancer cell line were destroyed after being incubated with nanotubes and exposed to the radiofrequency field.

"I'm humbled by the results of this research," says Kanzius. "I realize it's early in the race, but Dr. Curley and his team have moved on this carefully with utmost speed. I look forward to continuing to work with them and hopefully to watching the first person be treated with this procedure. The race isn't over but it needs to be taken to the finish line."

Radiofrequency energy fields penetrate deeply into tissue, so it would be possible to deliver heat anywhere in the body if targeted nanotubes or other nanoparticles can be delivered to cancerous cells, Curley says. Without such a target, radio waves will pass harmlessly through the body.

An invasive technique known as radio frequency ablation is used to treat some malignant tumors, the authors note. It requires insertion of needle electrodes directly into the tumors. ncomplete tumor destruction occurs in 5 to 40 percent of cases, normal tissue is damaged and complications arise in 10 percent of patients who suffer such damage. Radiofrequency ablation is limited to liver, kidney, breast, lung and bone cancers.

This research appeared online ahead of December publication in the journal Cancer.

The research was supported by an American Association of Cancer Research Littlefield Grant, NASA and the Houston-based Alliance for NanoHealth, the National Science Foundation, the Center for Biological and Environmental Nanotechnology and the Fulbright Foundation.

Co-authors with Curley, Smalley, Kanzius and Yakobson are first authors Christopher J. Gannon, M.D., also of M. D. Anderson's Department of Surgical Oncology, and Paul Cherukuri, Ph.D. of Rice's Carbon Nanotechnology Laboratory and Department of Chemistry; Carter Kittrell, Ph.D., R. Bruce Weisman, Ph.D., Matteo Pasquali, Ph.D., and Howard K. Schmidt, Ph.D., all of Rice; and Laurent Cognet, Ph.D., of Rice and the Centre de Physique Moléculaire Optique et Hertzienne, Université Bordeaux, France.

Adapted from materials provided by University of Texas M. D. Anderson Cancer Center.



Read the rest of this entry »

Daily Science Journal (Nov. 02, 2007) — In the first large-scale analysis of proteins in the brains of individuals addicted to cocaine, researchers have uncovered novel proteins and mechanisms that may one day lead to new treatment options to fight addiction.

The results, reported in the current issue of Molecular Psychiatry, released on-line today, show differences in the amounts of 50 proteins and point to profound changes in brain function related to long-term cocaine use, said Scott E. Hemby, Ph.D., of Wake Forest University of Medicine.

The researcher used technology so advanced it was like looking for differences in brain tissue with "floodlights" rather than a "flashlight," he said. Hemby and his colleagues analyzed thousands of proteins from brain tissue obtained from individuals who died of cocaine overdose and compared these "protein profiles" with individuals who died of non-drug related causes.


"The findings provide new insights into the long-term effects and damage that cocaine has on the human brain and will help guide future animal studies to further delineate the biochemical changes that comprise the addicted brain," said Hemby, associate professor of physiology and pharmacology.

The researchers compared the proteome (the entire complement of proteins expressed at a given time) between the two groups by separating all of the proteins and then using high-throughput mass spectrometry which allowed the accurate identifcation of thousands of proteins simultaneously, Hemby said.

The unbiased nature of the technology enables the determination of novel proteins and pathway involved in disease. Using post-mortem brain tissue samples from the Brain Endowment Bank at the University of Miami, the investigators analyzed protein expression in the nucleus accumbens, a part of the brain involved in the addictive effects of drugs, in 10 cocaine-overdose victims and 10 drug-free individuals.

Analysis of thousands of proteins revealed differences between the two groups in the amounts of approximately 50 proteins, most of which correspond to changes in the ability of the brain cells to strenghten their connections and communicate with one another.

Understanding the coordinated involvement of multiple proteins in cocaine abuse provides insight into the molecular basis of the disease and offers new targets for pharmaco-therapeutic intervention for drug-abuse-related disorders, he said.

"These studies are an important and significant step to further our understanding of the vast and long-term consequences of cocaine use and may provide insights into novel targets for medication development," Hemby said.

The research was supported by the National Institutes of Health. Co-researchers were Nilesh Tannu, M.B.B.S., M.S., of Wake Forest, and Deborah Mash, Ph.D., of the University of Miami School of Medicine.

Adapted from materials provided by Wake Forest University Baptist Medical Center.



Read the rest of this entry »