Saturday 5 November 2011

Digital worlds can help autistic children to develop social skills

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 21, 2011) — The benefits of virtual worlds can be used to help autistic children develop social skills beyond their anticipated levels, suggest early findings from new research funded by the Economic and Social Research Council (ESRC). Researchers on the Echoes Project have developed an interactive environment which uses multi-touch screen technology where virtual characters on the screen demonstrate gestures and  show children's actions in real time.

During sessions in the virtual environment, primary school children experiment with different social scenarios, allowing the researchers to compare their reactions with those they display in real-world situations.

"Discussions of the data with teachers suggest a fascinating possibility," said project leader Dr Kaska Porayska-Pomsta."Learning environments such as Echoes may allow some children to exceed their potential, behaving and achieving in ways that even teachers who knew them well could not have anticipated."

"A teacher observing a child interacting in such a virtual environment may gain access to a range of behaviours from individual children that would otherwise be difficult or impossible to observe in a classroom," she added.

Early indications of this research are that over a number of sessions some children demonstrate a better quality of interaction within the virtual environment and an increased ability to manage their own behaviour, enabling them to concentrate on following a virtual character's gaze or to focus on a pointing gesture, thus developing the skills vital for good communication and effective learning.

The findings could prove particularly useful in helping children with autism to develop skills they normally find difficult. Dr Porayska-Pomsta said: "Since autistic children have a particular affinity with computers, our research shows it may be possible to use digital technology to help develop their social skills."

"The beauty of it is that there are no real-world consequences, so children can afford to experiment with different social scenarios without real-world risks," she added.

The findings from the Echoes Project will showcase technologies for autism during an event in Birmingham which is part of the ESRC Festival of Social Science in November.

"In the longer term, virtual platforms such as the ones developed in the Echoes project could help young children to realise their potential in new and unexpected ways," concluded Dr Porayska-Pomsta.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Economic and Social Research Council (ESRC).

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Homicide, suicide outpace traditional causes of death in pregnant, postpartum women

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 20, 2011) — Violent deaths are outpacing traditional causes of maternal mortality, such as hemorrhage and preeclampsia, and conflicts with intimate partner are often a factor, researchers report.

"We found that the mortality rate from homicide and suicide were more common than what we think of as traditional causes of maternal mortality," said Dr. Christie L. Palladino, an obstetrician-gynecologist and educational researcher at Georgia Health Sciences University. "It's not what you want to read, but it's the reality."

The analysis of the Centers for Disease Control and Prevention's National Violent Death Reporting System, a surveillance system from 17 states, found 94 pregnancy-associated suicides and 139 homicides from 2003-07. Overall, 64.4 percent of pregnancy-associated violent deaths -- classified by the CDC as death during pregnancy and the following year -- occurred during pregnancy. The mortality rate was 4.9 per 100,000 live births.

The findings, published in the journal Obstetrics & Gynecology, are a wakeup call for health care providers and families alike about the need for mental health awareness and treatment at a time typically associated with great joy, said Palladino who is working to enhance training of obstetrician-gynecologists in depression diagnosis and treatment.

"We have a lot of studies looking at the effects of pregnancy on the baby but often we don't focus so much on outcomes for moms, and the most disastrous of those for both mom and baby would be suicide," Palladino said. She noted that homicide and suicide are both potentially preventable. "The more we look into ways to prevent suicide, ways to effectively manage women with mental health diagnoses during pregnancy and postpartum, the more we can take steps to prevent these deaths," she said.

Among the suicides, 45.7 percent occurred during pregnancy and problems with current or former partners appeared to contribute to more than half. Older, Caucasian women were at greatest risk. Among homicides, 77.7 percent occurred during pregnancy and more than half the women were age 24 or younger and unmarried. Nearly half were black, even though black women accounted for less than 20 percent of the live births, and 45 percent were associated with violence from a current or former partner.

In follow-up, Palladino is surveying practicing physicians about their practice patterns in treating depression during pregnancy. She and her colleagues also want to learn more about precipitating circumstances such as substance abuse, stress and mental illness and treatment. Studies already indicate that intervention lowers the recurrence risk of intimate partner violence in pregnancy and postpartum.

It was her early experience as an obstetrician-gynecologist who felt ill-prepared to treat depression in pregnant women that got her interested in the topic. A 2003 report in The British Journal of Psychiatry identifying suicide as the leading cause of maternal death in Great Britain sealed the deal. "Unfortunately what we found paralleled the Great Britain findings," Palladino said.

The good news is that evidence-based guidelines for depression treatment during pregnancy or postpartum have been developed by The American College of Obstetricians and Gynecologists and The American Psychiatric Association, she said. The bad news is that women and their providers might be hesitant to seek or provide care because mental illness and pregnancy seem counterintuitive.

Collateral materials such as police, coroner and medical examiner reports also were examined as part of the CDC database to provide greater context for the cause of death. While the overall pregnancy-associated violent death rate was stable over the four-year study period, those numbers could be underreported because the pregnancy or postpartum status was marked "unknown" in the majority of female deaths in the CDC database, the researchers noted. Pregnancy or postpartum status also could be missed because autopsies might not include a pregnancy exam, might miss postpartum signs or might fail to report pregnancy status on death certificates.

States participating in the CDC violent death database include South Carolina, Georgia, North Carolina, Virginia, New Jersey, Maryland, Alaska, Massachusetts, Oregon, Colorado, Oklahoma, Rhode Island, Wisconsin, California, Kentucky, New Mexico and Utah.

Researchers at the University of Michigan and Johns Hopkins University co-authored the study.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Georgia Health Sciences University.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Friday 4 November 2011

Cooling the warming debate: Major new analysis confirms that global warming is real

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 21, 2011) — Global warming is real, according to a major study released Oct. 20. Despite issues raised by climate change skeptics, the Berkeley Earth Surface Temperature study finds reliable evidence of a rise in the average world land temperature of approximately 1°C since the mid-1950s.

Analyzing temperature data from 15 sources, in some cases going as far back as 1800, the Berkeley Earth study directly addressed scientific concerns raised by skeptics, including the urban heat island effect, poor station quality, and the risk of data selection bias.

On the basis of its analysis, according to Berkeley Earth's founder and scientific director, Professor Richard A. Muller, the group concluded that earlier studies based on more limited data by teams in the United States and Britain had accurately estimated the extent of land surface warming.

"Our biggest surprise was that the new results agreed so closely with the warming values published previously by other teams in the U.S. and the U.K.," Muller said. "This confirms that these studies were done carefully and that potential biases identified by climate change skeptics did not seriously affect their conclusions."

Previous studies, carried out by NOAA, NASA, and the Hadley Center, also found that land warming was approximately 1°C since the mid-1950s, and that the urban heat island effect and poor station quality did not bias the results. But their findings were criticized by skeptics who worried that they relied on ad-hoc techniques that meant that the findings could not be duplicated. Robert Rohde, lead scientist for Berkeley Earth, noted that "the Berkeley Earth analysis is the first study to address the issue of data selection bias, by using nearly all of the available data, which includes about 5 times as many station locations as were reviewed by prior groups."

Elizabeth Muller, co-founder and Executive Director of Berkeley Earth, said she hopes the Berkeley Earth findings will help "cool the debate over global warming by addressing many of the valid concerns of the skeptics in a clear and rigorous way." This will be especially important in the run-up to the COP 17 meeting in Durban, South Africa, later this year, where participants will discuss targets for reducing Greenhouse Gas (GHG) emissions for the next commitment period as well as issues such as financing, technology transfer and cooperative action.

The Berkeley Earth team includes physicists, climatologists, and statisticians from California, Oregon, and Georgia. Rohde led the development of a new statistical approach and what Richard Muller called "the Herculean labor" of merging the data sets. One member of the group, Saul Perlmutter, was recently announced as a winner of the 2011 Nobel Prize in Physics (for his work in cosmology).

The Berkeley Earth study did not assess temperature changes in the oceans, which according to the Intergovernmental Panel on Climate Change (IPCC) have not warmed as much as land. When averaged in, they reduce the global surface temperature rise over the past 50 years -- the period during which the human effect on temperatures is discernable -- to about two thirds of one degree Centigrade.

Specifically, the Berkeley Earth study concludes that:

The urban heat island effect is locally large and real, but does not contribute significantly to the average land temperature rise. That's because the urban regions of Earth amount to less than 1% of the land area.About 1/3 of temperature sites around the world reported global cooling over the past 70 years (including much of the United States and northern Europe). But 2/3 of the sites show warming. Individual temperature histories reported from a single location are frequently noisy and/or unreliable, and it is always necessary to compare and combine many records to understand the true pattern of global warming.The large number of sites reporting cooling might help explain some of the skepticism of global warming," Rohde commented. "Global warming is too slow for humans to feel directly, and if your local weather man tells you that temperatures are the same or cooler than they were a hundred years ago it is easy to believe him." In fact, it is very hard to measure weather consistently over decades and centuries, and the presence of sites reporting cooling is a symptom of the noise and local variations that can creep in. A good determination of the rise in global land temperatures can't be done with just a few stations: it takes hundreds -- or better, thousands -- of stations to detect and measure the average warming. Only when many nearby thermometers reproduce the same patterns can we know that the measurements were reliably made.Stations ranked as "poor" in a survey by Anthony Watts and his team of the most important temperature recording stations in the U.S., (known as the USHCN -- the US Historical Climatology Network), showed the same pattern of global warming as stations ranked "OK." Absolute temperatures of poor stations may be higher and less accurate, but the overall global warming trend is the same, and the Berkeley Earth analysis concludes that there is not any undue bias from including poor stations in the survey.

Four scientific papers setting out these conclusions have been submitted for peer review and will form part of the literature for the next IPCC report on Climate Change. They can be accessed on: www.BerkeleyEarth.org. A video animation graphically shows global warming around the world since 1800.

Berkeley Earth is making its preliminary results public, together with its programs and dataset, in order to invite additional scrutiny. Elizabeth Muller said that "one of our goals is to make the science behind global warming readily accessible to the public." Most of the data were previously available on public websites, but in so many different locations and different formats that most people could access only a small subset of the data. The merged database, which combines 1.6 billion records, is now accessible from the Berkeley Earth website: www.BerkeleyEarth.org .

What Berkeley Earth has not done is make an independent assessment of how much of the observed warming is due to human actions, Richard Muller acknowledged. As a next step, Berkeley Earth plans to address the total warming of the oceans, with a view to obtaining a more accurate figure for the total amount of global warming observable.

More information about Berkeley Earth is available at www.BerkeleyEarth.org.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Berkeley Earth Surface Temperature.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Fluoride shuttle increases storage capacity: Researchers develop new concept for rechargeable batteries

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 21, 2011) — Karlsruhe Institute of Technology (KIT) researchers have developed a new concept for rechargeable batteries. Based on a fluoride shuttle -- the transfer of fluoride anions between the electrodes -- it promises to enhance the storage capacity reached by lithium-ion batteries by several factors. Operational safety is also increased, as it can be done without lithium.

The fluoride-ion battery is presented for the first time in the Journal of Materials Chemistry by Dr. Maximilian Fichtner and Dr. Munnangi Anji Reddy.

Lithium-ion batteries are applied widely, but their storage capacity is limited. In the future, battery systems of enhanced energy density will be needed for mobile applications in particular. Such batteries can store more energy at reduced weight. For this reason, KIT researchers are also conducting research into alternative systems. A completely new concept for secondary batteries based on metal fluorides was developed by Dr. Maximilian Fichtner, Head of the Energy Storage Systems Group, and Dr. Munnangi Anji Reddy at the KIT Institute of Nanotechnology (INT).

Metal fluorides may be applied as conversion materials in lithium-ion batteries. They also allow for lithium-free batteries with a fluoride-containing electrolyte, a metal anode, and metal fluoride cathode, which reach a much better storage capacity and possess improved safety properties. Instead of the lithium cation, the fluoride anion takes over charge transfer. At the cathode and anode, a metal fluoride is formed or reduced. "As several electrons per metal atom can be transferred, this concept allows to reach extraordinarily high energy densities -- up to ten times as high as those of conventional lithium-ion batteries," explains Dr. Maximilian Fichtner.

The KIT researchers are now working on the further development of material design and battery architecture in order to improve the initial capacity and cyclic stability of the fluoride-ion battery. Another challenge lies in the further development of the electrolyte: The solid electrolyte applied so far is suited for applications at elevated temperatures only. It is therefore aimed at finding a liquid electrolyte that is suited for use at room temperature.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Karlsruhe Institute of Technology.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

M. Anji Reddy, M. Fichtner. Batteries based on fluoride shuttle. Journal of Materials Chemistry, 2011; DOI: 10.1039/C1JM13535J

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Simple lifestyle changes can add a decade or more healthy years to the average lifespan, Canadian study shows

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 21, 2011) — Health prevention strategies to help Canadians achieve their optimal health potential could add a decade or more of healthy years to the average lifespan and save the economy billions of dollars as a result of reduced cardiovascular disease, says noted cardiologist Dr. Clyde Yancy.

Dr. Yancy, who will deliver the Heart and Stroke Foundation of Canada Lecture at the opening ceremonies of the Canadian Cardiovascular Congress in Vancouver on October 23, will tell delegates that people who follow seven simple steps to a healthy life can expect to live an additional 40 to 50 years after the age of 50.

"Achieving these seven simple lifestyle factors gives people a 90 per cent chance of living to the age of 90 or 100, free of not only heart disease and stroke but from a number of other chronic illnesses including cancer," says Dr. Yancy, a professor of medicine and chief of cardiology at the Northwestern University's Feinberg School of Medicine. He is also the past-president of the American Heart Association.

"By following these steps, we can compress life-threatening disease into the final stages of life and maintain quality of life for the longest possible time." He predicts that, if we act now, we can reverse the tide by 2020.

According to the Heart and Stroke Foundation, every year in Canada about 250,000 potential years of life are lost due to heart disease and stroke, which are two of the three leading causes of death in Canada.

Canadians can achieve optimal health, says Dr. Yancy, by following these steps:

1. GET ACTIVE: Inactivity can shave almost four years off a person's expected lifespan. People who are physically inactive are twice as likely to be at risk for heart disease or stroke.

2. KNOW AND CONTROL CHOLESTEROL LEVELS: Almost 40 per cent of Canadian adults have high blood cholesterol, which can lead to the build up of fatty deposits in your arteries - increasing your risk for heart disease and stroke.

3. FOLLOW A HEALTHY DIET: Healthy eating is one of the most important things you can do to improve your health -- yet about half of Canadians don't meet the healthy eating recommendations.

4. KNOW AND CONTROL BLOOD PRESSURE: High blood pressure - often called a 'silent killer' because it has no warning signs or symptoms - affects one in five Canadians. By knowing and controlling your blood pressure, you can cut your risk of stroke by up to 40 per cent and the risk of heart attack by up to 25 per cent.

5. ACHIEVE AND MAINTAIN A HEALTHY WEIGHT: Almost 60 per cent of Canadian adults are either overweight or obese - major risk factors for heart disease and stroke. Being obese can reduce your life span by almost four years.

6. MANAGE DIABETES: By 2016 an estimated 2.4 million Canadians will live with diabetes.Diabetes increases the risk of high blood pressure, atherosclerosis (narrowing of the arteries), coronary artery disease, and stroke, particularly if your blood sugar levels are poorly controlled.

7. BE TOBACCO FREE: More than 37,000 Canadians die prematurely each year due to tobacco use, and thousands of non-smokers die each year from exposure to second-hand smoke. As soon as you become smoke-free, your risk of heart disease and stroke begins to decrease. After 15 years ,your risk will be nearly that of a non-smoker.

A call for focused prevention strategies

While this goal of optimal health has been achieved by fewer than 10 per cent of the population, "it demonstrates the striking potential that prevention has if it is broadly embraced," says Dr. Yancy. "We know how to prevent heart disease and stroke -- we now need to build the tools to empower our citizens to manage their risk and prevent heart disease."

Dr. Yancy calls on governments to invest in steady and focused prevention strategies. He says that necessary initiatives include a change in current sodium policies, continued progress in tobacco control initiatives, increased green space, and health education.

"Healthy living is key to preventing heart disease and stroke," says Bobbe Wood, president of the Heart and Stroke Foundation of Canada. "The Foundation is committed to raising awareness about heart health and to promoting public policies that facilitate healthy lifestyles and communities."

She says that the Foundation will continue to build on partnerships and policies that have led to a significant reduction of trans fats in the Canadian food supply; stronger tobacco control initiatives; healthy community design; and a continued reduction in the amount of salt in our food products, which has been achieved in part through Health Check™, the Foundation's flagship food information program.

Dr. Yancy adds that improved access to health care that focuses on prevention and control of important risk factors including high blood pressure, high cholesterol and diabetes is also key.

Raising the alarm over looming costs of treating heart disease

Dr. Yancy will also raise the alarm over the looming cost of treating heart disease now and in the future.

With predictions that the direct medical cost of treating heart disease in the U.S. alone could climb to $818 billion in 2030, he says there is a health and economic imperative for governments and societies around the world to embrace prevention strategies.

Heart disease and stroke cost the Canadian economy more than $20.9 billion every year in physician services, hospital costs, lost wages and decreased productivity.

"The opportunity for prevention is not an unrealistic expectation," says Dr. Yancy. "Over the past 40 years the rates of heart disease and stroke have steadily declined." The rate has declined in Canada by 70 per cent since the mid-1950s. In the last decade alone, the rate has declined by 25 per cent.

Unfortunately, says Dr. Yancy, these benefits may be short-lived if the burden of risk, specifically obesity and diabetes, continues to grow, especially in children. "We need to act now."

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Heart and Stroke Foundation of Canada.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Seeking answers to treat the fear of childbirth

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 21, 2011) — A few women are so afraid of giving birth that they avoid becoming pregnant or seek an abortion, even though they want to have children. This fear is related to several serious conditions such as prolonged labour, a greater need for pain relief during labour and an increased risk of an emergency C-section. In some cases, the fear of childbirth is so serious that it can be classified as a specific phobia, such as a fear of dentists or a fear of heights, and leads to avoidance behaviour.

In Oslo, 5-10 per cent of all pregnant women are treated for fear of childbirth. It is the reason underlying about 20 per cent of all planned C-sections. This trend is on the rise, which is unfortunate since C-sections pose a greater risk to the mother and child than a vaginal delivery.

World's largest study of the fear of childbirth

There is very little research-based knowledge about the actual causes of the fear of childbirth.

"The studies conducted up until now have been limited and have had a low response rate. They have also lacked good psychometric measuring instruments for gauging the mental health of the participants," says Malin Eberhard-Gran, a senior researcher at the Norwegian Institute for Public Health. She is the project manager of an extensive study investigating the causal relationships between the risk factors for fear of childbirth and how this fear affects delivery and the child. The "Fear of childbirth: causes and consequences" project is funded in part by the Research Council of Norway's funding scheme for independent projects (FRIPRO).

"This group of pregnant women costs society considerable sums, but as of today we do not know whether we are giving them the proper treatment and follow-up. In addition to the societal perspective, there is also the personal suffering of the women themselves, their families and not least the child to take into consideration."

"This study will enable us to gain more insight into what we are treating as well as how we are treating it," says Dr Eberhard-Gran, who emphasises that a large percentage of women with a fear of childbirth are well functioning in terms of mental health.

"Currently many women who suffer from a fear of childbirth are treated as if they have a classic phobia, but this is wrong. It is not an illness to feel fear. It is actually completely normal to be afraid of giving birth," explains Dr Eberhard-Gran, who wrote her doctoral thesis on post-partum depression.

Involved mothers

"We got an extremely high response rate to our study," says the researcher. "More than 80 per cent of all of the women who gave birth at one of the large hospitals in the Oslo area in the period from 2009 to 2011 took part in the study -- that is almost 4,000 women."

"We believe so many women decided to participate because they wanted to help to enhance the quality of pregnancy checkups and post-partum follow-up in general."

International publication

The researchers have collected large amounts of valuable data that shed light on and document numerous problems related to pregnancy and health.

"So far we have authored nine articles that have either been published in, submitted to or are being assessed for publication by international scientific journals. Three have already been published online," says Dr Eberhard-Gran. The project will be followed up by a study of the respondents two years after giving birth.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by The Research Council of Norway.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Thursday 3 November 2011

Giant flakes make graphene oxide gel: Discovery could boost metamaterials, high-strength fibers

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 20, 2011) — Giant flakes of graphene oxide in water aggregate like a stack of pancakes, but infinitely thinner, and in the process gain characteristics that materials scientists may find delicious.

A new paper by scientists at Rice University and the University of Colorado details how slices of graphene, the single-atom form of carbon, in a solution arrange themselves to form a nematic liquid crystal in which particles are free-floating but aligned.

That much was already known. The new twist is that if the flakes -- in this case, graphene oxide -- are big enough and concentrated enough, they retain their alignment as they form a gel. That gel is a handy precursor for manufacturing metamaterials or fibers with unique mechanical and electronic properties.

The team reported its discovery online this week in the Royal Society of Chemistry journal Soft Matter. Rice authors include Matteo Pasquali, a professor of chemical and biomolecular engineering and of chemistry; James Tour, the T.T. and W.F. Chao Chair in Chemistry as well as a professor of mechanical engineering and materials science and of computer science; postdoctoral research associate Dmitry Kosynkin; and graduate students Budhadipta Dan and Natnael Behabtu. Ivan Smalyukh, an assistant professor of physics at the University of Colorado at Boulder, led research for his group, in which Dan served as a visiting scientist.

"Graphene materials and fluid phases are a great research area," Pasquali said. "From the fundamental point of view, fluid phases comprising flakes are relatively unexplored, and certainly so when the flakes have important electronic properties.

"From the application standpoint, graphene and graphene oxide can be important building blocks in such areas as flexible electronics and conductive and high-strength materials, and can serve as templates for ordering plasmonic structures," he said.

By "giant," the researchers referred to irregular flakes of graphene oxide up to 10,000 times as wide as they are high. (That's still impossibly small: on average, roughly 12 microns wide and less than a nanometer high.) Previous studies showed smaller bits of pristine graphene suspended in acid would form a liquid crystal and that graphene oxide would do likewise in other solutions, including water.

This time the team discovered that if the flakes are big enough and concentrated enough, the solution becomes semisolid. When they constrained the gel to a thin pipette and evaporated some of the water, the graphene oxide flakes got closer to each other and stacked up spontaneously, although imperfectly.

"The exciting part for me is the spontaneous ordering of graphene oxide into a liquid crystal, which nobody had observed before," said Behabtu, a member of Pasquali's lab. "It's still a liquid, but it's ordered. That's useful to make fibers, but it could also induce order on other particles like nanorods."

He said it would be a simple matter to heat the concentrated gel and extrude it into something like carbon fiber, with enhanced properties provided by "mix-ins."

Testing the possibilities, the researchers mixed gold microtriangles and glass microrods into the solution, and found both were effectively forced to line up with the pancaking flakes. Their inclusion also helped the team get visual confirmation of the flakes' orientation.

The process offers the possibility of the large-scale ordering and alignment of such plasmonic particles as gold, silver and palladium nanorods, important components in optoelectronic devices and metamaterials, they reported.

Behabtu added that heating the gel "crosslinks the flakes, and that's good for mechanical strength. You can even heat graphene oxide enough to reduce it, stripping out the oxygen and turning it back into graphite."

Co-authors of the paper are Angel Martinez and Julian Evans, graduate students of Smalyukh at the University of Colorado at Boulder.

The Institute for Complex Adaptive Matter, the Colorado Renewable and Sustainable Energy Initiative, the National Science Foundation, the Air Force Research Lab, the Air Force Office of Scientific Research, the Welch Foundation, the U.S. Army Corps of Engineers Environmental Quality and Installation Program and M-I Swaco supported the research.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Rice University.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

Budhadipta Dan, Natnael Behabtu, Angel Martinez, Julian S. Evans, Dmitry V. Kosynkin, James M. Tour, Matteo Pasquali, Ivan I. Smalyukh. Liquid crystals of aqueous, giant graphene oxide flakes. Soft Matter, 2011; DOI: 10.1039/C1SM06418E

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Vivid descriptions of faces 'don't have to go into detail'

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 21, 2011) — Celebrated writers such as Charles Dickens and George Eliot described characters' faces vividly without going into detail about their features, according to a research group led at Strathclyde.

Experts in literature, psychology, neurology and music suggested that vividness can be created not only by describing individual features, such as the eyes, nose or chin, but by the strength of readers' feelings about how a person is depicted.

These feelings may be triggered by the 'mirror neuron system,' in which people who see an action being performed have the same regions of the brain activated as are needed to perform the action itself- for example, by flinching when they see someone injured.

The researchers illustrated their theory by highlighting descriptions of characters in works by writers including Dickens, Eliot, Geoffrey Chaucer and Sir Walter Scott. They found that, in many cases, the face was not explicitly mentioned but that the scientific literature suggests this may be more beneficial for forming a vivid response to the description.

Dr Elspeth Jajdelska, a lecturer in Strathclyde's Faculty of Humanities & Social Sciences, led the research. She said: "Faces are something we perceive in a different way to other objects.

"Psychological research shows that we perceive and process them as a whole, not as a set of features, and while some literary descriptions of a face supply pieces of information to be assembled like a jigsaw puzzle, others may involve a holistic picture and an immediate response to what the author has described- these may not necessarily be accurate images, in terms of the face the author has in mind, but could still be very vivid.

"There is evidence to suggest that asking for a verbal description of a face can make it less easy for the face to be recognised and other research has called the effectiveness of the photofit identification technique into question- all suggesting that piece by piece descriptions of a face may not be the ideal way to communicate face information in words.

"However, a writer's description might produce a vivid response with only a partial description if it is also holistic, or draws on emotional qualities of the face."

One of the descriptions examined was of Bill Sikes, the character in Dickens' Oliver Twist, whose black eye is said to have "displayed various parti-coloured symptoms of having been recently damaged by a blow." The researchers suggested that this description could be more vivid than one which was more precise about the discolouration.

The researchers' theory defined 'vividness' in several ways, including: something belonging to a stimulus, such as a piece of text; an emotional experience produced by such a stimulus, or how realistic the mental images produced by text are.

Dr Steve Kelly, a Senior Lecturer in Psychology in Strathclyde's Faculty of Humanities & Social Sciences, was a research partner in the project. Researchers from the University of Oxford, the University of Edinburgh and Glasgow Caledonian University were also involved.

The research paper has been published in the journal Poetics Today.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by University of Strathclyde.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Biggest ever study shows no link between mobile phone use and tumors

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 20, 2011) — There is no link between long-term use of mobile phones and tumours of the brain or central nervous system, finds new research published online in the British Medical Journal.

In what is described as the largest study on the subject to date, Danish researchers found no evidence that the risk of brain tumours was raised among 358,403 mobile phone subscribers over an 18-year period.

The number of people using mobile phones is constantly rising with more than five billion subscriptions worldwide in 2010. This has led to concerns about potential adverse health effects, particularly tumours of the central nervous system.

Previous studies on a possible link between phone use and tumours have been inconclusive particularly on long-term use of mobile phones. Some of this earlier work took the form of case control studies involving small numbers of long-term users and were shown to be prone to error and bias. The International Agency for Research on Cancer (IARC) recently classified radio frequency electromagnetic fields, as emitted by mobile phones, as possibly carcinogenic to humans.

The only cohort study investigating mobile phone use and cancer to date is a Danish nationwide study comparing cancer risk of all 420,095 Danish mobile phone subscribers from 1982 until 1995, with the corresponding risk in the rest of the adult population with follow-up to 1996 and then 2002. This study found no evidence of any increased risk of brain or nervous system tumours or any cancer among mobile phone subscribers.

So researchers, led by the Institute of Cancer Epidemiology in Copenhagen, continued this study up to 2007.

They studied data on the whole Danish population aged 30 and over and born in Denmark after 1925, subdivided into subscribers and non-subscribers of mobile phones before 1995. Information was gathered from the Danish phone network operators and from the Danish Cancer Register.

Overall, 10,729 central nervous system tumours occurred in the study period 1990-2007.

When the figures were restricted to people with the longest mobile phone use -- 13 years or more -- cancer rates were almost the same in both long-term users and non-subscribers of mobile phones.

The researchers say they observed no overall increased risk for tumours of the central nervous system or for all cancers combined in mobile phone users.

They conclude: "The extended follow-up allowed us to investigate effects in people who had used mobile phones for 10 years or more, and this long-term use was not associated with higher risks of cancer.

"However, as a small to moderate increase in risk for subgroups of heavy users or after even longer induction periods than 10-15 years cannot be ruled out, further studies with large study populations, where the potential for misclassification of exposure and selection bias is minimised, are warranted."

In an accompanying editorial, Professors Anders Ahlbom and Maria Feychting at the Karolinska Institutet in Sweden say this new evidence is reassuring, but continued monitoring of health registers and prospective cohorts is still warranted.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by BMJ-British Medical Journal.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal References:

P. Frei, A. H. Poulsen, C. Johansen, J. H. Olsen, M. Steding-Jessen, J. Schuz. Use of mobile phones and risk of brain tumours: update of Danish cohort study. BMJ, 2011; 343 (oct19 4): d6387 DOI: 10.1136/bmj.d6387A. Ahlbom, M. Feychting. Mobile telephones and brain tumours. BMJ, 2011; 343 (oct19 4): d6605 DOI: 10.1136/bmj.d6605

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Wednesday 2 November 2011

'Trading places' most common pattern for couples dealing with male depression

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 21, 2011) — University of British Columbia researchers have identified three major patterns that emerge among couples dealing with male depression. These can be described as "trading places," "business as usual" and "edgy tensions."

Published in the Social Science & Medicine journal and led by UBC researcher John Oliffe, the paper details how heterosexual couples' gender roles undergo radical shifts and strain when the male partner is depressed and the female partner seeks to help. Depression, a disorder often thought of as a women's health issue, is underreported in men, and little is known about how heterosexual couples respond when the male partner is depressed.

"Overall, our study underscores how women play a key role in helping their male partners manage their depression," says Oliffe, an associate professor in the School of Nursing whose work investigates masculinities and men's health with a focus on men's depression.

"Our findings suggest that gender relations are pivotal in how health decisions are made in families and for that reason, it's important to understand couple dynamics if we want to have effective interventions."

Oliffe and his UBC colleagues found that "trading places" is the most common pattern. In these relationships, the partners took on atypical masculine and feminine roles to cope with challenges caused by the men's depression. For instance, men assumed the role of homemaker while the women became the family breadwinner.

Oliffe says, "Here, women partners also broke with feminine ideals in how they provided partner support by employing tough love strategies for self-protection and a means of prompting the men's self-management of their depression."

The second most common pattern is "business as usual," when couples sought to downplay or mask any problems caused by the men's depression. Holding firm to idealized heterosexual gender roles, the women continued to support and nurture their partners. Despite their ongoing struggles with depression, the men continued to work hard to maintain their careers in typically masculine arenas, which in the study included engineering, science, law enforcement, forestry and coaching.

The third pattern, "edgy tensions," describes men and women caught in dysfunctional relationships. Each holding ideas of gender roles that differed from those of their partner, these couples grappled with resentment. The men resisted medical treatment. Instead, they used alcohol and illicit drugs, at least in part, to self-manage their depression. The women expressed ambivalence about conforming to the feminine ideal of being a "selfless nurturer," especially for men who were volatile and unpredictable. The men in turn espoused a view of themselves as head of the household.

The study conducted qualitative analysis through in-depth interviews with 26 men, diagnosed or self identified as depressed, and their 26 partners, from Prince George, Kelowna and Vancouver. The study participants ranged in age from 20 to 53 years old. The duration of the couples' relationships ranged from two months to 18 years; seven couples had children living at home.

The men self-identified as Anglo-Canadian, First Nations, European, Asian and Middle Eastern. Seven couples were in mixed ethnicity relationships. The men had varying levels of education ranging from some high school to graduate degrees; 14 of the 26 men were unemployed at the time of interview, and self-identified as being of low socio-economic status as a consequence.

This research received support through the Canadian Institutes of Health Research, Institute of Gender and Health.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by University of British Columbia.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

John L. Oliffe, Mary T. Kelly, Joan L. Bottorff, Joy L. Johnson, Sabrina T. Wong. "He’s more typically female because he’s not afraid to cry": Connecting heterosexual gender relations and men’s depression. Social Science & Medicine, 2011; 73 (5): 775 DOI: 10.1016/j.socscimed.2011.06.034

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

How do protein binding sites stay dry in water?

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 21, 2011) — In a report to be published soon in The European Physical Journal E, researchers from the National University of the South in Bahía Blanca, Argentina studied the condition for model cavity and tunnel structures resembling the binding sites of proteins to stay dry without losing their ability to react, a prerequisite for proteins to establish stable interactions with other proteins in water.

E.P. Schulz and colleagues used models of nanometric-scale hydrophobic cavities and tunnels to understand the influence of geometry on the ability of those structures to stay dry in solution.

The authors studied the filling tendency of cavities and tunnels carved in a system referred to as an alkane-like monolayer, chosen for its hydrophobic properties, to ensure that no factors other than geometrical constraints determine their ability to stay dry.

They determined that the minimum size of hydrophobic cavities and tunnels that can be filled with water is on the order of a nanometer. Below that scale, these structures stay dry because they provide a geometric shield; if a water molecule were to penetrate the cavity it would pay the excessive energy cost of giving up its hydrogen bonds. By comparison, water fills carbon nanotubes that are twice as small (but slightly less hydrophobic) than the alkane monolayer, making them less prone to stay dry.

The authors also showed that the filling of nanometric cavities and tunnels with water is a dynamic process that goes from dry to wet over time. They believe that water molecules inside the cavities or tunnels are arranged in a network of strong cooperative hydrogen bonds. Their disruption by means of thermal fluctuations results in the temporary drying of the holes until new bonds are re-established.

One of the many potential applications is in biophysics, to study water-exclusion sites of proteins, and understand the physical phenomenon linked to the geometry of those sites, underpinning the widespread biological process of protein-protein associations.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Springer Science+Business Media.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

E. P. Schulz, L. M. Alarcón, G. A. Appignanesi. Behavior of water in contact with model hydrophobic cavities and tunnels and carbon nanotubes. The European Physical Journal E, 2011; 34 (10) DOI: 10.1140/epje/i2011-11114-8

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Biomarker detects graft-versus-host-disease in cancer patients after bone marrow transplant

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 21, 2011) — A University of Michigan Health System-led team of researchers has found a biomarker they believe can help rapidly identify one of the most serious complications in patients with leukemia, lymphoma and other blood disorders who have received a transplant of new, blood-forming cells.

Known as a hematopoietic stem cell transplant, these patients receive bone marrow or peripheral blood stem cells from a matched donor who is either a family member or an unrelated volunteer.

The most common fatal complication of this type of transplant is graft-versus-host disease (GVHD), where the newly transplanted immune system of the donor attacks the patient's skin and internal organs. Up to 30 percent of recipients develop GVHD in their gastrointestinal tract, which is the organ most resistant to treatment.

Without invasive tests such as biopsies, however, GVHD can be difficult to distinguish from other causes of gastrointestinal distress, such as infection or side effects from medication.

The U-M team tested blood samples from over 1,000 patients who were treated in Ann Arbor, Germany and Japan.

"We believe we've found a reliable biomarker in the patients' blood that is specific to graft-versus-host disease and therefore can help us to rapidly identify patients for whom standard treatment is likely to be insufficient," says James L.M. Ferrara, M.D., co-lead author of the study and director of the U-M Combined Blood and Marrow Transplant Program. "This marker can also tell us whether a patient is likely to respond to therapy and may lead to an entirely new risk assessment for the disease. The findings were recently published online ahead of print publication in the journal Blood.

The marker, known as regenerating islet-derived 3-alpha (REG3-alpha), doesn't prevent patients from still needing a biopsy, Ferrara cautions, but taken with other predictive indicators, it could help doctors to ensure patients get the most appropriate treatment as early as possible.

Doctors at U-M hope to start using the test clinically in early 2012.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by University of Michigan Health System.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

J. L. M. Ferrara, A. C. Harris, J. K. Greenson, T. M. Braun, E. Holler, T. Teshima, J. E. Levine, S. W. J. Choi, E. Huber, K. Landfried, K. Akashi, M. Vander Lugt, P. Reddy, A. Chin, Q. Zhang, S. Hanash, S. Paczesny. Regenerating islet-derived 3 alpha is a biomarker of gastrointestinal graft-versus-host disease. Blood, 2011; DOI: 10.1182/blood-2011-08-375006

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Aggregating bandwidth for faster mobile networks

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 21, 2011) — Smart phones, tablet computers and mobile broadband have begun to shift the mobile communications industry into a new phase especially as global mobile data traffic had already exceed voice traffic by the end of 2009.

A new study published in the Int. J. Management and Network Economics reveals that the value of mobile spectrum, the capacity to transfer data across mobile networks, is only likely to increase as the demand for data transfer increases. However, it is only those telecommunications companies that bought up in government auctions the inexpensive licences to operate at particularly frequencies of the spectrum that will be in strong position to dominate in the consumer and enterprise markets as well as being in a position to lease bandwidth to their competitors at a high profit.

Jan Markendahl of the Royal Institute of Technology and Bengt G. Mölleryd of the Swedish Post and Telecom Agency in Stockholm have demonstrated that operators that are able to obtain more spectrum than their competitors, and pursue network sharing and spectrum aggregation have a competitive advantage as they have the lowest production cost, highest margin and highest capacity when usage takes off. Spectrum is much cheaper than the construction of new base stations, network towers, power, and site leases.

With the emergence of new radio technology that allows otherwise separate blocks of frequencies to be used as if they were a single block of bandwidth means will allow those operators who enable the so-called 3GPP standard to profit from the separate chunks of bandwidth they own. Similarly, the evolution of 4G technology and devices will also allow aggregation. Indeed, the mobile equipment manufacturers have already launched flexible radio equipment capable of handling all relevant frequencies and access technologies.

Data traffic across mobile networks in Sweden alone increased by more than 90% during 2010 compared to 2009, from 27,800 to 53,100 terabytes (TB). Similar increases are being experienced elsewhere. The figures are likely to rise even faster in coming years as more people opt for smartphones and the use of tablet computers becomes more widespread. Such operators are likely to benefit considerably from this growth.

The researchers point out that the level of data rates a company can offer will be pivotal for its marketing success in mobile broadband services. Even minor differences will be exploited to gain brand advantage and those operators who can best use the entire spectrum available to them will be able to beat their competitors on data speeds. As the technology evolves, the companies that bought up lots of separate chunks of spectrum in the cheap government sell-offs of bandwidth could gain the upper hand.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Inderscience, via AlphaGalileo.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Tuesday 1 November 2011

NIPPV linked to increased hospital mortality rates in small group of patients

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 21, 2011) — Although increased use of noninvasive positive-pressure ventilation (NIPPV) nationwide has helped decrease mortality rates among patients hospitalized with chronic obstructive pulmonary disease (COPD), a small group of patients requiring subsequent treatment with invasive mechanical ventilation (IMV) have a significantly higher risk of death than those placed directly on IMV, according to researchers in the United States who studied patterns of NIPPV use.

The findings were published online ahead of the print edition of the American Thoracic Society's American Journal of Respiratory and Critical Care Medicine. Patients hospitalized with COPD frequently require mechanical ventilation -- either invasive or noninvasive -- to help them breathe. IMV requires insertion of a rigid tube into the airway which allows oxygen to enter and exit the lungs freely. Invasive methods often involve a significant risk of infection which can occur when bacteria gather around the tube and its fittings. As a result, NIPPV methods are used when possible to minimize infection risks. NIPPV relies on the use of a mask to deliver pressurized air through the mouth and nose. The two common forms of NIPPV are continuous positive airway pressure (CPAP), which provides a continuous pressurized stream of air, and bilevel positive airway pressure (BPAP), which offers different pressures for inhalation and exhalation.

"We performed the first examination of the patterns and outcomes of NIPPV treatment for acute exacerbations of COPD in clinical practice nationwide, using data from an estimated 7,511,267 million hospital admissions in the United States during 1998-2008," said Fernando Holguin, MD, MPH, an assistant professor of medicine in the Pulmonary, Allergy and Critical Care Division at the University of Pittsburgh School of Medicine. "The current study, to the best of our knowledge, is the first to report a dramatic shift towards NIPPV use for treating respiratory failure from acute exacerbations in the United States." The increase is consistent with results reported by investigators in smaller studies, he said.

For this study, researchers from the University of Pittsburgh, Emory University, the University of Illinois at Chicago and the University of Kentucky reviewed clinical patient data gathered by the Healthcare Cost and Utilization Project Nationwide Inpatient Sample (HCUP-NIS) database between 1998 and 2008. The researchers examined changes in the frequency of NIPPV and IMV use from 1998 to 2008, and compared patient demographics, income status, payer type, hospital region and hospital type among patients who initially received NIPPV, IMV or no respiratory support after hospital admission. They also examined in-hospital mortality, length-of-stay and total hospitalization charges, and compared those outcomes among patient groups.

At the completion of the study, they found that although the annual number of hospitalizations for acute exacerbations remained relatively constant during the 10-year period, there was a progressive increase in the use of NIPPV and a progressive decrease in use of IMV; during the entire study period, there was a fourfold increase in the use of NIPPV, which had grown to overtake IMV as the most frequently used form of respiratory support for patients hospitalized with acute exacerbations in the United States.

They also found that despite a steady decline in mortality among most patients studied, patients who used NIPPV and were then transitioned to IMV had significantly higher mortality rates than other patients, and that the mortality rate in these transitioned patients increased during the study period while mortality rates of other groups declined. Patients in this group also experienced the greatest increase in hospital charges and longest hospital length-of-stay.

"The concerning finding in our analysis was the high mortality in the group of patients who, despite initial treatment with NIPPV, required subsequent placement on IMV," said Dr. Holguin, who also serves as the assistant director of the University of Pittsburgh Medical Center's Asthma Institute at the University of Pittsburgh School of Medicine. "It is notable that this finding is contrary to that found in the carefully monitored patient environment of clinical trials, where those transitioned from NIPPV to IMV did not have higher mortality than patients placed on IMV from the beginning."

Dr. Holguin added that the overall trend toward greater use of NIPPV was likely due to several factors, including clinical trials linking NIPPV with a decrease in hospital mortality, increased confidence in using NIPPV and the ability to use NIPPV outside of the intensive care unit. "These results suggest that healthcare providers should continue to be aggressive with the use of noninvasive ventilation for patients with acute exacerbations, but definitely should intensively monitor sick patients, intervene early in the absence of improvement, and carefully examine if transitioning to IMV is in the interest of a patient with a poor prognosis," he said.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by American Thoracic Society, via Newswise.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

Chandra Divay et al. Outcomes of Non-invasive Ventilation for Acute Exacerbations of COPD in the United States, 1998-2008. American Journal of Respiratory and Critical Care Medicine, 2011

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

Sewage contains the greatest diversity of unidentified viral populations known to date

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 21, 2011) — Raw sewage provides a perfect ecosystem for studying the diversity of viral populations that remain uncharacterized. This is one of the principal conclusions of a study published in the journal mBio by a team comprising experts from the UB’s Laboratory of Water and Food Viral Pollution, Washington University and the University of Pittsburgh, under the direction of James Pipas (Department of Biological Sciences, University of Pittsburgh). The pioneering study, which applies metagenomics to the analysis of viral populations present in sewage, reveals that the viral universe is far larger than previously thought.

Some 3000 different viruses are currently recognized, but it is believed that this represents only a small proportion of the real number of viruses in nature. This new study analyses viral diversity through the examination of nucleic acid sequences from sewage samples from Pittsburgh (USA), Barcelona (Spain) and Addis Ababa (Ethiopia). According to the lecturer Rosina Gironès, co-author of the study and head of the Laboratory of Water and Food Viral Pollution at the University of Barcelona, “this is the greatest diversity of viral populations revealed in any research study, many of which infect humans. We have also seen that current databases are not always fully correct: you have to be extremely thorough when analysing these new sequences to make sure that homologies with known virus groups are in fact reliable.”

 Metagenomics: opening new ground in virology 

Metagenomics is a powerful new technique that enables scientists to study the genetic diversity of microorganisms in different environmental surroundings. The metagenomic approach, initially used to study viral diversity in oceans, arctic lakes, faecal matter and other environments, is broadening the scope of research to discover a level of viral diversity unknown to scientists. The article describes the identification of 234 known viruses belonging to 26 taxonomic families, 17 of which infect humans. However, the majority of the viral genomes detected bear little or no resemblance to known viruses.

According to the experts behind the study, the most abundant types are plant viruses and bacteriophages (which infect bacteria). “People may not be aware of the large proportion of viruses that are founds in plants,” says Rosina Gironès. “In fact, we can transmit viral plant pathogens via sewage.” The study also reveals the presence of skin-tropic viral species that can be excreted, such as the human papillomavirus 112 and the recently described human polyomavirus 6. 

The paper published in mBio establish a new framework for improving our knowledge of viral diversity and the origin of emerging pathogens. The team now intends to conduct further study of the newly detected viruses and their pathogenicity and fine-tune the metagenomic approach to improve sequence analyses. “We know there are viruses that we have been unable to detect, but we will be able to identify them with the right techniques. By using the novel metagenomic approach, which is generating a substantial volume of data on specific viral genome segments, we will be able to refine results on newly detected viruses, the diversity of the families observed and their impact on human health,” explains Rosina Gironès. 

Viruses: more than infectious agents 

The Laboratory on Water and Food Viral Pollutants, coordinated by Rosina Gironès, is an authority on the study of pathogens (hepatitis A and E), enteroviruses, adenoviruses, emerging human viruses, prions and various viruses indicative of faecal contamination. The projects carried out by the laboratory team have contributed to the discovery of new viruses with a bearing on human health and have reshaped the accepted paradigm of viral infection. “We have confirmed that human viruses are always present in sewage from human populations.

In other words, we excrete viruses regularly, not simply during periods of widespread infection. The concept of what viruses are has changed. We no longer consider them simply as pathogenic agents; viruses form part of the microbiome of the human body, of nature in general, and can also have positive effects on the body, as recent studies have shown. I am sure that once we have definitively identified the role of viruses in the human microbiome we will obtain a clearer picture of their effects and benefits,” concludes Gironès.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Universidad de Barcelona, via AlphaGalileo.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

Gantalupo, Paul G.; Calgua, Byron; Zhao, Guoyan; Hundesa, Ayalkibet; Wier, Adam D.; Katz, Josh P.; Grabe, Micahle; Hendrix, Roger W.; Gironès, Rosina; Wang, David; Pipas, James M. Raw Sewage Harbors Diverse Viral Populations. mBio, 2011, vol. 2, n?m. 3, e00180-11 DOI: 10.1128/mBio.00180-11

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here

New mechanism inhibiting the spread and growth of cancer found in motile cells

AppId is over the quota
AppId is over the quota
ScienceDaily (Oct. 21, 2011) — A revolutionary discovery regarding motile cancer cells made by research scientists at VTT Technical Research Centre of Finland and the University of Turku is challenging previous conceptions.

The results have been published on 25 July 2011 in the Journal of Cell Biology.

It has long been held that cells use different mechanisms for regulating migration and growth. This conception was proven false by research scientists Anja Mai and Stefan Veltel from the research team of Professor Johanna Ivaska. Their findings on aggressively spreading breast cancer cells revealed -- completely contrary to previous expectations -- that a single cell protein (p120RasGAP) acts as an important inhibitor of both cell migration and growth.

Cancer cells are characterised by traits such as uncontrollable growth and the ability to metastasise. The findings of the research team now show that the regulation of these two deadly traits in cells is interconnected, which may be an important piece of information in the future development of medicines.

Recommend this story on Facebook, Twitter,
and Google +1:

Other bookmarking and sharing tools:

Story Source:

The above story is reprinted from materials provided by Technical Research Centre of Finland (VTT).

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

A. Mai, S. Veltel, T. Pellinen, A. Padzik, E. Coffey, V. Marjomaki, J. Ivaska. Competitive binding of Rab21 and p120RasGAP to integrins regulates receptor traffic and migration. The Journal of Cell Biology, 2011; 194 (2): 291 DOI: 10.1083/jcb.201012126

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.


View the original article here