4/28/2012

Big Girls Don’t Cry


A study to be published in the June 2012 issue of Journal of Adolescent Health looking at the relationships between body satisfaction and healthy psychological functioning in overweight adolescents has found that young women who are happy with the size and shape of their bodies report higher levels of self-esteem. They may also be protected against the negative behavioral and psychological factors sometimes associated with being overweight.


A group of 103 overweight adolescents were surveyed between 2004 and 2006, assessing body satisfaction, weight-control behavior, importance placed on thinness, self-esteem and symptoms of anxiety and depression, among other factors.

"We found that girls with high body satisfaction had a lower likelihood of unhealthy weight-control behaviors like fasting, skipping meals or vomiting," said Kerri Boutelle, PhD, associate professor of psychiatry and pediatrics at the University of California, San Diego School of Medicine. Boutelle added that the positive relationship shown in this study between body a girl's happiness with her body and her behavioral and psychological well-being suggests that improving body satisfaction could be a key component of interventions for overweight youth.

"A focus on enhancing self-image while providing motivation and skills to engage in effect weight-control behaviors may help protect young girls from feelings of depression, anxiety or anger sometimes association with being overweight," said Boutelle.

Additional contributors included first author Taya R. Cromley, PhD, of UCLA; Stephanie Knatz and Roxanne Rockwell, UC San Diego; and Dianne Neumark-Sztainer, PhD, MPH, RD and Mary Story, PhD, RD, University of Minnesota, Minneapolis.

This study was supported by a University of Minnesota Children's Vikings Grant.

Source: University of California, San Diego Health Sciences [April 27, 2012]

Analytic Thinking Can Decrease Religious Belief


A new University of British Columbia study finds that analytic thinking can decrease religious belief, even in devout believers. The study, which is published in the April 27 issue of Science, finds that thinking analytically increases disbelief among believers and skeptics alike, shedding important new light on the psychology of religious belief.


“Our goal was to explore the fundamental question of why people believe in a God to different degrees,” says lead author Will Gervais, a PhD student in UBC’s Dept. of Psychology. “A combination of complex factors influence matters of personal spirituality, and these new findings suggest that the cognitive system related to analytic thoughts is one factor that can influence disbelief.”

Researchers used problem-solving tasks and subtle experimental priming – including showing participants Rodin’s sculpture The Thinker or asking participants to complete questionnaires in hard-to-read fonts – to successfully produce “analytic” thinking. The researchers, who assessed participants’ belief levels using a variety of self-reported measures, found that religious belief decreased when participants engaged in analytic tasks, compared to participants who engaged in tasks that did not involve analytic thinking.

The findings, Gervais says, are based on a longstanding human psychology model of two distinct, but related cognitive systems to process information: an “intuitive” system that relies on mental shortcuts to yield fast and efficient responses, and a more “analytic” system that yields more deliberate, reasoned responses.

“Our study builds on previous research that links religious beliefs to ‘intuitive’ thinking,” says study co-author and Associate Prof. Ara Norenzayan, UBC Dept. of Psychology. “Our findings suggest that activating the ‘analytic’ cognitive system in the brain can undermine the ‘intuitive’ support for religious belief, at least temporarily.”

The study involved more than 650 participants in the U.S. and Canada. Gervais says future studies will explore whether the increase in religious disbelief is temporary or long-lasting, and how the findings apply to non-Western cultures.

Recent figures suggest that the majority of the world’s population believes in a God, however atheists and agnostics number in the hundreds of millions, says Norenzayan, a co-director of UBC’s Centre for Human Evolution, Cognition and Culture. Religious convictions are shaped by psychological and cultural factors and fluctuate across time and situations, he says.

Source: University of British Columbia [April 26, 2012]

4/27/2012

Action Videogames Change Brains


A team led by psychology professor Ian Spence at the University of Toronto reveals that playing an action videogame, even for a relatively short time, causes differences in brain activity and improvements in visual attention.


Previous studies have found differences in brain activity between action videogame players and non-players, but these could have been attributed to pre-existing differences in the brains of those predisposed to playing videogames and those who avoid them. This is the first time research has attributed these differences directly to playing video games.

Twenty-five subjects -- who had not previously played videogames -- played a game for a total of 10 hours in one to two hour sessions. Sixteen of the subjects played a first-person shooter game and, as a control, nine subjects played a three-dimensional puzzle game.

Before and after playing the games, the subjects' brain waves were recorded while they tried to detect a target object among other distractions over a wide visual field. Subjects who played the shooter videogame and also showed the greatest improvement on the visual attention task showed significant changes in their brain waves. The remaining subjects -- including those who had played the puzzle game -- did not.

"After playing the shooter game, the changes in electrical activity were consistent with brain processes that enhance visual attention and suppress distracting information," said Sijing Wu, a PhD student in Spence's lab in U of T's Department of Psychology and lead author of the study.

"Studies in different labs, including here at the University of Toronto, have shown that action videogames can improve selective visual attention, such as the ability to quickly detect and identify a target in a cluttered background," said Spence. "But nobody has previously demonstrated that there are differences in brain activity which are a direct result of playing the videogame."

"Superior visual attention is crucial in many important everyday activities," added Spence. "It's necessary for things such as driving a car, monitoring changes on a computer display, or even avoiding tripping while walking through a room with children's toys scattered on the floor."

The research was supported by funding from the Natural Sciences and Engineering Research Council of Canada in the form of Discovery Grants to Spence and co-author Claude Alain of the Rotman Research Institute, Baycrest Centre and U of T's Psychology Department.

The research team also included U of T PhD candidate Cho Kin Cheng, Jing Feng, a postdoc at the Rotman Research Institute, and former undergraduate student Lisa D'Angelo.

Source: University of Toronto [April 26, 2012]

Mystery of Bacterial Growth and Resistance Solved


Scientists at The Scripps Research Institute have unraveled a complex chemical pathway that enables bacteria to form clusters called biofilms. Such improved understanding might eventually aid the development of new treatments targeting biofilms, which are involved in a wide variety of human infections and help bacteria resist antibiotics.


The report, published online ahead of print on April 26, 2012, by the journal Molecular Cell, explains how nitric oxide, a signaling molecule involved in the immune system, leads to biofilm formation.

"It is estimated that about 80 percent of human pathogens form biofilms during some part of their life cycle," said Scripps Research president and CEO Michael Marletta, PhD, who led the work. "In this study, we have detailed for the first time the signaling pathway from nitric oxide to the sensor through cellular regulators and on to the biological output, biofilm formation."

"There's a lot of interest right now in finding ways to influence biofilm formation in bacteria," said lead author Lars Plate, a graduate student in Marletta's team, which recently moved to Scripps Research from the University of California, Berkeley. "Figuring out the signaling pathway is a prerequisite for that."

Biofilm formation is a critical phenomenon that occurs when bacterial cells adhere to each other and to surfaces, at times as part of their growth stage and at other times to gird against attack. In such aggregations, cells on the outside of a biofilm might still be susceptible to natural or pharmaceutical antibiotics, but the interior cells are relatively protected. This can make them difficult to kill using conventional treatments.

Biofilms can form on surgical instruments such as heart valves or catheters, leading to potentially deadly infections. Likewise, difficult-to-eliminate biofilms also play key roles in a host of conditions from gum disease to cholera, and from cystic fibrosis to Legionnaires' disease.

For years, the Marletta lab and other groups have been studying how nitric oxide regulates everything from blood vessel dilation to nerve signals in humans and other vertebrates. Past research had also revealed that nitric oxide is involved in influencing bacterial biofilm formation.

Nitric oxide in sufficient quantity is toxic to bacteria, so it's logical that nitric oxide would trigger bacteria to enter the safety huddle of a biofilm. But nobody knew precisely how.

In the new study, the scientists set out to find what happens after the nitric oxide trigger is pulled. "The whole project was really a detective story in a way," said Plate.

In vertebrates, nitric oxide can bind to something called the Heme-Nitric Oxide/Oxygen (H-NOX) binding domain on a specific enzyme, activating that enzyme and beginning the chemical cascades that lead to physiological functions such as blood vessel dilation.

Many bacteria also have H-NOX domains, including key pathogens, so this seemed the best starting point for the investigation. From there, the team turned to genomic data.

Genes for proteins that interact are often found adjacent to one another. Based on this fact, the researchers were able to infer a connection between the bacterial H-NOX domain and an enzyme called histidine kinase, which transfers phosphate chemical groups to other molecules in signaling pathways. The question was where the phosphates were going.

To learn more, the researchers used a technique called phosphotransfer profiling. This involved activating the histidine kinase and then allowing them to react separately with about 20 potential targets. Those targets that the histidine kinase rapidly transferred phosphates to had to be part of the signaling pathway. "It's a neat method that we used to get an answer that was in fact very surprising," said Plate.

The experiments revealed that the histidine kinase phosphorylated three proteins called response regulators that work together to control biofilm formation for the project's primary study species, the bacterium Shewanella oneidensis, which is found in lake sediments.

Further work showed that each regulator plays a complementary role, making for an unusually complex system. One regulator activates gene expression, another controls the activity of an enzyme producing cyclic diguanosine monophosphate, an important bacterial messenger molecule that is critical in biofilm formation, and the third tunes the degree of activity of the second.

Since other bacterial species use the same chemical pathway uncovered in this study, the findings pave the way to further explore the potential for pharmaceutical application. As one example, researchers might be able to block biofilm formation with chemicals that interrupt the activity of one of the components of this nitric oxide cascade.

Marletta's group has already explored nitric oxide's role in controlling Legionnaires' disease and, among other goals, will focus now on understanding biofilm formation in the bacterium that causes cholera.

This research was supported by the National Institutes of Health and a Chang-Lin Tien Graduate Fellowship in the Environmental Sciences.

Source: The Scripps Research Institute [April 25, 2012]

Long-held genetic theory doesn't quite make the grade


New York University biologists have discovered new mechanisms that control how proteins are expressed in different regions of embryos, while also shedding additional insight into how physical traits are arranged in body plans. Their findings, which appear in the journal Cell, call for reconsideration of a decades-old biological theory.


The researchers investigated a specific theory—morphogen theory, which posits that proteins controlling traits are arranged as gradients, with different amounts of proteins activating genes to create specified physical features. This theory was first put forth in the 1950s by mathematician and World War II code breaker Alan Turing and refined in the 1960s by Lewis Wolpert. It has been used to explain why a tiger has stripes, among other phenomena.

But some biologists have raised questions about the theory, which contends that physical features are necessarily tied to absolute concentrations of proteins within the morphogen gradient. If a certain critical mass of protein is present, then a given physical feature—for example, cells that make the skin on your forehead—will appear. If less than that critical mass is present, a different structure—say, the skin that makes your eyebrows—will appear, and a boundary will be formed between the two structures.

But alternative views have suggested that physical features are not necessarily the result of a specified number of proteins, but, rather, come from more complex interactions between multiple gradients that work against one another.

The NYU biologists explored this process by studying the fruit fly Drosophila, a powerful model for studying genetic development as it is amenable to precise genetic manipulations. They focused on one protein, Bicoid (Bcd), which is expressed in a gradient with highest levels at the end of the embryo that will become the mature fly's head.

The researchers, headed by Stephen Small, chair of NYU's Department of Biology, examined a large number of target genes that are directly activated by Bcd. Each target gene is expressed in a region of the embryo with a boundary that corresponds to a specific structure.

By examining DNA sequences associated with these target genes, the NYU researchers discovered binding sites for three other proteins—Runt, Capicua, and Kruppel—which all act as repressors. All three proteins are expressed in gradients with highest levels in the middle part of the embryo, and thus are positioned in exactly the opposite orientation compared to the Bcd activation gradient.

By changing the spatial distribution of the repressors and by manipulating their binding sites, Small and his colleagues showed that these repressors antagonize Bcd-dependent activation and are absolutely critical for establishing the correct order of boundaries that are found in a normal embryo.

In other words, contrary to Turing's theory, a single gradient of proteins does not have sufficient power to form the same body plan in each member of a species; however, if there are multiple gradients that work against each other, then the system becomes robust enough for normal development.

While the results raise questions about morphogen theory, the researchers explained that their findings did not "falsify" it, but, rather, suggested it needed some additional refinement.

Source: New York University [April 26, 2012]

4/26/2012

Women have bigger pupils than men


From an anatomical point of view, a normal, non-pathological eye is known as an emmetropic eye, and has been studied very little until now in comparison with myopic and hypermetropic eyes. The results show that healthy emmetropic women have a wider pupil diameter than men. 

The pupil regulates the amount of light that reaches the retina [Credit: Michael Dawes]
Normal, non-pathological emmetropic eyes are the most common type amongst the population (43.2%), with a percentage that swings between 60.6% in children from three to eight years and 29% in those older than 66. 

Therefore, a study determines their anatomical pattern so that they serve as a model for comparison with eyes that have refractive defects (myopia, hypermetropia and stigmatism) pathological eyes (such as those that have cataracts). 

"We know very little about emmetropic eyes even though they should be used for comparisons with myopic and hypermetropic eyes" Juan Alberto Sanchis-Gimeno, researcher at the University of Valencia and lead author of the study explains to SINC. 

The project, published in the journal 'Surgical and Radiologic Anatomy' shows the values by gender for the central corneal thickness, minimum total corneal thickness, white to white distance and pupil diameter in a sample of 379 emmetropic subjects. 

"It is the first study that analyses these anatomical indexes in a large sample of healthy emmetropic subjects" Sanchis-Gimeno states. In recent years new technologies have been developed, such as corneal elevation topography, which allows us to increase our understanding of in vivo ocular anatomy. 

Although the research states that there are no big differences between most of the parameters analysed, healthy emmetropic women have a wider pupil diameter than men. 

"It will be necessary to investigate as to whether there are differences in the anatomical indexes studied between emmetropic, myopic and hypermetropic eyes, and between populations of different ethnic origin" the researcher concludes. 

How the human eye works 

Light penetrates through the pupil, crosses the crystalline lens and is projected onto the retina, where the photoreceptor cells turn it into nerve impulses, and it is transferred through the optic nerve to the brain. Rays of light should refract so that they can penetrate the eye and can be focused on the retina. Most of the refraction occurs in the cornea, which has a fixed curvature. 

The pupil is a dilatable and contractile opening that regulates the amount of light that reaches the retina. The size of the pupil is controlled by two muscles: the pupillary sphincter, which closes it, and the pupillary dilator, which opens it. Its diameter is between 3 and 4.5 millimetres in the human eye, although in the dark it could reach up to between 5 and 9 millimetres. 

The study is published in 'Surgical and Radiologic Anatomy' 

Source: Plataforma SINC via AlphaGalileo [April 26, 2012]

4/24/2012

Researchers discover bats may be a common source of many viral diseases


International researchers under the aegis of the University of Bonn have discovered the probable cause of not just one, but several infectious agents at the same time. Paramyxoviruses originate from ubiquitous bats, from where the pathogens have spread to humans and other mammals. This could make eradicating many dangerous diseases significantly more difficult than had been thought. The results of this study have just been published in the current issue of Nature Communications. 

[Image (c) Florian Gloza-Rausch/Uni Bonn/Noctalis Bad Segeberg]
Where do viruses dangerous to humans come from, and how have they evolved? Scientists working with Prof. Dr. Christian Drosten, Head of the Institute for Virology at the Universitätsklinikum Bonn, have made significant progress in answering this question. "We already knew from prior studies that bats and rodents play a role as carriers of paramyxoviruses," said Prof. Drosten. The many varied members of this large virus family cause, e.g., measles, mumps, pneumonias and colds. The highly dangerous Hendra and Nipah viruses cause types of encephalitis that result in death for one out of two patients. Paramyxoviruses also play a role in veterinary medicine, causing e.g., canine distemper or rinderpest. 

Researchers double the number of known paramyxovirus species 

With support from numerous scientific institutes in Germany and around the world, they tested a total of 9,278 animals from Europe, South America and Asia, including 86 bat and 33 rodent species. "These animals live in very large social communities with millions of individuals in some cases," reported the Bonn virologist. "Their close contact promotes mutual infection and provides for great variety in circulating viruses." Using molecular biology methods, the scientists identified which virus species are rampant in bats and rodents. According to their own estimates, they discovered more than 60 new paramyxovirus species. "That is about as many as the number that was already known," said Drosten. 

Bats are the original paramyxovirus hosts 

Using computational biology methods, the scientists calculated a common evolutionary tree for the new and the known viruses. They then deduced, using mathematical methods, in which host animals the viruses have most likely taken up residence during their evolutionary history. "Our analysis shows that almost all of the forebears of today's paramyxoviruses have existed in bats," explained Prof. Drosten. "Just as with influenza, where we are keeping an eye on birds as a source of new pandemic viruses, we will now have to study the bat viruses to see if they are a danger to humans." So, the current data might play a useful role in early detection and prevention of epidemics – a major new goal in virus research. 

Mumps viruses have jumped to humans 

The findings also included that the Hendra and Nipah viruses that cause encephalitis in Asia and Australia really came from Africa. "This results in an urgent need to conduct medical studies in Africa," said the Bonn virologist, adding that many disease cases on this continent remain unexplained and might possibly have been caused by such new viruses. In one case, the scientists have already found proof that bat viruses transfer directly to humans. "Our data show that the human mumps virus comes directly from bats – and can be found there to this day," reported Prof. Drosten. 

Dangerous viruses cannot be eradicated anytime soon 

These results indicate that it may not be as easy to eradicate dangerous viruses as had been assumed. For eliminating an infectious agent permanently from the population by means of vaccination requires that there are no animal hosts from which a new infection might come. "In bats, we assume that there is a vast reservoir of such agents," said Drosten. "If the vaccination campaigns are stopped once a virus has been eradicated, this might present a potential risk - maybe we will have to rethink." This is why Drosten advocates taking into account ecological data when planning vaccination campaigns. Eradicating bats or other wild animals would be neither possible nor sensible. "Bats and other small wild mammals are of immeasurable value for our planet's ecosystems," Drosten summarized his and his colleagues' unanimous opinion. 

Source: University of Bonn [April 24, 2012]

4/20/2012

Researchers create synthetic DNA/RNA that can evolve


Researchers have created artificial genetic material known as Xenonucleic acids, or XNAs, that can store information and evolve over generations in a comparable way to DNA. 


The research, reported Friday in the journal Science, has implications for the fields of molecular medicine and biotechnology, and sheds new light on how molecules first replicated and assembled into life billions of years ago. 

Living systems owe their existence to the information-carrying molecules DNA and RNA.  These fundamental chemical forms have two features essential for life: they display heredity, meaning they can encode and pass on genetic information, and they can adapt over time. 

Whether these traits could be performed by molecules other than DNA and RNA has been a long-debated issue. 

For the current study, an international team of researchers developed chemical procedures to convert DNA and RNA into six genetic polymers known as XNAs.  The process switches the deoxyribose and ribose (the “d” and “r” in DNA and RNA) for other molecules. 

The researchers demonstrated for the first time that all six XNAs could form a double helix with DNA, and were more stable than natural genetic material.  Moreover, one of these XNAs, a molecule known as anhydrohexitol nucleic acid, or HNA, was capable of undergoing directed evolution and folding into biologically useful forms. 

Philipp Holliger of MRC Laboratory of Molecular Biology in Cambridge, the study’s senior author, said the work demonstrated that heredity and evolution were possible using alternatives to natural genetic material. 

“There is nothing Goldilocks about DNA and RNA,” he told Science. 

“There is no overwhelming functional imperative for genetic systems or biology to be based on these two nucleic acids.” 

Both RNA and DNA embed data in their sequences of four nucleotides.  This information is vital for conferring hereditary traits and for supplying the coded recipe essential for building proteins from the 20 naturally occurring amino acids.  However, precisely how and when this system began remains one of the most perplexing and hotly contested areas of biology. 

According to one hypothesis, the simpler RNA molecule preceded DNA as the original informational conduit. The RNA world hypothesis proposes that the earliest examples of life were based on RNA and simple proteins.  Because of RNA’s great versatility—it is not only capable of carrying genetic information but also of catalyzing chemical reactions like an enzyme—it is believed by many to have supported pre-cellular life. 

Nevertheless, the spontaneous arrival of RNA through a sequence of purely random mixing events of primitive chemicals was, at the very least, an unlikely occurrence. 

“This is a big question,” said study leader John Chaput, a researcher at Arizona State University’s Biodesign Institute. 

“If the RNA world existed, how did it come into existence? Was it spontaneously produced, or was it the product of something that was even simpler than RNA?” 

This pre-RNA world hypothesis has been gaining ground, primarily through study of XNAs, which provide plausible alternatives to the current biological system and could have acted as chemical stepping-stones to the eventual emergence of life. 

Threose nucleic acid, or TNA, for example, is one candidate for this critical intermediary role. 

“TNA does some interesting things,” Chaput said, noting the molecule’s capacity to bind with RNA through antiparallel Watson-Crick base pairing. 

“This property provides a model for how XNAs could have transferred information from the pre-RNA world to the RNA world.” 

Nucleic acid molecules, including DNA and RNA, consist of 3 chemical components: a sugar group, a triphosphate backbone and combinations of the four nucleic acids.  By manipulating these structural elements, researchers can engineer XNA molecules with unique properties. 

However, in order for any of these molecules to have acted as a precursor to RNA in the pre-biotic epoch, they would need to have been able to transfer and recover their information from RNA. To do this, specialized enzymes, known as polymerases are required. 

And while nature has made DNA and RNA polymerases capable of reading, transcribing and reverse transcribing normal nucleic acid sequences, no naturally occurring polymerases exist for XNA molecules. 

So the researchers, led by Holliger, painstakingly evolved synthetic polymerases that could copy DNA into XNA, and other polymerases that could copy XNA back into DNA. 

Ultimately, polymerases were found that transcribe and reverse-transcribe six different genetic systems: HNA, CeNA, LNA, ANA, FANA and TNA. The experiments demonstrated that these unnatural DNA sequences could be rendered into various XNAs when the polymerases were fed the appropriate XNA substrates. 

Using these enzymes as tools for molecular evolution, the team evolved the first example of an HNA aptamer through iterative rounds of selection and amplification.  Starting from a large pool of DNA sequences, a synthetic polymerase was used to copy the DNA library into HNA. The pool of HNA molecules was then incubated with an arbitrary target. The small fraction of molecules that bound the target were then separated from the unbound pool, reverse transcribed back into DNA with a second synthetic enzyme, and amplified by PCR. After many repeated rounds, HNAs were generated that bound HIV trans-activating response RNA (TAR) and hen egg lysosome (HEL), which were used as binding targets. 

“This is a synthetic Darwinian process,” Chaput said. 

“The same thing happens inside our cells, but this is done in vitro.” 

The method for producing XNA polymerases draws on Holliger’s pervious, path-breaking work, and uses cell-like synthetic compartments of water/oil emulsion to conduct directed evolution of enzymes, particularly polymerases. 

By isolating self-replication reactions from each other, the process greatly improves the accuracy and efficiency of polymerase evolution and replication. 

“What nobody had really done before,” Chaput said, “is to take those technologies and apply them to unnatural nucleic acids. ” 

Chaput said the study advances the case for a pre-RNA world, while revealing a new class of XNA aptamers capable of fulfilling many useful roles. 

And while many questions surrounding the origins of life remain, he is optimistic that solutions are coming into view. 

“Further down the road, through research like this, I think we’ll have enough information to begin to put the pieces of the puzzle together.” 

In an article accompanying the study in the journal Science, Gerald Joyce of the Scripps Research Institute wrote that “the work heralds the era of synthetic genetics, with implications for exobiology (life elsewhere in the Universe), biotechnology, and understanding of life itself”. 

However, he stressed that the work does not yet represent a full synthetic genetics platform. For that, a self-replicating system that does not require the DNA intermediary must be developed. 

If that happens, “construction of genetic systems based on alternative chemical platforms may ultimately lead to the synthesis of novel forms of life”. 

Source: RedOrbit [April 20, 2012]

4/11/2012

Teamwork linked to intelligence


Learning to work in teams may explain why humans evolved a bigger brain, according to a new study published on Wednesday. 

Learning to work in teams may explain why humans evolved a bigger brain [Credit: AFP]
Compared to his hominid predecessors, Homo sapiens is a cerebral giant, a riddle that scientists have long tried to solve. 

The answer, according to researchers in Ireland and Scotland, may lie in social interaction. 

Working with others helped Man to survive, but he had to develop a brain big enough to cope with all the social complexities, they believe. 

In a computer model, the team simulated the human brain, allowing a network of neurons to evolve in response to a series of social challenges. 

There were two scenarios. The first entailed two partners in crime who had been caught by the police, each having to decide whether or not to inform on the other. 

The second had two individuals trapped in a car in a snowdrift and having to weigh whether to cooperate to dig themselves out or just sit back and let the other do it. 

In both cases, the individual would gain more from selfishness. 

But the researchers were intrigued to find that as the brain evolved, the individual was likelier to choose to cooperate. 

"We cooperate in large groups of unrelated individuals quite frequently, and that requires cognitive abilities to keep track of who is doing what to you and change your behaviour accordingly," co-author Luke McNally of Dublin's Trinity College told AFP. 

McNally pointed out, though, that cooperation has a calculating side. We do it out of reciprocity. 

"If you cooperate and I cheat, then next time we interact you could decide: 'Oh well, he cheated last time, so I won't cooperate with him.' So basically you have to cooperate in order to receive cooperation in the future." 

McNally said teamwork and bigger brainpower fed off each other. 

"Transitions to cooperative, complex societies can drive the evolution of a bigger brain," he said. 

"Once greater levels of intelligence started to evolve, you saw cooperation going much higher." 

The study appears in Proceedings of the Royal Society B, a journal published by Britain's de-facto academy of sciences. 

Commenting on the paper, Robin Dunbar, an evolutionary anthropologist at Oxford University, said the findings were a valuable add to understanding brain evolution. 

But he said there were physiological limits to cooperation. 

Man would need a "house-sized brain" to take cooperation to a perfect level on a planet filled with humans. 

"Our current brain size limits the community size that we can manage ... that we feel we belong to," he said. 

Our comfortable "personal social network" is limited to about 150, and boosting that to 500 would require a doubling of the size of the brain. 

"In order to create greater social integration, greater social cohesion even on the size of France, never mind the size of the EU, never mind the planet, we probably have to find other ways of doing it" than wait for evolution, said Dunbar. 

Source: AFP [April 11, 2012]

Do I look bigger with my finger on a trigger? Yes, says study


UCLA anthropologists asked hundreds of Americans to guess the size and muscularity of four men based solely on photographs of their hands holding a range of easily recognizable objects, including handguns. 

Photo from study. Holding a gun like this makes a man appear taller and stronger than he would otherwise, UCLA anthropologists have found [Credit: Daniel Fessler/UCLA]
The research, which publishes April 11 in the scholarly journal PLoS ONE, confirms what scrawny thugs have long known: Brandishing a weapon makes a man appear bigger and stronger than he would otherwise. 

"There's nothing about the knowledge that gun powder makes lead bullets fly through the air at damage-causing speeds that should make you think that a gun-bearer is bigger or stronger, yet you do," said Daniel Fessler, the lead author of the study and an associate professor of anthropology at UCLA. "Danger really does loom large -- in our minds." 

Researchers say the findings suggest an unconscious mental mechanism that gauges a potential adversary and then translates the magnitude of that threat into the same dimensions used by animals to size up their adversaries: size and strength. 

"We've isolated a capacity to assess threats in a simple way," said Colin Holbrook, a UCLA postdoctoral scholar in anthropology and co-author of the study. "Though this capacity is very efficient, it can misguide us." 

The study is part of larger project funded by the U.S. Air Force Office of Scientific Research to understand how people make decisions in situations where violent conflict is a possibility. The findings are expected to have ramifications for law enforcement, prison guards and the military. 

"We're exploring how people think about the relative likelihood that they will win a conflict, and then how those thoughts affect their decisions about whether to enter into conflict," said Fessler, whose research focuses on the biological and cultural bases of human behavior. He is the director of UCLA's Center for Behavior, Evolution and Culture, an interdisciplinary group of researchers who explore how various forms of evolution shape behavior. 

For the study, the UCLA researchers recruited participants in multiple rounds using classified advertisements on the websites Craigslist and MechanicalTurk. In one round, 628 individuals were asked to look at four pictures of different hands, each holding a single object: a caulking gun, electric drill, large saw or handgun. 

"Tools were used as control objects to rule out the possibility that a simple link with traditionally masculine objects would explain intuitions that the weapon-holders were larger and stronger," Fessler explained. 

The individuals were then asked to estimate the height of each hand model in feet and inches based solely on the photographs of their hands. Participants also were shown six images of progressively taller men and six images of progressively more muscular men and asked to estimate which image came closest to the probable size and strength of the hand model. 

Study participants consistently judged pistol-packers to be taller and stronger than the men holding the other objects, even though the experiment's four hand models were recruited on the basis of their equivalent hand size and similar hand appearance (white and without identifying marks such as tattoos or scars). 

To rule out the possibility that a feature of any one hand might influence the estimates, researchers had taken separate pictures of each hand holding each object -- some participants saw the gun held by one hand model, others saw the same gun held by another model, and so on; they did the same thing for each of the objects. The researchers also shuffled the order in which the photos were presented. 

On average, participants judged pistol packers to be 17 percent taller and stronger than those judged to be the smallest and weakest men -- the ones holding caulking guns. Hand models holding the saw and drill followed gun-wielders in size and strength. 

"The function of the system is to provide an easy way for people to assess the likelihood that they would win or lose in a conflict," said Jeffrey K. Snyder, a UCLA graduate student in anthropology and a study co-author. 

Concerned that their findings might be influenced by popular culture, which often depicts gun-slingers as big and strong men, the team conducted two more studies using objects that did not seem to have a macho image: a kitchen knife, a paint brush and a large, brightly colored toy squirt gun. In the initial round, a new group of 100 subjects was asked to evaluate the danger posed by each of the objects (which were presented alone, without hands holding them). They then were asked to pick the type of person most associated with the object: a child, a woman or a man. 

Not surprisingly, individuals rated the knife most dangerous, followed by the paint brush and squirt gun. But where the most lethal object in the earlier studies -- the handgun -- would likely have been associated with men, participants in this study most often associated the most lethal object -- the kitchen knife -- with women. The paint brush was most often associated with men, and the squirt gun with children. 

In the final round of tests, a new group of 541 individuals was shown male hands holding the knife, paint brush and squirt gun and was then asked to estimate the height and muscularity of the hand models. Once again, men holding the most lethal object -- in this case, the kitchen knife -- were judged to be the biggest and strongest, followed by those holding the paint brush and the squirt gun. 

"It's not Dirty Harry's or Rambo's handgun -- it's just a kitchen knife, but it's still deadly," Holbrook said. "And our study subjects responded accordingly, estimating its holder to be bigger and stronger than the rest." 

Author: Meg Sullivan | Source: University of California - Los Angeles [April 11, 2012]

4/04/2012

Does religious faith lead to greater rewards here on Earth?


Delayed gratification: People who are good at overcoming their immediate impulses to take small rewards now — in favor of larger rewards down the road — do better in many areas of life, including academic achievement, income, job performance and health. What life experiences develop this ability? A new study published online, ahead of print, by the journal of Evolution and Human Behavior, finds that religious people are better able to forgo immediate satisfaction in order to gain larger rewards in the future. The study is the first to demonstrate an association between religious commitment and a stronger preference for delayed, but more significant, rewards. 


"It's possible to analyze virtually all contemporary social concerns, from excessive credit card debt to obesity, as problems of impulsivity. So the fact that religious people tend to be less impulsive has implications for the sorts of decisions they make with their money, time, and other resources," says Michael McCullough, professor of Psychology in the College of Arts and Sciences at the University of Miami (UM), and principal investigator of this study. "Their tendency toward less impulsive decision-making might even be relevant to their stands on public policy issues, such as whether governments should be seeking to reduce their expenditures on public services and entitlement programs in the current economic environment." 

In the research work, titled "Religious people discount the future less," 277 undergraduate University students, from a variety of religious denominations and ethnic backgrounds, chose between receiving a small financial monetary reward that the investigators made available immediately--for example "$50 today," or a larger reward that was available only after longer amounts of time had passed—for example, "$100 six months from now." Participants' commitment to their religious beliefs and institutions was also measured, among other relevant variables. The data shows that the extent to which the participants follow religious teachings positively correlates with their ability to delay gratification. 

The findings suggest that through religious beliefs and practices, people "develop a more patient style of decision making." According to the study, religion teaches this type of patience by directing people's attention to the distant future—the afterlife—which may cause their nearer-term future on this earth to feel subjectively closer. 

"People who are intrinsically religious and who indicate an interest in the afterlife tend to report that the future feels as though it is approaching quickly and that they spend a lot of time thinking about the future," the study says. 

Source: University of Miami [April 04, 2012]

Keep aging brains sharp


Exercising, eating a healthy diet and playing brain games may help you keep your wits about you well into your 80s and even 90s, advises a new book by researchers at George Mason University. 


"These are all cheap, easy things to do," says Pamela Greenwood, an associate professor in the Department of Psychology on Mason's Fairfax, Va. campus. "We should all be doing them anyway. You should do them for your heart and health, so why not do them for your brain as well?" 

For the past 20 years, Greenwood and Raja Parasuraman, University Professor of Psychology, have studied how the mind and brain age, focusing on Alzheimer's disease. Their book, "Nurturing the Older Brain and Mind" published by MIT Press, came out in March. The cognitive neuroscientists geared the book to middle-aged readers who want to keep their mental snap. 

"We know that if we can put off dementing illnesses even by a year or two through lifestyle changes, that will reduce the number of people with Alzheimer's disease, which is reaching epidemic proportions," Parasuraman says. 

Not everyone's brain declines when retirement age hits. "You can look at a group of 65-year-olds — some are in nursing homes, and some are running the world," Greenwood says. 

Now that more workers are staying on the job longer for economic reasons and because countries are upping the retirement age, keeping the mind agile becomes paramount, Parasuraman says. 

For the book, Parasuraman and Greenwood examined only scientific studies, theirs and others, ranging from neurological to physiological. A few surprises leaped out of the data. 

"Several old dogmas were overturned," Parasuraman says. "There's the tired old joke that we're losing brain cells as we age — maybe starting as young as 20 or 30 — and it's all downhill after that." 

Not so, new research reveals. Not only are some 60-year-olds as sharp as 20-year-olds, but their brains still create new cells. Brain cells may not grow as fast as bone or skin cells, but grow they do, particularly in the hippocampus. "It's the area of the brain that's very important to memory and is affected by Alzheimer's disease," Parasuraman says. 

Novel experiences and new learning help new brain cells become part of the circuitry. Parasuraman points to a study of terminally ill cancer patients whose brains were still forming new neurons. "If a person who's in a terminally ill state can generate new neurons, then surely healthy people can," Parasuraman says. 

Brain games and new experiences may build up "white matter," which insulates neurons as they carry signals, Greenwood says. In older brains, this white matter insulation develops holes and signals go awry. 

Older adult gamers are winning skills to help them move through life, Parasuraman says. "We are looking at everyday problem solving," he says. "Are you better at balancing a checkbook? Are you better at making decisions in a grocery store? We're finding you get better at those tasks (after playing the video games in the study)." 

Moving large muscle groups also builds brain matter. In one study detailed in the book, older, sedentary people began walking or did stretching exercises for 45 minutes, three times a week. "Those people actually became smarter over time," Greenwood says. "You don't have to be running Ironman marathons. You can just walk briskly three or four times a week." 

Another best bet for an active mind is a nutritious diet that limits calories to the minimum amount needed to keep a body healthy. No starvation diets, though. "The strongest evidence we have is not very pleasant, which is dietary restriction, reducing calories," Parasuraman says. "That clearly improves longevity and cognition. The evidence in animals is very strong. Such dietary restriction may never be popular. But perhaps every-other-day fasting as an approximation to it is something people would tolerate: You eat normally one day, and the next day you don't." 

Popping supplements won't fill a nutritionally deficient diet, Parasuraman says. "A lot of people think, 'I can eat junk food and then take a pill.' No. You have to eat fruits and vegetables, leafy vegetables. It has to be part of the regular diet because otherwise it's not absorbed." 

Fat cells help make up cell membranes. The unsaturated fats found in fish and olive oils may boost flexibility in these membranes. The more flexible membranes are, the better they may work, scientists theorize. Saturated fats such as butter have to go because these fats vie with healthy fats for a place in the cell membrane, Greenwood explains. 

Greenwood and Parasuraman want people to know that getting old doesn't mean getting senile. "The bottom line message of the book is really a hopeful one," Greenwood says. "There are lots of things that you can do (to keep your brain healthy)." 

Source: George Mason University [April 04, 2012]

4/01/2012

Death anxiety increases atheists' unconscious belief in God


New research suggests that when non-religious people think about their own death they become more consciously skeptical about religion, but unconsciously grow more receptive to religious belief. 


The research, from the Department of Psychology at the University of Otago in New Zealand, also found that when religious people think about death, their religious beliefs appear to strengthen at both conscious and unconscious levels. The researchers believe the findings help explain why religion is such a durable feature of human society. 

In three studies, which involved 265 university students in total, religious and non-religious participants were randomly assigned to "death priming" and control groups. Priming involved asking participants to write about their own death or, in the control condition, about watching TV. 

In the first study, researchers found that death-primed religious participants consciously reported greater belief in religious entities than similar participants who had not been death-primed. Non-religious participants who had been primed showed the opposite effect: they reported greater disbelief than their fellow non-religious participants in the control condition. 

Study co-author Associate Professor Jamin Halberstadt says these results fit with the theory that fear of death prompts people to defend their own worldview, regardless of whether it is a religious or non-religious one. 

"However, when we studied people's unconscious beliefs in the two later experiments, a different picture emerged. While death-priming made religious participants more certain about the reality of religious entities, non-religious participants showed less confidence in their disbelief," Associate Professor Halberstadt says. 

The techniques used to study unconscious beliefs include measuring the speed with which participants can affirm or deny the existence of God and other religious entities. After being primed by thoughts of death, religious participants were faster to press a button to affirm God's existence, but non-religious participants were slower to press a button denying God's existence. 

"These findings may help solve part of the puzzle of why religion is such a persistent and pervasive feature of society. Fear of death is a near-universal human experience and religious beliefs are suspected to play an important psychological role in warding off this anxiety. As we now show, these beliefs operate at both a conscious and unconscious level, allowing even avowed atheists to unconsciously take advantage of them." 

The paper co-authors also included Jonathan Jong, currently at the University of Oxford, who undertook the experiments as part of his PhD thesis, and Matthias Bluemke, currently at the University of Heidelberg. Associate Professor Halberstadt was Jong's supervisor. 

The findings from the three experiments will be published in the Journal of Experimental Social Psychology. 

Source: University of Otago [April 01, 2012]

DNA sequencing lays foundation for personalized cancer treatment


Scientists at Washington University School of Medicine in St. Louis are using powerful DNA sequencing technology not only to identify mutations at the root of a patient's tumor – considered key to personalizing cancer treatment – but to map the genetic evolution of disease and monitor response to treatment. 

The Genomics of Drug Sensitivity in Cancer project released its first results on July 15th. Researchers released a first dataset from a study that will expose 1,000 cancer cell lines (including ovarian) to 400 anticancer treatments [Washington University]
"We're finding clinically relevant information in the tumor samples we're sequencing for discovery-oriented research studies," says Elaine Mardis, PhD, co-director of The Genome Institute at the School of Medicine. "Genome analysis can play a role at multiple time points during a patient's treatment, to identify 'driver' mutations in the tumor genome and to determine whether cells carrying those mutations have been eliminated by treatment." 

This work is helping to guide the design of future cancer clinical trials in which treatment decisions are based on results of sequencing, says Mardis, who is speaking April 1 at the opening plenary session of the American Association for Cancer Research annual meeting in Chicago. She also is affiliated with the Siteman Cancer Center at the School of Medicine and Barnes-Jewish Hospital. 

To date, Mardis and her colleagues have sequenced all the DNA – the genome – of tumor cells from more than 700 cancer patients. By comparing the genetic sequences in the tumor cells to healthy cells from the same patient, they can identify mutations underlying each patient's cancer. 

Already, information gleaned through whole-genome sequencing is pushing researchers to reclassify tumors based on their genetic makeup rather than their location in the body. In patients with breast cancer, for example, Mardis and her colleagues have found numerous driver mutations in genes that have not previously been associated with breast tumors. 

A number of these genes have been identified in prostate, colorectal, lung or skin cancer, as well as leukemia and other cancers. Drugs that target mutations in these genes, including imatinib, ruxolitinib and sunitinib, while not approved for breast cancer, are already on the market for other cancers. 

"We are finding genetic mutations in multiple tumor types that could potentially be targeted with drugs that are already available," Mardis says. 

She predicts, however, that it may require a paradigm change for oncologists to evaluate the potential benefits of individualized cancer therapy. While clinical trials typically involve randomly assigning patients to a particular treatment regimen, a personalized medicine approach calls for choosing drugs based on the underlying mutations in each patient's tumor. 

"Having all treatment options available for every patient doesn't fit neatly into the confines of a carefully designed clinical trial," Mardis acknowledges. "We're going to need more flexibility." 

When during the course of cancer mutations develop also is likely to be important in decisions about treatment. In a recent study, Mardis and her team mapped the genetic evolution of leukemia and found clues to suggest that targeted cancer drugs should be aimed at mutations that develop early in the course of the disease. 

Using "deep digital sequencing," a technique developed at The Genome Institute, they sequenced individual mutations in patients' tumor samples more than 1,000 times each. This provides a read-out of the frequency of each mutation in a patient's tumor genome and allowed the researchers to map the genetic evolution of cancer cells as the disease progressed. 

They found that as cancer evolves, tumors acquire new mutations but always retain the original cluster of mutations that made the cells cancerous in the first place. Their discovery suggests that drugs targeted to cancer may be more effective if they are directed toward genetic changes that occur early in the course of cancer. Drugs that target mutations found exclusively in later-evolving cancer cells likely may not have much effect on the disease because they would not kill all the tumor cells. 

Mardis says that sequencing the entire genome of cancer cells is essential to piecing together an accurate picture of the way cancer cells evolve. If the researchers had sequenced only the small portion of the genome that involves genes, they would not have had the statistical power to track the frequency of mutations over time. (Only 1 to 2 percent of the genome consists of genes.) 

In another study, a phase III clinical trial of post-menopausal women with estrogen-receptor positive breast cancer, the Washington University researchers have shown that sequencing can help to predict which women will respond to treatment with aromatase inhibitors. These estrogen-lowering drugs are often prescribed to shrink breast tumors before surgery. But only about half of women with estrogen-receptor positive breast cancer respond to these drugs, and doctors have not been able to predict which patients will benefit. 

Interestingly, by sequencing patients' breast tumors before and after aromatase inhibitor therapy, the researchers identified substantive genomic changes that had occurred in responsive patients, whereas the genomes of unresponsive patients remained largely unchanged by the therapy. 

"No one has ever looked at treatment response at this level of resolution," Mardis says. "It's so obvious who is responding." 

In addition, the researchers have identified a series of mutations in the breast tumors that have corresponding small-molecule inhibitor drugs that target defective proteins. This finding indicates that for women who are not responding to aromatase inhibitors, treatment options may include combining conventional chemotherapy with the indicated small-molecule inhibitor. 

"We felt it was important to show there could be therapeutic options available to patients who are resistant to aromatase inhibitors," Mardis says. "As we move forward, we think sequencing will contribute crucial information to determining the best treatment options for patients."  

Source: Washington University School of Medicine [April 01, 2012]

Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Design by Free WordPress Themes | Bloggerized by Lasantha - Premium Blogger Themes | Facebook Themes