This frightening aspect of scientific “progress” will be sold to people as a great way for us to help cure us of psychological disorders.
This frightening aspect of scientific “progress” will be sold to people as a great way for us to help cure us of psychological disorders.
Scientists have just made what could be the most important discovery about brains in a very long time, as the National Academies of Sciences, Engineering and Medicine said in a report released this past week that exercise, controlling blood pressure, and some brain training may be the magic formula to preventing mental decline, Alzheimer’s or dementia in old age.
While there are no proven ways to keep this mental deterioration from happening, this new report is an exciting indication that we may have more power to stop cognitive decline than we think. However, the government will need to do more research before such strategies are pushed as a viable method for ordinary citizens.
At the very least, these three strategies appear to do no harm, and at least two are really good for you even if they ultimately don’t work for preventing dementia. The report is based on a belief that changes in the brain begin long before symptoms of Alzheimer’s and other diseases, and it’s possible to catch the disease early on.
Cognitive training, blood pressure management for people with hypertension, and increased physical activity all show modest but inconclusive evidence that they can help prevent cognitive decline and dementia, but there is insufficient evidence to support a public health campaign encouraging their adoption, says a new report from the National Academies of Sciences, Engineering, and Medicine. Additional research is needed to further understand and gain confidence in their effectiveness, said the committee that conducted the study and wrote the report.
“There is good cause for hope that in the next several years much more will be known about how to prevent cognitive decline and dementia, as more clinical trial results become available and more evidence emerges,” said Alan I. Leshner, chair of the committee and CEO emeritus, American Association for the Advancement of Science. “Even though clinical trials have not conclusively supported the three interventions discussed in our report, the evidence is strong enough to suggest the public should at least have access to these results to help inform their decisions about how they can invest their time and resources to maintain brain health with aging.”
An earlier systematic review published in 2010 by the Agency for Healthcare Research and Quality (AHRQ) and an associated “state of the science” conference at the National Institutes of Health had concluded that there was insufficient evidence to make recommendations about any interventions to prevent cognitive decline and dementia. Since then, understanding of the pathological processes that result in dementia has advanced significantly, and a number of clinical trials of potential preventive interventions have been completed and published. In 2015, the National Institute on Aging (NIA) contracted with AHRQ to conduct another systematic review of the current evidence. NIA also asked the National Academies to convene an expert committee to help inform the design of the AHRQ review and then use the results to make recommendations to inform the development of public health messaging, as well as recommendations for future research. This report examines the most recent evidence on steps that can be taken to prevent, slow, or delay the onset of mild cognitive impairment and clinical Alzheimer’s-type dementia as well as steps that can delay or slow age-related cognitive decline.
Overall, the committee determined that despite an array of advances in understanding cognitive decline and dementia, the available evidence on interventions derived from randomized controlled trials – considered the gold standard of evidence – remains relatively limited and has significant shortcomings. Based on the totality of available evidence, however, the committee concluded that three classes of interventions can be described as supported by encouraging but inconclusive evidence. These interventions are:
cognitive training – which includes programs aimed at enhancing reasoning and problem solving, memory, and speed of processing – to delay or slow age-related cognitive decline. Such structured training exercises may or may not be computer-based. blood pressure management for people with hypertension – to prevent, delay, or slow clinical Alzheimer’s-type dementia. increased physical activity – to delay or slow age-related cognitive decline.
Cognitive training has been the object of considerable interest and debate in both the academic and commercial sectors, particularly within the last 15 years. Good evidence shows that cognitive training can improve performance on a trained task, at least in the short term. However, debate has centered on evidence for long-term benefits and whether training in one domain, such as processing speed, yields benefits in others, such as in memory and reasoning, and if this can translate to maintaining independence in instrumental activities of daily living, such as driving and remembering to take medications. Evidence from one randomized controlled trial suggests that cognitive training delivered over time and in an interactive context can improve long-term cognitive function as well as help maintain independence in instrumental activities of daily living for adults with normal cognition. However, results from other randomized controlled trials that tested cognitive training were mixed.
Managing blood pressure for people with hypertension, particularly during midlife – generally ages 35 to 65 years – is supported by encouraging but inconclusive evidence for preventing, delaying, and slowing clinical Alzheimer’s-type dementia, the committee said. The available evidence, together with the strong evidence for blood pressure management in preventing stroke and cardiovascular disease and the relative benefit/risk ratio of antihypertensive medications and lifestyle interventions, is sufficient to justify communication with the public regarding the use of blood pressure management, particularly during midlife, for preventing, delaying, and slowing clinical Alzheimer’s-type dementia, the report says.
It is well-documented that physical activity has many health benefits, and some of these benefits – such as stroke prevention – are causally related to brain health. The AHRQ systematic review found that the pattern of randomized controlled trials results across different types of physical activity interventions provides an indication of the effectiveness of increased physical activity in delaying or slowing age-related cognitive decline, although these results were not consistently positive. However, several other considerations led the committee to conclude that the evidence is sufficient to justify communicating to the public that increased physical activity for delaying or slowing age-related cognitive decline is supported by encouraging but inconclusive evidence.
None of the interventions evaluated in the AHRQ systematic review met the criteria for being supported by high-strength evidence, based on the quality of randomized controlled trials and the lack of consistently positive results across independent studies. This limitation suggests the need for additional research as well as methodological improvements in the future research. The National Institutes of Health and other interested organizations should support further research to strengthen the evidence base on cognitive training, blood pressure management, and increased physical activity, the committee said. Examples of research priorities for these three classes of interventions include evaluating the comparative effectiveness of different forms of cognitive training interventions; determining whether there are optimal blood pressure targets and approaches across different age ranges; and comparing the effects of different forms of physical activity.
When funding research on preventing cognitive decline and dementia, the National Institutes of Health and other interested organizations should identify individuals who are at higher risk of cognitive decline and dementia; increase participation of underrepresented populations; begin more interventions at younger ages and have longer follow-up periods; use consistent cognitive outcome measures across trials to enable pooling; integrate robust cognitive outcome measures into trials with other primary purposes; include biomarkers as intermediate outcomes; and conduct large trials designed to test the effectiveness of an intervention in broad, routine clinical practices or community settings.
The study was sponsored by the National Institute on Aging. The National Academies of Sciences, Engineering, and Medicine are private, nonprofit institutions that provide independent, objective analysis and advice to the nation to solve complex problems and inform public policy decisions related to science, technology, and medicine. The National Academies operate under an 1863 congressional charter to the National Academy of Sciences, signed by President Lincoln.
Scientists think there may be a few things you can do to keep dementia at bay: train your brain, keep your blood pressure under control and stay active.
According to a report published Thursday by the National Academies of Sciences, Engineering and Medicine (NASEM), there is promising evidence that cognitive training, managing your blood pressure if you have hypertension and increasing your physical activity may help prevent age-related cognitive decline and dementia.
The report’s findings line up with the Alzheimer’s Association’s findings from two years ago, said Keith N. Fargo, the association’s director of Scientific Programs and Outreach. In 2015, the organization published its own review and identified two things that could help minimize the risk of cognitive decline.
“They were increasing physical activity and improving cardiovascular health,” he said.
“The ideas were there before the report,” said Dan G. Blazer, a member of the NASEM committee that conducted the study and the J.P. Gibbons Professor of Psychiatry Emeritus at Duke University Medical Center. “What is good for the heart is good for the brain. Therefore, exercise and controlling high blood pressure are good for the brain.”
And cognitive training is getting a lot of attention now, said Blazer. Cognitive training refers to programs or exercises aimed at improving reasoning, problem-solving, memory and processing speed. Sometimes they can be computer-based.
In one randomized control trial of 2,832 participants that the committee reviewed called the Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) trial, those who had received cognitive training in reasoning and speed-of-processing showed less decline in those areas than those who didn’t — after ten years.
“(Cognitive training) is an area worthy of looking forward,” said Blazer.
The evidence is encouraging, but not enough to embark on a public health campaign, said Alan I. Leshner, the chair of the NASEM committee and CEO Emeritus of the American Association for the Advancement of Science. In the report, the findings were described as “encouraging, but inconclusive” evidence.
Further research needs to be done, the report added.
Even so, Fargo of the Alzheimer’s Association said the public should understand one thing.
“There are things that you can do to reduce your risk,” he said.
“You can take your own cognitive health and brain health in your hands,” he said. “You can affect it in a positive way.”
Follow Sarah Toy on Twitter: @sarahtoy17
The common lineage of great apes and humans split several hundred thousand years earlier than hitherto assumed, according to an international research team headed by Professor Madelaine Böhme from the Senckenberg Centre for Human Evolution and Palaeoenvironment at the University of Tübingen and Professor Nikolai Spassov from the Bulgarian Academy of Sciences. The researchers investigated two fossils of Graecopithecus freybergi with state-of-the-art methods and came to the conclusion that they belong to pre-humans. Their findings, published today in two papers in the journal PLOS ONE, further indicate that the split of the human lineage occurred in the Eastern Mediterranean and not – as customarily assumed – in Africa.
Present-day chimpanzees are humans’ nearest living relatives. Where the last chimp-human common ancestor lived is a central and highly debated issue in palaeoanthropology. Researchers have assumed up to now that the lineages diverged five to seven million years ago and that the first pre-humans developed in Africa. According to the 1994 theory of French palaeoanthropologist Yves Coppens, climate change in Eastern Africa could have played a crucial role. The two studies of the research team from Germany, Bulgaria, Greece, Canada, France and Australia now outline a new scenario for the beginning of human history.
Dental roots give new evidence
The team analyzed the two known specimens of the fossil hominid Graecopithecus freybergi: a lower jaw from Greece and an upper premolar from Bulgaria. Using computer tomography, they visualized the internal structures of the fossils and demonstrated that the roots of premolars are widely fused.
“While great apes typically have two or three separate and diverging roots, the roots of Graecopithecus converge and are partially fused – a feature that is characteristic of modern humans, early humans and several pre-humans including Ardipithecus and Australopithecus“, said Böhme.
The lower jaw, nicknamed ‘El Graeco’ by the scientists, has additional dental root features, suggesting that the species Graecopithecus freybergi might belong to the pre-human lineage. “We were surprised by our results, as pre-humans were previously known only from sub-Saharan Africa,” said Jochen Fuss, a Tübingen PhD student who conducted this part of the study.
Furthermore, Graecopithecus is several hundred thousand years older than the oldest potential pre-human from Africa, the six to seven million year old Sahelanthropus from Chad. The research team dated the sedimentary sequence of the Graecopithecus fossil sites in Greece and Bulgaria with physical methods and got a nearly synchronous age for both fossils – 7.24 and 7.175 million years before present. “It is at the beginning of the Messinian, an age that ends with the complete desiccation of the Mediterranean Sea,” Böhme said.
Professor David Begun, a University of Toronto paleoanthropologist and co-author of this study, added, “This dating allows us to move the human-chimpanzee split into the Mediterranean area.”
Environmental changes as the driving force for divergence
As with the out-of-East-Africa theory, the evolution of pre-humans may have been driven by dramatic environmental changes. The team led by Böhme demonstrated that the North African Sahara desert originated more than seven million years ago. The team concluded this based on geological analyses of the sediments in which the two fossils were found. Although geographically distant from the Sahara, the red-colored silts are very fine-grained and could be classified as desert dust. An analysis of uranium, thorium, and lead isotopes in individual dust particles yields an age between 0.6 and 3 billion years and infers an origin in Northern Africa.
Moreover, the dusty sediment has a high content of different salts. “These data document for the first time a spreading Sahara 7.2 million years ago, whose desert storms transported red, salty dusts to the north coast of the Mediterranean Sea in its then form,” the Tübingen researchers said. This process is also observable today. However, the researchers’ modelling shows that, with up to 250 grams per square meter and year, the amount of dust in the past considerably exceeds recent dust loadings in Southern Europe more than tenfold, comparable to the situation in the present-day Sahel zone in Africa.
Fire, grass, and water stress
The researchers further showed that, contemporary to the development of the Sahara in North Africa, a savannah biome formed in Europe. Using a combination of new methodologies, they studied microscopic fragments of charcoal and plant silicate particles, called phytoliths. Many of the phytoliths identified derive from grasses and particularly from those that use the metabolic pathway of C4-photosynthesis, which is common in today’s tropical grasslands and savannahs. The global spread of C4-grasses began eight million years ago on the Indian subcontinent – their presence in Europe was previously unknown.
“The phytolith record provides evidence of severe droughts, and the charcoal analysis indicates recurring vegetation fires,” said Böhme. “In summary, we reconstruct a savannah, which fits with the giraffes, gazelles, antelopes, and rhinoceroses that were found together with Graecopithecus,” Spassov added
“The incipient formation of a desert in North Africa more than seven million years ago and the spread of savannahs in Southern Europe may have played a central role in the splitting of the human and chimpanzee lineages,” said Böhme. She calls this hypothesis the North Side Story, recalling the thesis of Yves Coppens, known as East Side Story.
The findings are described in two studies pubished in PLOS ONE titled “Potential hominin affinities of Graecopithecus from the late Miocene of Europe” and “Messinian age and savannah environment of the possible hominin Graecopithecus from Europe.”
In recent years, the term “plant medicine” has come to be associated with psychedelics like mushrooms and ayahuasca, which are increasingly documented to provide mental and emotional relief to users. But according to a recent analysis from Kew Gardens in the United Kingdom, there are over 28,000 other plants currently being used as medicine throughout the world.
The second annual report from Britain’s Royal Botanic Gardens at Kew, located in London, is the result of the research and analysis of 128 scientists from 12 countries around across the globe.
According to their findings, there are 28,187 plants “currently recorded as being of medicinal use.”
The report [pdf] explains:
“In many regions of the world, people still rely on traditional plant-based medicines for their primary healthcare. This is especially true for many rural communities in Africa, parts of Asia, and Central and South America, where plants and knowledge of their traditional use are accessible and affordable. In other countries, many of these traditional plant-based medicines are being integrated through regulations into mainstream health systems.”
Though plant medicines are making their way into the mainstream, the researchers note that currently, just “16% (4,478) of the species used in plant-based medicines are cited in a medicinal regulatory publication.” Even so, they note data on drugs approved by the FDA and similar agencies:
“Since 1981, 1,130 new therapeutic agents have been approved for use as pharmaceutical drugs, of which 593 are based on compounds from natural sources. Thirty-eight are derived from medicinal plants. Fifteen of the 56 natural drugs registered for the treatment of cancer since 1980 are derived from medicinal plants with a long history of traditional use.”
They note, for example, that “The anti-cancerous drugs vincristine and vinblastine are derived from the Madagascar periwinkle, Catharanthus roseus in the Apocynaceae family.”
“For example drugs based on Paclitaxel have been isolated from the yew tree (Taxus spp.), Camptothecin from the happy tree, (Camptotheca acuminata) and Podophyllotoxin from the May apple (Podophyllum hexandrum and P. peltatum).”
Further, researchers have discovered over 1,000 species of beneficial plants since their survey last year. As Yahoo News summarized, “new plants discovered over the past year include nine species of a climbing vine used in the treatment of Parkinson’s disease.”
“The report said two plants, artemisinin and quinine, are ‘among the most important weapons’ against malaria, which killed over 400,000 people in 2015,” Yahoo summarized.
According to Monique Simmonds, deputy director of science at Kew, “The report is highlighting the huge potential that there is for plants, in areas like diabetes and malaria,” Yahoo reported. “One study documents 656 flowering plant species used traditionally for diabetes, representing 437 genera and 111 families,” the report explains.
It also points out that of “only five drugs developed specifically for the symptomatic treatment of Alzheimer’s disease, two are derived from plants.”
Some particularly powerful species of plants include Fabaceae (pea and bean), Lamiaceae (mint), Euphorbiaceae (spurge), Apocynaceae (dogbane), Malvaceae (mallow), Apiaceae (parsley or carrot), and Ranunculaceae (buttercup). Their key classes of compounds are alkaloids (Fabaceae), terpenes (Lamiaceae), diterpenoids (Euphorbiaceae), cardiac-glycosides (Apocynaceae), organic acids (Malvaceae), coumarins (Apiaceae), and alkaloids (Ranunculaceae). Another highly useful plant documented in the report is Moraceae, which is used in the treatment of diabetes.
Though their report offers great promise, they highlight some pitfalls. Stressing that correct labeling of plant medicines is vital, they explain:
“Product labeling is frequently misleading, with the trade name ‘ginseng’, for example, referring to 15 different species of plant, each with its own particular chemistry and therapeutic properties. Substitution by a Belgian clinic of one Chinese medicinal herb (‘Fang Ji’) with another sharing the same name, led to over 100 patients requiring kidney dialysis for the remainder of their lives.”
They also point out the threat to the plants themselves.
“Increasing demand for herbal medicines (particularly for species covered by pharmacopoeias) threatens wild populations of many of these plants,” they note, adding that “ the focus of world trade on relatively few species of medicinal plants leads to sustainability and conservation issues, which ultimately lead to other plants being substituted, with potential risks to human health.”
They advocate more precise scientific labeling of plants and more “clarity on which plants have or have not been studied in drug discovery programmes.”
“Such approaches,” they contend, “will be hugely important in improving our ability to realise current and future medicinal benefits from plants.”
As pharmaceutical drugs continue to wreak hazardous consequences, the healing power of natural plants appears to hold great promise for humans seeking treatment without the chemical side effects of current popular medicines.
This article originally appeared on The Anti-Media.
Sifting through teaspoons of clay and sand scraped from the floors of caves, German researchers have managed to isolate ancient human DNA — without turning up a single bone.
Their new technique, described in a study published on Thursday in the journal Science, promises to open new avenues of research into human prehistory and was met with excitement by geneticists and archaeologists.
“It’s a bit like discovering that you can extract gold dust from the air,” said Adam Siepel, a population geneticist at Cold Spring Harbor Laboratory.
“An absolutely amazing and exciting paper,” added David Reich, a genetics professor at Harvard who focuses on ancient DNA.
Until recently, the only way to study the genes of ancient humans like the Neanderthals and their cousins, the Denisovans, was to recover DNA from fossil bones.
But they are scarce and hard to find, which has greatly limited research into where early humans lived and how widely they ranged. The only Denisovan bones and teeth that scientists have, for example, come from a single cave in Siberia.
Looking for these genetic signposts in sediment has become possible only in the last few years, with recent developments in technology, including rapid sequencing of DNA.
Although DNA sticks to minerals and decayed plants in soil, scientists did not know whether it would ever be possible to fish out gene fragments that were tens of thousands of years old and buried deep among other genetic debris.
Bits of genes from ancient humans make up just a minute fraction of the DNA floating around in the natural world.
But the German scientists, led by Matthias Meyer at the Max Planck Institute for Developmental Biology in Tübingen, have spent years developing methods to find DNA even where it seemed impossibly scarce and degraded.
“There’s been a real revolution in technology invented by this lab,” Dr. Reich said. “Matthias is kind of a wizard in pushing the envelope.”
Scientists began by retrieving DNA from ancient bones: first Neanderthals, then Denisovans.
To identify the Denisovans, Svante Paabo, a geneticist at the Planck Institute and a co-author of the new paper, had only a child’s pinkie bone to work with.
His group surprised the world in 2010 by reporting that it had extracted DNA from the bone, finding that it belonged to a group of humans distinct from both Neanderthals and modern humans.
But that sort of analysis is limited by the availability of fossil bones.
“In a lot of cases, you can get bones, but not enough,” said Hendrik Poinar, an evolutionary geneticist at McMaster University.
“If you just have one small piece of bone from one site, curators do not want you to grind it up.”
Finding and analyzing ancient DNA in dirt is far more difficult than getting it out of bone. The idea was not new, noted Viviane Slon, a member of Dr. Meyer’s group and the first author of the new paper.
Other groups of researchers have found DNA in sediment, including Dr. Poinar and Michael Hofreiter, his former student. Using a tablespoon of dirt from a cave in Colorado, his team discovered traces from 16 animal species that had lived there. It took two weeks to do it.
Researchers who had scoured that cave for bones had spent 20 years there and had sifted through two metric tons of dirt to find bones, teeth or skin of 20 animal species — including the 16 that Dr. Poinar’s group later identified.
The new study involved searching for ancient DNA in four caves in Eurasia where humans were known to have lived between 14,000 and 550,000 years ago.
Dr. Meyer and his colleagues figured out which DNA in the cave sediment was prehistoric by looking for telltale signs of degradation at the ends of the molecules.
They then plucked out DNA from Neanderthals and Denisovans by using molecular hooks to snare genes in mitochondria — the cells’ energy factories — that are unique to these humans.
The scientists also built a robotic system to analyze the samples quickly; the old way, pipetting by hand, required several days to analyze only a fraction as many samples.
The group needed that efficiency. From different dirt samples, they recovered between 5,000 and 2.8 million DNA fragments. The number of DNA fragments per sample that were from ancient humans was minuscule and ranged from 0 to 8,822, depending on the site in the cave.
The discovery that it is now possible to do all this, Dr. Reich said, is just “an amazing, amazing thing.” The questions that can now be addressed seem almost endless.
Researchers could feasibly begin searching for bones in caves where DNA in the dirt indicates habitation by ancient humans. And they are likely to begin learning much more about human prehistory.
The Denisovans, for example: Tiny pieces of genes inherited from them have been found in modern humans in Papua New Guinea. How did they get there? And why these people, and not humans closer to Siberia?
With the new technique, one way to try to verify the presence of humans would be to look for ancient human DNA at the site where the bones were found or in areas nearby.
“A natural thing to do is start looking in sediments,” said Jonathan Pritchard, a professor of genetics and biology at Stanford.
Another application of the discovery, said Dr. Reich, would be to start looking for evidence of ancient human DNA in open air sites, instead of looking for bones in caves.
“If it worked, it would provide a much richer picture of the geographic distribution and migration patterns of ancient humans, one that was not limited by the small number of bones that have been found,” he said.
“That would be a magical thing to do.”
Scientists have created an “artificial womb” in the hopes of someday using the device to save babies born extremely prematurely.
So far the device has only been tested on fetal lambs. A study published Tuesday involving eight animals found the device appears effective at enabling very premature fetuses to develop normally for about a month.
“We’ve been extremely successful in replacing the conditions in the womb in our lamb model,” says Alan Flake, a fetal surgeon at Children’s Hospital of Philadelphia who led the study published in the journal Nature Communications.
“They’ve had normal growth. They’ve had normal lung maturation. They’ve had normal brain maturation. They’ve had normal development in every way that we can measure it,” Flake says.
Flake says the group hopes to test the device on very premature human babies within three to five years.
“What we tried to do is develop a system that mimics the environment of the womb as closely as possible,” Flake says. “It’s basically an artificial womb.”
The device consists of a clear plastic bag filled with synthetic amniotic fluid. A machine outside the bag is attached to the umbilical cord to function like a placenta, providing nutrition and oxygen to the blood and removing carbon dioxide.
“The whole idea is to support normal development; to re-create everything that the mother does in every way that we can to support normal fetal development and maturation,” Flake says.
Other researchers praised the advance, saying it could help thousands of babies born very prematurely each year, if tests in humans were to prove successful.
Jay Greenspan, a pediatrician at Thomas Jefferson University, called the device a “technological miracle” that marks “a huge step to try to do something that we’ve been trying to do for many years.”
The device could also help scientists learn more about normal fetal development, says Thomas Shaffer a professor of physiology and pediatrics at Temple University.
“I think this is a major breakthrough,” Shaffer says.
The device in the fetal lamb experiment is kept in a dark, warm room where researchers can play the sounds of the mother’s heart for the lamb fetus and monitor the fetus with ultrasounds.
Previous research has shown that lamb fetuses are good models for human fetal development.
“If you can just use this device as a bridge for the fetus then you can have a dramatic impact on the outcomes of extremely premature infants,” Flake says. “This would be a huge deal.”
But others say the device raises ethical issues, including many questions about whether it would ever be acceptable to test it on humans.
“There are all kinds of possibilities for stress and pain with not, at the beginning, a whole lot of likelihood for success,” says Dena Davis, a bioethicist at Lehigh University.
Flake says ethical concerns need to be balanced against the risk of death and severe disabilities babies often suffer when they are born very prematurely. A normal pregnancy lasts about 40 weeks. A human device would be designed for those born 23 or 24 weeks into pregnancy.
Only about half of such babies survive and, of those that do, about 90 percent suffer severe complications, such as cerebral palsy, mental retardation, seizures, paralysis, blindness and deafness, Flake says.
About 30,000 babies are born earlier than 26 weeks into pregnancy each year in the United States, according to the researchers.
Davis worries that the device is not necessarily a good solution for human fetuses.
“If it’s a difference between a baby dying rather peacefully and a baby dying under conditions of great stress and discomfort then, no, I don’t think it’s better,” Davis says.
“If it’s a question of a baby dying versus a baby being born who then needs to live its entire life in an institution, then I don’t think that’s better. Some parents might think that’s better, but many would not,” she says.
And even if it works, Davis also worries about whether this could blur the line between a fetus and a baby.
“Up to now, we’ve been either born or not born. This would be halfway born, or something like that. Think about that in terms of our abortion politics,” she says.
Some worry that others could take this technology further. Other scientists are already keeping embryos alive in their labs longer then ever before, and trying to create human sperm, eggs and even embryo-like entities out of stem cells. One group recently created an artificial version of the female reproductive system in the lab.
“I could imagine a time, you know sort of [a] ‘Brave New World,’ where we’re growing embryos from the beginning to the end outside of our bodies. It would be a very Gattaca-like world,” says Davis, referring to the 1997 science-fiction film.
There’s also a danger such devices might be used coercively. States could theoretically require women getting abortions to put their fetuses into artificial wombs, says Scott Gelfand, a bioethicist at Oklahoma State University.
Employers could also require female employees to use artificial wombs to avoid maternity leave, he says. Insurers could require use of the device to avoid costly complicated pregnancies and deliveries.
“The ethical implications are just so far-reaching,” Gelfand says.
Barbara Katz Rothman, a sociologist at the City University of New York, says more should be done to prevent premature births. She worries about the technological transformation of pregnancy.
“The problem is a baby raised in a machine is denied a human connection,” Rothman says. “I think that’s a scary, tragic thing.”
Flake says his team has no interest in trying to gestate a fetus any earlier than about 23 weeks into pregnancy.
“I want to make this very clear: We have no intention and we’ve never had any intention with this technology of extending the limits of viability further back,” Flake says. “I think when you do that you open a whole new can of worms.
Flake doubts anything like that would ever be possible.
“That’s a pipe dream at this point,” Flake says.
Research conducted by a team of Scandinavian scientists came to a startling conclusion regarding the DTP vaccine, which is supposed to protect children from diphtheria, pertussis, and tetanus. Though they found that the vaccine can prevent those diseases, it does so at a terrible cost.
The research, which was partly funded by the Danish government, derived its data from a vaccination campaign conducted in the African nation of Guinea Bissau during the 1980’s. Initially, the campaign offered parents the opportunity to have their babies weighed every 3 months, and in 1981 they started giving out DTP vaccines during these sessions. Because the babies were only allowed to be vaccinated at a certain age, some were not vaccinated, which created the perfect control group.
It turns out that the babies who were vaccinated had a mortality rate that was on average, five times higher than the unvaccinated infants. The vaccinated girls were 9.98 times more likely to die after being vaccinated, and the boys were 3.93 time more likely to die.
These numbers were derived from kids who also had a polio vaccine. Strangely, they had a much lower mortality rate. The kids who only received the DTP vaccine had on average, a mortality rate that was 10 times higher than the control group. The researchers believe that the vaccine must have stifled the immune systems of these children, opening them up to mutliple infections.
The researchers wrote that “It should be of concern that the effect of routine vaccinations on all-cause mortality was not tested in randomized trials. All currently available evidence suggests that DTP vaccine may kill more children from other causes than it saves from diphtheria, tetanus or pertussis. Though a vaccine protects children against the target disease it may simultaneously increase susceptibility to unrelated infections.”
The study only looked at children who were healthy before being vaccinated. Because of that, the researchers noted “The estimate from the natural experiment may therefore still be conservative.”
Delivered by The Daily Sheeple
We encourage you to share and republish our reports, analyses, breaking news and videos (Click for details).
Contributed by Daniel Lang of The Daily Sheeple.
For hundreds of years, biologists knew of the giant shipworm only from shell fragments and a handful of dead specimens. Those specimens, despite being preserved in museum jars, had gone to mush. Still, the shipworm’s scattered remains made an outsize impression on biologists. Its three-foot-long tubular shells — the shipworm isn’t technically a worm but a bivalve — were so striking that Swedish taxonomist Carl Linnaeus included the animal in his book that introduced the scientific naming system “Systema Naturae.”
And yet no one could get their hands on a living example of the giant shipworm, or Kuphus polythalamia. Unlike with other shipworms, named because they ate their way into the sides of wooden boats, no one knew where the giant shipworm lived.
“It’s sort of the unicorn of mollusks,” Margo Haygood, a marine microbiologist at the University of Utah, told The Washington Post.
The habitat of the world’s longest clam is a mystery no longer. As Haygood and her colleagues reported Monday in the Proceedings of the National Academy of Sciences, the search for the giant shipworm has come to an end.
Television news in the Philippines dealt the mortal blow to the shipworm’s near-mythical status. A TV station aired a short documentary segment about strange shellfish living in a lagoon. The show filmed the mollusks growing in the muck, as though someone had planted rows of elephant tusks. As luck would have it, a colleague of Haygood’s in the Philippines caught wind of the segment. Researchers investigated the lagoon, where they plucked a live shipworm from the mud, slipped it along with some seawater into a PVC pipe and shipped the animal to a laboratory.
“I’ve been studying shipworms since 1989 and in all that time I had never seen a living specimen of Kuphus polythalamia,” Daniel Distel, a co-author of the new study and the director of Northeastern University’s Ocean Genome Legacy Center, wrote in an email. “It was pretty spectacular to lift that tube out of its container for the first time.”
Distel carefully chipped away at the giant shipworm’s massive shell. Smaller shipworms are fleshy pink, beige or white, as are most clams. Not the giant shipworm. Its body is black.
“To see this giant gunmetal black specimen was amazing,” Distel said. “On the one hand I was pretty excited to see what it looked like inside. On the other hand it was a little intimidating to dissect this incredibly rare specimen.”
A full-grown giant shipworm reaches up to three feet long, which means that when draped across the width of a twin bed, the clam would just barely fit. “It’s quite heavy. It’s like picking up a tree branch or something even heavier,” Haygood said. “The living animal is just magnificent.”
What’s more, the giant shipworm barely has a digestive system. “It’s not feeding in any normal way,” Haygood said.
The clam has a mouth and a small stomach, but its gills are supersize. Living within those gills are bacteria. That’s not unusual for shipworms: The clams, as a rule, have symbiotic relationships with microbes. Usually, though, the microbes help shipworms digest wood.
In the case of the giant shipworm, the scientists found grains of sulfur packed into the bacteria. The marine biologists suspect that, at some point in the shipworm’s evolution, the animal traded its wood-digesting bacteria for bacteria that feed off sulfur compounds.
The study “provides a fascinating example of symbiont displacement, a phenomena we are only just beginning to observe more regularly in nature, thanks to advances in sequencing which have provided us with the tools to unravel the evolutionary history of microbes,” said Nicole Dubilier, director of the Max Planck Institute for Marine Microbiology, who was not involved in the study. “What we are now seeing is unexpected: symbioses are not as stable as we previously assumed.”
The symbiotic arrangement between microbe and giant shipworm was similar to one found in deep-sea hydrothermal vents. Thousands of feet below the surface, beyond the reaches of sunlight, tube worms also get their nutrients from bacteria that consume sulfides. Despite their similar names, though, tube worms and shipworms aren’t close relatives. Tube worms are annelids — they’re actual worms, like earthworms, not clams.
But the symbiotic bacteria in both deep-sea worms and the lagoon-living clams are related to each other. “So this is a case of convergent evolution,” Distel said. That is, both the worms and clams independently arrived at the same conclusion: Housing bacteria inside their bodies was a fine way to stay nourished.
Haygood said the presence of the sulfide-consuming bacteria suggested that the lagoon, perhaps filled with rotting wood or other organic matter, produced hydrogen sulfide.
The discovery lends support to a hypothesis proposed by Distel in 2000 about the origins of animals that live in deep-sea vents. In Distel’s theory, mussels that lived in wood and harbored the sulfide-eating bacteria might have sunk to the vents. Far below, they flourished on sulfide released from the vents.
“Wood provided an ecological bridge, helping them to invade the vents,” he said. The discovery of the new shipworm indicated that shallow lagoons could have served as the location for the switch in bacteria types: First the wood served directly as food for clams. But once the clams began to take in the sulfur-loving bacteria, the wood provided a source of the hydrogen sulfide for the microbes.
“This is an extremely rare example where we were actually able to find fairly direct evidence about how this particular symbiosis evolved,” in which the clams traded one type of bacteria for the other, Distel said.
The “new” finding by U.S. scientists has therefore provided proof of what the Nazis knew in 1933: that criminal behavior is largely a genetic, hereditary issue.
Duplicating a conclusion made by National Socialist scientists over 80 years ago, scientists in Europe and the United States have this week announced the identification of two genes which in a mutated form are found in a “substantially higher frequency” in violent offenders — meaning that such criminal traits are likely to be inherited.
According to the study titled “Genetic background of extreme violent behavior,” published in the journal Molecular Psychiatry (Molecular Psychiatry, October, 28, 2014, doi:10.1038/mp.2014.130), “in developed countries, the majority of all violent crime is committed by a small group of antisocial recidivistic offenders”—in other words by a small group of people who constantly reoffend.
Until now, the study said, no one has identified any genes which contribute to recidivistic violent offending or severe violent behavior, such as homicide.
However, the new study from two independent cohorts of Finnish prisoners “revealed that a monoamine oxidase A (MAOA) low-activity genotype (contributing to low dopamine turnover rate) as well as the CDH13 gene (coding for neuronal membrane adhesion protein) are associated with extremely violent behavior (at least 10 committed homicides, attempted homicides or batteries).”
The study continued: “No substantial signal was observed for either MAOA or CDH13 among non-violent offenders, indicating that findings were specific for violent offending, and not largely attributable to substance abuse or antisocial personality disorder.
“These results indicate both low monoamine metabolism and neuronal membrane dysfunction as plausible factors in the etiology of extreme criminal violent behavior, and imply that at least about 5–10 percent of all severe violent crime in Finland is attributable to the aforementioned MAOA and CDH13 genotypes.”
Study leader Jari Tiihonen and colleagues analyzed the genes of 895 Finnish individuals found guilty of criminal behavior, and classified them by crimes committed, ranging from non-violent offenses (such as drug or property crimes) to extremely violent offenses (10 or more severe violent crimes, consisting of varying degrees of homicide and battery).
The authors found a possible link between violent offences and MAOA, with the strongest association in the extremely violent offending group.
Through additional research, including a genome-wide association study, the authors identified a variant of cadherin 13 (CDH13)—a gene involved in neural connectivity that has been linked to impulse control—in extremely violent offenders. When compared to the control population, non-violent offenders were not observed to exhibit either variant to a greater degree, indicating that these genetic variants may be specific to extremely violent behavior. The authors also suggest that the low dopamine recycling associated with the MAOA genotype may result in higher aggression levels during intoxication, increasing the risk of violent behavior.
In effect, these genes affect complex brain chemistry, which in turn alter behavior. As all genes are inherited, these behavioral traits are passed on from parents to children, creating the well-known phenomenon of criminality running through families.
For example, a 2010 study published in the journal Psychological Medicine (Psychol. Med. 2011 Jan; 41(1):97-105. doi: 10.1017/S0033291710000462. Epub 2010 Mar 25), titled “Violent crime runs in families: a total population study of 12.5 million individuals,” found “strong familial aggregation of interpersonal violence among first-degree relatives [e.g. odds ratio (OR) sibling 4.3, 95 percent confidence interval (CI) 4.2-4.3], lower for more distant relatives (e.g. OR cousin 1.9, 95 percent CI 1.9-1.9).
“Familial risks were stronger among women, in higher socio-economic strata, and for early onset interpersonal violence. There were crime-specific effects (e.g. OR sibling for arson 22.4, 95 percent CI 12.2-41.2), suggesting both general and subtype-specific familial risk factors for violent behavior” and concluded that “The observed familiality should be accounted for in criminological research, applied violence risk assessment, and prevention efforts.”
There are also many other anecdotal examples of how criminal behavior is passed from generation to generation (for example, “Crime Runs in the Family,” Sept. 9, 2002, ABC News), and of course, the notorious Bogle family in America whose extended familial incarceration has cost the American taxpayer millions of dollars.
According to a report on the Bogles, published in 2002, “For all this criminal activity, the Bogle clan is merely an extreme example of a phenomenon that prison officials, the police and criminal justice experts have long observed, that crime often runs in families.
“Justice Department figures show that 47 percent of inmates in state prisons have a parent or other close relative who has also been incarcerated, said Allen J. Beck of the Bureau of Justice Statistics. Similarly, the link between the generations is so powerful that half of all juveniles in custody have a father, mother or other close relative who has been in jail or prison, Mr. Beck said.”
Significantly, the scientists who have completed the latest study took into account environmental factors—whether or not people had a history of substance abuse, antisocial personality disorders or childhood maltreatment—but these factors did not alter the outcome.
The MAOA gene has been linked to the metabolism of dopamine, a neurotransmitter that plays a role in addiction and the ability to experience pleasure.
In 1934, the German government passed legislation titled the “Law against Dangerous Habitual Criminals” which was the result of a Ministry of Justice circular issued in December 1933 which requested that all courts, prosecutors and prison officials report all serious “criminals who might suffer from a genetic disease” to the recently established Hereditary Health Courts for a sterilization hearing (Inventing the Criminal: A History of German Criminology, 1880–1945, Richard F. Wetzell, p. 258).
The point of that law was to prevent serious criminals—those convicted of three or more violent crimes—from having children and thereby reproducing the criminal gene in society.
In the official commentary accompanying the Law on Habitual Criminals, the German Ministry of Justice declared that the “task of protecting the nation from the inferior offspring of genetically diseased criminals lies in the area of eugenics, not criminal law” (ibid, p. 260).
The “new” finding by the scientists has therefore provided proof of what the National Socialist government knew in 1933: that criminal behavior is largely a genetic, hereditary issue, and that it can be combated with a strict eugenics program.
* What the new study failed to point out is the fact that scientific studies have shown that American blacks are fifty times more likely to have the variant of MAOA that is associated with violent behavior.
As detailed in the book, A Troublesome Inheritance, written by Nicholas Wade, (Penguin Press, May 15 2014), a research team led by Michael Vaughn of Saint
Louis University looked at the MAOA promoters in 2,524 American youths. Of the blacks in the sample, 5 percent carried two MAOA promoters, a condition found to be associated with higher levels of delinquency.
“Members of the two-promoter group were significantly more likely to have been arrested and imprisoned than African Americans who carried three or four promoters. The same comparison could not be made in white, or Caucasian, males, the researchers report, because only 0.1 percent carry the two-promoter allele,” Wade pointed out.