Tuesday, July 24, 2012

New Species of Sharks


Local biologist discovers 79 new species of sharks

  • Posted: Monday, July 23, 2012 12:11 a.m.
    UPDATED: Monday, July 23, 2012 9:49 a.m.
  • Text size: A A A
Hollings Marine Laboratory researcher Chenhong Li prepares shark specimens for DNA testing. Studying genetics, the researchers have discovered 79 new species of shark.
Hollings Marine Laboratory researcher Chenhong Li prepares shark specimens for DNA testing. Studying genetics, the researchers have discovered 79 new species of shark.
FORT JOHNSON — Just when you thought it was safe to go back in the water, Gavin Naylor has discovered 79 new species of sharks.
Yep, that’s right, 79 more than the 1,200 or so species of sharks and rays already known to be out there. Nearly 40 known species are found in waters in the South Carolina region.
As if sharks like the Atlantic sharpnose, sandbar, blacknose, finetooth, bull, tiger and occasional great white weren’t enough.
And, oh, it gets worse.
“We’re pretty sure there are a lot more (species) out there. We’re just a bunch of nerdy scientists doing the best we can. I’m pretty sure we’ve barely scratched the surface,” Naylor said.
OK, breathe a little.
Naylor is a biologist at the Hollings Marine Laboratory, working jointly for College of Charleston and the Medical University of South Carolina.
He’s studying the genetic sequencing of sharks, to find keys in the DNA that might explain how and why organisms “sprout new features,” or evolve.
It’s pretty involved science, but it could have profound “real world” applications to genetic medicine and other fields.
Working with Naylor is researcher Chenhong Li, who invented a way to “velcro” individual genes, getting them to stick to molecules so they can be isolated. “It allows us to ‘suck’ out particular genes and compare them across organisms,” Naylor said.
The technique potentially could allow genetic coding to be compared among 1,200 genes of as many as 100 different specimens at a time.
How important is that?
“We used to do it one gene by one gene,” Li said.
Most of the new shark species are “cryptic,” virtually indistinguishable from known species — except the DNA doesn’t match. The team came across them while using DNA to ensure they had the species they thought they did. They’re using sharks for the study because sharks are the oldest surviving vertebrate animals that, like us, have jaws.
Don’t start that movie music in your head. More species of sharks doesn’t mean there are more sharks out there, just more types.
Besides, only five of the species are likely to take a nip out of us. “The incidence of interaction between sharks and humans is not going to change,” a good-humored Naylor said. “Nothing’s changed, just our knowledge.”
Reach Bo Petersen at 937-5744, @bopete on Twitter or Bo Petersen Reporting on Facebook.

Thursday, July 19, 2012

GeoEngineering

Interesting Idea....


Dumping iron at sea can bury carbon for centuries, study shows

Iron fertilisation creates algae blooms that later die off and sink, taking the absorbed carbon deep towards the ocean floor
Damian on Plankton  fertilisation with iron on the research vessel 'Polarstern' of the AWI
A magnified view of the plankton three weeks after its fertilisation with iron. Photograph: Philipp Assmy/Awi/EPA
Dumping iron into the sea can bury carbon dioxide for centuries, potentially helping reduce the impact of climate change, according to a major new study. The work shows for the first time that much of the algae that blooms when iron filings are added dies and falls into the deep ocean.
Geoengineering – technologies aimed at alleviating global warming – are controversial, with critics warning of unintended environmental side effects or encouraging complacency in global deals to cut carbon emissions. But Prof Victor Smetacek, at the Alfred Wegener Institute for Polar and Marine Research in Germany, who led the new research, said: "The time has come to differentiate: some geoengineering techniques are more dangerous than others. Doing nothing is probably the worst option."
Dave Reay, senior lecturer in carbon management at the University of Edinburgh, said: "This represents a whole new ball game in terms of iron fertilisation as a geoengineering technique. Maybe deliberate enhancement of carbon storage in the oceans has more legs than we thought but, as the scientists themselves acknowledge, it's still far too early to run with it."
2009 report from the Royal Society, the UK's science academy, concluded that while cutting emissions is the first priority, careful research into geoengineering was required in case drastic measures – such as trying to block sunlight by pumping sulphate into the atmosphere– were one day needed.
Prof John Shepherd, chair of the report, said on Wednesday: "It is important that we continue to research these technologies but governance of this research is vital to protect the oceans, wider environment and public interests."
Smetacek's team added seven tonnes of iron sulphate to the ocean near Antarctica, where iron levels are extremely low. The addition of the missing nutrient prompted a massive bloom of phytoplankton to begin growing within a week. As the phytoplankton, mostly species of diatom, began to die after three weeks, they sank towards the ocean floor, taking the carbon they had incorporated with them.
The scientists chose the experiment location carefully, within a 60km-wide self-enclosed eddy in the ocean that acted as a giant "test tube". This meant that it was possible to compare what happened within the eddy with control points outside the eddy. After a month of monitoring nutrient and plankton levels from the surface to the depths the team concluded at least half of the bloom had fallen to depths below 1,000m and that a "substantial portion was likely to have reached the sea floor" at 3,800m.
The scientists conclude in the journal Nature that the carbon is therefore likely to be kept out of the atmosphere for many centuries or longer.
A dozen other experiments have shown that iron can prompt phytoplankton blooms, but this is the first study to show that the carbon the plants take up is deeply buried. Other researchers recognise the significance of this but warn of other issues that might prevent the iron fertilisation of the ocean as being a useful geoengineering technique.
"The ocean's capacity for carbon sequestration in low-iron regions is just a fraction of anthropogenic CO2 emissions, and such sequestration is not permanent — it lasts only for decades to centuries," said Ken Buesseler, at the Woods Hole Oceanographic Institution in the US.
Smetacek said ocean iron fertilisation could bury at most 1 gigatonne of CO2 per year compared to annual emissions of 8-9Gt, of which 4Gt accumulates in the atmosphere. But sequestering some CO2 could make the difference between crossing a climate "tipping" point, where feedback effects lead to runaway global warming, he said: "I don't see what will stop Arctic sea ice from decreasing."
Michael Steinke, director of marine biology at the University of Essex, said: "Will this open up the gates to large-scale geoengineering using ocean fertilisation? Likely not, since the logistics of finding the right spot for such experiments are difficult and costly."
Smetacek responded that ocean iron fertilisation is much cheaper than other possible geoengineering techniques. He acknowledged more experiments were needed over longer periods to examine, for example, how many of the diatoms were eaten by krill, and then by whales, meaning they did not fall to the ocean floor.
On the ethics of geoengineering, Smetacek, who is a vegetarian, told the Guardian: "We could reduce emissions significantly and increase the scope for sequestration on land [by freeing grazing land for forestry] if we managed to convert the global population to vegetarianism. Would that be geoengineering?"
Iron filings and carbon burialIron filings and carbon burial Illustration: guardian.co.uk

Wednesday, July 18, 2012

EvoDevo: Graph Computer Genetics


Your Laptop Can Now Analyze Big Data

New software makes it possible to do in minutes on a small computer what used to be done by large clusters of computers.
Maximilian Bode
Computer scientists from Carnegie Mellon University have devised a framework for running large-scale computations for tasks such as social network or Web search analysis efficiently on a single personal computer.
The software could help developers working on many modern tasks: for example, designing a new recommendation engine using social network connections. In order to make effective recommendations—"your friends liked this movie, so here is another movie that you haven't seen yet, but you will probably like"—the software has to be able to analyze the connections between the members of a social network. This type of task is called graph computation, and it is increasingly common. But working with large-scale data sets (such as online social networks) usually requires the processing horsepower of many computers clustered together, such as those offered by Amazon's cloud-based EC2 service.
The new software, called GraphChi, exploits the capacious hard drives that are becoming ever more common in personal computers. A graph would normally be stored in temporary memory (RAM) for analysis. With GraphChi, the hard drive performs this task instead.
Advertisement
"PCs don't have enough RAM to hold an entire Web graph, but they do have hard drives, which can hold a lot of information," says Carlos Guestrin, codirector of Carnegie Mellon's Select Lab, where GraphChi was developed. But hard drives are slow compared to RAM for reading and writing data, which tends to slow down computation. So Guestrin's student Aapo Kyrola designed a faster, less random method of accessing the hard drive.
According to Guestrin, a Mac Mini running GraphChi can analyze Twitter's social graph from 2010—which contains 40 million users and 1.2 billion connections—in 59 minutes. "The previous published result on this problem took 400 minutes using a cluster of about 1,000 computers," Guestrin says.
As technology gets more networked, and data sets get larger, graph computation is becoming more and more relevant in many domains, says David A. Bader, a graph computation expert at Georgia Tech. "Trying to understand how the human brain works or trying to make sense of medical patient records involve graph computing," he says.
Graph analysis also drives the development of new web products, says Jeremy Kepner, a researcher at MIT. "Document search, ad placement, route planning, travel reservations, and cyber security all rely on graph analysis," he says. "Enabling web developers to construct these analyses on their desktop computers catalyzes these industries and accelerates product development."
Guestrin adds that GraphChi can handle "streaming graphs," which more accurately model large networks by showing how relationships change over time. Bader and others at Georgia Tech have created a graph computation framework, called Stinger, that's optimized for supercomputers working with massive streaming graphs.
"The scales of these problems will obviously keep growing," says Guestrin. But he says GraphChi is capable of effectively handling many large-scale graph-computing problems without resorting to cloud-based solutions or supercomputers.
"A researcher in computational biology could do large-scale computations on their PC; a developer working on a data-center algorithm can test it on their laptop before pushing it to the cloud," Guestrin says. "Big data is everywhere now, but some big data isn't as big as it once was, relatively speaking. Tools like GraphChi will let many companies and startups solve all their graph-computing needs on a single machine. It's cost effective, and it drives innovation, too."

Ecology of Disease

http://www.nytimes.com/2012/07/15/sunday-review/the-ecology-of-disease.html?ref=science



The Ecology of Disease

Olaf Hajek
THERE’S a term biologists and economists use these days — ecosystem services — which refers to the many ways nature supports the human endeavor. Forests filter the water we drink, for example, and birds and bees pollinate crops, both of which have substantial economic as well as biological value.
Multimedia

Readers’ Comments

Readers shared their thoughts on this article.
If we fail to understand and take care of the natural world, it can cause a breakdown of these systems and come back to haunt us in ways we know little about. A critical example is a developing model of infectious disease that shows that most epidemics — AIDS, Ebola, West Nile, SARS, Lyme disease and hundreds more that have occurred over the last several decades — don’t just happen. They are a result of things people do to nature.
Disease, it turns out, is largely an environmental issue. Sixty percent of emerging infectious diseases that affect humans are zoonotic — they originate in animals. And more than two-thirds of those originate in wildlife.
Teams of veterinarians and conservation biologists are in the midst of a global effort with medical doctors and epidemiologists to understand the “ecology of disease.” It is part of a project called Predict, which is financed by the United States Agency for International Development. Experts are trying to figure out, based on how people alter the landscape — with a new farm or road, for example — where the next diseases are likely to spill over into humans and how to spot them when they do emerge, before they can spread. They are gathering blood, saliva and other samples from high-risk wildlife species to create a library of viruses so that if one does infect humans, it can be more quickly identified. And they are studying ways of managing forests, wildlife and livestock to prevent diseases from leaving the woods and becoming the next pandemic.
It isn’t only a public health issue, but an economic one. The World Bank has estimated that a severe influenza pandemic, for example, could cost the world economy $3 trillion.
The problem is exacerbated by how livestock are kept in poor countries, which can magnify diseases borne by wild animals. A study released earlier this month by the International Livestock Research Institute found that more than two million people a year are killed by diseases that spread to humans from wild and domestic animals.
The Nipah virus in South Asia, and the closely related Hendra virus in Australia, both in the genus of henipah viruses, are the most urgent examples of how disrupting an ecosystem can cause disease. The viruses originated with flying foxes, Pteropus vampyrus, also known as fruit bats. They are messy eaters, no small matter in this scenario. They often hang upside down, looking like Dracula wrapped tightly in their membranous wings, and eat fruit by masticating the pulp and then spitting out the juices and seeds.
The bats have evolved with henipah over millions of years, and because of this co-evolution, they experience little more from it than the fruit bat equivalent of a cold. But once the virus breaks out of the bats and into species that haven’t evolved with it, a horror show can occur, as one did in 1999 in rural Malaysia. It is likely that a bat dropped a piece of chewed fruit into a piggery in a forest. The pigs became infected with the virus, and amplified it, and it jumped to humans. It was startling in its lethality. Out of 276 people infected in Malaysia, 106 died, and many others suffered permanent and crippling neurological disorders. There is no cure or vaccine. Since then there have been 12 smaller outbreaks in South Asia.
In Australia, where four people and dozens of horses have died of Hendra, the scenario was different: suburbanization lured infected bats that were once forest-dwellers into backyards and pastures. If a henipah virus evolves to be transmitted readily through casual contact, the concern is that it could leave the jungle and spread throughout Asia or the world. “Nipah is spilling over, and we are observing these small clusters of cases — and it’s a matter of time that the right strain will come along and efficiently spread among people,” says Jonathan Epstein, a veterinarian with EcoHealth Alliance, a New York-based organization that studies the ecological causes of disease.
That’s why experts say it’s critical to understand underlying causes. “Any emerging disease in the last 30 or 40 years has come about as a result of encroachment into wild lands and changes in demography,” says Peter Daszak, a disease ecologist and the president of EcoHealth.
Emerging infectious diseases are either new types of pathogens or old ones that have mutated to become novel, as the flu does every year. AIDS, for example, crossed into humans from chimpanzees in the 1920s when bush-meat hunters in Africa killed and butchered them.
Jim Robbins is a frequent contributor to the Science section of The New York Times.

Using Genetic Codes to Detect and Trace Food Poisoning


Harnessing Gene Codes as Sleuths of Food Ills

  • FACEBOOK
  • TWITTER
  • GOOGLE+
  • E-MAIL
  • SHARE
  • PRINT
  • REPRINTS
WASHINGTON — A new public database aims to catalog the genetic codes of 100,000 types of bacteria found in food, vastly increasing the amount of data that scientists can use to trace the causes of food-borne illness.
The free database, being set up at the University of California, Davis, will enable scientists to pinpoint not only what food carries the bacteria responsible for a given outbreak — raw tuna in sushi, for example — but also what country it came from. And while responses to such outbreaks have typically taken weeks, the new database is expected to reduce that to days.
“It’s actually a big deal from a scientific standpoint,” said Steven M. Musser, the Food and Drug Administration official who announced plans for the database on Thursday.
Genetic sequencing is new. To date scientists have identified as many as 3,000 sequences, and only about 1,000 are related to food.
The Centers for Disease Control and Prevention has the largest such database, but the gene maps it contains are only partial, not enough to determine which food the illness came from or its geographic origin, said Dr. Musser, director of the office of regulatory science at the F.D.A.’s Center for Food Safety and Applied Nutrition.
Cataloging gene codes is time-consuming. Salmonella alone has about 2,700 different strains, almost three times as many as all the sequences for food-borne bacteria that have been cataloged to date. Dr. Musser said his laboratory had cataloged just 500 in about three years of work. It is contributing those sequences to the project, all related to salmonella.
But the database, which includes contributions from the disease centers and from the biotechnology company Agilent Technologies, aims to have mapped 100,000 sequences in five years, making it the single largest genome project in the world, said Bart C. Weimer, a professor of microbiology at U.C. Davis who is directing the project.
The cost of such work has dropped sharply in recent years, he said, but having enough people trained to sort through all the data was the main concern.
The first sequencing started in March, Dr. Weimer said, shortly after researchers at the F.D.A. and at the university realized they were working simultaneously on similar things and decided to join forces.
“You need a big volume of information to make an impact in the public health arena,” he said. “That improves accuracy and the capability of doing things fast.”

Salmon Genetically changing


OBSERVATORY

Alaskan Salmon Evolve Along With the Climate

  • FACEBOOK
  • TWITTER
  • GOOGLE+
  • E-MAIL
  • SHARE
  • PRINT
  • REPRINTS
Alaskan salmon are apparently evolving to adapt to climate change.
Researchers have suspected that temperature-driven changes in migration and reproduction behaviors — which have happened in many species — may be evidence of natural selection at work. Now there is genetic evidence to confirm the hypothesis.
For their study, published online last week in Proceedings of the Royal Society B, the scientists studied Alaska pink salmon in a small stream near Juneau where there have been complete daily counts of all adult fish since 1971.
The salmon migrated in two distinct populations, one appearing toward the end of August, the other starting in September. In 1979, scientists introduced a neutral genetic marker into the later-migrating population so it could be identified and tracked without affecting its fitness.
In the 1980s, the genetically marked late migrators made up about a third of the population. But as streams started warming earlier in the year, the proportion began to decrease rapidly — to just 5 percent by 2011 — even though overall abundance did not change.
These were rapid changes, not gradual evolutionary shifts. The late-migrating fish practically disappeared within a few years.
“It’s sort of reassuring that organisms have the potential to adapt to the really major changes that are occurring,” said the lead author, Ryan P. Kovach, a doctoral student at the University of Alaska, Fairbanks. “But the fact that climate change is already forcing evolutionary changes should be raising red flags for resource managers and the general public alike.”