Department of Interior Leads the Way on Scientific Integrity

The Department of the Interior has released a new scientific integrity policy that sets a strong standard for the administration, according to the Union of Concerned Scientists.

Below is a statement from Michael Halpern, Program Manager at the Center for Science and Democracy:

“The Department of the Interior’s new scientific integrity policy is simplified, streamlined, and clear. With this policy, Interior continues to stand at the front of the pack in the Obama administration’s quest to create strong scientific integrity standards within government.

“Outside pressure on the Department of Interior to politicize science is strong, so it’s critical that the department respond with strong policies to protect science and scientists from political interference in their work.

“While the different bureaus within the Department of Interior have been uneven in terms of embracing reform, headquarters has devoted significant resources to implementing and improving its scientific integrity policies, and this new policy is no exception. The new policy and handbook provide significantly more specifics on the purview of the policy and how it will be carried out. The new online training program and intra-departmental scientific integrity council should go a long way to making sure that the policy becomes part of the department’s culture. To fully realize the gains it has made and ensure external accountability, Interior should make clear that completed investigations will continue to be reported publicly.”

Halpern explores the strengths and weaknesses of the policy in a new blog post.

Reality & Imagination Flow in Opposite Directions

As real as that daydream may seem, its path through your brain runs opposite to external reality.

Aiming to discern discrete neural circuits, researchers at the University of Wisconsin–Madison have tracked electrical activity in the brains of people who alternately imagined scenes or watched videos.

“A really important problem in brain research is understanding how different parts of the brain are functionally connected. What areas are interacting? What is the direction of communication?” says Barry Van Veen, a UW-Madison professor of electrical and computer engineering. “We know that the brain does not function as a set of independent areas, but as a network of specialized areas that collaborate.”

Van Veen, along with Giulio Tononi, a UW-Madison psychiatry professor and neuroscientist, Daniela Dentico, a scientist at UW–Madison’s Waisman Center, and collaborators from the University of Liege in Belgium, published results recently in the journal NeuroImage. Their work could lead to the development of new tools to help Tononi untangle what happens in the brain during sleep and dreaming, while Van Veen hopes to apply the study’s new methods to understand how the brain uses networks to encode short-term memory.

During imagination, the researchers found an increase in the flow of information from the parietal lobe of the brain to the occipital lobe — from a higher-order region that combines inputs from several of the senses out to a lower-order region.

In contrast, visual information taken in by the eyes tends to flow from the occipital lobe — which makes up much of the brain’s visual cortex — “up” to the parietal lobe.

“There seems to be a lot in our brains and animal brains that is directional, that neural signals move in a particular direction, then stop, and start somewhere else,” says. “I think this is really a new theme that had not been explored.”

The researchers approached the study as an opportunity to test the power of electroencephalography (EEG) — which uses sensors on the scalp to measure underlying electrical activity — to discriminate between different parts of the brain’s network.

Brains are rarely quiet, though, and EEG tends to record plenty of activity not necessarily related to a particular process researchers want to study.

To zero in on a set of target circuits, the researchers asked their subjects to watch short video clips before trying to replay the action from memory in their heads. Others were asked to imagine traveling on a magic bicycle — focusing on the details of shapes, colors and textures — before watching a short video of silent nature scenes.

Using an algorithm Van Veen developed to parse the detailed EEG data, the researchers were able to compile strong evidence of the directional flow of information.

“We were very interested in seeing if our signal-processing methods were sensitive enough to discriminate between these conditions,” says Van Veen, whose work is supported by the National Institute of Biomedical Imaging and Bioengineering. “These types of demonstrations are important for gaining confidence in new tools.”

Three Times More Mercury in the Oceans

 Mercury is a naturally occuring element as well as a by-product of human activities such as burning coal and making cement. Mercury is also highly toxic to most all life forms.

Estimates of “bioavailable “ mercury – forms of the element that can be taken up by animals and humans – play an important role in everything from drafting an internation treaty designed to protect humans and the environment from mercury emissions, to establishing public policies behind warnings about seafood consumption.

Yet surprisingly little is known about how much mercury in the environment is the result of human activity, or even how much bioavailable mercury exists in the global ocean. Until now.

A recent paper by a group that includes researchers from the Woods Hole Oceanographic Institution (WHOI), Wright State University, Observatoire Midi-Pyrenees in France, and the Royal Netherlands Institute for Sea Research appears in this week’s edition of the journal Nature and provides the first direct calculation of mercury in the global ocean from pollution based on data obtained from 12 sampling cruises over the past 8 years. The work, which was funded by the U.S. National Science Foundation and European Research Council and led by WHOI marine chemist Carl Lamborg, also provides a look at the global distribution of mercury in the marine environment.

The group started by looking at data sets that offer detail about oceanic levels of phosphate, a substance that is both better studied than mercury and that behaves in much the same way in the ocean. Phosphate is a nutrient that, like mercury, is taken up into the marine food web by binding with organic material. By determining the ratio of phosphate to mercury in water deeper than 1,000 meters (3,300 feet) that has not been in contact with Earth’s atmosphere since the Industrial Revolution, the group was able to estimate mercury in the ocean that originated from natural sources such as the breakdown, or “weathering,” of rocks on land.

Their finding s agreed with what they would expect to see given the known pattern of global ocean circulation. North Atlantic waters, for example, showed the most obvious signs of mercury from pollution because that is where surface waters sink under the influence of temperature and salinity changes to form deep and intermediate water flows. The Tropical and Northeast Pacific, on the other hand, were seen to be relatively unaffected because it takes centuries for deep ocean water to circulate to those regions.

But determining the contribution of mercury from human activity required another step. To obtain estimates for shallower waters and to provide basin-wide numbers for the amount of mercury in the global ocean, the team needed a tracer – a substance that could be linked back to the major activities that release mercury into the environment in the first place. They found it in one of the most well studied gases of the past 40 years – carbon dioxide. Databases of CO₂ in ocean waters are extensive and readily available for every ocean basin at virtually all depths. Because much of the mercury and CO₂ from human sources derive from the same activities, the team was able to derive an index relating the two and use it to calculate the amount and distribution of mercury in the world’s ocean basins that originated from human activity.

Analysis of their results showed rough agreement with the models used previously – that the ocean contains about 60,000 to 80,000 tons of pollution mercury. In addition, they found that ocean waters shallower than about 100 m (300 feet) have tripled in mercury concentration since the Industrial Revolution and that the ocean as a whole has shown an increase of roughly 10 percent over pre-industrial mercury levels.

“With the increases we’ve seen in the recent past, the next 50 years could very well add the same amount we’ve seen in the past 150,” said Lamborg. “The trouble is, we don’t know what it all means for fish and marine mammals. It likely means some fish also contain at least three times more mercury than 150 years ago, but it could be more. The key is now we have some solid numbers on which to be base continued work.”

“Mercury is a priority environmental poison detectable wherever we look for it, including the global ocean abyss,” says Don Rice, director of the National Science Foundation (NSF)’s Chemical Oceanography Program, which funded the research. “These scientists have reminded us that the problem is far from abatement, especially in regions of the world ocean where the human fingerprint is most distinct.”

 

New Paper on Life Under Antarctic Ice

The first breakthrough paper to come out of a massive U.S. expedition to one of Earth’s final frontiers shows that there’s life and an active ecosystem one-half mile below the surface of the West Antarctic Ice Sheet, specifically in a lake that hasn’t seen sunlight or felt a breath of wind for millions of years.

The life is in the form of microorganisms that live beneath the enormous Antarctic ice sheet and convert ammonium and methane into the energy required for growth. Many of the microbes are single-celled organisms known as Archaea, said Montana State University professor John Priscu, the chief scientist of the U.S. project called WISSARD that sampled the sub-ice environment. He is also co-author of the MSU author-dominated paper in the Aug. 21 issue of Nature, an international weekly journal for all fields of science and technology.

“We were able to prove unequivocally to the world that Antarctica is not a dead continent,” Priscu said, adding that data in the Nature paper is the first direct evidence that life is present in the subglacial environment beneath the Antarctic ice sheet.

Priscu said he wasn’t entirely surprised that the team found life after drilling through half a mile of ice to reach Subglacial Lake Whillans in January 2013. An internationally renowned polar biologist, Priscu researches both the South and North Poles. This fall will be his 30th field season in Antarctica, and he has long predicted the discovery.

More than a decade ago, he published two manuscripts in the journal Science describing for the first time that microbial life can thrive in and under Antarctic ice. Five years ago, he published a manuscript where he predicted that the Antarctic subglacial environment would be the planet’s largest wetland, one not dominated by the red-winged blackbirds and cattails of typical wetland regions in North America, but by microorganisms that mine minerals in rocks at subzero temperatures to obtain the energy that fuels their growth.

Following more than a decade of traveling the world presenting lectures describing what may lie beneath Antarctic ice, Priscu was instrumental in convincing U.S. national funding agencies that this research would transform the way we view the fifth largest continent on the planet.

Although he was not really surprised about the discovery, Priscu said he was excited by some of the details of the Antarctic find, particularly how the microbes function without sunlight at subzero temperatures and the fact that evidence from DNA sequencing revealed that the dominant organisms are archaea. Archaea is one of three domains of life, with the others being Bacteria and Eukaryote.

Many of the subglacial archaea use the energy in the chemical bonds of ammonium to fix carbon dioxide and drive other metabolic processes. Another group of microorganisms uses the energy and carbon in methane to make a living. According to Priscu, the source of the ammonium and methane is most likely from the breakdown of organic matter that was deposited in the area hundreds of thousands of years ago when Antarctica was warmer and the sea inundated West Antarctica. He also noted that, as Antarctica continues to warm, vast amounts of methane, a potent greenhouse gas, will be liberated into the atmosphere enhancing climate warming.

The U.S. team also proved that the microorganisms originated in Lake Whillans and weren’t introduced by contaminated equipment, Priscu said. Skeptics of his previous studies of Antarctic ice have suggested that his group didn’t actually discover microorganisms, but recovered microbes they brought in themselves.

Extensive tests were conducted at MSU two years ago on WISSARD’s borehole decontamination system to ensure that it worked, and Priscu led a publication in an international journal presenting results of these tests. This decontamination system was mated to a one-of-a-kind hot water drill that was used to melt a borehole through the ice sheet, which provided a conduit to the subglacial environment for sampling.

Every day in Antarctica, he would tell his team to keep it simple, Priscu said. To prove that an ecosystem existed below the West Antarctic Ice Sheet, he wanted at least three lines of evidence. They had to see microorganisms under the microscope that came from Lake Whillans and not contaminated equipment. They then had to show that the microorganisms were alive and growing. They had to be identifiable by their DNA.

When the team found those things, he knew they had succeeded, Priscu said.

The Whillans Ice Stream Subglacial Access Research Drilling (WISSARD) project officially began in 2009 with a $10 million grant from the National Science Foundation. Now involving 13 principal investigators at eight U.S. institutions, the researchers drilled down to Subglacial Lake Whillans in January 2013. The microorganisms they discovered are still being analyzed at MSU and other collaborating institutions.

Christner said species are hard to determine in microbiology, but “We are looking at a water column that probably has about 4,000 things we call species. It’s incredibly diverse.”

Planning to drill again this austral summer in a new Antarctic location, Priscu said WISSARD was the first large-scale multidisciplinary effort to directly examine the biology of an Antarctic subglacial environment. The Antarctic Ice Sheet covers an area 1 ½ times the size of the United States and contains 70 percent of Earth’s freshwater, and any significant melting can drastically increase sea level. Lake Whillans, one of more than 200 known lakes beneath the Antarctic Ice Sheet and the primary lake in the WISSARD study, fills and drains about every three years. The river that drains Lake Whillans flows under the Ross Ice Shelf, which is the size of France, and feeds the Southern Ocean, where it can provide nutrients for life and influence water circulation patterns.

The opportunity to explore the world under the West Antarctic Ice Sheet is an unparalleled opportunity for the U.S. team, as well as for several MSU-affiliated researchers who are part of that team and wrote or co-authored the Nature paper, Priscu said.

Christner said the team that wrote the paper in Nature is the dream team of polar biology. Besides the MSU-affiliated scientists, the co-authors include Amanda Achberger, a graduate student at Louisiana State University; Carlo Barbante, a geochemist at the University of Venice in Italy; Sasha Carter, a postdoctoral researcher at the University of California in San Diego; and Knut Christianson a postdoctoral researcher from St. Olaf College in Minnesota and New York University.

“I hope this exciting discovery will touch the lives (both young and old) of people throughout the world and inspire the next generation of polar scientists,” Priscu said.

The microorganisms that came out of Subglacial Lake Whillans were "incredibly diverse," and the microbial cells came in a variety of shapes. The yellow arrow points to a rod-shaped cell as seen through a scanning electron microscope. Credit: Image courtesy of WISSARD
The microorganisms that came out of Subglacial Lake Whillans were “incredibly diverse,” and the microbial cells came in a variety of shapes. The yellow arrow points to a rod-shaped cell as seen through a scanning electron microscope. Credit: Image courtesy of WISSARD

 

Bees Infected With Mutated Plant Virus?

A viral pathogen that typically infects plants has been found in honeybees and could help explain their decline, researchers in the U.S. and China report.

The routine screening of bees for frequent and rare viruses “resulted in the serendipitous detection of Tobacco Ringspot Virus, or TRSV, and prompted an investigation into whether this plant-infecting virus could also cause systemic infection in the bees,” says Yan Ping Chen from the U.S. Department of Agriculture’s Agricultural Research Service (ARS) laboratory in Beltsville, Maryland, an author on the study.

“The results of our study provide the first evidence that honeybees exposed to virus-contaminated pollen can also be infected and that the infection becomes widespread in their bodies,” says lead author Ji Lian Li, at the Chinese Academy of Agricultural Science in Beijing.

“We already know that honeybees, Apis melllifera, can transmit TRSV when they move from flower to flower, likely spreading the virus from one plant to another,” Chen adds.

Notably, about 5% of known plant viruses are pollen-transmitted and thus potential sources of host-jumping viruses. RNA viruses tend to be particularly dangerous because they lack the 3′-5′ proofreading function which edits out errors in replicated genomes. As a result, viruses such as TRSV generate a flood of variant copies with differing infective properties.

One consequence of such high replication rates are populations of RNA viruses thought to exist as “quasispecies,” clouds of genetically related variants that appear to work together to determine the pathology of their hosts. These sources of genetic diversity, coupled with large population sizes, further facilitate the adaption of RNA viruses to new selective conditions such as those imposed by novel hosts. “Thus, RNA viruses are a likely source of emerging and reemerging infectious diseases,” explain these researchers.

Toxic viral cocktails appear to have a strong link with honey bee Colony Collapse Disorder (CCD), a mysterious malady that abruptly wiped out entire hives across the United States and was first reported in 2006. Israel Acute Paralysis Virus (IAPV), Acute Bee Paralysis Virus (ABPV), Chronic Paralysis Virus (CPV), Kashmir Bee Virus (KBV), Deformed Wing Bee Virus (DWV), Black Queen Cell Virus (BQCV) and Sacbrood Virus (SBV) are other known causes of honeybee viral disease.

When these researchers investigated bee colonies classified as “strong” or “weak,” TRSV and other viruses were more common in the weak colonies than they were in the strong ones. Bee populations with high levels of multiple viral infections began failing in late fall and perished before February, these researchers report. In contrast, those in colonies with fewer viral assaults survived the entire cold winter months.

TRSV was also detected inside the bodies of Varroa mites, a “vampire” parasite that transmits viruses between bees while feeding on their blood. However, unlike honeybees, the mite-associated TRSV was restricted to their gastric cecum indicating that the mites likely facilitate the horizontal spread of TRSV within the hive without becoming diseased themselves. The fact that infected queens lay infected eggs convinced these scientists that TRSV could also be transmitted vertically from the queen mother to her offspring.

“The increasing prevalence of TRSV in conjunction with other bee viruses is associated with a gradual decline of host populations and supports the view that viral infections have a significant negative impact on colony survival,” these researchers conclude. Thus, they call for increased surveillance of potential host-jumping events as an integrated part of insect pollinator management programs.

Suicidal Temperatures Ahead

The research could solve one of the great unknowns of climate sensitivity, the role of cloud formation and whether this will have a positive or negative effect on global warming.

“Our research has shown climate models indicating a low temperature response to a doubling of carbon dioxide from preindustrial times are not reproducing the correct processes that lead to cloud formation,” said lead author from UNSW’s Centre of Excellence for Climate System Science, Professor Steven Sherwood.

“When the processes are correct in the climate models the level of climate sensitivity is far higher. Previously estimates of the sensitivity of global temperature to a doubling of carbon dioxide ranged from 1.5°C to 5°C. This new research takes away the lower end of climate sensitivity estimates, meaning that global average temperatures will increase by 3°C to 5°C with a doubling of carbon dioxide.”

The key to this narrower but much higher estimate can be found in the observations around the role of water vapour in cloud formation.

Observations show when water vapour is taken up by the atmosphere through evaporation the updraughts often rise up to 15 km to form heavy rains, but can also rise just a few km before returning to the surface without forming such rains.

In addition, where updraughts rise this smaller distance they reduce total cloud cover because they pull more vapour away from the higher cloud forming regions than when only the deep ones are present.

Climate models that show a low global temperature response to carbon dioxide do not include enough of this lower-level process. They instead simulate nearly all updraughts rising to 15 km.

These deeper updraughts alone do not have the same effect, resulting in increased reflection of sunlight and reduced sensitivity of the global climate to atmospheric carbon dioxide.

However, real world observations show this behaviour is wrong.

When the processes are correct in the climate model, this produces cycles that take water vapour to a wider range of heights in the atmosphere, causing fewer clouds to form in a warmer climate. This increases the amount of sunlight and heat entering the atmosphere and increases the sensitivity of our climate to carbon dioxide or any other perturbation.

When water vapour processes are correctly represented, the sensitivity of the climate to a doubling of carbon dioxide – which will occur in the next 50 years – means we can expect a temperature increase of at least 3°C and more likely 4°C by 2100.

“Climate sceptics like to criticise climate models for getting things wrong, and we are the first to admit they are not perfect, but what we are finding is that the mistakes are being made by those models which predict less warming, not those that predict more,” said Professor Sherwood.

“Rises in global average temperatures of this magnitude will have profound impacts on the world and the economies of many countries if we don’t urgently start to curb our emissions.”

Methane Hydrates – Climate Tipping Point?

Methane hydrates are fragile and it doesn’t take much for them to convert from a solid to a gas and release methane to the ocean’s surface. At the sea floor the ice-like solid fuel composed of water and methane is only stable at high pressure and low temperature. In some areas, for instance in the North Atlantic off the coast of Svalbard, scientists have detected gas flares regularly. The reasons for their occurrence were still unclear but one hypothesis was that warmer water might cause the dissolution of gas hydrates. Over the past years, comprehensive investigations by an international team of researchers led by scientists from GEOMAR Helmholtz Centre for Ocean Research Kiel have now shown that it may be likely that the gas flares are caused by natural processes.

“In 2008, when we observed the outgassing of methane for the first time, we were alarmed”, reports Professor Christian Berndt, lead author of the study from GEOMAR. “The gas originates from depths where the hydrates should normally be stable. But we knew that a relatively small warming might melt the hydrates”, Berndt explains. Thus, the key question was to find out what causes the outgassing. Step by step, several expeditions that took place in the following years helped to solve the mystery.

One of the most obvious assumptions was that the increasing global warming has already extended into these regions of the North Atlantic. However, the investigations partly carried out with the German research submersible JAGO, pointed clearly to natural causes. “On one hand, we have found that the seasonal variations in temperature in this region are sufficient to push the stability zone of gas hydrates more than a kilometre up and down the slope,” Professor Berndt explains. “Additionally, we discovered carbonate structures in the vicinity of methane seeps at the seafloor”, Dr. Tom Feseker from MARUM adds. “These are clear indicators that the outgassing likely takes place over very long time periods, presumably for several thousand years”, Feseker continues.

Does this mean that global warming has no impact on potential methane release from the seafloor off Svalbard? Certainly not, because over long periods of time the deep ocean will also warm up and in particular the polar regions are affected. Here, enormous amounts of methane hydrate are stored in the ocean floor. “As a powerful greenhouse gas methane represents a particular risk for our climate. A release of large amounts of the gas would further accelerate global warming,” says Prof. Berndt. “Therefore, it is necessary to continue long-term monitoring, particularly in such critical regions as off Svalbard”, the Geophysicist concludes.

Ocean Acidity to Skyrocket – Reduce CO2 Now!

A group of ‘experts’ have noted the obvious and agreed on ‘levels of confidence’ in relation to ocean acidification. They have issued statements summarizing a state of knowledge that is so conservative as to be counterproductive in the struggle to create awareness of the dire future that lurks around the corner if we don’t act now.

The summary was led by the International Geosphere-Biosphere Programme and results from the world’s largest gathering of experts on ocean acidification ever convened. The Third Symposium on the Ocean in a High CO2 World was held in Monterey, California (September 2012), and attended by 540 experts from 37 countries. The summary will be launched at the UNFCCC climate negotiations in Warsaw, 18 November, for the benefit of policymakers who want support for a measured response and continued rush to planetary destruction.

The ‘experts’ concluded that marine ecosystems and biodiversity are likely to change as a result of ocean acidification, even though they have already been changing rapidly. They also came up with a brilliant theory that ocean acidification will have far-reaching consequences for society and that economic losses from declines in shellfish aquaculture and the degradation of tropical coral reefs may be substantial owing to the sensitivity of mollusks and corals to ocean acidification, which is also already occurring.

One of the lead authors of the summary, and chair of the symposium, Ulf Riebesell of GEOMAR Helmholtz Centre for Ocean Research Kiel said: “What we can now say with high levels of confidence about ocean acidification sends a clear message. Globally we have to be prepared for significant economic and ecosystem service losses. But we also know that reducing the rate of carbon dioxide emissions will slow acidification. That has to be the major message for the COP19 meeting.”

One outcome emphasized by the ‘experts’ is that if society continues on the current high emissions trajectory, cold water coral reefs, located in the deep sea, may be unsustainable and tropical coral reef erosion is likely to outpace reef building this century, which has already happened in many areas.

The really boneheaded statement by the ‘experts’ is that significant emissions reductions to meet the two-degree target by 2100 could ensure that half of surface waters presently occupied by tropical coral reefs remain favorable for their growth. Which is not even remotely true. C02 levels will have to drop back to 350 ppm for ocean-life to have any chance of avoiding massive destruction and even then it is unlikely that the oceans can be saved because it takes so long for the carbon dioxide to cycle out of the atmosphere and oceans.

Author Wendy Broadgate, Deputy Director at the International Geosphere-Biosphere Programme, wisely said: “Emissions reductions may protect some reefs and marine organisms but we know that the ocean is subject to many other stresses such as warming, deoxygenation, pollution and overfishing. Warming and deoxygenation are also caused by rising carbon dioxide emissions, underlining the importance of reducing fossil fuel emissions. Reducing other stressors such as pollution and overfishing, and the introduction of large scale marine protected areas, may help build some resilience to ocean acidification.”

The summary for policymakers makes 21 statements about ocean acidification with a range of confidence levels from “very high” to “low”.

These include:

Very high confidence

  • Ocean acidification is caused by carbon dioxide emissions from human activity to the atmosphere that end up in the ocean.
  • The capacity of the ocean to act as a carbon sink decreases as it acidifies
  • Reducing carbon dioxide emissions will slow the progress of ocean acidification.
  • Anthropogenic ocean acidification is currently in progress and is measurable
  • The legacy of historical fossil fuel emissions on ocean acidification will be felt for centuries.

High confidence 

  • If carbon dioxide emissions continue on the current trajectory, coral reef erosion is likely to outpace reef building some time this century.
  • Cold-water coral communities are at risk and may be unsustainable.
  • Mollusks (such as mussels, oysters and pteropods) are one of the groups most sensitive to ocean acidification.
  • The varied responses of species to ocean acidification and other stressors are likely to lead to changes in marine ecosystems, but the extent of the impact is difficult to predict.
  • Multiple stressors compound the effects of ocean acidification. 

Medium confidence

  • Negative socio-economic impacts on coral reefs are expected, but the scale of the costs is uncertain.
  • Declines in shellfisheries will lead to economic losses, but the extent of the losses is uncertain.
  • Ocean acidification may have some direct effects on fish behaviour and physiology.
  • The shells of marine snails known as pteropods, an important link in the marine food web, are already dissolving.

The summary is disappointing to many because it merely states the obvious and is another example of scientific caution muting the vital information that humanity needs to alter its path away from extinction. It is time for scientists to start showing some leadership and not water down the facts, linear data projections and logical assumptions. Even the most bold statements about climate change are proving to be wildly off the mark. Many predictions made a few years ago for decades into the future are happening now.

 

Solar Activity a Minimal Role in Climate Change

The findings, made by Professor Terry Sloan at the University of Lancaster and Professor Sir Arnold Wolfendale at the University of Durham, find that neither changes in the activity of the Sun, nor its impact in blocking cosmic rays, can be a significant contributor to global warming.

The results have been published today, 8 November, in IOP Publishing’s journal Environmental Research Letters.

Changes in the amount of energy from the Sun reaching the Earth have previously been proposed as a driver of increasing global temperatures, as has the Sun’s ability to block cosmic rays. It has been proposed that cosmic rays may have a role in cooling the Earth by encouraging clouds to form, which subsequently reflect the Sun’s rays back into space.

According to this proposal, in periods of high activity the Sun blocks some of the cosmic rays from entering the Earth’s atmosphere, so that fewer clouds form and the Earth’s surface temperatures rise.

In an attempt to quantify the effect that solar activity—whether directly or through cosmic rays—may have had on global temperatures in the twentieth century, Sloan and Wolfendale compared data on the rate of cosmic rays entering the atmosphere, which can be used as a proxy for solar activity, with the record of global temperatures going back to 1955.

They found a small correlation between cosmic rays and global temperatures occurring every 22 years; however, the changing cosmic ray rate lagged behind the change in temperatures by between one and two years, suggesting that the cause may not be down to cosmic rays and cloud formation, but may be due to the direct effects of the Sun.

By comparing the small oscillations in cosmic ray rate, which was taken from data from two neutron monitors, and temperature with the overall trends in both since 1955, Sloan and Wolfendale found that less than 14 per cent of the global warming seen during this period could be attributable to solar activity.

Furthermore, the researchers reviewed their own previous studies and surveyed the relevant literature to find other evidence of a link between solar activity and increasing global temperatures existing. Their findings indicated that overall, the contribution of changing solar activity, either directly or through cosmic rays, was even less cannot have contributed more than 10 per cent to global warming in the twentieth century.

They concluded that the paleontological evidence, derived from carbon and oxygen isotopes, was “weak and confused” and that a more-up-to-date study linking cosmic rays with low-level cloud cover was flawed because the correlation only occurred in certain regions rather than the entire globe.

Sloan and Wolfendale also discussed the results from the CLOUD experiment at CERN, where researchers are looking at ways in which cosmic rays can ionize, or charge, aerosols in the atmosphere, which can then influence how clouds are formed. They also examined instances where real-world events produced large-scale ionization in the atmosphere.

Events such as the Chernobyl nuclear disaster and nuclear weapons testing would have been expected to have affected aerosol production in the atmosphere, but no such effects could be seen.

Professor Sloan said: “Our paper reviews our work to try and find a connection between cosmic rays and cloud formation with changes in global temperature.

“We conclude that the level of contribution of changing solar activity is less than 10 per cent of the measured global warming observed in the twentieth century. As a result of this and other work, the IPCC state that no robust association between changes in cosmic rays and cloudiness has been identified.”

From Friday 8 November, this paper can be downloaded from http://iopscience.iop.org/1748-9326/8/4/045022/article

 

Topography Slows Antarctic Ice Loss

Narrow stripes of dirt and rock beneath massive Antarctic glaciers create friction zones that slow the flow of ice toward the sea, researchers at Princeton University and the British Antarctic Survey have found. Understanding how these high-friction regions form and subside could help researchers understand how the flow of these glaciers responds to a warming climate.

Just as no-slip strips on flooring prevent people from slipping on a wet floor, these ribs or “tiger stripes” — named in reference to Princeton’s tiger mascot — provide friction that hinders the glaciers from slipping along the underlying bed of rock and sediment, the researchers report online in the journal Science.

Researchers at Princeton University and the British Antarctic Survey used mathematical modeling and data from satellites and ground-penetrating radar to infer the existence of stripes or ribs (in red) indicating areas of high friction between the glacier and the underlying bedrock. These high-friction ribs slow the movement of ice toward the sea. The image on the left is the Pine Island Glacier and the image on the right is the Thwaites Glacier, both in West Antarctica. (Image courtesy of Olga Sergienko, Program in Atmospheric and Oceanic Sciences)

The researchers discovered these tiger stripes, which occur in large, slippery regions under the glaciers, using mathematical modeling based on data from the National Snow and Ice Data Center and the British Antarctic Survey. The work was conducted by Olga Sergienko, an associate research scientist in Princeton’s Program in Atmospheric and Oceanic Sciences, and Richard Hindmarsh, a scientist at the British Antarctic Survey.

Researchers would like to understand what factors determine the flow of glaciers, which are massive, moving ice sheets that, when they flow into the ocean, can contribute substantially to sea-level rise. The researchers studied two glaciers, the Pine Island Glacier and the Thwaites Glacier in West Antarctica, which together contribute about 10 percent of the observed sea-level rise over the past 20 years, despite their small areas. The Pine Island Glacier moves at a velocity of about 1.5 miles per year, according to the researchers.

Studying the bottom of these glaciers is next to impossible due to the inability to see through the ice, which is over a mile-and-a-half thick. Instead, the researchers used satellite measurements of the ice velocity and ground-penetrating radar collected from airplane flyovers to detect bedrock and surface topography, as well as field observations. Using the data, Sergienko created a mathematical model that calculated what happens inside the glacier as it flows along the bedrock. The model predicted the formation of the tiger stripes or ribs, which Hindmarsh had theorized some years earlier.

The friction at the interface of the bedrock and glacier ice is a major factor in the speed of a glacier, Sergienko said. When friction is high, the glacier moves slowly. When friction is low, as when melting ice provides a liquid layer that allows the ice to slide over the bedrock, the glacier moves more quickly.

The tiger stripes, which the researchers also call ribs due to their slightly curved structure, lie at roughly 30-degree angles to the direction of the glacier’s movement. These ribs arise and decay in response to natural processes over roughly 50 to 100 years, according to the researchers’ calculations. The process is strongly affected by how water, which comes from ice melting due to the inherent heat trapped in the Earth, infiltrates the space between the ice sheet and the bedrock, the researchers found.

“The ribs may play an important role in buffering the effects of a warming climate, since they slow the movement of ice that reaches the ocean and contributes to sea-level rise,” said Sergienko. “These changes can happen independently of climate change, too,” she added.

More investigations are needed to verify models of rib formation, according to the researchers. “Our guess is that these ribs are related to typical landforms that exist in the formerly glaciated areas of North America and Europe,” said Hindmarsh. “A great example are the drumlins — raised areas of soil and rock — that make the hills in Seattle or Glasgow,” he said.

The study reveals new patterns of friction that help control the speed of ice flow and determine the effect of Antarctic ice on sea level, according to Douglas MacAyeal, a professor of glaciology at the University of Chicago who was not involved in the work. “This is strongly suggestive of a new style of physical controls over friction, like water flow in the thin zone between the rock of the bed and the ice,” he said. “The results of this study will drive new theoretical and observational efforts to understand what causes this pattern.”

The study, “Regular patterns in frictional resistance of ice-stream beds seen by surface data inversion,” was published online in the journal Science on Nov. 7. The research was supported by National Science Foundation grant CMG-0934534 and the British Antarctic Survey Polar Science for Planet Earth program.