Amazon Deforestation Decreases but Still High

According to official data released by the Brazilian environmental ministry deforestation rates dropped by 18%  from August 2013 to July 2014 when 4,848 km2 of the Amazon rainforest was destroyed, compared to the previous year in which deforestation increased 29% between August 2012 and July 2013 to 5,871 km2.

The recent decrease in deforestation is an indication that agreements like the Soy Moratorium and the Cattle Agreement continue to mitigate the continuous expansion of the powerful Brazilian agribusiness sector in the Amazon. Recently, the landmark Soy Moratorium was renewed for another 18 months. This voluntary agreement between global soy traders, the Brazilian government and civil society is credited with contributing to the reduction of deforestation in the Amazon. Corruption with the Brazilian government ensures that laws against deforestation are seldom enforced.

Zero Deforestation is necessary to protect the Amazon rainforest and to reduce the threats of climate change. Deforestation in the Amazon is a principal cause of greenhouse gases emissions and is already being linked to extreme climate events.

The return of a decrease in deforestation rates is proof that Brazil can continue to grow without increasing deforestation rates. Brazil is proving that it is possible to decrease deforestation and related CO2 emissions. Now the government needs to ensure deforestation reduction continues to Zero Deforestation and adopt an energy matrix based on renewable energy such as solar and wind.

Consumers can do their part to reduce the deforestation by adopting a healthy non-GMO plant-based diet. Most of the deforestation is done to grow soy to feed animals which are then killed and fed to humans, which is an extremely inefficient use of land. It can take 8-16 kilos of soy or corn to produce 1 kilo of meat.

The soy grown in the Amazon is almost all genetically modified and requires massive amounts of glyphosate herbicide, chemical fertilizers and pesticides. The proteins in GE soy can be toxic and create a wide range of disease in animals and humans. The genetic inserts can also be unstable and the inserted genes can transfer to human gut bacteria, creating a wide range of adverse conditions in the human body.

Glyphosate causes DNA damage to microbes in the soil which results in mutations that not only radically alters soil biology but new pathogens that emerge from the soil and infect wildlife. When ingested by humans glyphosate causes DNA damage that leads to cancer, organ failure, birth defects and mutation of intestinal bacteria.

Glyphosate is linked to a wide range of human ailments and many scientists have demanded that it be banned.

Republicans Continue Anti-Science Agenda

House Republicans continued their attack on science-based protections for public health and the environment today, passing on a largely party-line vote a bill that would bar the Environmental Protection Agency from using certain peer-reviewed health research when developing standards to protect public health.

Scott Slesinger, legislative director for the Natural Resources Defense Council, made the following statement:

“The bill would deny EPA the ability to rely upon peer-reviewed medical studies that involve commitments to patient confidentiality when the agency carries out its statutory responsibilities to safeguard public health and the environment.  This is a backdoor way to make it harder to protect the public from toxic threats and should not stand.”

The action came a day after a similarly partisan House vote on another anti-science bill that would put even more industry representatives on the EPA’s already corrupted independent Science Advisory Board and make other changes that would compromise the board’s scientific integrity and independence. The White House has threatened to veto both bills.

Fracking Steals & Contaminates Desperately Needed Water

The oil and gas industry insists that hydraulic fracturing of natural gas and oil wells does not threaten America’s water supplies. But a new report by Environmental Working Group finds that hundreds of “monster wells” across the country were fracked with 10 to 25 million gallons of water each – and many that used the most water were in drought-stricken areas.

The report, titled Monster Wells, found that between April 2010 and December 2013, a total of more than 3.3 billion gallons of water were used to drill 261 such wells, with the greatest number in Texas, Pennsylvania and Colorado. About two-thirds were drilled in areas suffering from drought, depleting scarce freshwater resources and threatening to pollute groundwater.

“The amount of water used in these wells is staggering,” said Bill Walker, coauthor of the report and EWG consultant. “The water used to frack a single monster well could meet the water needs of a drought-stricken county in Texas twice over.”

The U.S. Environmental Protection Agency says the average amount of water used to frack a well is 50,000 to 5 million gallons. But EWG’s analysis found that is hardly the upper limit.

The industry typically downplays the amount of water used, the resource depletion it causes in drought areas and the risk of polluting groundwater. Since drilling companies often are not required to reporting their water use, there are likely an unknown number of other undisclosed monster wells across the country.

“It’s time we give towns, cities and counties the right to make decisions about how their resources are used, especially in drought areas,” Soren Rundquist, EWG’s landscape and remote sensing analyst, who coauthored the report, said. “That means the industry must be required to report – for every well – how much water was used, the source of the water and how it was disposed of.”

The impact on water supplies does not end once a well is fracked. When millions of gallons of water laced with chemicals and sand are injected into underground rock formations to free trapped gas or oil, huge quantities of contaminated water come back up. Because of the high costs and technological challenges of treating this water, most of it is re-injected into deep disposal wells, which can leak and pollute groundwater.

The report recommends:

  • State or local authorities should require oil and gas companies to obtain water use permits for each well they drill. Applications for permits should disclose not only the amount of water to be used but its source and type and how it will be recycled or disposed of.
  • State and local authorities should have authority to deny or limit permits for wells they judge to require excessive amounts of water.
  • In times of officially declared drought, oil and gas drilling operations should be subject to the same kind of water use restrictions imposed on citizens, farmers, communities, recreational activities and other industries. Ensuring access to clean, safe, affordable drinking water should always be the top priority.
  • To improve reporting and tracking of water use, the industry-operated FracFocus website must be replaced with an independent database, overseen by the EPA and modeled on its Toxics Release Inventory.

The full report, which includes the location and other data for every monster well, can be found here.

Big Waves Emerges in the Arctic Ocean

An ice-covered expanse of Arctic Ocean has now a season of increasingly open water that is predicted to extend across the whole Arctic before the middle of this century. Storm thus have the potential to create Arctic swell – huge waves that could add a new and unpredictable element to the region.

A University of Washington researcher made the first study of waves in the middle of the Arctic Ocean, and detected house-sized waves during a September 2012 storm. The storm created waves of 5 meters (16 feet) high during the peak of the storm. The research also traces the sources of those big waves: high winds, which have always howled through the Arctic, combined with the new reality of open water in summer. The results were recently published in Geophysical Research Letters.

Arctic ice used to retreat less than 100 miles from the shore. In 2012, it retreated more than 1,000 miles. Wind blowing across an expanse of water for a long time creates whitecaps, then small waves, which then slowly consolidate into big swells that carry huge amounts of energy in a single punch.

The size of the waves increases with the fetch, or travel distance over open water. So more open water means bigger waves. As waves grow bigger they also catch more wind, driving them faster and with more energy.

Shipping and oil companies have been eyeing the opportunity of an ice-free season in the Arctic Ocean. Yet the emergence of big waves in the Arctic could be bad news for them. “Almost all of the casualties and losses at sea are because of stormy conditions, and breaking waves are often the culprit,” Thomson said.

Bigger waves also break up the remaining summer ice floes leading to more open water. Waves breaking on the shore could also affect the coastlines, where melting permafrost is already making shores more vulnerable to erosion.

The observations were made as part of a bigger project by a sensor anchored to the seafloor and sitting 50 meters (more than 150 feet) below the surface in the middle of the Beaufort Sea, about 350 miles off Alaska’s north slope and at the middle of the ice-free summer water. It measured wave height from mid-August until late October 2012.

Thomson will be out on Alaska’s northern coast from late July until mid-August deploying sensors to track waves and hopes to understand the physics of the sea-ice retreat and the cause of wave heights.

 

Burning Affects Global Warming and Health

It was now comprehensively quantified by the new study of Stanford University Civil and Environmental Engineering Professor Mark Z. Jacobson that burning biomass is playing a much bigger role in climate change and human health issues than previously thought.

Carbon emissions due to human activities are the contributors to global warming. Excluding biomass burning, coal-fired power plants, automobile emissions, concrete factories, cattle feedlots and others stand at more than 39 billion tons annually.

Jacobson, the director of Stanford’s Atmosphere/Energy Program and a senior fellow at the Woods Institute for the Environment and the Precourt Institute for Energy, said almost 8.5 billion tons of atmospheric carbon dioxide – or about 18 percent of all anthropogenic carbon dioxide emissions –comes from biomass burning.

But Jacobson’s research also demonstrates that black carbon and brown carbon coming from biomass burning maximize the thermal impacts of such fires. They cause much more global warming per unit weight that the other human-associated carbon source.

Black and brown carbon particles increase atmospheric warming in three ways. First, they enter the water droplets that form clouds. When sunlight penetrates a water droplet containing black or brown carbon particles, the carbon absorbs the light energy creating heat. Carbon particles floating around in the spaces between the droplets also absorb scattered sunlight, converting also into heat.

Heating the cloud reduces the relative humidity in the cloud thus causes the cloud to dissipate. And because the clouds reflect sunlight, cloud dissipation causes more sunlight to transfer to the ground and seas, ultimately resulting in warmer ground and air temperatures.

Finally, carbon particles released from burning biomass settle on snow and ice which reflect sunlight effectively contributes to further warming. The carbon particles absorb heat therefore melting the snow and ice and expose the dark soil and dark seas.

Although, biomass burning which includes the combustion of agricultural and lumber waste is often promoted as a “sustainable” alternative to burning fossil fuels but the effects cannot be ignored.

Exposure to biomass burning particles also affects human health. It was calculated that 250,000 people die of premature deaths every year.

 

Climate Change is Fueling Forest Disturbances

Forest Disturbance DataClimate change is already altering the environment. Long-lived ecosystems such as forests are particularly vulnerable to the comparatively rapid changes in the climate system. A recent international study published in Nature Climate Change shows that damage from wind, bark beetles, and wildfires has increased drastically in Europe’s forests in recent years. “Disturbances like windthrow and forest fires are part of the natural dynamics of forest ecosystems, and are not, therefore, a catastrophe for the ecosystem as such. However, these disturbances have intensified considerably in recent decades, which increasingly challenges the sustainable management of forest ecosystems”, says Rupert Seidl, BOKU Vienna, the principal researcher involved in the study.

The authors show that damage caused by forest disturbance has increased continuously over the last 40 years in Europe, reaching 56 million cubic meters of timber per year in the period 2002-2010. Scenario analyses for the coming decades also suggest a continuation of this trend: the study estimates that forest disturbances will increase damage by another million cubic meters of timber every year over the next 20 years.  This increase amounts to the approximate timber volume stocking on a forest area corresponding to 7,000 soccer fields. The scientist identified climate change as the main driver behind this increase: under assumed stable climatic conditions no substantial further increases in forest disturbances beyond the current levels were found in their simulations. Damage from forest fires was particularly estimated to increase on the Iberian Peninsula, while bark beetle damage increased most strongly in the Alps. Wind damage would be seen to rise most notably in Central and Western Europe.

Increasing disturbances amplify climate change

There is a strong feedback from forest disturbances on the climate system. Currently, Europe’s forests are mitigating climate change by taking up large quantities of the greenhouse gas carbon dioxide. The carbon loss from increasing tree mortality and disturbance could, however, reduce this uptake and reverse the positive effects of forest management aimed at reducing climate change. The climate-induced increase in forest disturbance could thus further amplify the progression of climate change. In this respect, adapted management strategies, such as increased in biodiversity and optimized thinning interventions in Europe’s forest, can buffer these carbon losses and support the climate change mitigation function of forests. Europe’s forest management will thus need to adapt to changing disturbances in order to keep sustaining the diverse set of ecosystem services provided to society in the future, the study concludes.

Climate Relicts May Help Researchers Understand Climate Change

While hiking through the Ozarks’ characteristic oak and hickory forests as a teenager, ecologist Scott Woolbright discovered something decidedly uncharacteristic for the region: prickly pear cacti growing on an exposed, rocky ledge.

In a recent paper published in Trends in Ecology and Evolution, Woolbright describes how populations and communities like these, known as climate relicts, can help scientists understand how ecological communities are affected by climate change.

Rocky, well-drained slopes in the Ozarks often create habitat “islands” within the surrounding forest known as glade ecosystems, said Woolbright, who is a postdoctoral fellow at the Institute for Genomic Biology (IGB) in the Genomic Ecology of Global Change research theme. In the Ozarks, glades often help to preserve isolated communities of cacti and other desert and prairie species that dominated the area during the Hypsithermal, a period of warming that occurred four to eight thousand years ago.

Ecologists have recently begun to discuss climate relicts as potential “natural laboratories” for studying the evolution of single plant species. Woolbright and co-authors suggest expanding such studies to include interactions between plants and other organisms that can drive community and ecosystem patterns.

It can be very difficult to replicate the long-term effects of climate change over very large geographic areas in the laboratory or field. But isolated climate relicts that are distributed across landscapes create “natural experiments” that help to overcome these problems of scale.

Using the genomic technologies he’s learned at the IGB, Woolbright hopes to develop a research program that investigates climate-driven changes in species interactions at the gene level. While such a program would contribute to basic community and ecosystem research, it also has significant implications for ecological conservation and restoration.

“We’re learning that you often can’t just go out and preserve a single species,” Woolbright said. “Interactions with other species can play very important roles in species survival. If we don’t take those interactions into account, we can miss things that are really important.”

Many climate relicts are threatened by small population size, ongoing environmental change in already stressful environments, invasions from species in adjacent non-relict communities, and human encroachment.

Woolbright said it will take the cooperation of many stakeholders to conserve relicts for their historical, ecological and aesthetic value.

Thomas Whitham, Catherine Gehring, and Gerard Allan from Northern Arizona University as well as Joseph Bailey from the University of Tennessee were co-authors in this study.

The IGB’s fellows program supported Woolbright, who was inspired to pursue a career in climate change ecology by his encounter with Ozark glades.

Eating Less Meat – A Solution to Reduce Water Use

Researchers at Aalto University have found that eating less meat would protect water resources especially in dry areas around the world as meat production requires more water than other agricultural products.

Growing population and climate change are likely to increase the pressure on already limited water resources and diet change has been suggested as one of the measures contributing to adequate food security for growing population.

The researchers assessed the impact of diet change on global water resources over four scenarios, where the meat consumption was gradually reduced while diet recommendations in terms of energy supply, proteins and fat were followed. The study published in Environmental Research Letters is the first global-scale analysis with a focus on changes in national diets and their impact on the blue and green water use of food consumption.

Food supply for growing population

Global population is expected to exceed 9 billion by 2050, adding over 2 billion mouths to be fed to the current population, according to the UN. By reducing the animal product contribution in the diet, global green water (rainwater) consumption decreases up to 21% while for blue water (irrigation water) the reductions would be up to 14%. In other words, by shifting to vegetarian diet we could secure adequate food supply for an additional 1.8 billion people without increasing the use of water resources. . The potential savings are, however, distributed unevenly, and even more important, their potential alleviation on water scarcity varies widely from country to country.

Regional differences

The researchers at Aalto University found substantial regional differences in diet change potential to reduce water use. In Latin America, Europe, Central and Eastern Asia, and Sub-Saharan Africa, diet change reduces mainly green water use. In Finland, for example, turning into a meat free diet would decrease the daily green water use of a Finn over 530 litres but at the same time resulting nearly 50 litres increase in blue water use. In the Middle East region, North America, Australia and Oceania, also blue water use would decrease considerably. In South and Southeast Asia, on the other hand, diet change does not result in savings in water use, as in these regions the diet is already largely based on a minimal amount of products.

 

Northern Pacific’s Anoxic Zone May Diminish Due to Climate Change

A commonly held belief that global warming will diminish oxygen concentrations in the ocean looks like it may not be entirely true. According to new research published in Science magazine, just the opposite is likely the case in the northern Pacific Ocean, with its anoxic zone expected to shrink in coming decades because of climate change. However, the warmer the water the less gas it contains.

An international team of scientists came to that surprising conclusion after completing a detailed assessment of changes since 1850 in the eastern tropical northern Pacific’s oxygen minimum zone (OMZ). An ocean layer beginning typically a few hundred to a thousand meters below the surface, an OMZ is by definition the zone with the lowest oxygen saturation in the water column. OMZs are a consequence of microbial respiration and can hostile environments for marine life.

Using core samples of the seabed in three locations, the scientists measured the isotopic ratio of nitrogen-15 to nitrogen-14 in the organic matter therein; the ratio can be used to estimate the extent of anoxia in these OMZs. The core depth correlates with age, giving the team a picture of how the oxygen content varied over the time period.

From 1990 to 2010, the nitrogen isotope record indicates that oxygen content steadily decreased in the area, as expected. But before that, and particularly clearly from about 1950 to 1990, oceanic oxygen steadily increased, which, according to co-author Robert Thunell, a marine scientist at the University of South Carolina, runs counter to conventional wisdom.

“The prevailing thinking that has been that as the oceans warm due to increasing atmospheric greenhouse gases, the oxygen content of the oceans should decline, ”Thunell says.”That’s due to two very simple processes.

“One, as water becomes warmer, the solubility of oxygen decreases in it, so it can hold less oxygen. And two, as the surface of the ocean warms, its density decreases and the oceans become more stratified. When that happens, the surface waters that do have oxygen don’t mix down into the deeper waters of the ocean.”

But that just covers the supply side of oxygen in the ocean. Thunell say. Just as important is the oxygen demand, particularly for the degradation of sinking organic matter.

Phytoplankton grow in surface waters, and they are the primary producers of this organic matter. After they die, their detritus slowly sinks from the surface to the sea floor, and there is a layer in the water column, the OMZ, where microbes consume much of the detritus, a process that depletes oxygen through bacterial respiration.

The extent of oxygen deprivation in the OMZ largely reflects how much phytoplankton is being produced on the surface, Thunell says. Plenty of phytoplankton production at the surface means less oxygen underneath.

And that, the team thinks, is why the oxygen concentrations in the Pacific Ocean so clearly increased from 1950 to 1990. Phytoplankton production is enhanced by strong winds (because the cause upwelling of nutrients from deeper waters) and diminished by weaker winds, and the scientists found evidence that trade winds were weaker then.

Looking at two different measures of wind intensity (the East-West difference in sea level pressure and the depth of the thermocline) over the time periods involved, they conclude that trade winds were diminishing over the course of 1950 to 1990, but then picked up from 1990 to 2010.

They’re not sure why wind strength increased around 1990, but think it may be related to the Pacific Decadal Oscillation. “A lot of people are familiar with ENSO, or El Nino, which is a kind of interannual climate variability,” Thunell says. “The Pacific Decadal Oscillation is analogous to a super-ENSO, but one that’s varying on decadal time scales.”

Over the course of coming decades, though, trade wind speed is expected to decrease from global warming, Thunell says, and the result will be less phytoplankton production at the surface and less oxygen utilization at depth, causing a concomitant increase in the ocean’s oxygen content.

“That has some important implications for fisheries,” he says. “ One of the issues over the past 20 to 30 years is that oxygen has been declining and these oxygen minimum zones have been expanding, which could have a negative impact on fisheries. “

“But if the last 20 to 30 years are not the norm because of these unusually strong trade winds, then there won’t be necessarily be that impact on the fisheries. If the trend reverses, and we go back to weaker trade winds – as people predict will happen because of the warming oceans – then the decrease in oxygen in the oceans that we’ve been seeing may be reversed.”

It’s a matter of both supply and demand.

 

 

Study Reveals Dynamics of Microbes and Nitrate

Human tampering with global carbon balances has received massive public attention because of its effects on global warming, but we play less attention to another set of chemical processes we are similarly disrupting: human input to the nitrogen cycle. Unfortunately, the story of nitrogen transformations in the biosphere is also less understood.

In modern times, human developed the technology to turn nitrogen gas in the atmosphere into a biologically available form to be used as fertilizer. Before this, bio-available or “fixed” nitrogen was only created sparingly by natural forces such as lightning and nitrogen-fixing bacteria, while some fixed nitrogen was always being eliminated by denitrifying bacteria, which returned nitrogen to the atmosphere. This change to the planet’s chemical balance has been critical because now the majority  of life-essential nitrogen comes from human, not environmental, sources. Run-off from this systematic massive input of fixed nitrogen in managed natural and engineered environments (through agriculture and aquaculture), has lead to immediate problems like eutrophication in lakes and reservoirs and the oceans (with resulting dead zones), but also has long-term effects on the atmosphere.

Like carbon, nitrogen is naturally cycled and mediated in local ecosystems and integrated environmental processes by global cycling – facilitated by microbial communities. It has been well established that microbes take biologically usable nitrate (NO₃ -) and process it in one of two critically important pathways, – denitrification, which returns it to biologically inert nitrogen gas (N₂), or ammonification, which turns nitrate into ammonium (NH₄+) and keeps the nitrogen biologically useable.

Because microbial ecosystems are vastly complex and difficult to monitor in nature, the environmental factors that determine whether microbial communities facilitate either the denitrification or the ammonification processes have been poorly understood. A variety of conditions – including temperature, pH, the carbon to nitrogen ratio, and sulfide concentration – have been suggested as potential determining elements but measurements taken directly  from the environment have yielded conflicting or incomplete evidence.

Now, by performing 15 long-term experiments with microbial community sampled from nitrate-filtering sediments found in a sandy tidal flat, researchers have been able to conclusively test the effect of controlled conditions and specific nutrient supplies on the fate of nitrate in nitrogen-cycling processes. The dynamics of structure and activity of the community were carefully monitored under controlled experimental conditions, using a suite of sophisticated bioinformatics techniques including metagenomics, transcriptomics and proteomics.

The study appeared on the issue of Science. Srous’ co-authors are Beate Kraft, Halinina E. Tegetmeyer, Timothy G. Ferdelman and Jeanine S. Geelhoed from the Max Planck Institute  for Marine Microbiology, Ritin Sharma and Robert L. Hettich from the University of Tennessee-Oak Ridge National Lab Graduate School of Genome Science and Technology, and Martin G. Klotz, Department of Biological Sciences at the University of North Carolina at Charlotte.

The researchers found that three specific initial factors were conclusively responsible for determining the denitrifiction or ammonification of the nitrate supply, regardless of other condition: the nitrite to nitrate ratio, the carbon to nitrogen ratio, and – hitherto unknown – the time it takes the microbes to duplicate themselves as a community (their “generation time”).  Further, the team found that different specific cohorts of bacteria achieved system dominance to perform the processes, depending on the set conditions.

Strous notes that the basic chemical pathways and their energetic, which, depending on conditions, allow different assemblages of bacteria to work together and do their part in the process. The three factors identified by Strous’ team’s experiments are the critical switches that determine whether it is the denitrification or the ammonification process that dominates the microbial ecosystem, in that factors select specific bacteria that succeed in getting major parts in the process.

Among the three factors the experiment identified as being critical in determining the process, the ration of the two oxidized forms of fixed nitrogen, nitrate ((NO₃ -) or nitrite (NO₂-), is the most basic because when nitrite is more abundant (nitrate can change to nitrite and vice-versa), the process that dominates is inevitably denitrification.

When nitrite is more abundant, the denitrifying bacteria dominate because they are more efficient and faster. The fundamental advantage of the speed and efficiency of the denitrification process also contributes in part to the second factor the study found- microbial generation time. Regardless of the ratio of the fixed nitrogen supply, when conditions encouraged rapid generation (generation time of less than 1.7 days), then the denitrifying bacterial groups came to dominate (either rapidly or eventually), but slow growth conditions favored ammonifying bacteria.

The same kind of fundamental chemical energy reasoning seems to help explain the third critical factor the team identified – the carbon to nitrogen ration.

“As has already been proposed in the literature, the carbon to nitrogen ratio is an important factor because, based on the bioenergetics, carbon limitation will favor denitrification – denitrifying organisms will have a higher productivity under those conditions. Whereas the ammonifying microorganisms will have higher productivity when the nitrogen is limited. That follows from the way these organisms respire these compounds,” Strous said.

“We understand this better now and this understanding is not trivial because the outcome of this competition is really important in determining primary productivity – because ammonium form ammonification is directly re-usable by the primary producers and with denitrification the nitrogen is mainly lost. So this hard-to-study bacterial competition is really important in determining the primary productivity. In any modeling you do on how human impact causes global change, this is a very important piece of the puzzle.”

Klotz stress how the study shows the importance of having new “omics” tools – metagenomics, transcriptomics, proteomics, etc. – to help study in real time the metabolic activity and changes going on in complex bacterial communities.