Government, council and retailer-backed report says ban on landfill could save UK £17bn and heat 600,000 homes
Your new post is loading...
Your new post is loading...
Can the idea of sustainability be determined by metrics? The answer is "Of course", as any type of improvement can be measured. We understand it is far more efficient to recycle aluminum than it is to produce it the first time, which we call this value embodied energy. However, since refining represents a significant proportion…
One of the downsides to large-scale solar power is finding space suitable for the installation of a large area of PV panels or mirrors for CSP. These are long-term installations, and will have impact on the land and it's uses. There are potential objections to committing areas of undeveloped or pristine land to solar power. Solar…
Jessica Ernst's long fight to challenge legislation putting energy regulator above the law reaches top court.
Duane Tilden's insight:
"[...] After years of legal wrangling, Jessica Ernst and Alberta's powerful energy regulator finally squared off in the Supreme Court of Canada yesterday.
For almost two hours, all nine justices questioned lawyers from both sides in a case that will determine if legislation can grant government agencies blanket immunity from lawsuits based on the Charter of Rights and Freedoms.
At times the debate was so bogged down in legal jargon and little known cases that it felt as though the participants were holding a conversation in a foreign language. [...]
Ernst alleges the Alberta Energy Regulator violated her rights by characterizing her as a "criminal threat" and barring all communication with her.
The claims are part of her multipronged lawsuit related to the regulation of fracking. She says fracking contaminated aquifers near her homestead near Rosebud, about 110 kilometres east of Calgary, and is seeking $33 million in damages. [...]
The Supreme Court hearing dealt with Ernst's allegation that the provincial energy regulator denied her the right to raise her concerns about groundwater contamination. She argues that the legislation shielding the regulator from citizen's lawsuits should not bar charter claims.
Lawyers for Ernst, the BC Civil Liberties Association and the David Asper Centre for Constitutional Rights all argued that the Alberta Energy Regulator's immunity clause undermined the spirit of Canada's charter, which is designed to protect citizens from government abuses of power.
It is patently unfair to allow a government to violate a citizen's basic freedoms and then deny them an appropriate remedy in the courts, especially when the charter itself grants that right, they argued. [...]
Eight years ago, Ernst sued Alberta Environment, the Energy Resources Conservation Board (which has since become the Alberta Energy Regulator) and Encana, one of Canada's largest unconventional gas drillers. She claimed her well water had been contaminated by fracking and government agencies had failed to investigate the problems.
But the regulator argued that it couldn't be sued because it had an immunity clause that protected it from civil action.
After an Alberta Court of Appeal agreed, Ernst's lawyers appealed the matter to the Supreme Court in 2014.
Initially three provincial governments and the federal government announced their intention to intervene in the case.
"But once they looked at the arguments, they withdrew," said Murray Klippenstein, another of Ernst's lawyers, after yesterday's hearing.
"So there was no government here to support the argument of the [regulator]," added Klippenstein. "It kind of shows in a common sense sort of way how ridiculous the position is."
The case made legal history, too. "This is the first time the Supreme Court has heard a case about human rights with an environmental context," noted Lynda Collins, a professor of law at the University of Ottawa's Centre for Environmental Law and Global Studies.
She said the case concerns the right of a citizen to pinpoint environmental wrongs, such as groundwater contamination, without being penalized by a regulatory body.
Whenever a regulator allegedly takes punitive measures against a citizen addressing key environmental issues in the public interest, "you have a serious allegation," added Collins. [...]
The case is being closely watched by Canada's oil and gas industry. In 2014, Borden Ladner Gervais, Canada's largest national full-service law firm, included the Ernst case in a top 10 list of important judicial decisions affecting the energy industry.
"The Ernst case has brought into focus the potential for regulator or provincial liability arising out of oil and gas operations.... If Ernst proceeds to trial, it will likely provide more guidance on the scope of the duty of care and the standard of care required by the province and the oil and gas operator to discharge their duties in the context of hydraulic fracturing."
The fracking industry has been the subject of scores of lawsuits across North America. Landowners have sued over property damage and personal injury related to industry-caused earthquakes, air pollution and the contamination of groundwater.
In one major Texas case, a jury awarded one family $3 million. The verdict found that Aruba Petroleum "intentionally created a private nuisance" though its drilling, fracking and production activities at 21 gas wells near the Parrs' Wise County home over a three-year period between 2008 and 2011. [...]"
The natural gas leak from a storage facility in the hills above Los Angeles is shaping up as a major ecological disaster as it vents large amounts of methane, a potent greenhouse gas.
Duane Tilden's insight:
A runaway natural gas leak from a storage facility in the hills above Los Angeles is shaping up as a significant ecological disaster, state officials and experts say, with more than 150 million pounds of methane pouring into the atmosphere so far and no immediate end in sight.
The rupture within a massive underground containment system — first detected more than two months ago — is venting gas at a rate of up to 110,000 pounds per hour, California officials confirm. The leak already has forced evacuations of nearby neighborhoods, and officials say pollutants released in the accident could have long-term consequences far beyond the region.
Newly obtained infrared video captures a plume of gas — invisible to the naked eye — spouting from a hilltop in the Aliso Canyon area above Burbank, like smoke billowing from a volcano. Besides being an explosive hazard, the methane being released is a powerful greenhouse gas, more potent than carbon dioxide in trapping heat in the lower atmosphere.
Scientists and environmental experts say the Aliso Canyon leak instantly became the biggest single source of methane emissions in all of California when it began two months ago. The impact of greenhouse gases released since then, measured over a 20-year time frame, is the equivalent of emissions from six coal-fired power plants or 7 million automobiles, environmentalists say. [...]
Methane is a potent greenhouse gas with a global warming potential more than 25 times that of carbon dioxide. Climate change threatens the health and welfare of current and future generations. Children, older adults, people with heart or lung disease and people living in poverty may be most at risk from the health impacts of climate change. In addition to methane, landfills also emit other pollutants, including the air toxics benzene, toluene, ethylbenzene and vinyl chloride.
Image Source: http://www.environmentalleader.com/
Duane Tilden's insight:
>"Release Date: 08/14/2015
When it comes to human-caused climate change, urban warming is a big player.
Duane Tilden's insight:
>"Perhaps no other climatic variable receives more attention in the debate over CO2-induced global warming than temperature. Its forecast change over time in response to rising atmospheric CO2 concentrations is the typical measure by which climate models are compared. It is also the standard by which the climate model projections tend to be judged; right or wrong, the correctness of global warming theory is most often adjudicated by comparing model projections of temperature against real-world measurements. And in such comparisons, it is critical to have a proper baseline of good data; but that is easier acknowledged than accomplished, as multiple problems and potential inaccuracies have been identified in even the best of temperature data sets.
One particular issue in this regard is the urban heat island effect, a phenomenon by which urban structures artificially warm background air temperatures above what they normally would be in a non-urbanized environment. The urban influence on a given station’s temperature record can be quite profound. In large cities, for example, urban-induced heating can be as great as Tokyo’s 10°C, making it all the more difficult to detect and discern a CO2-induced global warming signal in the temperature record, especially since the putative warming of non-urbanized areas of the planet over the past century is believed to be less than 1°C. Yet, because nearly all long-term temperature records have been obtained from sensors initially located in towns and cities that have experienced significant growth over the past century, it is extremely important that urbanization-induced warming – which can be a full order of magnitude greater than the background trend being sought – be removed from the original temperature records when attempting to accurately assess the true warming (or cooling!) of the natural non-urban environment. A new study by Founda et al. (2015) suggests this may not be so simple or straightforward a task.
Working with temperature records in and around the metropolitan area of Athens, Greece, Founda et al. set out to examine the interdecadal variability of the urban heat island (UHI) effect, since “few studies focus on the temporal variability of UHI intensity over long periods.” Yet, as they note, “knowledge of the temporal variability and trends of UHI intensity is very important in climate change studies, since [the] urban effect has an additive effect on long term air temperature trends.”
Such findings as these are of significant relevance in climate change studies, for they clearly indicate the UHI influence on a temperature record is not static. It changes over time and is likely inducing an ever-increasing warming bias on the temperature record, a bias that will only increase as the world’s population continues to urbanize in the years and decades ahead. Consequently, unless researchers routinely identify and remove this growing UHI influence from the various temperature data bases used in global change studies, there will likely be a progressive overestimation of the influence of the radiative effects of rising CO2 on the temperature record. "<
Japan is due to switch on a nuclear reactor for the first time in nearly two years on Tuesday, as Prime Minister Shinzo Abe seeks to reassure a nervous public that tougher standards mean the sector is
Duane Tilden's insight:
Abe and much of Japanese industry want reactors to be restarted to cut fuel imports, but opinion polls show a majority of the public oppose the move after the nuclear crisis triggered by the earthquake and tsunami in March 2011.
In the worst nuclear disaster since Chernobyl 25 years earlier, the meltdowns at the Fukushima Daiichi plant caused a release of radioactive material and forced 160,000 from their homes, with many never to return.
The crisis transfixed the world as the government and the Fukushima operator, Tokyo Electric Power (Tepco), fumbled their response and took two months to confirm that the reactors had undergone meltdowns.
Kyushu Electric Power said it aimed to restart its No. 1 reactor at its Sendai plant at 0130 GMT on Tuesday (2130 ET on Monday).
The plant on the west coast of Kyushu island is the furthest away of Japan's reactors from Tokyo, where protesters regularly gather outside Abe's official residence to oppose atomic energy.
At nearly 1,000 km (600 miles) from the capital, Sendai is closer to Shanghai or Seoul.
A successful restart would mark the culmination of a process whereby reactors had to be relicensed, refitted and vetted under tougher standards that were introduced following the disaster.
While two reactors were allowed to restart for one fuelling cycle in 2012, the whole sector has been shut down since September 2013, forcing Japan to import record amounts of expensive liquefied natural gas.
As well as cutting energy costs, showing it can reboot the industry safely is crucial for Abe's plans to export nuclear technology, said Malcolm Grimston, a senior research fellow at Imperial College in London.
"Japan also has to rehabilitate itself with the rest of the world's nuclear industry," said Grimston.
At the Sendai plant, Kyushu Electric expects to have power supply flowing within a few days if all goes to plan. It aims to start the station's No. 2 unit in October.
The head of Japan's atomic watchdog said that the new safety regime meant a repeat of the Fukushima disaster would not happen, but protesters outside the Sendai plant are not convinced.
"You will need to change where you evacuate to depending on the direction of the wind. The current evacuation plan is nonsense," said Shouhei Nomura, a 79-year-old former worker at a nuclear plant equipment maker, who now opposes atomic energy and is living in a protest camp near the plant.
Of Japan's 25 reactors at 15 plants for which operators have applied for permission to restart, only five at three plants have been cleared for restart. [...]"<
CHICAGO, Aug. 4, 2015 /PRNewswire/ -- According to a new study ... energy efficiency is recognized among U.S. higher education institutions as key to fulfilling their schools' core mission, with almost 9 out of 10 respondents expecting to increase or maintain energy efficiency investments next year.
Photo: Lillis Complex, University of Oregon
Duane Tilden's insight:
>" [...] Eighty-eight percent of respondents also agree that energy efficiency is the most cost effective way to meet their energy needs while at the same time reducing greenhouse gas emissions and cutting costs.
The biggest factor driving schools' energy efficiency efforts is cost savings, according to the survey conducted with higher education facility leaders, with environmental benefits and industry standards rounding out the top three reasons for becoming more energy efficient. However, obstacles exist to achieving these objectives. While 92 percent of respondents stated that their school had a culture that encourages energy efficiency practices, organizational barriers are challenging their ability to achieve efficiency goals. Fifty-nine percent view this as the biggest obstacle, with insufficient funding and lack of a clear definition of success also ranking highly.
Another factor impacting institutions is aging infrastructure, with 59 percent indicating that the average age of their buildings exceeds 15 years, and only one in five reporting that the average age of their building is below 10 years. As facility leaders look to upgrade existing buildings, compatibility with new technology ranks as most important when considering making an investment. Compatibility with legacy systems outranked quality of the product and technology advancements of the solution.
"A majority of the higher education buildings that stand today are expected to be in operation for the next few decades," said Tara Canfield, Segment Director, Education and Commercial Office Buildings at Schneider Electric. "Tremendous opportunities exist to improve energy efficiency and reduce waste in these existing buildings. In particular, by integrating building systems, facility managers can view energy use from a single interface, identify long-term opportunities for savings and continuously optimize their facility to yield the highest levels of efficiency over time. This integration also enables organizations to better use data from the Internet of Things, turning building insights into meaningful action that will improve operations." [...]
This survey was conducted by Redshift Research in June 2015 among 150 U.S. facilities leaders in higher educational establishments. Respondents have responsibility related to purchasing energy solutions, and their biggest responsibilities included facility management and operations management. Results of any sample are subject to sampling variation. [...]"<
Nexen Energy apologized Friday for a major leak in an Alberta pipeline that was only installed last year and said a warning system failed to detect it.
Duane Tilden's insight:
>" [...] A contractor discovered the leak Wednesday about 35 kilometres southeast of Fort McMurray, Alta. Nexen shut down the pipeline soon after, but not before some five million litres of bitumen, produced water and sand spilled into muskeg.
Nexen, which was taken over by China’s CNOOC Ltd. in 2013, says the affected area is about 16,000 square metres, mostly along the pipeline’s route. [...]
John Bennett, national program director of the Sierra Club Canada Foundation, said he was worried.
“We’re always concerned when petroleum products get spilled into the environment. There’s always damage, and it’s usually permanent of some nature,” said Bennett. “It’s full of toxic elements that should not be released into the environment.” "<
The warming effects of aircraft vapor trails could be eased with fewer night flights, especially during winter, the report says.
Duane Tilden's insight:
Nicola Stuber, first author of the study, to be published in tomorrow's edition of the journal Nature, suggests that contrails' overall impact on climate change is similar in scope to that of aircrafts' carbon dioxide (CO2) emissions over a hundred-year period.
Aircraft are believed to be responsible for 2 to 3 percent of human CO2 emissions. Like other high, thin clouds, contrails reflect sunlight back into space and cool the planet.
However, they also trap energy in Earth's atmosphere and boost the warming effect, the study says. [...]
Contrails are artificial clouds that form around the tiny aerosol particles in airplane exhaust.
They appear only in moist, very cold (less than 40ºF/4ºC) air—usually at altitudes of 5 miles (8 kilometers) or higher.
Some contrails can last for a day or longer, though they gradually disperse and begin to resemble natural clouds.
Contrails Mystery Scientists disagree about the extent of contrails' climate impact.
"The jury is out on the impact of contrails," said Patrick Minnis, an atmospheric scientist at NASA's Langley Research Center in Langley, Virginia.
David Travis, a climatologist at the University of Wisconsin-Whitewater, notes that some recent studies suggest that contrails have little impact on global climate change but have a greater regional warming impact.
"I prefer to think of contrails as a regional-scale climate problem, as they are most common in certain regions of the world, such as western Europe, eastern and central U.S., and parts of eastern Asia," he said.
"This is due to a combination of dense air traffic in these areas and favorable atmospheric conditions to support contrail persistence once they form."
Because of their locations and short life spans, contrails are a difficult study subject.
"The greatest impediment to understanding the contrail impacts on weather and climate is the poor state of knowledge of humidity in the upper troposphere [3.8 to 9.3 miles/6 to 15 kilometers in altitude]," NASA's Minnis said.
"Until we can measure it properly and extensively, and model it and its interaction with cirrus clouds and contrails, we will continue to have large uncertainties about the effect of contrails."
Winter is Contrail Season
At the high altitudes favored by commercial airlines, the air is much more humid in winter, so contrails are twice as likely in that season, study co-author Stuber said.
"We also found that flights between December and February contribute half of the annual mean climate warming, even though they account for less than a quarter of annual air traffic," she said of her U.K.-based research.
Study leader Piers Forster, of England's University of Leeds, suggests that contrails' current impact on the atmosphere is likely to increase as air traffic grows. [...]"<
Thyne says he’s not the only one who’s been subjected to undue pressure from the oil and gas industry. He says he knows of faculty around the nation who have been targeted as well, including an engineer at Cornell University who called for an outright fracking ban in his state.
Sales of smart building technologies almost could triple to $17.4 billion between 2014 and 2019. That’s driving a flood of investment from corporations and venture capitalists alike.
Duane Tilden's insight:
>" [...] As of this week, you can add cloud software company Lucid to the list of energy-efficiency startups — particularly those that monitor building power consumption for lighting and climate-control systems — attracting substantial cash infusions this year.Among those contributing to the $14.2 million Series B round disclosed by Lucid this week: GE Ventures, Autodesk, Formation 8 and Zetta Venture Partners.
Lucid plans to use the new funds for enhancements to BuildingOS, a cloud service that analyzes data from more than 160 hardware and software building technologies.
“Lucid’s technology is rapidly connecting many disparate building systems together, making the vision of truly connected buildings and real-time management possible,” said Ben Sampson, an associate with GE Ventures.
Its reference accounts include Genentech, along with more than a half-dozen educational institutions such as Cornell University and Stanford University.
Lucid joins a respectable list of companies attracting private capital this year, as businesses and organizations become more comfortable with gathering data from the Internet of Things.
Research firm Mercom Capital Group reports that startups focused on smart grid and energy efficiency raised more than $325 million in the first quarter.
Two deals last quarter that explicitly focused on building management or analytics: Blue Pillar, which scored a $14 million deal after more than 250 deployments; and Enbala Power Networks, which raised $11 million.
All told, the last year has been incredibly active in the sector, reaching $944 million in 2014. Those investments covered more than 111 deals at a time when the broader field of cleantech has suffered a decline in available capital, according to a separate report from Lux Research.
“While cleantech is declining from its peak of 291 deals in 2008, building energy deals have risen steadily since then, growing by 208 percent over the same period,” Lux wrote in its presentation about funding trends.
One of the more notable deals over the past two years was Distech Controls, which raised about $37 million in May 2013. [...]Why so active?
The spike in funding reflects the rather bullish revenue projects for building energy management technologies over the next decade. Depending on how broadly you view the market, projections vary dramatically.
If you focus just on building energy management, revenue is likely to reach around $2.4 billion this year, growing almost fivefold to $10.8 billion by 2024, according to the forecast from Navigant Research.
Players in the space include not only a slew of startups, but also multinational companies such as Siemans and Intel.
“Building energy management systems (BEMS) represent an important evolutionary step in the approach to facilities and operations management,” said Casey Talon, senior analyst, commenting on that projection. “As the market matures, more integrated and sophisticated BEMS solutions are delivering energy efficiency improvements while also enabling comprehensive business intelligence and strategic management.”
Indeed, if you consider smart buildings from a more holistic perspective, the growth potential is much larger — up to $17.4 billion by 2019, compared with $6.3 billion last year, according to IDC Energy Insights. In North America, spending is being driven by large corporate operational efficiency initiatives. "<
As gridlock continues to be a problem in the United States, exacerbated by crumbling infrastructure, the American public has reportedly approved up to $200 billion for rapid and mass transit. According to the American Public Transportation Association (Apta), the 49 ballot measures totalling nearly $US 200bn that were voted on were the largest in history.…
Figure 1: Chart showing recent drop in Diesel Car sales, AID Newsletter "[...] Germany’s Bundesrat has passed a resolution to ban the internal combustion engine starting in 2030,Germany’s Spiegel Magazin writes. Higher taxes may hasten the ICE’s departure. An across-the-aisle Bundesrat resolution calls on the EU Commission in Brussels to pass directives assuring that…
Environment Canada recently released images showing air emissions modelling results across Alberta. These images are a reminder of how a small number of large sources mix together to pollute the air Albertans breathe, resulting in increased risks to human health.
Duane Tilden's insight:
"[...] SO2 and NOx emissions impact human health not only because they can cause direct harm, but also because they can react in the atmosphere to create fine particulate matter (PM2.5). The Alberta government has found that NOx and SO2 are the main causes of past incidents where PM2.5 concentrations have exceeded Canada’s air quality standards.
PM2.5 can cause asthma attacks, hospitalizations and even premature death, as we’ve summarized before. It’s a particular concern in Alberta, where PM2.5 is putting us on track to have the worst air quality in Canada, and Edmonton’s pollution levels are exceeding Toronto's.
These images underscore the cumulative impacts of a small number of very large industrial emissions sources — particularly coal plants, the oilsands and refineries — in addition to distributed industrial activities such as oil and gas operations. Those may all be separate sources, but their emissions end up in the same air. Pollutants from these different sources mix together in the air Albertans breathe, resulting in increased risks to human health. [...]
Alberta is unique in the western half of North America for its mid- and high-level readings. The province more closely resembles the densely populated mid-Atlantic region of the United States, or the coal-burning Midwest, than our western neighbours.Problem spots near coal plants, refineries and the oilsands
Another image shows how SO2 and NOX that is released into the atmosphere returns to ground level, or “deposits.” The image reveals a clear concentration (the orange and red spots) of the two pollutants being deposited around both Edmonton and the oilsands in northeast Alberta.
Edmonton is sandwiched between three large coal-burning power plants, which are clustered near Wabamun Lake west of Edmonton, and refineries on the east side of the city.Air pollution on the move
The video that AEMERA posted shows modelled SO2 plumes from large emitters across British Columbia, Alberta and Saskatchewan. The three-dimensional plumes reflect SO2 concentrations of at least three parts per billion. How the plumes travel was modelled using real weather conditions from a four-week period in the fall of 2013.
The video visually represents where SO2 is generated, how it moves through the atmosphere and where it eventually lands. As SO2 deposits on the ground, the land surface in the video changes colour to indicate where higher depositions are modelled. Although the specifics will differ for other pollutants, the video is representative of how airborne pollutants generally are dispersed and deposited.
It’s not particularly surprising to see that SO2 pollution originates from oil and gas production, coal plants and the oilsands — Alberta’s three largest-emitting sectors, by far. But seeing how much of the province is affected by these plumes may come as a shock.
The video shows that major industrial emissions do not blow in the direction of the prevailing wind pattern. Rather, they shift directions and can be combined with pollutants emitted in different areas. This raises concerns about environmental evaluations for new industrial emitters, since those evaluations focus on a much smaller area around the polluter — and focus on prevailing winds — rather than these dynamic wind patterns.
The data used for the oilsands is from 2010, so it discounts the emissions growth in that region over the last five years. The data for the rest of the sources is from 2006. In terms of coal emissions, these images correspond closely to today’s reality: NOx and SO2 in 2014 are at nearly the same levels as in 2006. [...]"
A hydraulic fracturing operation near Fox Creek, Alta., has been shut down after an earthquake hit the area Tuesday.
Duane Tilden's insight:
The magnitude 4.8 quake was reported at 11:27 a.m., says Alberta Energy Regulator, which ordered the shutdown of the Repsol Oil & Gas site 35 kilometres north of Fox Creek.
Carrie Rosa, spokeswoman for the regulator, says "the company has ceased operations … and they will not be allowed to resume operations until we have approved their plans."
Rosa added the company is working with the energy regulator to ensure all environmental and safety rules are followed.
In a statement, Repsol confirmed the seismic event and said the company was conducting hydraulic fracturing operations at the time it happened. [...]
Jeffrey Gu, associate professor of geophysics at the University of Alberta, said the area surrounding Fox Creek has been experiencing a proliferation of quakes lately.
He estimates in the last six months there have been hundreds of quakes in the area ranging in magnitude from 2.0 to 3.0.
But it is not considered a risky area with a such low population, said Gu, who added that Fox Creek and the surrounding region is carefully monitored by the energy regulator.
"There are faults in this area that have been mapped, have been reported in that area, but nothing of significance," he said.
"It's a relatively safe area without major, major faults."
Still, Gu said, there were two fairly large quakes in the area in January 2015, one of which had a magnitude of 4.4.
He wasn't able to confirm that they were caused by fracking, but said it is "highly probable."
The energy regulator said at the time that the 4.4 magnitude quake was likely caused by hydraulic fracturing. [...]
Demand response (DR) energy distribution appears to be gaining momentum in the United States and elsewhere. In the U.S., however, the DR sector is awaiting a Supreme Court decision that will have great impact on the evolution of the technology, administrative and business models.
Duane Tilden's insight:
"[...] A lot is going on besides the Supreme Court case, however. Technology evolutions in two discreet areas are converging to make DR a hot topic. The tools necessary to determine where energy is being stored, where it is needed and when to deliver it is have developed over decades in the telecommunications sector. Secondly, the more recent rush of advanced battery research is making it possible to store energy and provide the flexibility necessary for demand response to really work. Mix that with the growing ability to generate energy on premises through solar, wind and other methods and a potent new distributed structure is created.
In October, Advanced Energy Economy (AEE) released a report entitled “Peak Demand Reduction Strategy,” which was prepared for it by Navigant Research. The research found that the upside is high. For instance, for every $1 spent on reducing peak demand, savings of $2.62 and $3.26 or more can be expected in Illinois and Massachusetts, respectively. The most progress has been made in the United States, the report found. Last year, the U.S. accounted for $1.25 billion of the total worldwide $2 billion demand response market, according to JR Tolbert, the AEE’s Senior Director of State Policy. The U.S. market, he wrote in response to questions emailed by Energy Manager Today, grew 14 percent last year compared to 2013.
The report painted a bright picture for the future of demand response. “The key takeaway from this report is that by passing peak demand reduction mandates into law, or creating peak demand reduction programs, policy makers and utilities could significantly reduce costs for ratepayers, strengthen reliability of the electricity system, and facilitate compliance with the Clean Power Plan,” Tolbert wrote. “As states plan for their energy future, demand response should be a go-to option for legislators and regulators.” [...]"
WASHINGTON (Reuters) - The U.S. Environmental Protection Agency will propose regulations on Tuesday aimed at cutting methane emissions from the oil and gas sector by 40 to 45 percent over the next decade
Duane Tilden's insight:
>"WASHINGTON (Reuters) - The U.S. Environmental Protection Agency will propose regulations on Tuesday aimed at cutting methane emissions from the oil and gas sector by 40 to 45 percent over the next decade from 2012 levels, a source familiar with the issue said on Monday.
The regulations on methane are one part of the Obama administration's strategy to curb greenhouse gases and combat climate change.
The targets in Tuesday's proposal are in line with a January announcement by the Obama administration that it wanted to reduce oil and gas industry methane emissions by up to 45 percent from 2012 levels by 2025, the source said.
Earlier this month, President Barack Obama unveiled the final version of his plan to tackle greenhouse gases from coal-fired power plants, requiring carbon emissions from the sector be cut 32 percent from 2005 levels by 2030.
(Reporting By Valerie Volcovici; Writing by Mohammad Zargham; Editing by Peter Cooney)"<
A California woman, for one, who wants to ease the drought, put disabled vets to work, and make some money
Duane Tilden's insight:
>" [...] The shade balls of Los Angeles are 4 inches in diameter, hollow, polyethylene orbs [...] The Los Angeles Department of Water and Power has now dumped 96 million balls into local reservoirs to reduce evaporation and block sunlight from encouraging algae growth and toxic chemical reactions. The balls are coated with a chemical that blocks ultraviolet light and helps the spheres last as long as 25 years. Las Virgenes, north of L.A., now uses shade balls, too. [...]
The U.S. Environmental Protection Agency has encouraged the nation’s water managers in recent years to find ways to cover or contain their resources, to prevent sunlight from reacting with chlorine and possibly creating carcinogens, says Ed Osann, a senior policy analyst at the Natural Resources Defense Council. The shade balls shouldn’t pose a pollution problem in themselves, he says, since “everything that comes in contact with drinking water has to be a certified material.” Chase says the balls are designed not to degrade.
The shade balls are a novel way to protect drinking water, and Californians’ latest attempt to adjust to their four-year drought. [...]"<
Liquefied natural gas (LNG) as a transportation fuel option is back on the competitive race track, thanks to a part of the temporary (three-month) highway funding bill passed by the U.S. Senate Thursday, according to natural gas vehicle (NGV) advocates. The House-passed version had a similar provision.
Image Source: www.freightlinertrucks.com
Duane Tilden's insight:
>" [...] At a Congressional hearing last December, the global energy and procurement director for Atlanta-based UPS called for "removing barriers" to NGVs, adding that if Congress really wanted to accelerate the adoption of LNG use in heavy-duty trucks and more use of U.S.-produced natural gas supplies, it needed to eliminate "disproportionate taxing of LNG compared with diesel fuel."
Noting that President Obama was expected to sign the latest measure, Newport Beach, CA-based Clean Energy Fuels Corp. said the new leveling provision will effectively lower the tax on LNG by 14.1 cents/gal. Twenty-six state legislatures have already taken similar action, a Clean Energy spokesperson told NGI.
Clean Energy CEO Andrew Littlefair said the use of LNG in heavy-duty trucks, locomotives and large marine vessels has been growing steadily in North America, and "anyone who cares about a cleaner environment and energy independence should be very grateful for what the U.S. Congress has done, making LNG much more competitive."
Executives with America's Natural Gas Alliance (ANGA), and the NGVAmerica and American Gas Association (AGA) trade associations echoed Littlefair's sentiments.
"We applaud Congress for including language to equalize the federal highway excise tax on LNG," said ANGA CEO Marty Durbin. "This provision has garnered strong bipartisan support over the years, and we are thrilled to see it become law."
Calling the action a "common-sense change" that will mean greater fuel cost savings, NGVAmerica President Matt Godlewski said the passage of the LNG provision is great news for trucking fleets that are looking for clean-burning fuels. His calculation places the excise tax on LNG at 24.3 cents/DGE, compared to its current 41.3 cents/DGE level, Godlewski said.
"Currently, fleets operating LNG-powered trucks are effectively taxed for their fuel at a rate 70% higher than that of diesel fuel," he said.
An AGA spokesperson clarified the number to point out that the current federal excise tax on both diesel and LNG is 24.3 cents/gallon, but because LNG does not have the same energy content/gallon of fuel, it takes 1.7 gallons of LNG to equal a gallon of diesel. "Since the excise tax is based on volume (gallons) -- not energy content -- LNG is taxed at 170% of the rate of diesel on an energy equivalent basis," he said.
"This provision provides the level playing field that natural gas has needed to reach its full potential as a transportation fuel," said Kathryn Clay, AGA vice president for policy strategy.
Each of the trade groups has been lobbying Congress for some time to take this corrective action on LNG. Under the new provision, the energy equivalent of a diesel gallon of LNG is defined as having a Btu content of 128,700, which AGA said is equal to 6.06 pounds of LNG.
Separately, the new measure defines the energy equivalent of a gallon of compressed natural gas (CNG) as having a Btu content of 115,400, or 5.66 pounds of CNG. [...]"<
Four years after the Fukushima nuclear disaster which has caused incredible an ongoing destruction, in the meantime authorities have tried to cover up the serious consequences...
Image source: www.businessinsider.com
Duane Tilden's insight:
Fukushima will likely go down in history as the biggest cover-up of the 21st Century. Governments and corporations are not leveling with citizens about the risks and dangers; similarly, truth itself, as an ethical standard, is at risk of going to shambles as the glue that holds together the trust and belief in society’s institutions. Ultimately, this is an example of how societies fail.
Tens of thousands of Fukushima residents remain in temporary housing more than four years after the horrific disaster of March 2011. Some areas on the outskirts of Fukushima have officially reopened to former residents, but many of those former residents are reluctant to return home because of widespread distrust of government claims that it is okay and safe. [...]
According to Japan Times as of March 11, 2015: “There have been quite a few accidents and problems at the Fukushima plant in the past year, and we need to face the reality that they are causing anxiety and anger among people in Fukushima, as explained by Shunichi Tanaka at the Nuclear Regulation Authority. Furthermore, Mr. Tanaka said, there are numerous risks that could cause various accidents and problems.”
Even more ominously, Seiichi Mizuno, a former member of Japan’s House of Councillors (Upper House of Parliament, 1995-2001) in March 2015 said: “The biggest problem is the melt-through of reactor cores… We have groundwater contamination… The idea that the contaminated water is somehow blocked in the harbor is especially absurd. It is leaking directly into the ocean. There’s evidence of more than 40 known hotspot areas where extremely contaminated water is flowing directly into the ocean… We face huge problems with no prospect of solution.”
At Fukushima, each reactor required one million gallons of water per minute for cooling, but when the tsunami hit, the backup diesel generators were drowned. Units 1, 2, and 3 had meltdowns within days. There were four hydrogen explosions. Thereafter, the melting cores burrowed into the container vessels, maybe into the earth. [...]
Following the meltdown, the Japanese government did not inform people of the ambient levels of radiation that blew back onto the island. Unfortunately and mistakenly, people fled away from the reactors to the highest radiation levels on the island at the time.
As the disaster happened, enormous levels of radiation hit Tokyo. The highest radiation detected in the Tokyo Metro area was in Saitama with cesium radiation levels detected at 919,000 becquerel (Bq) per square meter, a level almost twice as high as Chernobyl’s “permanent dead zone evacuation limit of 500,000 Bq” (source: Radiation Defense Project). For that reason, Dr. Caldicott strongly advises against travel to Japan and recommends avoiding Japanese food.
Even so, post the Fukushima disaster, Secretary of State Hillary Clinton signed an agreement with Japan that the U.S. would continue importing Japanese foodstuff. Therefore, Dr. Caldicott suggests people not vote for Hillary Clinton. One reckless dangerous precedent is enough for her. [...]
Mari Yamaguchi, Associated Press (AP), June 12, 2015: “Four years after an earthquake and tsunami destroyed Japan’s Fukushima nuclear power plant, the road ahead remains riddled with unknowns… Experts have yet to pinpoint the exact location of the melted fuel inside the three reactors and study it, and still need to develop robots capable of working safely in such highly radioactive conditions. And then there’s the question of what to do with the waste… serious doubts about whether the cleanup can be completed within 40 years.” [...]
According to the Smithsonian, April 30, 2015: “Birds Are in a Tailspin Four Years After Fukushima: Bird species are in sharp decline, and it is getting worse over time… Where it’s much, much hotter, it’s dead silent. You’ll see one or two birds if you’re lucky.” Developmental abnormalities of birds include cataracts, tumors, and asymmetries. Birds are spotted with strange white patches on their feathers.
Maya Moore, a former NHK news anchor, authored a book about the disaster:The Rose Garden of Fukushima (Tankobon, 2014), about the roses of Mr. Katsuhide Okada. Today, the garden has perished: “It’s just poisoned wasteland. The last time Mr. Okada actually went back there, he found baby crows that could not fly, that were blind. Mutations have begun with animals, with birds.” [...] "<
The amount of water required to hydraulically fracture oil and gas wells varies widely across the country, according to the first national-scale analysis and map of hydraulic fracturing water usage detailed in a new USGS study accepted for publication in Water Resources Research, a journal of the American Geophysical Union.
Duane Tilden's insight:
>" [...] from 2000 to 2014, median annual water volume estimates for hydraulic fracturing in horizontal wells had increased from about 177,000 gallons per oil and gas well to more than 4 million gallons per oil well and 5.1 million gallons per gas well. Meanwhile, median water use in vertical and directional wells remained below 671,000 gallons per well. For comparison, an Olympic-sized swimming pool holds about 660,000 gallons.
“One of the most important things we found was that the amount of water used per well varies quite a bit, even within a single oil and gas basin,” said USGS scientist Tanya Gallegos, the study’s lead author. “This is important for land and resource managers, because a better understanding of the volumes of water injected for hydraulic fracturing could be a key to understanding the potential for some environmental impacts.”
Horizontal wells are those that are first drilled vertically or directionally (at an angle from straight down) to reach the unconventional oil or gas reservoir and then laterally along the oil or gas-bearing rock layers. This is done to increase the contact area with the reservoir rock and stimulate greater oil or gas production than could be achieved through vertical wells alone.
However, horizontal wells also generally require more water than vertical or directional wells. In fact, in 52 out of the 57 watersheds with the highest average water use for hydraulic fracturing, over 90 percent of the wells were horizontally drilled.
Although there has been an increase in the number of horizontal wells drilled since 2008, about 42 percent of new hydraulically fractured oil and gas wells completed in 2014 were still either vertical or directional. The ubiquity of the lower-water-use vertical and directional wells explains, in part, why the amount of water used per well is so variable across the United States.
The watersheds where the most water was used to hydraulically fracture wells on average coincided with parts of the following shale formations:
Eagle Ford (within watersheds located mainly in Texas)Haynesville-Bossier (within watersheds located mainly in Texas & Louisiana)Barnett (within watersheds located mainly in Texas)Fayetteville (within watersheds located in Arkansas)Woodford (within watersheds located mainly in Oklahoma)Tuscaloosa (within watersheds located in Louisiana & Mississippi)Marcellus & Utica (within watersheds located in parts of Ohio, Pennsylvania, West Virginia and within watersheds extending into southern New York)
Shale gas reservoirs are often hydraulically fractured using slick water, a fluid type that requires a lot of water. In contrast, tight oil formations like the Bakken (in parts of Montana and North Dakota) often use gel-based hydraulic fracturing treatment fluids, which generally contain lower amounts of water. [...]"<
By John Timmer, Ars Technica Air travel has come under fire for its potential contributions to climate change. Most people probably assume that its impact comes through carbon emissions, given that aircraft burn significant amounts of fossil fuel to stay aloft. But the carbon released by air travel remains a relatively minor part of the…
Duane Tilden's insight:
>" [...]Others include the emissions of particulates high in the atmosphere, the production of nitrogen oxides and the direct production of clouds through contrail water vapor.
Over time, these thin lines of water evolve into “contrail cirrus” clouds that lose their linear features and become indistinguishable from the real thing.
Although low-altitude clouds tend to cool the planet by reflecting sunlight, high-altitude clouds like cirrus have an insulating effect and actually enhance warming.
To figure out the impact of these cirrus clouds, the authors created a module for an existing climate model (theECHAM4) that simulated the evolution of aircraft-induced cirrus clouds (they could validate some of the model’s output against satellite images of contrails).
They found hot spots of these clouds over the United States and Europe, as well as the North Atlantic travel corridor.
Smaller affects were seen in East Asia and over the northern Pacific. Over central Europe, values peaked at about 10 percent, in part because the output of the North Atlantic corridor drifted in that direction.
On their own, aircraft-generated cirrus produces a global climate forcing of about 40 milliwatts per square meter. (In contrast, the solar cycle results in changes of about a full watt/M2.) But these clouds suppressed the formation of natural cirrus clouds, which partially offset the impact of the aircraft-generated ones, reducing the figure to about 30 mW/M2. That still leaves it among the most significant contribution to the climate produced by aircraft.
Some reports have suggested we might focus on makingengines that emit less water vapor, but the water is a necessary byproduct of burning hydrocarbon. We’ll almost certainly be accomplishing that as a result of rising fuel prices, and will limit carbon emissions at the same time.
The nice thing is that, in contrast to the long atmospheric lifespan of CO2, if we can cause any changes in cloud formation, they’ll have an impact within a matter of days. [...]"<
The Environmental Protection Agency is expected to propose rules requiring heavy trucks to increase their fuel economy by up to 40 percent by 2027.
Duane Tilden's insight:
>" [...] This week, the E.P.A. is expected to propose regulations to cut greenhouse gas emissions from heavy-duty trucks, requiring that their fuel economy increase up to 40 percent by 2027, compared with levels in 2010, according to people briefed on the proposal. A tractor-trailer now averages five to six miles a gallon of diesel. The new regulations would seek to raise that average to as much as nine miles a gallon. A truck’s emissions can vary greatly, depending on how much it is carrying.
The hotly debated rules, which cover almost any truck larger than a standard pickup, are the latest in a stack of sweeping climate change policy measures on which President Obama hopes to build his environmental legacy. Already, his administration has proposed rules to cut emissions from power plants and has imposed significantly higher fuel efficiency standards on passenger vehicles.
The truck proposals could cut millions of tons of carbon dioxide pollution while saving millions of barrels of oil. Trucks now account for a quarter of all greenhouse gas emissions from vehicles in the United States, even though they make up only 4 percent of traffic, the E.P.A. says.
But the rules will also impose significant burdens on America’s trucking industry — the beating heart of the nation’s economy, hauling food, raw goods and other freight across the country.
It is expected that the new rules will add $12,000 to $14,000 to the manufacturing cost of a new tractor-trailer, although E.P.A. studies estimate that cost will be recouped after 18 months by fuel savings.
Environmental advocates say that without regulation, the contribution of American trucks to global warming will soar.
“Trucking is set to be a bad actor if we don’t do something now,” Jason Mathers, head of the Green Freight program at the Environmental Defense Fund.
But some in the trucking industry are wary.
“I’ll put it this way: We told them what we can do, but they haven’t told us what they plan to do,” said Tony Greszler, vice president for government relations for Volvo Group North America, one of the largest manufacturers of big trucks. “We have concerns with how this will play out.”
The E.P.A., along with the National Highway Traffic Safety Administration, began its initial phase of big truck fuel economy regulation in 2011, and those efforts have been widely seen within the industry as successful. But meeting the initial standards, like using more efficient tires, was not especially difficult by comparison. [...]"