Connect with us

Environment

The Missing Lynx

Ross Hunter

Published

on

lynx and deer

flickr/blacktigersdream

Deer, on the surface, seem like one of the most graceful and harmless animals one could come across in the UK. Any traveller on a British railway will be familiar with seeing the fragile-looking mammal graze upon a verdant field. Yet, as unassuming as they may seem deer pose perhaps the greatest threat to the biodiversity of the country. Essentially, without any natural predators – except ourselves – they have been allowed to breed and feed unhindered. Despite attempts to cull a sufficient proportion of them each year in order to maintain the semblance of natural predation their population continues to rise. This year it is estimated that around two million deer survive in the forests and countryside of Great Britain; a vastly unsustainable number.

It’s not that deer are inherently damaging animals. Rather, the over abundance of any creature is detrimental to a natural world that requires balance in order to be sustainable – as is proven by the rapacious effect our own species has had on the planet. Too many deer harm the diversity of precious woodland habitats and spoil suitable environments for other species, particularly migrant songbirds like the nightingale. Not to mention their habit of destroying sapling trees and causing thousands of road accidents a year. Should their population continue to explode these problems will accelerate. The remaining British wilderness exists in an unnatural state where deer hold dominion over the land. The king of the British forest is a reclusive herbivore. Hardly an exciting prospect for the growing ecotourism industry, is it?

So, what’s the solution?

Currently the measures being implemented to control the UK deer population are ineffective and the philosophy behind them troubling. The British Deer Society, a charity claiming to promote the existence and sustainability of deer for the generations to come, have written on their website a peculiar phrase:    ‘…our requirements take precedence over those of other creatures, and truly “wild” habitats no longer exist in Britain’

The BDS evidently believe in the philosophy that has caused this mess in the first place: that we matter more than the animals we share the planet with. Restricting nature to convenient corrals like parks and fenced in forests is not really how one goes about creating ‘truly wild’ habitats. Deer are not cattle. They have been able to procreate so freely because throughout history we have scourged the British land of its natural predators and replaced them with livestock. Yet, as natural disasters and global warming related problems prove, despite our technological intelligence we are still very much at the mercy of nature. This issue of deer overpopulation is no different. Even the most prolific deerstalker will not be able to solve this problem with brute force. In order to find a solution a new philosophy is needed.

Lo and behold, this month saw the launch of the charity Rewilding Britain. The charity actively campaigns for the reintroduction of several species into the UK: from wolves to wild boar. In creating a more diverse ecological landscape they hope to rebuild the natural processes that over-time the British wilderness has lost. From supporting the establishment of areas seabed to be made free from dredging and trawling to new methods of farming that allow animals to roam freely around estates, Rewilding Britain take a wholly different view than the BDS does of how we should form a relationship with the environment. Of course, the concept of rewilding is driven by human intervention and could, therefore, be argued to be unnatural. However, if human intervention banished these habitats and species from Britain in the first place then surely it should be our responsibility to bring them back. Though rewilding isn’t just an apologetic conservation ideal. It could actually solve some of the UK’s most pressing environmental problems, like deer.

Enter the Eurasian lynx: an elusive, carnivorous feline that could once have kept the deer population in Britain within sustainable levels. Since its departure the only things killing deer in this country have been the

landed gentry, unsuspecting motorists, disgruntled farmers or people carrying out controlled culls (which, as the Deer Initiative have admitted, are largely feckless pursuits in actually controlling the population). A 2013 study stated that in order for the deer population to remain within sustainable levels in Britain up to 50% of the population would have to be shot each year. That works out, at current estimates, at around 1 million deer every year; a monumental task even for the most ardent hunter.

Rewilding Britain propose that instead of exerting the enormous amount of effort and financial resources such a gigantic cull would require, we simply allow the laws of nature to solve the problem for us. If deer have no natural predators then lets introduce one: the lynx.

This isn’t exactly a new idea. Lynx have been re-introduced to several habitats across Europe after human impact pushed them out and the Lynx UK Trust are currently attempting to get the go-ahead for a trial re-introduction to Britain. However, should this come closer to becoming a reality it can be guaranteed that sceptics would begin to make their voices heard. Despite the evident benefits lynx could provide to the UK wilderness – from managing the deer population to adding much needed diversity to our forests – already doubts are being broadcast by academics and farmers alike. The latter group will surely campaign for a compensation system, even though lynx have been found to be responsible for the mortality of less that 0.5% of the total number of available sheep in a given area (far less than deaths caused by disease). Professor Chris Thomas at the University of York has also stated that the endangered capercaillie could be under threat should lynx be introduced, although he provided no evidence for such an assumption in his article on the BBC News website. If the Lynx UK Trust begin to garner more success doubts such as these will only become more numerous.

It seems to me that the human race has a great deal to atone for when it comes to the environment. We’re directly responsible for the extinction of several species; we continually destroy wilderness in order to make way for houses with bare, lifeless gardens; we do not even respect the creatures around us enough to know their names as common knowledge. If we can make up for our plethora of ruinous errors, if only a little, then I see no justifiable barriers as to why the lynx should not be re-introduced into Britain. Plus, who wouldn’t love to catch sight of a pointy-eared predator skulking through a British forest? It’s a hell of a lot more exciting than seeing deer through the window of a train.

Continue Reading
Comments

Environment

Nuclear Power and Other Power Sources: How Do They Stack Up?

Published

on

Most everyone dreads the idea of nuclear war because of the abject devastation it would inflict on planet Earth. Yet few connect the dots between nuclear weapons and nuclear power — the same energy that makes atomic bombs and nuclear missiles so threatening is also harnessed to power electrical grids and other forms of infrastructure. When properly contained, nuclear power is the cleanest and most abundant energy source available. With all the concern over climate change and environmental degradation, it begs a huge question: why is the United States of America not generating more — much more — nuclear energy?

Capital Investment vs. Production Costs

Looking at it from one angle, a larger nuclear energy capacity is a no-brainer. Making electricity from nuclear sources is cheaper than using coal, gas or petroleum, i.e. fossil fuels. On average, using 2011 cash value, electricity cost 21.56, 3.23 and 4.51 cents per kilowatt-hour from petroleum, coal and natural gas, respectively. Nuclear power came in at 2.10 cents per kW according to data received by the Federal Energy Regulatory Commission (FERC). Yet these simple ongoing production costs fail to tell the full story.

To up the power generating capacity of nuclear sources, additional plants are necessary. Some argue that the savings in electricity production means the nuclear utilities pay for themselves. What, though, are they paying for…and how long until the payoff? Engineering and constructing a nuclear power plant is very expensive. In fact, 74 percent of the cost of nuclear-sourced electricity is in the capital costs of creating the physical facility and technology for that purpose. Some estimates range drom six-billion to nine-billion dollars. Others estimate over $5,300 per kW before it begins paying for itself…in 20 to 30 years. These figures make the prospect cost-prohibitive to many decision makers in government and business.

Plentiful Energy at Low Costs without Nuclear Power

Were we living back during the oil shocks and embargoes of the 1970s, the urgency factor would be much higher concerning nuclear power in the US. The abundance of discoveries and advancement of technology have made fossil fuels more available at modest prices. Coal and petroleum are each low compared to their peaks. With the advent of hydraulic fracturing, or “fracking,” natural gas is ever more accessible and affordable. Though people may worry about the environmental effects of burning these substances, they are likely to continue usage to maintain a decent househild cash flow.

Still, even the renewable alternatives to traditional fuels are dropping in price. In terms of sheer volume, wind turbines and solar panels — for instance — have yet to match the output of fossil fuels, much less the overwhelming energy yield of nuclear. Nevertheless, their contribution to production in the United States is growing while their financial outlays are shrinking. Added to the two aforementioned renewable sources, hydro-electric power, biomass and geothermal each come in under 10 cents per kW. According to Forbes magazine, this makes them highly competitive with oil and gas financially.

Lack of Knowledge

The absence of urgency mentioned above relates to a third factor about why Americans are not expanding their nuclear production capacity. Generations have passed that are not well-informed about the potential and reality of nuclear power. A dangerous accident at Pennsylvania’s Three-Mile Island facility in the 1970s scared public officials and policy makers into backing off of a pro-nuclear agenda. The improvements and replication found in today’s safety protocols have been ineffective in re-booting a national conversation. Granted, the United States operates 97 nuclear reactors, more than any other country. Yet only four more are under design and/or construction compared to 20 for China.

Furthermore, France relies on nuclear for three-quarters of its electricity; several eastern European nations, half; South Korea, in excess of 30 percent; while the U.S. can claim around 20 percent. Clearly, the public knowledge regarding how clean and abundant atomic energy is meager; awareness of past accidents — including the Fukushima Daiichi and Chernobyl meltdowns of recent decades were, by contrast, reported widely by media outlets.

Advocates of nuclear power have work to do to bring Americans on board. Otherwise, dirtier, cheaper sources will continue to reign.

Prev postNext post
Use your ← → (arrow) keys to browse

Continue Reading

Environment

Francisco Reynés: “We have to consider gas as the energy source with the most potential in the future”

Published

on

Natural Gas

Francisco Reynés, executive chairman of Naturgy (formerly Gas Natural Fenosa), has talked about the role of gas in the world as the energy source with the greatest potential in the future, at the 6TH IEF-IGU Ministerial Gas Forum celebrated in Barcelona, Spain.

 Francisco Reynés has explained that the world “needs to talk about the different uses of natural gas and the gas technologies and innovations towards a sustainable energy future. We have to address the role of gas in the world as a future energy source, not only as a transition source of energy”.

 “The uses of gas are, as we all know, well beyond those of power generation. Gas provides sources for non-energy uses, such as petrochemicals or fertilizers, which have no clear substitute”, he added.

 About this possibility, Francisco Reynés has explained that “all of this will benefit and service the economic growth and development of the countries and economies around the globe. It is, indeed, a joint effort which we must all face with the utmost priority and the maximum care”.

Reynés has also insisted on the cooperation between governments, producers and even consumers to strengthen the security of gas supply on international markets. “The challenge for the future is how energy systems will evolve to meet greenhouse gas emission goals, and more stringent fuel quality standards while at the same time they respond to growing demand for affordable access to reliable energy services”, he concluded.

The 6th IEF-IGU Ministerial Gas Forum aims to sharpen a collective focus on energy policies, market trends, and technology options that enable the gas industry to deliver inclusive growth and successful transformations for a secure, inclusive and sustainable energy future. Energy and climate policies, gas technologies and innovations as well as market fundamentals are ever more co-dependent but also vary across geographies.

Use your ← → (arrow) keys to browse

Continue Reading

Environment

You can’t fight nature, but you can be ready for whatever she throws at you

Published

on

tsunami

The human race has got used to being in control of its surroundings, and yet we will never be able to truly prevent some of the most devastating catastrophes that our planet can throw our way. Yet we still strive to protect all the things we have built and worked hard for, and technology is helping us to do that on a day to day basis.

Tsunamis are a reality and we need to be prepared for them

Despite all the advances in our technology, we have not yet found ourselves able to avert the most fatal of natural disasters. The fact remains that our planet is far larger than we can possibly control and despite being considerably safer than several million years ago in the early days of the Earth’s life, it still has the capacity to be volatile and terrifying.

Some of the  most devastating tsunamis in recent history have taken place in the last 60 years, with catastrophic loss of life and billions needed in humanitarian aid and reconstruction. The effects will last a lifetime for many areas as they try to recover and rebuild.

It is impossible to forget the Tohoku earthquake and subsequent tsunami in 2011. The consequences were absolutely devastating.

Striking Japan on the 11th March the earthquake reached an eye watering 9.0 magnitude, and generated a 33 feet high wall of water travelling as far as 6 miles inland. Some reports even record waves as high as 133 feet, with a 97-foot wave smashing into the city of Ofunato.

Around 25,000 people were killed or reported missing, and 125,000 buildings damaged or destroyed. But more worryingly the Fukushima I Nuclear Power Plant was also struck causing a nuclear meltdown. The disaster is recorded at the highest level of International Nuclear Event Scale. The impact of this event is still being fully understood, and radiation from the plant has been detected as much as 200 miles away, with many areas remaining uninhabitable and will be for many years to come.

The loss of human life can be staggering due to a tsunami that hits with no warning. Take for example the Boxing Day Tsunami of 2004 in the Indian Ocean. An unbelievable death toll of 230,000 was recorded across 14 countries including Indonesia, Sri Lanka, India and Thailand. The earthquake under the ocean was recorded at 9.3 magnitude, generating waves up to 93 feet high. Some waves hit land within 15 minutes, but some took as much as 7 hours.

Even those with time to evacuate were hard hit, mostly due to the complete lack of a tsunami warning system which meant very densely populated coastal areas being taken by surprise.

Early warnings save lives

By comparison, although damage to buildings and general destruction was widespread, the 2009 Samoa earthquake and tsunami saw a considerable lower death toll.

With an earthquake of 8.1 magnitude and waves reaching 45 feet high, that travelled up to a mile inland there were 189 casualties recorded. The loss of life would have been far higher if it wasn’t for the Pacific Tsunami Warning Centre which gave people time to evacuate and reach higher ground.

There are several ways in which a tsunami can be detected. From recognition of symptoms, an earthquake can be quite hard to miss, to technological warnings from tsunami detection and forecasting. These are based on a combination of data collected by environmental sensors and using that data for tsunami modelling.

For example monitoring seismic activity and the magnitude of an earthquake can give an excellent warning of tsunami potential. However, it cannot be taken in isolation.  For larger earthquakes it is easier to underestimate the size of the quake, and therefore miscalculate the tsunami potential.

Rapid sea level monitoring will give the best warning

When managing the data collected, those carrying out the analysis have a hard decision. Declare a tsunami imminent, and risk a costly unnecessary evacuation, or make the decision to issue the warning to the public so that emergency plans can be activated.

They also need to be able to indicate clearly from the modelling how large the waves will be and when they will strike. Importantly they need to know when the danger will be over so that people can return safely to the evacuated areas.

The issue is that  tsunami detection and forecasting requires near-real-time observations from both coastal sea level instruments and open-ocean sensors. Fundamental gaps in coverage still exist, especially in open-water. This puts at risk the ability to give warning, and the ability to learn more about the behaviour of tsunamis after the fact which will further refine the accuracy of the modelling in the future. More coverage is needed, and the durability of the equipment a key factor.

New technology paramount for the detection of tsunamis

The installation of new tsunami buoys is without doubt the next step for addressing the coverage issue, and these buoys need to be smart with built in Tsunami Early Detection and Warning System. It needs to be able to detect an event and send that information to be centrally analysed.

Pressure sensors deployed in a water depth up to 7,000 meters can detect height variations on the water surface, and in order to resist the effect of the harsh elements and environments must be of the highest quality. It is now possible to obtain floats manufactured with a closed-cell polyethylene foam sheet that prevents water absorption.

In  terms of positioning and communication, all can be managed through GPS, and redundancy in place for communications via satellite, with a reaction time of less than one minute and powered by a double solar power system. These buoys are so durable they can provide much better confidence that there will be no failure of service in remote locations.

They are able to transmit a NOAA Tsunami Warning System compatible message and monitor the sea level column changes to within 1mm. This kind of monitoring will be paramount for buying enough time for evacuation and prevent the loss of life seen previously.

Use your ← → (arrow) keys to browse

Continue Reading

Trending