A joke on the morbidity of New Delhi is circulating among Delhiites (people from Delhi) that while the lives of the citizens were disrupted in November last year due to ‘note-bandi’ (ban on currency), November of this year presents an even tougher test for the people with ‘saans-bandi’, a ban on breathing. The receding autumn or advent of winter was a once beloved season of a good number of people in the city who welcomed the change with a complete revamp of their wardrobes with colourful woollens. It is now characterized with bleak skies, an air of gloom and a little bit of grey in everything you see outside of your house.
For the past three days, I have been acutely aware of the air I am breathing, felt unproductive and apprehensive in spells for no good reason, and felt the need to confine myself to my house for as long as possible. These are some of the less apparent effects of the thick blanket of smog that has engulfed the national capital region. As a number of people donned with different types of masks on the roads and on Snapchat serve as a constant visual reminder of how the city is choking, a flurry of articles and news updates have presided over my feed. One of them included a horrifying viral video recording of vehicles ramming into each other due to poor visibility on the Noida-Agra Expressway as people scrambled to get themselves and their children out of the way, while some other articles argued about how currently breathing in Delhi for a day is the equivalent of smoking twenty cigarettes.
A sudden state of emergency
Less than two days ago, when the air quality in Delhi visibly worsened, Chief Minister Arvind Kejriwal likened the city to a ‘gas chamber’. The PM 10 and PM 2.5 levels in different parts of the capital have rocketed above the levels that are considered safe, and the Safar (System of Air Quality and Weather Forecasting And Research) has declared the air quality as ‘severe’ for at least the next three days after which the level may drop to a not so safe either ‘very poor’ level. In some parts of the city, the AQI (air quality index) was detected on monitors at 999, the highest possible reading, which suggests that the level might be even higher. The visibility during the early hours has also dropped to very low levels. Among the different reasons for the observed level of pollution in Delhi, slow winds at this time of the year have been identified as the prime contributor along with stubble burning by farmers in the neighbouring states of Punjab and Haryana. Combined with the dust particles present in the air, omissions from vehicles that plague the roads in the region throughout the day, and those from factories and construction activities, these factors dictate a recipe for creating uninhabitable conditions.
Making amends: A scramble for order
The Indian Medical Association on 7th November declared Delhi to be in a state of public health emergency, urging the Delhi government and other bodies to take adequate steps to ensure minimum risk to citizens, especially young children and the elderly, who are most likely to suffer from the effects of pollution. After a worsening situation, the government has ordered all schools in the capital to remain shut till Sunday, and has rolled out plans to implement the odd-even scheme for vehicles in the city from next week. Parking fee throughout the city has also been increased fourfold and the prices for travel by the metro have been substantially reduced for the time being to promote the use of public transport. The National Green Tribunal (NGT) has also banned all construction and industrial activities till November 14 in a bid to provide the citizens of Delhi a breath of better quality air. Mr. Kejriwal has also approached his counterparts from Punjab and Haryana over the issue of stubble burning by the farmers but it remains to be seen how the move plays out in the coming days.
As the government battles against the situation, the public is taking measures to protect themselves in whatever way they can. An increasing number of doctors and specialists on the matter have advised people not to go out for morning walks or outdoor activities so as to not inhale excessive quantities of toxic pollutants. Some doctors have even advised their patients to leave the city for the time being if possible. Air purifiers for houses and masks for travelling outside have seen a huge rise in sales as nearly everyone has become an expert on the subject of filters and N95 and N99 have become trending words from pharmacies to WhatsApp conversations.
A year ago, while New Delhi wrestled with more or less the same conditions, UNICEF had called on the rest of the world to consider the situation as a wake up call. “It is a wake up call that very clearly tells us: unless decisive actions are taken to reduce air pollution, the events we are witnessing in Delhi over the past week are likely to be increasingly common”, it had said in a statement. If we are doing better than last year, it is still not enough, and all one needs is less than a minute in the open to be convinced of that. As the world battles with the effects of climate change, India’s bid to have a major global footprint in the coming decades is bound to take a serious hit if so many of its cities, and especially its capital follow a trend of being unlivable for a chunk of time at the end of every year.
Nuclear Power and Other Power Sources: How Do They Stack Up?
Most everyone dreads the idea of nuclear war because of the abject devastation it would inflict on planet Earth. Yet few connect the dots between nuclear weapons and nuclear power — the same energy that makes atomic bombs and nuclear missiles so threatening is also harnessed to power electrical grids and other forms of infrastructure. When properly contained, nuclear power is the cleanest and most abundant energy source available. With all the concern over climate change and environmental degradation, it begs a huge question: why is the United States of America not generating more — much more — nuclear energy?
Capital Investment vs. Production Costs
Looking at it from one angle, a larger nuclear energy capacity is a no-brainer. Making electricity from nuclear sources is cheaper than using coal, gas or petroleum, i.e. fossil fuels. On average, using 2011 cash value, electricity cost 21.56, 3.23 and 4.51 cents per kilowatt-hour from petroleum, coal and natural gas, respectively. Nuclear power came in at 2.10 cents per kW according to data received by the Federal Energy Regulatory Commission (FERC). Yet these simple ongoing production costs fail to tell the full story.
To up the power generating capacity of nuclear sources, additional plants are necessary. Some argue that the savings in electricity production means the nuclear utilities pay for themselves. What, though, are they paying for…and how long until the payoff? Engineering and constructing a nuclear power plant is very expensive. In fact, 74 percent of the cost of nuclear-sourced electricity is in the capital costs of creating the physical facility and technology for that purpose. Some estimates range drom six-billion to nine-billion dollars. Others estimate over $5,300 per kW before it begins paying for itself…in 20 to 30 years. These figures make the prospect cost-prohibitive to many decision makers in government and business.
Plentiful Energy at Low Costs without Nuclear Power
Were we living back during the oil shocks and embargoes of the 1970s, the urgency factor would be much higher concerning nuclear power in the US. The abundance of discoveries and advancement of technology have made fossil fuels more available at modest prices. Coal and petroleum are each low compared to their peaks. With the advent of hydraulic fracturing, or “fracking,” natural gas is ever more accessible and affordable. Though people may worry about the environmental effects of burning these substances, they are likely to continue usage to maintain a decent househild cash flow.
Lack of Knowledge
The absence of urgency mentioned above relates to a third factor about why Americans are not expanding their nuclear production capacity. Generations have passed that are not well-informed about the potential and reality of nuclear power. A dangerous accident at Pennsylvania’s Three-Mile Island facility in the 1970s scared public officials and policy makers into backing off of a pro-nuclear agenda. The improvements and replication found in today’s safety protocols have been ineffective in re-booting a national conversation. Granted, the United States operates 97 nuclear reactors, more than any other country. Yet only four more are under design and/or construction compared to 20 for China.
Furthermore, France relies on nuclear for three-quarters of its electricity; several eastern European nations, half; South Korea, in excess of 30 percent; while the U.S. can claim around 20 percent. Clearly, the public knowledge regarding how clean and abundant atomic energy is meager; awareness of past accidents — including the Fukushima Daiichi and Chernobyl meltdowns of recent decades were, by contrast, reported widely by media outlets.
Advocates of nuclear power have work to do to bring Americans on board. Otherwise, dirtier, cheaper sources will continue to reign.
Francisco Reynés: “We have to consider gas as the energy source with the most potential in the future”
Francisco Reynés, executive chairman of Naturgy (formerly Gas Natural Fenosa), has talked about the role of gas in the world as the energy source with the greatest potential in the future, at the 6TH IEF-IGU Ministerial Gas Forum celebrated in Barcelona, Spain.
Francisco Reynés has explained that the world “needs to talk about the different uses of natural gas and the gas technologies and innovations towards a sustainable energy future. We have to address the role of gas in the world as a future energy source, not only as a transition source of energy”.
“The uses of gas are, as we all know, well beyond those of power generation. Gas provides sources for non-energy uses, such as petrochemicals or fertilizers, which have no clear substitute”, he added.
About this possibility, Francisco Reynés has explained that “all of this will benefit and service the economic growth and development of the countries and economies around the globe. It is, indeed, a joint effort which we must all face with the utmost priority and the maximum care”.
Reynés has also insisted on the cooperation between governments, producers and even consumers to strengthen the security of gas supply on international markets. “The challenge for the future is how energy systems will evolve to meet greenhouse gas emission goals, and more stringent fuel quality standards while at the same time they respond to growing demand for affordable access to reliable energy services”, he concluded.
The 6th IEF-IGU Ministerial Gas Forum aims to sharpen a collective focus on energy policies, market trends, and technology options that enable the gas industry to deliver inclusive growth and successful transformations for a secure, inclusive and sustainable energy future. Energy and climate policies, gas technologies and innovations as well as market fundamentals are ever more co-dependent but also vary across geographies.
You can’t fight nature, but you can be ready for whatever she throws at you
The human race has got used to being in control of its surroundings, and yet we will never be able to truly prevent some of the most devastating catastrophes that our planet can throw our way. Yet we still strive to protect all the things we have built and worked hard for, and technology is helping us to do that on a day to day basis.
Tsunamis are a reality and we need to be prepared for them
Despite all the advances in our technology, we have not yet found ourselves able to avert the most fatal of natural disasters. The fact remains that our planet is far larger than we can possibly control and despite being considerably safer than several million years ago in the early days of the Earth’s life, it still has the capacity to be volatile and terrifying.
Some of the most devastating tsunamis in recent history have taken place in the last 60 years, with catastrophic loss of life and billions needed in humanitarian aid and reconstruction. The effects will last a lifetime for many areas as they try to recover and rebuild.
It is impossible to forget the Tohoku earthquake and subsequent tsunami in 2011. The consequences were absolutely devastating.
Striking Japan on the 11th March the earthquake reached an eye watering 9.0 magnitude, and generated a 33 feet high wall of water travelling as far as 6 miles inland. Some reports even record waves as high as 133 feet, with a 97-foot wave smashing into the city of Ofunato.
Around 25,000 people were killed or reported missing, and 125,000 buildings damaged or destroyed. But more worryingly the Fukushima I Nuclear Power Plant was also struck causing a nuclear meltdown. The disaster is recorded at the highest level of International Nuclear Event Scale. The impact of this event is still being fully understood, and radiation from the plant has been detected as much as 200 miles away, with many areas remaining uninhabitable and will be for many years to come.
The loss of human life can be staggering due to a tsunami that hits with no warning. Take for example the Boxing Day Tsunami of 2004 in the Indian Ocean. An unbelievable death toll of 230,000 was recorded across 14 countries including Indonesia, Sri Lanka, India and Thailand. The earthquake under the ocean was recorded at 9.3 magnitude, generating waves up to 93 feet high. Some waves hit land within 15 minutes, but some took as much as 7 hours.
Even those with time to evacuate were hard hit, mostly due to the complete lack of a tsunami warning system which meant very densely populated coastal areas being taken by surprise.
Early warnings save lives
By comparison, although damage to buildings and general destruction was widespread, the 2009 Samoa earthquake and tsunami saw a considerable lower death toll.
With an earthquake of 8.1 magnitude and waves reaching 45 feet high, that travelled up to a mile inland there were 189 casualties recorded. The loss of life would have been far higher if it wasn’t for the Pacific Tsunami Warning Centre which gave people time to evacuate and reach higher ground.
There are several ways in which a tsunami can be detected. From recognition of symptoms, an earthquake can be quite hard to miss, to technological warnings from tsunami detection and forecasting. These are based on a combination of data collected by environmental sensors and using that data for tsunami modelling.
For example monitoring seismic activity and the magnitude of an earthquake can give an excellent warning of tsunami potential. However, it cannot be taken in isolation. For larger earthquakes it is easier to underestimate the size of the quake, and therefore miscalculate the tsunami potential.
Rapid sea level monitoring will give the best warning
When managing the data collected, those carrying out the analysis have a hard decision. Declare a tsunami imminent, and risk a costly unnecessary evacuation, or make the decision to issue the warning to the public so that emergency plans can be activated.
They also need to be able to indicate clearly from the modelling how large the waves will be and when they will strike. Importantly they need to know when the danger will be over so that people can return safely to the evacuated areas.
The issue is that tsunami detection and forecasting requires near-real-time observations from both coastal sea level instruments and open-ocean sensors. Fundamental gaps in coverage still exist, especially in open-water. This puts at risk the ability to give warning, and the ability to learn more about the behaviour of tsunamis after the fact which will further refine the accuracy of the modelling in the future. More coverage is needed, and the durability of the equipment a key factor.
New technology paramount for the detection of tsunamis
The installation of new tsunami buoys is without doubt the next step for addressing the coverage issue, and these buoys need to be smart with built in Tsunami Early Detection and Warning System. It needs to be able to detect an event and send that information to be centrally analysed.
Pressure sensors deployed in a water depth up to 7,000 meters can detect height variations on the water surface, and in order to resist the effect of the harsh elements and environments must be of the highest quality. It is now possible to obtain floats manufactured with a closed-cell polyethylene foam sheet that prevents water absorption.
In terms of positioning and communication, all can be managed through GPS, and redundancy in place for communications via satellite, with a reaction time of less than one minute and powered by a double solar power system. These buoys are so durable they can provide much better confidence that there will be no failure of service in remote locations.
They are able to transmit a NOAA Tsunami Warning System compatible message and monitor the sea level column changes to within 1mm. This kind of monitoring will be paramount for buying enough time for evacuation and prevent the loss of life seen previously.
Public Money Into The Private Sector
6 Things you can do in Tijuana
Branding Basics – Quick Tips
The best site to watch movies
A site to watch full movies
3 Steps to Becoming a Better Person
Quotes To Get Your Boss To Take Cyber Security Seriously
The best places to stay during your holidays in Isla Mujeres
5 Business Mishaps That Could Cost You Dearly
How Jurisdiction of the Diamond Princess Affected Japan’s Image
Business7 months ago
Gutemberg Dos Santos and Robert Kiyosaki share their vision of leadership
Blog11 months ago
Easy, Safe and Without Ads: the Best Websites to Watch Movies Online in 2019
Europe11 months ago
The Best Handmade Souvenirs from Spain are Just a Click Away
Travel12 months ago
New Luxury Resort at Playa Mujeres, Mexico – Discover The Unknown
Travel11 months ago
How to Pack for Air Travel – Tips & Suggestions
Travel10 months ago
You need to visit Ecuador now
Europe11 months ago
Top 10 reasons to visit Iceland by car
Travel10 months ago
New Hospitality Concept Arrives to Cancun ÓLEO Artist Service