Connect with us

Technology

Stuxnet: a New Era in Global Security

Alexandra Goman

Published

on

Stuxnet was a malware which affected an Iranian nuclear facility (along with couple of other industrial sites across the world). It was found in 2010 but it took quite a while to actually discover it. What is particular about it is the fact that it crossed the line between cyber and physical domain, showing that it was possible to use a code to damage a critical infrastructure.  Before it, a general debate in national / global security on how a critical infrastructure can be targeted and damaged through the information system has only been theoretical.  After Stuxnet it was evident that cyberspace could be exploited and used to launch cyberattacks in order to cause physical damage. So what actually happened?

On June 17, 2010 Sergey Ulazin from a small security company in Belarus received a help-request for technical support from a customer in Iran. Arbitrary BSODs (a stop error after a system crash) and computer reboots were reported. After careful examination and a regular check for system malfunction, it was discovered that a malware infection was probably involved (The Man Who Found Stuxnet – Sergey Ulasen in the Spotlight). Having a stealthy nature and strange payload, it was later named Stuxnet, according to the file-name found in the code. A computer worm infected at least 14 industrial sites in Iran along with the uranium-enrichment plant in Natanz.

It carried genuine digital certificates (they guarantee that you can trust a file) from recognized companies, and it was well-developed and direct. The malware was able to determine the target it was looking for. In case, it was not, it did nothing and moved on to another system. This “fingerprinting of the control systems” proved that it was not just an average malicious program, but a targeted malware that meant to destroy.

Although Stuxnet relied on a physical person to install it (via USB flash drive), the worm spreads on its own between computers with Windows operating system. It affects other machines, regardless of the connection to the Internet though a local computer network. It could also infect other USB flash drives and jump into other computers through it. Moreover, it proliferates very quickly.

Once the worm infects a system, it waits, checking if necessary parameters are met. As soon as they are, it activates a sequence that causes industrial process to self-destruct. Symantec, a software company that provides cyber security software and services, conducted a thorough analysis of Stuxnet and found that Iran, Indonesia and India were the most affected countries in the early days of infection. The nuclear facility at Natanz was one of the most affected.

Furthermore, the principle is that this malware identifies a target, then records the data and finally decides what normal operations are. After this, it plays pre-recorded data on the computers of the personnel so that they think that the centrifuges are running normally, when in fact they are not. In the end, it erases itself from the system so that it cannot be traced and/or found.

The International Atomic Energy Agency inspected the Natanz facility and confirmed (International Atomic Energy Agency (2010)) that the centrifuges were malfunctioning and producing less than 20% of enriched uranium. However, at that time, the reason for that was unknown. The most detailed damage assessment came later from the Institute for Science and International Security in Washington. It claimed that Stuxnet destroyed 984 centrifuges. However, Iran has not provided such a number, and the IAEA failed to give precise information on the damage.

Stuxnet crossed this line where a code infects software or digital programs, what it actually did, it affected the physical equipment. This has brought a new technological revolution. Before, viruses were used by cyber pranksters and minor rowdies to cause a system to crash on computers of innocent victims. But state-to-state attacks and a cyberwar were not discussed and were not thought of, as it was something out of science fiction scenarios. Stuxnet has changed this perception, and opened a new era in global security.

A former chief of industrial control systems cyber security research said that Stuxnet was “the first view of something … that doesn’t need outside guidance by a human – but can still take control of your infrastructure. This is the first direct example of weaponized software, highly customized and designed to find a particular target.” It is not hard to imagine that similar malicious programs can be developed in the future and used to achieve a military and/or political goal.

Many believe that the cyberattacks on Iran nuclear facility were meant to slow down Iran nuclear program. However, enrichment recovered within a year, and did not permanently damage nuclear program. Some experts also say that it had no effect on nuclear program whatsoever and the whole situation around Stuxnet was over-hyped by the media. Others are also saying that evidence on the malware has been inconclusive and Stuxnet may have, in fact, helped in speeding up Iranian nuclear program. The media reaction towards cyberattacks may have been exaggerated because of the secrecy around cyber issues but in end Stuxnet has made a good story.

As to the parties involved, the attack was not tied to a specific name and/or a country. Yet, it widely believed to be launched by U.S. and Israel. The sophistication of the program required considerable amount of resources, including extensive financial support and skilled specialists. This is why many security companies and experts agree on attributing the complex malware to one or more states. Among them is Kaspersky Lab, a multinational cyber security company, who says that the attack was launched with a specific motivation in mind. The attackers wanted to access industrial control systems which monitor and control infrastructure and processes of the facility. (Similar systems are used in power plants, communication systems, airports, and even military sites). Moreover, such an attack required significant amount of intelligence data so Kaspersky Lab is convinced that it was likely supported by a nation state.

Although the identity of the attacker is still unknown, many experts in international politics believe that the attack was clearly politically-motivated and aimed to slow down the development of Iran’s nuclear program. The United States and Israel both deny their involvement in Stuxnet, however, some leaked information (WikiLeaks, CBC interview with a former CIA director Michael Hayden etc.) suggests that the claims might have some credibility. Regardless the claims made, it is important to highlight that no country officially declared that it launched an offensive cyberattack.

All in all, Stuxnet has revolutionized the way we look at malicious digital programs and boosted a debate about cyber tools used for political purpose. After all, we are living in a highly digitalized world where we are dependent on technology. Military is no exception. Digital technologies are widely being incorporated into military planning and operations. Modern nuclear and conventional weapons systems rely and depend on information systems for launching, targeting, command and control, including technologies that govern safety and security. It is clear that future military conflicts will all include a digital aspect and cyber technologies. Stuxnet was just an early version of software that could potentially destroy an industrial site, specifically a nuclear facility. If malware actually achieved its goals, consequences would have been disastrous and could cause an international crisis.

 After all, as experts once have said, “Major concern is no longer weapons of mass destruction, but weapons of mass disruption” (Cetron and Davies, 2009).

Use your ← → (arrow) keys to browse

Specialist in global security and nuclear disarmament. Excited about international relations, curious about cognitive, psycho- & neuro-linguistics. A complete traveller.

Continue Reading
Comments

Technology

Growth in the nanofiber market expected to continue to grow throughout 2019 and in 2020

Published

on

The field is now seeing phenomenal growth and investment as newer, slicker and cheaper technologies, such as electrospinning, are allowing for more and more applications, particularly in the field of drug delivery. 

Use of nanofiber is no new technology, in fact microfibers have been in use – particularly in the textile industry for many years. Even in the global filtration and separation technology market current forecasts for the coming year is that there will be growth of around 6% in demand, and that is before you factor in the explosion in alternative global drug delivery methods due to the increase in chronic diseases, new pharmaceutical products and technological advances. Major manufacturers are exploring the potential of the production of nanomaterials by electrospinning as the next big step forward for their business. 

What is electrospinning and how does it work? 

Put quite simply, electrospinning is the method by which nanomaterials are made. It is incredibly versatile, and the range of raw materials that can be used is very wide ranging, and can allow for different properties of the finished product. 

Starting with a polymer melt, using materials such as collagen, cellulose, silk fibroin, keratin, gelatin, or polysaccharides for example, chain entanglement takes place in the solution. An electrical force is then applied to pull threads of electrically charged materials into a jet that can be whipped or spun into fibers as the solvents in the solution are evaporated.

Finally the dry fiber is formed into a membrane or material, depending on the intended use. The material will have some great functional properties, such as a large surface area-to-volume ratio, high porosity and strength. 

Nanomaterials are revolutionising the advancement of new materials, and for companies looking to be the leaders in new developments and pushing industry forward with new technologies this is an area that will help them stay at the top of their game.  

Why is it worth the research and development?

With virtually limitless applications, electrospinning can be used in any industry. Not just in the production of textiles, where breathable, lightweight or protective clothing might be required, but also in the creation of filtration systems, and in medicinal and pharmaceutical products. 

It even has use in the packaging of food and other consumables, and there is some research being put into the creation of food. There are already companies who have managed to scale their electrospinning processes. 

The versatility of the process and the potential for creating groundbreaking new products is only part of the story. One of the other reasons this is a good direction to take your research and development team is because it is relatively quick and easy to set up with the help of a good electrospinning equipment company. There is a range of machinery available, from small worktop ‘proof of concept’ electrospinning machines for small laboratories, to large pre-production scale machines. It means that start up and installation costs are far lower in comparison to many other production processes. 

The user interface of this machinery has also advanced with the times, making it far simpler to operate and carry out the processes with a passing knowledge of polymers and electrostatics. Training up the workforce takes no time at all. The world is already seeing the benefits of this technology, particularly in the field of health and medicine. For example wound patches or organ membranes are artificially made and used during surgical procedures. Due to the molecular structure of the material it can graft with biological living tissue. And of course in the use of pharmaceutical implants and patches for the slow release of medicine. This is a field that will continue to grow as new discoveries are made.

Prev postNext post
Use your ← → (arrow) keys to browse

Continue Reading

Technology

9 disruptive technologies that will bloom before 2019 ends

Published

on

Technology-in-2019

Since the beginning of time, each new technological invention has meant a change of paradigm for the way people work. However, in recent years the frequency of changes has accelerated to such an extent that companies have to renew themselves and their daily procedures almost every season. Usually they are small changes or mere adaptations, but sometimes an innovation appears that makes the previous mechanisms obsolete. This is what is known as disruptive technology.

2019 is a disruptive year as far as technology is concerned: the trend of innovation continues at an accelerated pace, deepening the technological revolution. Innovative industries keep evolving and they are overcoming barriers only imaginable in Isaac Asimov’s sci-fi novels or in TV series and films such as Black Mirror or Gattaca. Check the technological trends that are making a disruptive change in the digital transformation.

1. 5G mobile networks

Some companies have started to launch pilot experiments of this kind of technology. 5G prepares the ground for navigating at speeds of up to 10 gigabytes per second from mobile devices.

2. Artificial intelligence (AI)

This will be the year of its definitive take-off. Included in the political agendas, the European Commission has made it one of the mandates for member states to develop a strategy on this matter by the middle of the year.

3. Autonomous devices

Robots, drones and autonomous mobility systems are some of the innovations related to AI. They all aim to automate functions that were previously performed by people. This trend goes beyond mere automation through rigid programming models, as it explores AI to develop advanced behaviors that interact more naturally with the environment and users.

4. ‘Blockchain’.

Finally, this technology it is no longer associated only to the crypto coins world, and experts are starting to notice its likely application in other fields. In congresses such as the annual IoT World Congress by Digitalizing Industries, -coming in october 2019-, we will witness the actual implementation of many projects based on ‘blockchain’, which will try to solve the challenges still faced by technology in different fields such as banking and insurance. It will also be a decisive year for the deployment of ‘decentralised organisations’ operating around smart contracts.

5. Advanced analytics

‘Big data’, is taking a step further with this trend, which combines this technology with artificial intelligence. Automatic learning techniques will transform the way data analysis is developed, shared and consumed. It is estimated that the capabilities of advances analytics will soon be widely adopted not only to work with information, but also to implement them in business applications of the departments of Human Resources, Finance, Sales, Marketing or Customer Service, in order to optimize decisions through a deep analysis of data.

6. Digital twins

Digital Twins are one of the disruptive technologies that will have more impact on the simulation and analysis of industrial processes. A digital twin is the virtual representation of a real-world entity or system capable to maximize the benefits of the digital transformation of companies. Many companies and organizations are already implementing these representations and will develop them over time, improving their ability to collect and visualize the right data, apply improvements to it, and respond effectively to business objectives.

7. Enhanced Edge Computing

Edge computing is a trend mostly applied to the Internet of Things. It consists of the location of intermediate points between connected objects in order to process information and perform other tasks in places closer to the reception of content by the user, in order to reduce traffic and latency in responses. This is a way to keep processing near the endpoint rather than on a centralized cloud server. However, instead of creating a new architecture, cloud computing and perimeter computing will evolve as models complementary to cloud services, managed as a centralized service that runs not only on centralized servers, but on local distributed servers and on the perimeter devices themselves.

8. Immersive experiences in intelligent spaces

Chatbots integrated into different conversation platforms and voice assistants are transforming the way people interact with the digital world, as are virtual reality (VR), augmented reality (AR) and mixed reality (MR). The combination of these technologies will lead to a profound change in the perception of everything that surrounds us through the creation of intelligent spaces where more immersive, interactive and automated experiences can be lived for a specific group of people or for specific scenarios in an industry.

9. Digital ethics and privacy

Digital ethics and privacy are issues of increasing interest to individuals, organizations and governments. It is no coincidence that people are increasingly concerned about how their personal information is being used by public and private sector entities, so in the coming months companies will be proactively addressing these concerns and to gain the trust of users.


Use your ← → (arrow) keys to browse

Continue Reading

Technology

You haven’t virtualized yet – why you should do so as soon as possible

Published

on

keyboard laptop virtualization

Virtualization is not a new thing, it has been around for some time now, and is one of the key ways a business can protect their IT infrastructure and reduce costs.

Opting for cloud vdi (virtual desktop infrastructure), is absolutely the way forward for businesses, but there could be many reasons why you haven’t been able to make the change yet.

Maybe you have not had a good enough network to support externally hosted desktops and applications, or you are a smaller business that is only just beginning to think of moving to a virtual enterprise structure. It could also be that you are suffering from the hangover of an older infrastructure with your own onsite servers and just coming to the end of the asset life time. Either way your next move should be to look at virtualization and here is why.

The savings can be substantial

Without a doubt the biggest reason is the cost savings you will make. Any company or business needs to be fully aware of the bottomline, and while the project to virtualize will need a little investment, long term it will save your business a lot more.

For example, you will no longer need onsite servers. Hardware is expensive to replace, and in order to keep up with technological investment they need to be replaced every few years. They also need to be upgrades, require server engineers to manage them, a specialised location to store them with adequate cooling and they use a lot of electricity. And this is before you even begin to think about the licences for the operating systems and applications.

Increased reliability and security

With security becoming so much more important, especially if you are holding any personal data, you need to be sure that you have adequate security measures in place to protect your IT services. Through application virtualization a data centre via the cloud, you can make sure that those provisions meet exactly what you need.

You can also increase the uptime and availability for your users, through better mirroring and failover provisions. Data centres are geared towards maximum uptime, and even should something go wrong with a server, users will like never even know as the services move over to alternative servers. To create and host this type of infrastructure yourself will require a whole IT department!

Increased productivity for your workforce

By moving to desktop virtualization your employees will be able to access their documentation and applications from almost any device. From mobile devices, tablets, laptops they will be able to do whatever they need, whenever and wherever they need it. For companies operating internationally or with a lot of travel involved this is absolutely vital.

It can also set the scene for flexible working – already proved to make the workforce much more productive. It also means that should a device breakdown, it is simple enough to switch to another.

Management of company devices is also a lot simpler, with setup and deployment happening remotely. All your installations, updates and patches, back ups and virus scans can be controlled centrally. It also means much better management of software assets.

In addition your service provider should be able to provide a whole range of support for your IT teams, with access to many disciplines and expertise to keep you running at your maximum 24 hours a day if needed.

Desktop virtualisation is definitely the way forward for any business. It makes end user environments much more secure. Reliability and uptime is better, which also keeps those end users happy and productive in their own work. No more lost working hours due to broken servers. Approached strategically, this can revolutionise your business and its operations well into the future.

Use your ← → (arrow) keys to browse

Continue Reading

Trending