From the depths of the crystal ball: the risks of tomorrow revealed today

:)


Download 53.35 Kb.
Date conversion24.04.2017
Size53.35 Kb.


FROM THE DEPTHS OF THE CRYSTAL BALL:

THE RISKS OF TOMORROW REVEALED TODAY
Dr Lynn T Drennan, Glasgow Caledonian University

Abstract
There is no doubt that our ability to manage future risks is improved by learning from events that have occurred in the past. However, nothing stays the same forever and no two incidents will ever repeat themselves in exactly the same way. What are the trends now that might give us cause for concern in the future? Are we ignoring or underestimating current threats that will pose serious health or environmental dangers in years to come? Perhaps we need to take our clues not only from the research scientists and engineers but from writers and film-makers. Will the ‘day after tomorrow’ lead to a melting of the polar ice caps and a frozen world? Will nanotechnology lead to the development of robots that can think for themselves? And does size matter? As we seek to create the tallest buildings, and largest ships and planes, what additional risks do these structures present to the people that occupy them? If we are to manage the risks of tomorrow, we need to understand both the past and the present, but most of all we need to use our imaginations and, perhaps, a crystal ball that will help us look into the future and prepare for the unexpected surprises that lie ahead.

Introduction

Our ability to manage future risks is undoubtedly improved if we can learn lessons from events that have happened in the past, and use that learning wisely. We know that no two incidents will ever repeat themselves in exactly the same way, but there are often common features relating to a failure – whether that be a failure of design, manufacture, usage or the impact of the forces of nature – that are relevant to other organizations and societies. By examining these, we can learn valuable lessons that may help prevent similar incidents in the future or, at least, enable us to respond more effectively when an adverse event does occurs.

One way in which many societies examine major failures is through a public inquiry. In the UK, we have seen this approach used following air and rail crashes (Lockerbie; Clapham; Ladbroke Grove), oil rig explosions (Piper Alpha), failures in the medical system (Bristol hospital heart babies; mass murderer GP, Dr Harold Shipman), football ground disasters (Bradford; Hillsborough) and the sinking of passenger ships (Herald of Free Enterprise; Marchioness). The same is true in many other countries, including the United States, where the 9/11 Commission recently published its report into the terrorist attacks on the US in 2001.
The key questions that are asked, following these major incidents, are ‘why did it happen?’ and ‘how can it be prevented in future?’ In this respect, we largely rely on being able to look to the past as a means of predicting the future. This can be helpful when we are looking at a situation, system or product, that is gradually undergoing a process of change but it is of no help whatsoever when faced with something that is completely new and outwith human experience.
The problem is that our thinking can often be hampered by our knowledge and experience of past events. Hindsight is an exact science. If only we had known, understood or acted differently in the past, perhaps we could have prevented a major failure from occurring. But hindsight is of no help when trying to predict, and manage, the technology of the future and the risks it may bring.

Future technology: science fiction or science fact?

If we try and look forward 5, 10, 50 or more years into the future, what kind of a world do we see? Will new technology have made our lives better? Will we have found a cure to all our major diseases? Will we be traveling freely in space? Replicating food and transporting our bodies from one place to another, at the touch of a button? That was the vision of Gene Roddenberry who wrote the scripts for ‘Star Trek’, the television series that first appeared in 1966, several years before man had even landed on the moon. Roddenberry imagined a future in which Earth was a war-free and happy place, from which men and women conducted great adventures in space, and the only risks came from aggressive alien species or strange gravitational pulls.

How much of Roddenberry’s vision was science fiction and how much science fact? David A. Batchelor, a NASA scientist, who wrote “The Science of Star Trek” in 1993, attempted to separate out the real (or potential) science from the science fiction writer’s fantasies. In his book, he observed the following:


  • Ship’s Computer: most of the things it does are plausible. It is expanding on the autopilot and navigational systems that already exist, and responding to voice command, which is also feasible.

  • Matter-Antimatter Power Generation: this was based on real physics and would be a logical fuel for such craft.

  • Impulse Engines: within the bounds of real, possible future engineering.

  • Androids: the president of the American Association for Artificial Intelligence, when asked what the ultimate goal in his field would be, answered ‘Lieutenant Commander Data’ (the android from Star Trek, Next Generation) – a historic feat of cybernetics.

  • Alien Beings: scientists accept that life probably does exist in other solar systems, but what shape it might take is debatable. It is good to know that those of us who are worried about being abducted from Earth by aliens can still obtain coverage via the UFO Abduction Insurance Policy offered by the Saint Lawrence Agency of Altamonte Springs, Florida.

  • Sensors and Tricorders: a whole variety of sensing and recording equipment exists and some can create 3-dimensional imaging.

  • Warp Drive, Holodecks, Replicators and Transporters: unfortunately, in Batchelor’s view, these exist only in the writers’ imaginations. They are science fiction ‘magic’ – there to assist the storyline, but unlikely ever to come true.

Roddenberry himself said “the funny thing is that everything is science fiction at one time or another”. Our grandparents could scarcely have imagined the world of mobile phones, digital photography and notebook computers with which we are all familiar today. What writers such as Roddenberry, Arthur C. Clarke, Issac Asimov and many others have done is to give us a vision of what might be possible in the future.

The film “I Robot” which was released this year, starring Will Smith as a ‘robophobic’ cop who does not like or trust robots, was based on a series of short stories written by Asimov in the 1940s. In the film, it appears that a robot has overcome one of the universal laws of robotics – that a robot cannot harm a human or allow harm to come to a human – and has killed its inventor. Will Smith is sent to investigate, and discovers a sinister plot intended to result in robots taking over the world, and ruling humans.
The idea that man might invent a computer or robot that could eventually think and act for itself is not new. In Arthur C. Clarke’s “2001: A Space Odyssey”, the on-board computer ‘Hal’ kills some of the astronauts, in order to protect his mission. In “I Robot”, the writers explore our fears that robotics and nanotechnology, which will be discussed more fully later, might develop independent of human intervention, creating a situation over which we will have no control.

For other ideas of future risk, we could cast our minds back to disaster movies of the 70s, 80s and 90s, such as “Towering Inferno”, “Earthquake”, “Armageddon”, “Deep Impact”, “Twister” and “Dante’s Peak”. In “Towering Inferno” we watched the predicament of a group of people, trapped in a high-rise, burning building, trying desperately for a means of escape. The parallels with the World Trade Center attack are sadly apparent. Fire chiefs have admitted, following 9/11, that it is almost impossible to rescue people trapped above the level of a fire in a high rise building. In “Earthquake”, “Twister” and “Dante’s Peak”, we are at the mercy of the natural environment. While “Armageddon” and “Deep Impact” heightened fears of a major meteor striking Earth and wiping out large tracts of the developed world.

More recently, we have been presented with a vision of “The Day After Tomorrow”, in which man’s (and in particular the United States Government’s) reluctance to control carbon emissions leads to a melting of the polar ice caps, massive weather changes, and a fast-freeze of half the planet, resulting in the deaths of millions of people. Science fiction or science fact?

Current and future risks: the jury is still out
Steven Fink, author of “Crisis management: planning for the inevitable” argues that we are constantly surrounded by warning signs, or ‘prodromes’ as he calls them, that failures and crises are likely to occur. Unfortunately, we are prone to either ignoring or undervaluing the information that is presented to us. Often, we hear only what we want to hear.
‘Big Tobacco’ argued that smoking was not harmful, long after medical evidence proved conclusively that it was. Even today, the tobacco lobby still argues against any risk associated with passive smoking. With the benefit of hindsight, we now better understand the risks associated with products such as cigarettes, asbestos, silicone breast implants, thalidomide and HRT. So too do the people whose lives were ruined, or cut short, as a result of being exposed to these products. However, the list of issues where experts currently disagree over the levels of risk is almost endless. Examples include:


  • Mobile / cell phones

  • Mobile phone masts

  • Electricity sub-stations and pylons

  • Climate change

  • MMR (measles, mumps and rubella) vaccination

  • Ritalin

  • GM foods

  • New variant CJD, and

  • Nanotechnology

When experts disagree, it becomes even more difficult for the public to make up its mind as to what is, or is not, acceptable risk. An understanding of how the public perceive individual risk issues, and the factors that influence such perception, is an essential component of risk management. For the most part, threats that may impact on the well-being of children are most vigorously opposed. Thus the alleged risks arising from the combined MMR vaccination, proximity to mobile phone masts, and/or the consumption of genetically-modified foodstuffs, have gained a lot of publicity in local communities and generated action by well-organised environmental groups. This opposition has been highly vocal, and effective, despite a lack of evidence that any real risk exists.

On the other hand, parents seem happy to purchase mobile phones for their relatively young children, arguing that it keeps them safer, as they can get in touch with their children more easily and can be contacted quickly in an emergency. Yet, the widespread use of mobile phones among young people flies in the face of research which indicates that physical damage may be done to our brains, and in particular to the still-developing brains of children. A recent study suggested that there was a 30% increase in brain cancer in regular mobile phone users, and that damage to brain cells could also lead to early Alzheimer’s disease. Yet, for the moment, the public seem to perceive the benefits of mobile phones as outweighing any possible risks associated with their use. These views may, however, change as more information reaches the public domain.

In looking at future risk, therefore, it may be wise to start with an examination of the potential risks that arise from our lifestyles, preferences and practices today.



Here’s looking at you, kid
A quick look at the television schedules, both in the UK and the US, suggests that we are obsessed with taking extraordinary measures to attain personal improvement. Dramas like “Nip Tuck”, reality shows such as “Extreme Makeover” and “10 Years Younger”, and documentary series with titles such as “Cosmetic Surgery Live” demonstrate the popularity and prevalence of cosmetic ‘enhancement’ today. According to the ABC website, “Extreme Makeover” gives men and women:

a truly Cinderella-like experience. A real-life fairy tale in which their wishes come true, not just to change their looks, but their lives and destinies. This magic is conjured through the skills of an “Extreme Team” including the nation’s top plastic surgeons, eye surgeons and cosmetic dentists, along with a talented team of hair and makeup artists, stylists and personal trainers.’

Liposuction, nose jobs, tummy tucks, breast implants, dermal fillers, injections of botox… These are serious procedures, not without their risks, yet in 2002, in the United States alone, 6.9 million cosmetic procedures were carried out – more than three times the number in 1997. Business is also booming in Asia and Latin America, and it is not just women who are going under the knife, men are increasingly demanding treatment, including implants to enhance various parts of the male anatomy.
If we know about the risks that silicone breast implants, which were once considered ‘safe’ and then proved otherwise as they began to leak into the body, should we not be concerned about the effects of other substances that are being used for cosmetic enhancement today. For example, the most popular treatment in the US in 2002 was the injection of Botox® to reduce wrinkles. Botox® is a potent neurotoxin. Essentially it paralyses the muscles used to create a frown, thereby making the brow seem smoother. Treatment is often started in the late 20s and 30s, as women seek prevention rather than a cure for lines and wrinkles. How can we know what the effect might be of prolonged use of Botox® over twenty, thirty or forty years?

Other popular treatments, such as dermal fillers, to puff up lips and fill in facial lines, are being routinely injected in an effort to restore a youthful look. Originally, bovine collagen was used, but some patients were allergic and others feared catching ‘mad cow disease’ in the wake of the BSE scare. New products use hyaluronic acid, which is expected to last up to six months, following treatment. Unfortunately, for some women, the use of dermal fillers in the lip area has left them with what is referred to, rather unkindly, as a ‘trout pout’ – after the fish of the same name – meaning that their lips are left unnaturally over-sized, and not at all what they expected or wanted. Some doctors have expressed concern about the use of materials that stay in the body and fail to age with nature, however others seem happy to employ ‘permanent’ fillers, with attractive-sounding names such as ‘Radiance’, which critics say risk creating unsightly lumps and bumps that may require surgery later.

Lasers, which first came into use in the 1980s, to ‘resurface’ the skin by burning off the top layers, revealing fresh flesh underneath, but necessitating several weeks of recovery, have now been replaced by newer technologies. Radio frequencies are being used to heat up the skin’s collagen, offering the prospect of a ‘scalpel-free facelift’ and one manufacturer of a light-emitting diode (similar to a TV remote control) that exposes skin to specific pulses, intended to stimulate cells to produce new collagen, believes that such devices could be sold for home use. Home use or home abuse? If such products were freely available for use in the home, the potential for mis-use and over-use would multiply.
Yet, as these less invasive technologies have developed and become easier to deliver, the range of people willing to carry out the procedures has increased. Family doctors, beauty therapists and hairdressers now offer these ‘added value’ services to their patients or clients. For those of us concerned with the management of risk, it is not only the technology with which we must be concerned, but with the skills of the operator. Currently, there are few regulatory restrictions on individuals who wish to offer non-invasive cosmetic enhancement treatments. This may need to change.

Despite our obvious concern with appearance, obesity has become a major problem in the United States and in other developed countries. In 2002, the World Health Organisation ranked obesity as 10th on its list of avoidable causes of death, with the related issues of high blood pressure and high cholesterol taking 3rd and 7th positions respectively. Our increasing dependence on eating out – particularly with eating ‘fast foods’ – traveling everywhere by car, and taking minimal exercise have all been blamed as contributory factors.
Morgan Spurlock, in his film “Super Size Me” conducted an experiment that required him to eat and drink only from McDonalds for one whole month. The effect on his body and his health was dramatic and, from his doctors’ viewpoints, extremely worrying. McDonalds have contributed their side of the story in a website www.supersizeme-thedebate.co.uk and no-one would disagree that eating such an unbalanced diet would be bad for anyone. However, it was the addictive nature of the products, with their high fat and sugar contents, and the specific marketing aimed at very young children, with which Spurlock was most concerned. McDonalds has, coincidentally, stopped offering to ‘super-size’ its products and has introduced healthier options in its restaurants. Although the corporation last year successfully defended a law suit, brought by two overweight teenagers, who claimed that eating McDonalds had caused their obesity and ill-health, the likelihood is that McDonalds, and other major food chains, may well find themselves the target of further law suits in the future. The argument that individuals should exercise personal responsibility falls short, as it has done in the case of tobacco manufacturers, if it can be successfully argued that the product in question has known addictive elements.

All this ill-health and surgical self-improvement leads inevitably to a stay in hospital. Once considered a relatively safe and clean environment, hospitals are now known to make you sick. The number of people infected with the superbug – MRSA – has increased sharply in recent years. Bacteria have become resistant to antibiotics and are therefore more difficult to treat effectively. As is often the case, the elderly and the very young are most at risk, however MRSA has resulted in death and serious injury to relatively young, and otherwise healthy individuals, who have contracted the disease while in hospital. Worryingly, some experts believe that it will take a breakthrough akin to the discovery of penicillin before humans can regain the upper hand over these bugs.

As for the risks of surgery itself, we can now add to our list of potential threats that of contracting new variant CJD from surgical instruments. Scientists now believe that traditional sterilizing methods do not kill the prions that transmit vCJD and that if instruments have been used on a patient who has been infected with vCJD, whether that fact is known or not, then there is the possibility of transmitting infection to another patient. These risks can be overcome if disposable instruments are used, but this is not always practical. Failing that, improved methods of sterilization are called for.

Does size matter? From the very, very small to the very, very big
If it is true that we often fear what we cannot see, hear or touch, for example nuclear radiation or virulent disease, such as SARS, then when it comes to considering nanotechnology perhaps we should be afraid… very afraid. Nanotechnology is a generic term for a large number of applications and products that contain tiny particles, measured in nanometers, which are one thousand millionth of a metre. Put another way, 1 millimetre equals 1million nanometers. Although man has constantly sought to reduce the size of products such as computers, phones, televisions, music systems, etc. nanotechnology takes this into a whole new sphere.

In nanotechnology, old laws no longer apply. Electrically insulating substances become conductive; insoluble substances become soluble; nanoparticles can travel almost unhindered throughout the human body. Some have likened the discovery of nanotechnology to a new industrial revolution. Whether this is true or not, there can be no doubt that nanotechnology is big business. The US Government has committed $3.7bn to the budget of the National Nanotechnology Initiative for the period 2005-2008, while the EU committed $1.2bn for nanotechnology research in 2003-04.

The name ‘nanotechnology’ was first used by Dr Eric Drexler in his book “Engines of Creation”, published in 1986. Dr Drexler also famously coined the phrase “grey goo” when he mooted the idea that self-replicating assemblers might, if not carefully designed, eat everything around them and turn into more assemblers, creating a mass-like grey goo that could take over the planet. Needless to say, this idea was later taken up by one of the world’s most famous fiction writers, Michael Chrichton, in his novel “Prey”, which was published in 2003. The storyline is described as follows:

In the Nevada desert, an experiment has gone horribly wrong. A cloud of nanoparticles – micro-robots – has escaped from the laboratory. This cloud is self-sustaining and self-reproducing. It is intelligent and learns from experience. For all practical purposes, it is alive. It has been programmed as a predator. It is evolving swiftly, becoming more deadly with each passing hour. Every attempt to destroy it has failed. And we are the prey.
Are we in the realms of science fiction again? Or are there some real facts here that we should be concerned about, now, before it is too late. The Royal Society and the Royal Academy of Engineering in the UK certainly believe that there is an immediate need for further research into certain aspects of nanotechnology. In their joint report “Nanoscience and nanotechnologies: opportunities and uncertainties” which was published in July 2004, they state that while such technologies offer many benefits now and in the future, more public debate is needed about their development. Specifically, they saw an immediate need for research to address uncertainties about the effects of nanoparticles on health and the environment.

Nanoparticles are not new, as particles of nano size exist at present, for example salt in the ocean air or carbon from diesel engines. Although naturally occurring nanoparticles can be harmful, their effect tends to be short-lived. Artificially and commercially manufactured nanoparticles behave differently. They can enter the bloodstream through inhalation, via the digestive tract, and may be absorbed through the skin. Whether introduced involuntarily from the environment, or deliberately as part of a medical procedure, nanoparticles can penetrate all the body’s organs, even the brain, and can certainly cross the placenta to a growing foetus.

All of this should give us cause for concern as to current and future health risks associated with such technology, yet products – some of which come into close contact with the body – are now on the market. These include:


  • Suntan lotion that does not leave a white film on the skin

  • More absorbent babies’ nappies

  • More effective vitamin pills

  • Clothing that repels sweat and stains

  • More effective air-tight cling film

  • Harder-wearing tyres

  • Better car paint

  • Bouncier tennis balls

Those involved in the manufacture of products containing nanoparticles may be most at risk, as existing PPE is not designed to protect against such tiny substances. Yet, in contrast to debates on the safety of nuclear power or genetic modification, for example, the public does not yet seem concerned with the threat of nanotechnology. Nonetheless, the media is getting more interested in the debate and even our own Prince Charles has contributed to the discussion. In an article published in the “Independent on Sunday” on 11th July 2004, Prince Charles refers to the Royal Society study and writes:



I am particularly struck by the evidence provided by a recently-retired Professor of Engineering at Cambridge University, Professor John Carroll…Referring to the thalidomide disaster, he says it ‘would be surprising if nanotechnology did not offer similar upsets unless appropriate care and humility is observed.’ He ends by pointing out that ‘it may not be easy to steer between a Luddite reaction and a capitulation to the brave new technological world, especially when money, jobs and business are at risk.’

Those are my sentiments too…

It has been estimated that only 29% of the UK population are aware of nanotechnology. Currently it is not a major public issue. The question is whether government and business should be adopting the ‘precautionary principle’ – taking the necessary measures to protect people and the environment at an early stage, even if the scientific uncertainties regarding the risks have not yet been fully demonstrated. If Michael Crichton’s book is made into a film and reaches a wide audience, that may be a further catalyst for increasing public debate and concerns over this issue. Will the perceived benefits outweigh the perceived concerns? Will nanotechnology generate the same level of fear and mistrust that has accompanied bio-engineering and result in widespread opposition and pressure for legislation?

The final report of the Royal Society/Royal Academy of Engineering suggests that a precautionary approach should be taken. The report states that:
There is virtually no information available about the effect of nanoparticles on species other than humans or about how they behave in the air, water or soil, or about their ability to accumulate in food chains. Until more is known about their environmental impact we are keen that the release of nanoparticles and nanotubes to the environment is avoided as far as possible.
Specifically, we recommend as a precautionary measure that factories and research laboratories treat manufactured nanoparticles and nanotubes as if they were hazardous and reduce them from waste streams and that the use of free nanoparticles in environmental applications such as remediation of groundwater be prohibited.

The report goes on to recommend that all relevant regulatory bodies should consider whether existing regulations are appropriate to protect humans and the environment, not only with regard to current applications of nanotechnology, but also for those that might occur in the future, and ensure that any regulatory ‘gaps’ are plugged. While not pursuing the prospect that we may be destroyed by grey goo, or over-run by intelligent nano-robots, it does seem that we are faced with the rapid development of a technology that we do not yet fully understand. Without wishing to halt the advance of scientific development, there do seem to be strong arguments, in this case, to err on the side of caution and ensure that in grasping the opportunities that nanotechnology presents, we are also able to manage the risks.

From the very, very small, let’s now turn our attention to the very, very big. The designers of aircraft, ships and buildings seem to compete with one another to create the world’s largest / tallest / biggest structures to contain and transport humanity. The Sears Tower in Chicago, constructed in 1974, was listed as the world’s tallest building until overtaken in 1998 by the Petrona Towers 1 and 2 in Kuala Lumpur, Malaysia. In turn, these were overtaken by Taipei 101, completed this year in Taiwan, and designed to withstand the typhoons and earthquakes prevalent in that region.

The twin towers of the World Trade Center in New York City were ranked 5th and 6th until their destruction on September 11th 2001. The proposed Freedom Tower, designed by Daniel Libeskind and set to be built on the site of the former WTC, is expected to rise to 1,776 ft and rank as the world’s (next) tallest building, until that too is surpassed – as undoubtedly it will.
But it is not only buildings with which size apparently matters. On January 2004, the world’s longest, tallest and widest passenger ship – the Queen Mary 2 – set sail from Southampton, England for Fort Lauderdale in Florida. Costing an estimated $800m, Cunard’s QM2 is billed as the grandest ship ever built, with 1,253 crew looking after 2,620 passengers – a ratio of approximately one crew member to every two passengers.
Not to be outdone, the airline industry has weighed in – literally – with the largest passenger aircraft so far, the Airbus A380. This two-story plane has 555 seats and is expected to be 15% to 20% cheaper to operate than the Boeing 747, carry 35% more passengers and have a 10% greater range, making it capable of non-stop journeys of up to 15,000km. Rolls Royce and GE/Pratt & Whitney have both been working on a new breed of engine that will be capable of lifting this giant into the air, while airports around the world are considering what alterations they need to make to runways or docking stations (not to mention the logistics of dealing with 555 passengers and all their luggage) to accommodate aircraft of this size. On September 10th, Airbus announced that it had 139 firm orders and commitments for the A380, which they expect to go into service in 2006.

Tallest, longest, heaviest, widest… none of these factors, in themselves, pose additional risk. A larger ship is not inherently less safe than a smaller one. What does have to be considered, however, is the greater number of people that will be affected should something go wrong. If an Airbus A380 were to crash, the loss of life would be considerably more than from a Boeing 747. Fire, or a terrorist alert, affecting a mega-building would involve the evacuation or rescue of thousands of people. An outbreak of food poisoning or legionella on a vessel the size of the QM2 could have a devastating impact in a short space of time. Our expectations for health, safety and environmental protection need therefore to expand to accommodate the enhanced risk that these giant structures present.


And now, the weather report
Recent weeks have seen some of the most devastating hurricanes in living memory batter the Caribbean islands and the coast of Florida. Bangladesh has experienced the worst floods in fifty years, resulting in the deaths of 600 people. In August the small Cornish village of Boscastle was nearly washed away when 185mm of rain fell in just 5 hours, creating flash floods that caught everyone off-guard. What everyone wants to know is whether such events are part of a global climate change.
According to the UK’s Environment Agency, the answer is ‘yes’. 2002 was the second warmest on the planet since records began 150 years ago. 1998 was the warmest, with 1997 and 2001 close behind. All the evidence suggests that the 1990s were the warmest decade in the warmest century of the last millennium. The Environment Agency supports this statement with evidence of disappearing glaciers, worsening Mediterranean droughts, unprecedented storms in Latin America and Asia and, as the ice caps melt, rising sea levels.
Although there are some natural explanations for climate change, such as solar and volcanic activity, most scientist agree that the gases emitted by burning coal and other fossil fuels contribute to the ‘greenhouse effect’. This argument has persuaded most of the world’s governments to sign up to the Kyoto Protocol and to commit to a reduction in greenhouse gas emissions. However, the United States and Australia have yet to ratify the Protocol and no formula has yet been found to include developing nations in the framework.

Are we doing too little too late? Just how far-fetched is the scenario painted in the film “The Day After Tomorrow”, where climate change is no longer slow and incremental but accelerates at an unexpected and unprecedented speed, creating a situation in which governments are no longer in control.

There is no doubt that the time to act is now. Timely action can avert disaster. [Otherwise changes will happen] within the lifetime of my children certainly, and possibly within my own…

I mean a challenge so far-reaching in its impact and irreversible in its destructive power that it alters radically human existence.
These are not the words of a Greenpeace or Friends of the Earth activist, but of Prime Minister Tony Blair, in a speech he gave in London on 15 September 2004.
From a risk perspective, in the short term, we need to consider the location, construction and protection of buildings and other structures. Government, in turn, will need to address the increased likelihood that they will be called on to give emergency assistance when weather-related disasters strike. In the short to medium term, more drastic measures – including legislation – may be necessary to force both business and domestic consumers to change their energy inefficient ways. In the words of Tony Blair, if left unchecked, climate change will “result in catastrophic consequences for our world”.

The crystal ball
Most of the threats that have been discussed in this paper are predictions based on what we currently know, or think we know. The challenge for us all, in risk management, is to go beyond ‘planning for the inevitable’ and to ‘think the unthinkable’. Reporting on the al-Qaeda terrorist attacks, the 9/11 Commission found that:
Across the government, there were failures of imagination, policy, capabilities and management.
The most important failure was one of imagination.

Imagination is therefore the most valuable skill we can bring to the process of identifying and controlling future risks. When you look into the future, what do you see? Will the developed world be choked by its own fumes, obese and dependent on medical procedures for its survival? Will the destruction of our environment lead to ever increasing levels of skin cancer and asthma, and bring about catastrophic climate change? Will nanotechnology prove to be a blessing or a curse? Finally, will our desire to seek ‘blame’ and obtain compensation continue to spiral out of control or is there a glimmer of hope that governments will begin to set sensible limits on compensation?

I’m glad that I don’t have a crystal ball. Hindsight may be a wonderful thing but looking into the future seems to present us with one doomsday scenario after another. However, history shows us that man is nothing if not resilient, and innovative. If we can use our imagination to create great opportunities, then we can also use it in the search for creative solutions to the risks that new technologies present. A crystal ball is, as Scott Adams, author of the “Dilbert Future”, suggests - just one method that we can employ:
There are many methods of predicting the future. For example you can read horoscopes, tea leaves, tarot cards, or crystal balls. Collectively, these methods are known as ‘nutty methods’.
Or you can put well-researched facts into sophisticated computer models, more commonly referred to as ‘a complete waste of time’.
Hopefully, you have not found this paper a complete waste of time. So, now, as risk engineers and risk managers, I must leave the question with you. When you look into the depths of the crystal ball, what future risks do you see, and how best can we manage them?
References

Batchelor, D.A., (1993) The Science of Star Trek, http://ssdoo.gsfc.nasa.gov/education/just_for_fun/startrek.html

Fink, S., (2002) Crisis Management: planning for the inevitable

The Environment Agency, http://www.environment-agency.gov.uk/

The Royal Society and The Royal Academy of Engineering, (2004), Nanoscience and nanotechnologies: opportunities and uncertainties, http://www.nanotec.org.uk/finalReport.htm

Other web resources

AssTech, http://www.asstech.com/en/newsletter/index.html

ABC Extreme Makeover, http://abc.go.com/primetime/extrememakeover/

BBC News, www.bbc.co.uk

I Robot, www.irobot.com

Star Trek, www.startrek.com

Super Size Me, http://www.supersizeme.com/

Swiss Re, Emerging Risks – a challenge for liability underwriters, http://www.swissre.com/INTERNET/pwswpspr.nsf/fmBookMarkFrameSet?ReadForm&BM=../vwAllbyIDKeyLu/ESTR-5LDGZB?OpenDocument

The Day After Tomorrow, www.thedayaftertomorrow.com



© Dr Lynn Drennan ‘From the depths of the crystal ball: the risks of tomorrow revealed today’

Zurich Risk Engineering Global Workshop 2004






:)


The database is protected by copyright ©hestories.info 2017
send message

    Main page

:)