Verification: How much is enough?
|Jun 10 2011|
|VERTIC Blog >> Arms Control and Disarmament|
Andreas Persbo, Wilton Park, West Sussex
Good morning everyone and welcome to the third session. It is, of course, always a pleasure to be back at Wilton Park. And I am very pleased to see so many friends and colleagues here. I'm also very pleased to have two former VERTIC directors, Trevor Findlay and Angela Woodward, around the table. And I hear that Patricia Lewis may join us over Skype from California later this evening.
This conference has gotten off to a good start. However, taking the speaker’s chair was not something that I was prepared for. I was hoping that if I pay Mark enough money, he would allow me to participate and enjoy the discussion without having to open my mouth. Sadly, though, I didn’t have enough money to satisfy him. I even tried to buy him drinks yesterday. It did not work.
Jokes aside. I am filling in for my good friend and colleague Ole Reistad of the Institute for Energy Technology. He informed me a few days ago that he could not attend due to personal reasons. It is a shame that he could not attend, as I think that he would have enjoyed it and made a great contribution.
This session asks a deceiving, and very difficult, question. Namely, how much verification is enough?
From a general perspective, it cannot be answered. But if we start to examine the question more closely, we might get close to an answer. How much verification is enough? Enough for what? Enough for whom? Yesterday, we heard several speakers examine the importance of multilateralism, verification in particular, within the context of their specific regimes. Tibor Toth, for instance, made a strong case for multilateral verification.
My impression from his presentation is that the verification regime - as far is testing is concerned - works. And, beyond doubt, the CTBT verification regime is an important experiment. When it was set up, no one quite knew what the end result what look like. We still don’t know what the end result will look like. It's still under development. But the future looks bright.
What about the environment?
Yesterday, we heard Vitaly talk about the development of the climate change treaty’s monitoring and verification system. This is clearly a young regime, and it may benefit from input from those in the arms control and disarmament field. He stressed the importance of a strong enforcement process. And verification and enforcement go hand in hand. Over the years to come, I hope to get more involved in environment issues.
We are leaving the Halocene epoch - the stable era spanning the last 10,000 years which have witness the rise of our spices to the masters of the planet. Like it or not. But we are entering the Antropocene. The Age of Man. Our actions have already had an unimaginable impact on our common habitat. So great, indeed, that some researchers are claiming that we've entered into a slow moving mass extinction event. Climate change, in my mind, is the greatest threat to humankind. And verification questions, as we heard yesterday, is at the heart of the debate.
I took away a simple observation from yesterday’s sessions. On the surface, the task of verifying an arms control agreement may seem fundamentally different from that of verifying an environmental agreement. Look deeper, however, and commonalities start to emerge. Both systems are about the production of accurate and verifiable data. Both systems are put in place to judge governments’ adherence to treaty objectives. And both systems struggle with the limits of knowledge, and the constraints of politics.
This is, perhaps, most pertinent in the environmental field. Here, the basis of the entire carbon market is based on data that can be monitored, verified and reported. Uncertainty is a persistent problem. Undeniably, there is a prevalent error in the data. A recent study by the Carnegie Institution for Science and the USDA Forest Service, for instance, used LiDAR to measure above ground carbon density for Hawaii. It found that the actual concentration is about 55 per cent lower than the value estimated by IPCC Tier 1 methods, the most stringent of the estimation standards.
A 55 per cent error bar. While this may sound dramatic, it is how you deal with this uncertainty that matters. True, good decisions can only be made on the basis of good data. Let's take an example from economics. You may be concerned if I wanted to buy a one pound item from you and presented a banknote worth between 55 pennies and 1 pound 80 pence. There is a significant transaction risk. If you and I like gambling, we might make the deal anyway, and so the market is working. But in most cases, we won't.
High uncertainty may be acceptable when we’re starting a human endeavour, and while we’re collecting data. But perhaps not for a compliance determination. It is therefore important that verification system always strive to reduce the uncertainty of its data. Naturally, I am not ignorant of the point that if the aim is to control nuclear weaponry, for instance, this level of initial uncertainty may be unacceptable.
Sometimes, states, or markets, insist on almost impossible standards of verification before they take the first step. No deviation, no error, is accepted. This is far too common. We see this in the arguments relating to the verifiability of the Comprehensive Nuclear Test Ban Treaty. The system is exceptionally sensitive, the on-site component in particular utilises state of the art technology, with very high detection rates, yet its opponents claim that the treaty is unverifiable. Opponents raise all kinds of evasion scenarios. Some are believable. Testing in a salt mine, for instance, could significantly muffle the seismic signal. Some scenarios defy reason: such as nuclear testing on the far side of the moon.
The strength of the skeptic’s argument vary, but are often amusing. My favourite is US Senator Helms intervention in the 1979 SALT II hearings, when addressing the use of the term ‘adequate verification’ by Secretary of Defence Harold Brown: ‘… the repeated use of the qualification ‘adequately’ bothers me. And I guess Mrs. Brown would be a little suspicious of you if you were to come home tonight and tell her that you were adequately faithful to her, wouldn’t she?’
Mr Brown responded, ‘… in that case as in this, I suppose it would depend on the alternative offered’.
The alternative offered. How much uncertainty are you willing to accept?
How much verification is enough? It depends on the case at hand.
But clearly, the best is often the enemy of the good. Many of you in this room has direct, first hand experience of this. Relatively small errors in data collection may become blown out of proportion. Just consider the relatively recent row over Climategate and the consequences thereof. The credibility of the IPCC was needlessly called in the question.
I think we all need to accept that some will not be convinced by the evidence, no matter how strong it is. Some believe that the Earth was created some ten thousand years ago. Some believe that Global Warming is not, in fact, happening, despite overwhelming, and deeply troubling, data to the contrary.
Requiring a system to detect marginal patterns of violations — or marginal changes in emissions — is very problematic. It places a very high burden on verification efforts and raises the verification standard to almost unachievable highest. It forces the system put in place to monitor activities very closely, and no matter how sensitive your monitoring equipment is, you will have errors. This leads to doubts, and false alarms. And this, I would argue, leads to bad decisions. Sometimes leading to war.
How much verification is enough? At the end of the day, this depends on what you want to verify. Or what you want to want to monitor. In some cases, high levels of verifiability, and corresponding low levels of uncertainty will be required. In others, uncertainty can be tolerated.
For many years, I worked together with the UK Atomic Weapons Establishment to develop a regime for the verified dismantlement of nuclear warheads. Quite early on, we realised that the level of confidence required is very high. This is natural. After all, consider the national security implications of one state having a hidden nuclear arsenal. And this is perhaps not the biggest problem. Consider the impact of one state believing or suspecting that another possesses a nuclear arsenal.
In the environment field, it would seem like the tolerances to uncertainty is different. That its somewhat higher.
At the end of the day, perhaps, what matters is the whos and the whats. What is the significance of a marginal violation, and does it matter who is under suspcion? My impression, watching verification systems at work, is that it does matter. Whether or not that is a good thing is very much open for debate.
Thank you for your attention.
Last changed: Jun 10 2011 at 11:56 AMBack