The use of analytical tools will generally involve many uncertainties. These can be tech- nical uncertainties regarding data, they can be methodological assumptions, and they can be value choices or even paradigmatical differences. There are a number of options to deal with such uncertainties. Below we will briefly discuss the most important ones.
New Measurements
The most straightforward answer to uncertainties consists of new measurements. These can pertain to new dose-response experiments in the laboratory, to the validation of extrapolations from laboratory to field, or to the validation offield models like the multi- media dispersion models. This is the high road of uncertainty abatement. But it is time and money-consuming and will be no option for a given practical case study.
The Choice of Robust Indicators
A next possibility is to choose indicators which are rather robust. However, the choice of more robust (more certain) indicators will come at the cost of accuracy. For example, the impacts of chlorine policies may be assessed in terms of impact category indicators, like those used in LCA. These have a rather high resolution power, but many of them are quite uncertain. In contrast, these policies can be assessed in terms of total kilograms of chlo- rine emitted, as is generally done in SFA studies. Such a metric is very robust and may therefore arouse significantly less resistance in a policy debate (Tukker 1998). But very important differences between the emitted substances will then be obscured. Going even one step further, one may leave quantification altogether and choose qualitative indica- tors like ‘made from recycled material’ or ‘biodegradable’. This may further reduce public resistance to the results, but will again be less informative.
Uncertainty Analysis, Sensitivity Analysis and Scenario Analysis
Given a set of indicators, the uncertainty thereof can be assessed in terms of standard errors. These errors will depend on many links in the chain of processes underlying the indicator at hand. Furthermore, the errors will pertain to uncertainty in data, to method- ological assumptions or to value choices regarding these different links. Consequently, the results of uncertainty analyses will soon become very complex and may well pile up uncer- tainty upon uncertainty. A more sophisticated approach concerns Monte Carlo simula- tion. For every element in the uncertainty of an indicator the probability of different possible values is assessed. Then subsequent computation runs are made, in which the different uncertainty elements are fixed independently, each according to its own prob- ability distribution. The final result will show a more realistic range of outcomes, which will avoid artificial accumulation of uncertainties.
If no uncertainty values can be given, a sensitivity analysis can be performed starting from deliberate changes in the modeling conditions. Thus changes which are deemed rea- sonable can be made in the input data, in the methodological assumptions or in value choices underlying the different steps in the methodology. The consequences of such changes for the final result can then be calculated. This procedure is used quite often, as it puts rather low requirements on study resources and still provides important insights in the robustness of the final results.
Sensitivity analysis is generally performed for separate parameters, regarding data, methods or value choices. In scenario analyses sets of choices are put together into con- sistent packages. Thus we can calculate a worst case, a most likely or a best case scenario.
Scenario analyses thus help to structure the results of sensitivity analyses in order to make them more comprehensible for decision making purposes.
International Harmonization
International standardization in the field of analytical tools predominantly focuses on ter- minology, on technical frameworks and on procedural requirements. But it may also go one step further, in harmonizing the use of best available data or methods. Thus the Intergovernmental Panel on Climate Change (IPCC) working under UNEP authority,
among others, establishes the best available knowledge about climate change due to differ- ent greenhouse gases in terms of the well-known global warming potentials (GWPs).
Likewise, the World Meteorological Organization (WMO) establishes best values for the stratospheric ozone depletion potential (ODP) of different substances. Recently, a com- bined research program has been defined by SETAC and UNEP to identify best available practice also for other impact categories. Although considerable uncertainties may be involved, such harmonization guides practical application and helps to avoid arbitrari- ness in selecting best data or models.
Procedural Checks
The above options for dealing with uncertainty all regard technical characteristics. Quite another approach starts from the other side, that is, from the decision procedure in which the results of the analytical tools are to be used. For instance, the results can be reviewed by an independent panel of experts, or even by a panel of stakeholders. If the results pass such a review procedure, this may well contribute more to the credibility of the results than any of the above technical procedures. For this reason, much attention is currently paid to the possibilities of incorporating analytical tools like LCA in explicit decision procedures in which both independent experts and the relevant stakeholders have a clearly defined input. An example concerns a European directive which gives guidance on the acceptabil- ity of the type of packaging to be used (when a company is allowed to use non-reusable materials); or a directive which guides the choice between waste management options.
Paradigmatic Differences
The most fundamental problem can be that analytical tools involve paradigmatic assump- tions which are not shared by the different stakeholders in a decision process. Thus there is a major gap between a risk approach, as used in tools like LCA and ERA, focusing on emissions which actually take place, and a precautionary approach, focusing on inherent risks of a process. Such a gap cannot be bridged by improving the models or data used, or by better public participation in the decision process. Such differences can lead to grave frustrations regarding the application of quantitative analytical tools like LCA or ERA.
Examples are the historic public debate on the acceptability of nuclear power installa- tions, the debate on the environmental risks of the chlorine industry and materials like PVC (Tukker 1998), and more recently on the use of genetically modified organisms (GMOs). Generally one will have to go back to the precise questions being asked and to the way risks are approached. The use of quantitative analytical tools presupposes agree- ment on these points.
CONCLUSIONS
Life cycle assessment (LCA) concerns one of the major approaches in the field of indus- trial ecology. It involves a cradle-to-grave analysis of product systems, that is, of the total of processes which are involved in the provision of a certain function. It is complementary to other tools, such as environmental risk assessment, focusing on the environmental
impacts of single activities or single substances, or substance flow analysis, focusing on the metabolism of substances in the economy as well as in the environment. LCA is a formal, quantitative tool in the area of LCA. Main contributing organizations are SETAC, respon- sible for its scientific development, ISO, responsible for its international standardization, and UNEP, taking a leading position in the enhancement of its global use. LCA appears to be increasingly used by industry, from operational decisions, like the purchasing of materials, up to strategic decisions. Like other formal analytical tools, LCA has a number of clear limitations. Some of these can be tackled by technical measures, some by proced- ural measures. But some limitations deal with paradigmatic differences regarding the way one wants to cope with risks. Decision procedures involving stakeholders with a risk approach versus stakeholders with a precautionary approach cannot easily be supported by LCA or other formal and quantitative environmental assessment tools.