You can navigate to other sections of the report by folllowing these links:

  1. What Overseer is
  2. The Overseer farm file
  3. What Overseer produces
  4. How the Overseer engine works
  5. Uncertainty in model results
  6. Overseer Glossary

This page includes:

Uncertainty in model results

The outputs of Overseer, as with every model, involve a level of uncertainty. Overseer was designed to model easily obtainable data which means the model uses simplifications of complex processes. This results in a level of uncertainty in the modelled estimates.

Obtaining actual measurements of leaching carries with it a level of uncertainty, and generating an accepted value for farm-scale nutrient loss is considered extremely challenging1.

How Overseer handles uncertainty

There are multiple sources of uncertainty in Overseer – they can be generally grouped into measured or modelled uncertainties.

Measured data contains variability in space and time inherent in all biological systems, such as the differences in soil moisture measured across a paddock. This uncertainty relates to data that comes from a measured source bringing uncertainty with it.  

Modelling uncertainty includes all the variability associated with data entry as well as the uncertainty within the modelling procedures including:

  • Differences between users’ in entering data
  • Variability in the representation of the actual farm system via data records
  • Errors in input and boundary condition data, model structure, parameter values, observations used to calibrate, errors of omissions, commensurability of modelled and observed variables and parameters.
  • The unknown ‘unknowns’.

The diagram below provides a conceptual example of the relationship between measured and modelled uncertainty.

Quantifying and accounting for sources of uncertainty in models is particularly challenging, especially for a model describing complex farm systems like Overseer. A report by Ledgard and Waller (2001)2 estimated uncertainty of 25-30% for model predictions for N, which has since been widely quoted. However, this estimate didn’t include errors associated with measurements, or uncertainty from data inputs, providing only part of the full picture of quantifying uncertainty, and is therefore limited.

Since 2001, the Overseer model has had a number of changes, which likely result in a different uncertainty estimate for the current model version. An updated uncertainty analysis would be useful; however, quantifying all sources of uncertainty involved in the N leaching value produced by Overseer is impossible. Therefore, reducing uncertainty is more appropriate, for which there are several options.

Evaluating Overseer and reducing model uncertainty

All models should be calibrated (adjusted to match measured values) to confirm that the model is delivering a known outcome for specific conditions. To provide further confidence in the model, results should also be validated (compared to an independent data set)3. Together calibration and validation techniques are known as evaluation. The diagram below demonstrates how these processes work.



Overseer is a complex model with multiple sub-models generating different results. The pastoral N leaching model has been continually validated and calibrated as it has been developed, whereas the P loss model is based on a single calibration process.

Because of its wide coverage, the Overseer model is not evaluated for all conditions or types of systems in New Zealand. The reason for this is that there isnt data available to do this. It is unrealistic to get data for all of the varied farm systems and conditions in New Zealand. However, continuing to source data from a greater range of systems and conditions is important for ongoing validation and calibration of Overseer to increase confidence in model outputs and reduce their modelling uncertainty.

Uncertainty around model estimates tends to be lower when they reflect conditions similar to those in the calibration data set (where we have the most information). The model is capable of extrapolating outside of these conditions; however, the uncertainty is greater because there is less available information supporting the modelled estimate. The relative difference in prediction uncertainty within and outside the calibration conditions is described in the diagram below. An example is annual rainfall, whereby N leaching estimates under high rainfall (>1500 mm per year) conditions have far greater uncertainty, due to lack of measured data from these situations, than those under moderate rainfall.



Most of the calibration data used to date is from flat, pastoral, dairy enterprises, with free-draining soils and moderate rainfall. Therefore, to strengthen the calibration dataset and to reduce uncertainty in model results, datasets from outside these conditions are required e.g. cropping, beef and sheep enterprises, clay and shallow soil types, rainfall zones > 1200 mm.

Opportunities to reduce uncertainty in Overseer results

There are many opportunities to reduce uncertainty in Overseer outputs, the main ones are listed below:
  • Following the user guide and using the OVERSEER Best Practice Data Input Standards, to ensure the best quality data is used to describe the farm.
  • Improving the understanding and description of farm systems - and how they are entered into Overseer.
  • Using best practice evaluation, validation and calibration processes to review and develop the model. This requires:
    • Increasing the number of datasets of field measurements sitting outside the existing/typical calibration dataset range e.g. high rainfall, clay soils, enterprises other than pastoral/grazed.
    • Continually increasing the number of farmlet scale datasets for use in validation and calibration.
    • Using consistent methods for scientific measurements and data accumulation.
    • Linking to systems such as daily management monitoring.
    • Undertaking model comparison and inter-modal scale comparisons.
1 ​Shepherd et al. 2013
2 Ledgard et Waller 2001

Shepherd et al. 2015

Go to the next section:  Overseer Glossary