Institute for Environmental Futures
AI4NetZero
The AI4NetZero project is an interdisciplinary team comprising researchers from the University of Leicester, Loughborough University, and Bristol University, led by Professor Heiko Balzter and managed by Dr Stephen Wright, is carrying out a project aimed at addressing the UKs commitment to achieving net-zero greenhouse gas (GHG) emissions by 2050. With ruminant and peatland farming accounting for approximately 10% of UK GHG emissions, the project will leverage AI-supported Digital Twins to explore strategies for reducing GHGs in agriculture. These Digital Twins, referred to as 'Self-Learning Digital Twins, ' will blend mathematical models and diverse data sources, enabling stakeholders, including policymakers, farmers, and supply chains, to examine and develop sustainable farming practices and peatland restoration approaches. The initiative not only seeks to promote environmental sustainability but also advocates for the ethical and socially responsible application of machine learning in a context-specific manner.
Project blogs
Our blogs feature insights from contributors who have played a key role in our community of practice within the AI for Net Zero Sustainable Land Management project. Through our regular workshops, we've cultivated a collaborative space for sharing knowledge and networking, and these blog posts capture the valuable perspectives and ideas that have emerged from these sessions.
AI4NetZero: Efficient deep learning based time-series data imputation for missing values
By Dr Amjad Ali, Staff
Time series datasets used to train deep learning models for downstream prediction tasks, as well as the data streams monitored continuously for predictions using the trained model, often contain inherent missingness. The missing values are everywhere in time series data due to unexpected situations such as sensors malfunctioning or IoT signal distortions. breakage Handling missing data in time series is critical for high-stakes applications, such as clinical research and digital healthcare systems, particularly in critical care settings where patient monitoring is continuous. Deep learning models, having gained significant popularity, offer state of the art solutions for the imputation task as well.
In 2023, Zhang et al. (https://doi.org/10.3233/faia230625) proposed a deep learning based imputation model, named Tri-Attention Bi-GRU (TABiG), which unlike other models in the literature, addresses two objectives simultaneously (not just anyone of these): inter-variable temporal dependence and inter-variable co-relationship. The TABiG imputation model leverages bi-directional Gated Recurrent Units (GRUs) and multi-level attention mechanisms to accurately impute missing time-series data by capturing the complex coupling among the variables. This study applies the TABiG model, together with introducing various impactful data pre-processing strategies, to impute missing values in clinical datasets of pneumonia patients. Pneumonia is a leading cause of hospitalization and intensive care unit (ICU) admissions, making accurate data imputation essential for guiding treatment decisions.
The study involves training the model using a real dataset of pneumonia patients with the variables: blood pressure, Early Warning Score (EWS), respiratory rate, heart rate, and temperature. Various models are trained and tested based on differing sequence lengths (time intervals), hyperparameters, and versions of the TABiG model. The model evaluation is based on MAE, MRE, and RMSE, with results showing promising accuracy in the imputation task.
The obtained accurate reconstruction of missing data has the potential to improve downstream clinical tasks such as mortality prediction (a classification problem) and length of stay (LoS) prediction (a regression problem).
AI4NetZero: AGEMINI-UK: towards a new national capability for greenhouse gas observations
By Neil Humpage, Research Associate, National Centre for Earth Observation, University of Leicester
One of the main approaches to inferring greenhouse gas emissions is to measure concentrations of greenhouse gases in the atmosphere and map them back to their emissions sources using an atmospheric transport model. Using this method, emissions are currently calculated for the UK with concentration data obtained from in-situ air samples, acquired via inlets at four tall tower sites around the British Isles. An alternative approach to measuring the abundances of greenhouse gases in the atmosphere involves ground-based remote sensing. This technique works by using a high-resolution spectrometer to measure the radiance spectrum of direct sunlight at shortwave infrared wavelengths. Given our knowledge of the absorption properties of the molecules we’re interested in, we can then infer the total column concentrations of carbon dioxide and methane, along with other gases, by looking at how much light is absorbed at the specific wavelengths these gases absorb at – essentially, the higher the concentration of the gas of interest, the more sunlight is absorbed at that wavelength as it passes through the atmosphere from the top to the surface. The same principle is used to infer column concentrations on a global scale using satellites, which measure sunlight which has been reflected back to space by the Earth’s surface, passing through the atmosphere twice on the way. These two remote sensing geometries provide complementary information on the concentrations of greenhouse gases in the atmosphere, with satellites observing the whole Earth once every few days at the same local time of day, in contrast to ground-based instruments which can observe diurnal and longer-term temporal changes at a specific location.
A number of ground-based remote sensing sites have been established around the world as part of TCCON, the Total Carbon Column Observing Network, which is used to validate satellite data products as well as being used to support studies of the global carbon cycle. TCCON stations use Bruker 125HR Fourier transform spectrometers, which in combination with a solar tracker provide very high spectral resolution measurements of sunlight at near- and shortwave-infrared wavelengths. Though they are able to perform these measurements with a high degree of precision and accuracy, they are very expensive and logistically challenging to set up and operate. A cheaper, more portable alternative has been developed by Bruker, the EM27/SUN, allowing ground-based remote sensing observations of greenhouse gas column observations to be made in parts of the world which are underserved by TCCON, whilst still providing data of useful quality. Their relatively low cost also allows multiple instruments to be set up around a city or region to provide denser coverage, whilst their portability enables them to be temporarily located near emissions sources in different locations on a campaign basis. The Earth Observation Science group at the University of Leicester has some experience with the EM27/SUN instruments, including a deployment in Uganda looking at methane from wetlands (Humpage et al. 2024) and a project focused on urban emissions in London, which involved three EM27/SUNs being located across the city for a period of over 18 months.
Over the coming months, a new network of EM27/SUN instruments called GEMINI-UK (Greenhouse gas Emissions Monitoring network to Inform Net-zero Initiatives for the UK) will be set up around the UK, with the goal of providing long-term, continuous observations of greenhouse gas column concentrations that are sensitive to the UK’s carbon emissions. These data will feed into a wider programme called GEMMA (Greenhouse gas Emissions Measurement Modelling Advancement), which aims to provide frequent, timely estimates of the country’s carbon footprint driven by a range of atmospheric measurements and inverse models. Initial work has focused on preparing the weatherproof enclosures for the instruments, running tests alongside the TCCON site operated by the Rutherford Appleton Laboratory in Harwell, Oxfordshire, and calculating the estimated impact of GEMINI-UK on the atmospheric inversions using simulated measurements.
In conclusion, ground-based remote sensing of greenhouse gases has proved to be a very useful tool for carbon cycle science. The data these measurements provide is complementary to that from satellite and in-situ observations, and is the primary means for validating both satellite and atmospheric chemistry model datasets. In particular, the portable nature of the EM27/SUN instrument has opened up new science possibilities, allowing measurements to be made in different parts of the world underserved by existing measurement networks, and providing the opportunity to set up multi-instrument networks on the scale of cities, regions, or (in the case of GEMINI-UK nations, giving us a new perspective on the influence of greenhouse gas emissions on the atmosphere.
Reference
Humpage, N., Boesch, H., Okello, W., Chen, J., Dietrich, F., Lunt, M. F., Feng, L., Palmer, P. I., and Hase, F.: Greenhouse gas column observations from a portable spectrometer in Uganda, Atmos. Meas. Tech., 17, 5679–5707, https://doi.org/10.5194/amt-17-5679-2024, 2024.
AI4NetZero: Physics-informed graph neural networks as a self-learning digital twin for sustainable land management
By Dr Craig Bowers, Co-Investigator
The exchange of Greenhouse Gases (GHG) between the land and atmosphere is governed by the complex dynamics of many interconnected variables. A digital twin for sustainable land management mirrors the environmental system and insights lead to actions that influence future states of the physical system. The Joint UK Land Environment Simulator (JULES) is an idealised, deterministic model, driven by past climate conditions. It can take several days to generate results. Eddy covariance Flux Towers record real measurements of environmental conditions in-situ. However, there are only a limited number available.
A user gives the digital twin a set of input variables such as climate condition, soil characteristics and crop type. The user then receives predicted outputs of GHG emissions and crop yield (a proxy for profit) associated with the given inputs. The user can then change some of the inputs and interrogate the model to see how those changes influence the returned outputs. This essentially is a high-dimensional, nonlinear regression problem where the digital twin learns the line of best fit between inputs and outputs.
Graph Neural Networks for regression are Machine Learning models that not only learn a mapping between input variables and output variables, but they also learn based on the interconnected relationships across variables. Each layer of the network integrates the correlations between neighbouring nodes in the graph, in-turn influencing the next layer of the network. The final layer of the model maps the model weights to the associated outputs and supervised training optimises the model weights by minimising the mean-squared error between predicted outputs and true outputs, referred to as the loss function.
The training dataset is then filtered to only include the input features and output targets. This filtered dataset is then scaled using standard scaling from scikit-learn. Each row of the training dataset becomes a single sample consisting of both features and targets. The correlation matrix is then computed between every sequence of input features. This will be used as a weighted adjacency matrix for the graph neural network.
The model learns by supervised learning using the training dataset from JULES. The model is trained for multiple epochs (n_epochs = 1000). The training process is isolated in its own loop. For each batch of training samples, the train_step function is called that takes the model itself, the optimisation procedure (Adam), the batch features, batch targets and the epoch number. A custom_loss function is also called within train_step. A percentage of the training dataset is held-out for validating the loss of each model prediction for every epoch.
The graph neural network is built using Tensorflow and keras. The input to each graph layer in the network is a combination of input features and adjacency matrix then multiplied by the network weights for that layer and a nonlinear activation function is applied (relu). Multiple hops account for input features further from a given features local neighbourhood to be taken into account. The final layer of the model is a Dense fully connected layer with an output for each output target.
The physics-informed custom_loss function, takes for each sample in a batch the features, targets, predicted targets and the epoch number. In this example, the relationship between Gross Primary Productivity and Net Solar Radiation is used. We want the difference between the prediction for GPP and the known relationship between Rad_net to be as small as possible. The physics-informed loss is added to the mean squared error loss commonly used for regression problems.
The sum of the weights for each input feature represents the contribution of each input to the predicted output. The greater the sum, the more significant the contribution. This gives us a relative importance of each input feature towards the predicted output. Activation heatmaps show how much each neuron in the network is being used. A distributed activation strength shows that no single feature is dominating the output.
The loss function only uses the mean squared error for the data. Each black dot is the predicted output against the true output from the test dataset retained during model training. The blue line is the least-squares regression line-of-best-fit. The red line represents a perfect prediction. We add extra terms to the loss function and train the model again using the same data. In this example, we penalise negative predictions of both resp_p_gb and gpp_gb. Future iterations will include the relationship between Gross Primary Productivity and Global Solar Radiation.
Performing inference is simply a case of giving the model a set of input variables that correspond to the input features. The model then predicts the output targets for each set of inputs. Since the model has learned a mapping from inputs to outputs, we (the user) can investigate what the Net Ecosystem CO_2 Exchange will be for different crop types, under different weather conditions and with different soil characteristics.
Key Takeaways
Our model can make predictions for a particular year. We’ll use 2019 since we have true (JULES) data and can then quantify the error in our predictions. We have weather data stored for this period. The year is the first filter of this table. We then give the model a location to look at (longitude, latitude). This filters the table further. Finally, we filter for just the columns that we use as input features to the model. Predicted NEE is combined from predicted resp_p_gb, resp_s_gb and gpp_gb.
Filtering for an entire year means we can show a time-series.
We can now look at what the model tells us about what it expects NEE to be for the given year and at the given location for a selection of different crops. The user can then decide which crop best fits their land management practices for achieving a particular CO_2 emissions profile.
We can give the model data from an Eddy Covariance Flux Tower. We filter for the corresponding input features and output targets (almost). Note: we cannot check the accuracy of our cropyield predictions for this data currently. But we can check the accuracy of resp_p_gb and resp_s_gb predictions by comparing with known measurements of TER. We can also check our predictions for NEE.
Conclusions
Feeding the model with (almost) live Flux Tower measurements using an IoT system may also be a future option (possibly not for this project). Using ML to forecast Flux Tower data is also on the agenda. Our Physics-informed Graph Neural Network acts as the prediction engine in a Self-learning Digital Twin for Sustainable Land Management. This approach allows a user to virtually plant crops and check their GHG emissions without having to do it in real-life. This enables the best decisions to be made with some level of confidence.
AI4NetZero: Unlocking the secrets soil: predicting carbon dioxide emission from arable peatland soil through microbial insights
By Dr Gavers K. Oppong, Research Fellow
The drainage of natural peatlands for arable purposes has transformed them from carbon sinks into carbon sources due to disruptions in microbial communities. Arable peatlands contribute significantly to carbon dioxide (CO₂) emissions to the atmosphere, and urgent mitigation is needed to reduce this carbon flux. The role of soil microorganisms in mitigating CO₂ emissions from arable peatlands, however, remains largely unexplored. In this presentation, I highlight my ongoing work on incorporating bacterial and fungal abundance and diversity into predicting CO₂ emissions from arable peatlands under different crop scenarios. I outline the methods used for field and laboratory data collection and explain the differences between gas measurements using Li-Cor systems and Eddy covariance flux towers.
Key Takeaways
Vegetation and soil moisture appear to be key drivers of CO₂ emissions from the soil. Integrating bacterial and fungal sequencing data will offer insights into whether crop type influences the soil microbiome, and how this, in turn, impacts CO₂ emissions from the soil.
Conclusion
Understanding the complex interactions between soil microbiomes and environmental factors is crucial for developing sustainable agricultural practices that minimize the climate impact of arable peatlands.
Community of practice event 2 October 2024
By M. Ali, Research Associate
AI for Net Zero Community of Practice workshops provide a platform for learning, sharing, and growing, and are only successful in achieving that when participants from different backgrounds come together to share their work, expertise, and ideas. I would like to thank all the presenters for their engaging presentations and willingness to share their knowledge, which greatly enriched our discussions and provided us with new perspectives and practical takeaways. To our participants, thank you for your active engagement and thoughtful contributions. Your questions, comments, and collaborative spirit helped foster a dynamic and enriching environment for all.
Overview of Topics Covered at Our Community of Practice
Our recent Community of Practice event on 2 October 2024 featured a diverse array of topics, reflecting the wide-ranging interests and expertise of our members.
Here's a glimpse into the varied topics we explored and key takeaways:
Unlocking the Secrets of Soil: Predicting CO2 Emissions from Arable Peatlands through Microbial Insights - Dr Gavers, University of Leicester
The drainage of peatlands for arable purposes has resulted in disruptions in microbial communities that significantly alter their carbon emissions. Dr Gavers' presentation investigated the role of soil microorganisms in mitigating CO2 emissions from arable peatlands by incorporating bacterial and fungal abundance and diversity into predicting CO2 emissions. He described his field and laboratory data collection efforts and explained differences between gas measurements using Li-Cor and Eddy covariance flux towers.
Machine Learning Accelerated Atmospheric Transport for Inference of Greenhouse Gas Fluxes – Dr Nawid Keshtmand, University of Leicester
Dr Nawid discussed the development of a GNN architecture capable of emulating an atmospheric dispersion model which can be used to generate
emission footprints. He highlighted how their ongoing work on adding prototypes to the GNN model will further improve performance and classification accuracy when reporting on emissions.
PINNs on Graphs using Symbolic Regression – Dr Craig
The Joint UK Land Environment Simulator (JULES) is a deterministic model of land surface driven by past climate conditions and mathematical equations. It can take several days to generate results. An alternative is to create a machine learning-based JULES surrogate using physics-informed neural network graphs. Graph Neural Networks for regression are Machine Learning models that not only learn a mapping between input variables and output variables, but they also learn based on the interconnected relationships across variables. This JULES surrogate model can significantly accelerate the processing of JULES without compromising on the accuracy of the original deterministic model.
A Multi-head Self-attention LSTM Model for UK GHG Forecasting – Dr Bashar
Despite global efforts, GHG levels (methane and carbon dioxide) have continued to rise, impacting global warming and climate change. These emissions depend on multiple environmental variables, leading to a highly complex prediction task. Dr Bashar presented a time-series machine learning model for the prediction of methane emissions. The novel Long Short-Term Memory (LSTM) model is combined with a multi-head self-attention mechanism to improve the prediction accuracy by better capturing relevant features and dependencies between them. He talked about his LSTM model architecture, data pre-processing approach, how it was trained and evaluated using the UK DECC dataset, and future plans to improve the model.
Efficient Deep Learning-based Time-Series Data Imputation for Missing Values – Dr Amjad, University of Leicester
Most greenhouse gas emissions datasets exist as time-series data; however, they contain missing values due to unexpected situations such as sensormal functions, signal distortion, or other errors. In this talk, Dr Amjad presented deep learning-based imputation methods such as RNNs, Generative models, and Transformer-based models to address missingness in the data. These techniques can detect complex temporal patterns and correlations in the data to fill gaps and missing values. These techniques can work on the pre-processing stage to prepare the best shape of training data or postprocessing stages to further improve the output of the machine learning model.
The Greenhouse Gas Emissions Monitoring Network to Inform Net Zero Initiatives for the UK (GEMINI UK): Towards a New National Capability for Greenhouse Gas Observations – Dr Neil, University of Leicester
Greenhouse gas emissions can be inferred by measuring atmospheric concentrations and tracing them to their sources using in-situ air sampling or remote sensing techniques. Ground-based remote sensing, like the TCCON network, and satellites provide complementary data, with TCCON offering high-precision local measurements and satellites providing global coverage. The EM27/SUN instrument is a cheaper, portable alternative that expands monitoring capabilities to underserved regions. The GEMINI-UK network will deploy these instruments across the UK to support continuous observation and improve carbon footprint estimates through the GEMMA program. These methods are key to validating models and understanding the global carbon cycle. The goal of GEMINI UK is to develop a national UK-wide capability for GHG emissions.
AI4NetZero: Real-time digital optimisation and decision making for energy and transport systems
By Dr Georgios Rigas, Principal Investigator
The energy and transport systems that underpin key renewable or zero-emission involve multi-physics systems (i.e., reacting hydrogen with multi-phase flows relevant to aviation), are multi-scale and turbulent, (i.e., the optimisation of wind farms’ power output), and need adaptive solutions (i.e., the intelligent cooperation of road vehicles to minimise aerodynamic resistance). On the one hand, fast solutions to these problems are available through cheap models, but these solutions are not optimal. On the other hand, optimal solutions to these problems can be achieved (in principle) by simulation or experimentation, but the cost and time required are prohibitive.
In this project, we seamlessly combine two disciplines: physics-based modelling, which is generalisable and robust but may require tremendous computational cost, and machine learning, which is adaptive and fast to be evaluated but not easily generalisable and robust. The intersection of the two spawns scientific machine learning, which maximises the strengths and minimises the weaknesses of the two approaches.
The outcome of this is real-time digital twins that enable the static or dynamic optimisation of hydrogen combustion systems, wind-farm layouts and operation, and active road vehicle aerodynamics. Using Bayesian optimisation and Reinforcement Learning, an energy efficiency improvement of 10-30% can be achieved at zero cost using existing infrastructure/hardware and requiring only software modifications. The improved energy efficiency of the proposed approaches has been demonstrated through high-fidelity simulations and wind-tunnel experiments in the National Wind Tunnel Facilities.
AI4NetZero: Using AI for greenhouse gas emissions reporting
By Nawid Keshtmand, Research Associate
The presentation focused on the importance of validating inventory estimates of greenhouse gas emissions using top-down estimates. This is important as there could be misreporting or gaps in the science which could lead to errors in inventory estimates.
The presentation then looked at the process of performing a top-down atmospheric estimate which consisted of various stages, with the most computationally expensive step being the use of the physics-based atmospheric dispersion model. The next part discussed the amount of data obtained using greenhouse gas measurement sites as well as satellite measurements. Due to the computational cost of the top-down emissions estimate pipeline, it is currently unable to process the large amount of data which comes in from satellite measurements. Therefore, to try to solve this issue, we look at reducing the computational cost of the top-down emissions estimate pipeline. We approach this by replacing the physics based atmospheric dispersion model with a machine learning based emulator.
The emulator we use is a graph neural network which consists of an encoder, processor and decoder. The inputs of the GNN is meteorological variables (such as
temperature, pressure, wind vectors, atmospheric boundary layer etc) and the output of the GNN is a footprint. We test the GNN on satellite data from Brazil. We train the neural network on 2 years of data (2014 and 2015) and test using data from 2016. We see that the footprints generated from the GNN are similar to the ground-truth footprints generated from the physics model and we are now using the emulated footprints to infer Brazil’s emissions and comparing it with the estimate obtained when using the ground-truth footprints. My research in particular is examining how we can use prototypes as an additional input in the GNN in order to improve the performance of the GNN. We can choose prototypes in various different ways such as using random prototypes, prototypes generated using k-means as well as prototypes which we hand pick using expert knowledge. We examine the effect of using the prototypes by comparing the performance of the model with the prototype against the situation where there is no prototypes added. We did this by having an oracle scenario where we choose a particular prototype which is closest to the footprint we want to predict (which would not be possible to do in practice). It was seen that by adding the prototype to the GNN, we can reduce the Mean-squared error obtained between the predicted and true footprints.
Key takeaways
We have developed a GNN architecture which is used as an emulator of the atmospheric dispersion model. We can generate footprints which we are using to
infer Brazil’s emissions. We are working to improve the performance of the model by
adding prototypes as an additional input to the GNN model.
Conclusion
Successfully able to use a GNN to emulate an atmospheric dispersion model.
AI4NetZero: Visualisation updates in the digital twin of agricultural peatlands
By Dr Frank McQuade, Bloc Digital Visualisation Lead
This project has made strides in developing the visualisation application for Net Ecosystem Exchange (NEE) data. Understanding This summary highlights the project's achievements in building the application, its current capabilities, and its shift towards a user-friendly web interface.
The initial phase focused on creating a workflow for the application. This involved selecting an area of interest and building a Unity-based program. A crucial step was integrating this application with the digital twin data via a cloud-based API. The current application showcased the capabilities by connecting to both the Jules Emulator and Earth Observation models. Users can select crop type and climate model, allowing the application to generate NEE data specific to that region and choice. This empowers researchers and stakeholders to explore NEE data across various scenarios.
Beyond the application, the project established a robust cloud-based server. This server acts as the backbone, managing the application securely and facilitating communication with digital twin data. It also manages connections to various data sources, ensuring access to the latest information.
The project is taking a now transitioning towards a web interface. This next generation user interface offers several advantages including incorporating external data sources, including the successful integration with OpenStreetMap. This opens doors to richer visualisations by overlaying additional geospatial information.
The most significant advantage of the web interface is its user-friendliness. Unlike the current application, the web interface won't require remote installation, making it readily accessible to a wider user community, including stakeholders who may not have the technical expertise for software installations.
In conclusion, the visualisation project has established a powerful proof of concept for sharing NEE data. The current application demonstrates core functionalities and integrates with existing data sources. The cloud-based server provides a reliable foundation for future enhancements. Next steps in the project is the transition towards a user-friendly web interface, to allow broader engagement and further exploration of stakeholder needs. This web interface will be instrumental in creating a more robust and flexible implementation that will allow greater capabilities as the project moves forward.
AI4NetZero: Challenges of translating policy goals, priorities and interventions
By Dr Hibist Kassa, Policy Research Fellow
Inter-governmental Panel on Climate Change, Sixth Assessment Report highlights that GHG emissions have continued to increase, with unequal historical and ongoing contributions arising from unsustainable energy use, land use and land-use change, lifestyles and patterns of consumption and production across regions, between and within countries, and among individuals. The report recognises that ecosystem-based adaptation approaches such as urban greening, restoration of wetlands and upstream forest ecosystems have been effective in reducing flood risks and urban heat. Global GHG emissions in 2030 (implied by Nationally Determined Contributions in October 2021) make it likely that warming will exceed 1.5°C during the 21st century and make it harder to limit warming below 2°C. Hansen, Karecha and Sato (2024), conclude that the 12-month mean of global temperature is rising at 1.56°C relative to 1880-1920.
A combination of gaps between projected emissions from implemented policies and those from NDCs, that are already far shorter than those required to curb emissions, as well as, shortfalls in finance flows have undermined meeting climate goals across all sectors and regions. This lack of global leadership is further deepened by the Uturn in delaying or reversing UK commitments to reducing carbon emissions. This amplifies uncertainty in a policy context where there has been a shift from payments per hectare for landscape management, towards one that values farm and animal welfare practices as public goods. Since agriculture is a devolved area, the Climate Change Committee (2023) and National Union for Farmers have made calls for greater policy co-ordination at a UK level. Garvey and Jordan (2023) suggest that in seeking de-Europeanisation via Brexit, UK had become locked into disengagement with the EU and divergence between the devolved nations. Brexit in creating the opportunity of crafting new regulations for the environment, also required new institutions in a period of eroded capacity due to austerity.
This creates conditions of uncertainty for environmental regulation that has bearing on vulnerable ecosystems such as peatlands and animal welfare practices in ruminant production. These are the use-cases for the development of the Digital Twin in this project that will be used by policymakers, tenant farmers, land managers and other stakeholders. The aim is to simulate real world conditions to guide policymaking that encourage farm practices that reduce emissions intensity. While agriculture can play a role in reducing emissions, energy, transport and the built environment remain to be key emissions reductions sections. To this extent, there are concerns of challenges towards achieving absolute emissions reductions. Some of these are a result of the following:
- Carbon offsetting permitting fossil fuel industry, weapons manufacturers and airlines to continue to pollute, therefore not taking any measures to reduce emissions, Scottish Woodlands restricted approach to carbon trading
- Greenwashing techniques superficial changes to product packaging, minor increases in plant-based products, and continued encouraging of meat multibuy offers. Disclosures of emissions information has also been partial, with a focus on reduced emissions from stores and vehicles, when 95% of emissions comes from sales
- Continued overconsumption of meat and dairy, which has an effect on production that relies on feedstock imports that also places pressure on agriculture such as in the Amazon. Innovations in livestock rearing for low carbon beef introduced in the UK need to be examined in terms of impact on emissions and unintended consequences
- Globalised nature of livestock production emphasises the large carbon footprint of supermarkets in UK. UK Soy consumption impacts on Amazon
- Low carbon requirements added to trade agreements impacts less competitive farmers, for example, family farms, as well as importers to the UK (as NUF demands)
- Climate change impacts to farm practices: Calon Cymru Network oral evidence to UK Parliament cites how ‘changing rainfall patterns are undermining the basic assumptions of hill farming…causing more damage from winds and floods.’ For instance, knowing how tree planting is done to avoid emissions increase from environment
The Digital Twin is being developed on data sets from East Anglia. According to the UK Centre for Ecology and Hydrology (UKCEH), East Anglia fens peatland produces vegetables worth 3 billion pounds a year. While peatlands in their natural state capture CO2 through photosynthesis, once drained for agriculture purposes, it becomes a net source of CO2 emissions. Excessive rewetting can also increase methane emissions.
England’s Agricultural Transition Plan aims to ensure 40% of agricultural soil into sustainable management by 2028 and 60% by 2030. It also focuses on productivity and transition to low-carbon farming systems for arable and livestock. This emphasises the creation of new woodland, and its sustainable management, along with peatland restoration, agroforestry approaches on farms and planting of energy crops.
In the Agricultural Transition Plan, policymakers aim ‘to target the right level of ambition, combination and scale, in the right places and in joined-up ways to deliver target outcomes.’ England’s Peatland Action Plan sets out the government’s long term vision for managing, protecting, and restoring peatlands. This includes ending the horticultural use of peat in the amateur sector. It also includes recognition of how Greenhouse Gas Removal feasibility is impacted by decisions on alternative land use as well as technological options for production.
Key takeaways
Innovation and technological measures in agriculture and land use alone are not enough to reduce emissions. The Climate Change Committee recognises that robust policy framework is required for innovation and technology to be effective alongside behaviour change. The report continues to recognise that a UK food strategy that relies on innovation to decarbonise and increase productivity across food chains, without strong policies, puts emission reduction at risk. Identifying how productivity goals as well as emissions reductions targets can be met simultaneously or if there are trade-offs, what exactly those are need to be understood.
In the Making Landscape Decisions (2022) report, it was highlighted that land use decisions, especially restoration, must be place-specific and across all scales. There is a need to identify co-benefits and risks of competing land demands, such as food security, agricultural productivity, nature conservation and housing, as well as attendant social implications.
Conclusions
There is an uneasy compromise between restoring natural peat functioning and existing economic value and livelihoods in the case of intensive commercial agriculture such as in East Anglia. The long-term impacts of new measures for production of low carbon beef and dairy should be weighed against other environmental and health impacts. The Digital Twin aims to create an interactive platform that facilitates decision-making that is responsive to concerns such as this. However, there needs to be reflection on the limits or extent of trustworthiness given it aims to simulate real world conditions. Any shortfalls may unravel presumed emissions reductions from changes to farm practices and livestock rearing.
We would like to thank UK Research and Innovation (UKRI) and the Engineering and Physical Sciences Research Council (EPSRC) for funding this work. Grant: EP/Y00597X/1