The FLUXNET2015 dataset and the ONEFlux processing pipeline for eddy covariance data

The FLUXNET2015 dataset and the ONEFlux processing pipeline for eddy covariance data

2020 | Gilberto Pastorello et al.
The FLUXNET2015 dataset and the ONEFlux processing pipeline for eddy covariance data provide ecosystem-scale measurements of CO₂, water, and energy exchange between the biosphere and atmosphere, along with other meteorological and biological data from 212 sites worldwide (over 1500 site-years, up to 2014). These sites voluntarily contributed data, which were quality-controlled and processed using uniform methods to ensure consistency and intercomparability. The dataset includes derived products such as gap-filled time series, ecosystem respiration and photosynthetic uptake estimates, and metadata. 206 of these sites are now distributed under a Creative Commons license. The paper details this enhanced dataset and the processing methods, now available as open-source codes, making the dataset more accessible, transparent, and reproducible. Eddy covariance has been used for over 30 years to measure land-atmosphere exchanges of greenhouse gases and energy. Regional networks of sites were formed in Europe and the US, followed by similar initiatives in other continents. These networks enabled cross-site comparisons and regional-to-global studies. FLUXNET was created as a global network of networks, a joint effort among regional networks to harmonize and standardize data. The first gap-filled global FLUXNET dataset, the Marconi dataset, was created in 2000, followed by the LaThuile dataset in 2007 and the FLUXNET2015 dataset in 2015 with 1532 site-years of data. The FLUXNET2015 dataset includes sites with records over two decades long. The dataset was created through collaborations among many regional networks, with data preparation efforts happening at site, regional network, and global network levels. The global coordination of data preparation activities and data processing was done by a team from the AmeriFlux Management Project, the European Ecosystem Fluxes Database, and the ICOS Ecosystem Thematic Centre. This team was responsible for the coding efforts, quality checks, and execution of the data processing pipeline. These combined efforts led to a dataset that is standardized with respect to data products, data distribution formatting, and data quality across sites. The data processing pipeline uses well-established and published methods, with new code implemented for this release as well as code adapted from implementations by the community. The main products in this pipeline are: (1) thorough data quality control checks; (2) calculation of a range of friction velocity thresholds to filter low turbulence periods, allowing an estimate of the uncertainty from this filtering along with the random uncertainty; (3) gap-filling of meteorological and flux measurements, including the use of a downscaled reanalysis data product to fill long gaps in meteorological variables; (4) partitioning of CO₂ fluxes into respiration and photosynthesis (gross primary productivity) components using three distinct methods; and (5) calculation of a correction factorThe FLUXNET2015 dataset and the ONEFlux processing pipeline for eddy covariance data provide ecosystem-scale measurements of CO₂, water, and energy exchange between the biosphere and atmosphere, along with other meteorological and biological data from 212 sites worldwide (over 1500 site-years, up to 2014). These sites voluntarily contributed data, which were quality-controlled and processed using uniform methods to ensure consistency and intercomparability. The dataset includes derived products such as gap-filled time series, ecosystem respiration and photosynthetic uptake estimates, and metadata. 206 of these sites are now distributed under a Creative Commons license. The paper details this enhanced dataset and the processing methods, now available as open-source codes, making the dataset more accessible, transparent, and reproducible. Eddy covariance has been used for over 30 years to measure land-atmosphere exchanges of greenhouse gases and energy. Regional networks of sites were formed in Europe and the US, followed by similar initiatives in other continents. These networks enabled cross-site comparisons and regional-to-global studies. FLUXNET was created as a global network of networks, a joint effort among regional networks to harmonize and standardize data. The first gap-filled global FLUXNET dataset, the Marconi dataset, was created in 2000, followed by the LaThuile dataset in 2007 and the FLUXNET2015 dataset in 2015 with 1532 site-years of data. The FLUXNET2015 dataset includes sites with records over two decades long. The dataset was created through collaborations among many regional networks, with data preparation efforts happening at site, regional network, and global network levels. The global coordination of data preparation activities and data processing was done by a team from the AmeriFlux Management Project, the European Ecosystem Fluxes Database, and the ICOS Ecosystem Thematic Centre. This team was responsible for the coding efforts, quality checks, and execution of the data processing pipeline. These combined efforts led to a dataset that is standardized with respect to data products, data distribution formatting, and data quality across sites. The data processing pipeline uses well-established and published methods, with new code implemented for this release as well as code adapted from implementations by the community. The main products in this pipeline are: (1) thorough data quality control checks; (2) calculation of a range of friction velocity thresholds to filter low turbulence periods, allowing an estimate of the uncertainty from this filtering along with the random uncertainty; (3) gap-filling of meteorological and flux measurements, including the use of a downscaled reanalysis data product to fill long gaps in meteorological variables; (4) partitioning of CO₂ fluxes into respiration and photosynthesis (gross primary productivity) components using three distinct methods; and (5) calculation of a correction factor
Reach us at info@study.space