Interpreting Macroscopic Charcoal Data

Week 4: Macroscopic Charcoal Data Analysis
After weeks of treating sediment and using a stereo microscope to count charcoal particles, we have sufficient raw data to process and analyse!

Macroscopic charcoal is the remains of burned wood and vegetation fragments that are >150 µm and are often visible to the naked eye. As mentioned on our previous post , macroscopic sedimentary charcoal is an ideal proxy for aridity. It is produced during paleo-wildfires, transported by water or air and then preserved as charcoal fossil in paleo-lakes like Ayauchi. See the illustration below for more information on the deposition of sedimentary charcoal into lakes.
charcoaldeposition.jpg
An illustration showing the dispersion of charcoal particles from a wildfire into a nearby lake
Image Source:National Centers for Environmental Information- Fire History Datasets
Once we have raw charcoal count data we use a geo-statistical software called CharAnalysis to process and transform our raw data so that it is easier to analyse. Data preparation is necessary because paleoclimate conditions are a combination of dynamic natural processes there may not be an isolated linear relationship between the number of charcoal particles and paleo-fire incidents. Geo-statistics allows us to remove statistical noise and ultimately reduce uncertainty.

i.What is statistical noise?
Statistical noise could come about as a result of human error or poor instrument calibration during data collection. In the case of charcoal analysis, lack of attention during wet sieving could wash out most of the charcoal or break it off into multiple particles such that it is exaggerated during counting. Other natural processes such as sediment accumulation rates can also be sources of statistical noise and might obscure trends or signals. In regards to the use of charcoal as a paleo-proxy for fire incidences, statistical noise may cause one to inaccurately conclude that there were fire episodes due to the introduction of charcoal transported from regional fire sources. In the case of lake Ayauchi, this means that the recorded charcoal particle count we made is an inherently noisy data set. Charcoal might have originated from local as well as regional fires and then washed into the Ayauchi basin. To separate this noise generated by regional fires from the signal of local fire, which is the data we are looking for, we treat our raw data using various statistical techniques. CharAnalysis is encoded with multiple algorithms to aid the statistical treatment process.
statistical noise.png
The blue graph shows annual water depth, with high frequency and variations. This is a noisy data set compared to the red climatic 30-year average.
ii.The Magic of CharAnalysis
CharAnalysis  is a peak-detection tool that employs the use of various mathematical techniques to optimize the data and reduce noise. CharAnalysis uses threshold values t to separate signals from statistical noise. Signals that exceed set threshold values are identified as peaks and reflect fire incidences, C_peak>t. These local peak incidents, C_peak are shown in the diagrams as red plus signs (+) while statistically insignificant incidents are shown as grey dots . The model uses a binary decomposition analysis system where input data is classified as either low-frequency or high-frequency variables. The graphs below, which were created using CharAnalysis, show C_background and C_peak variables to represent low-frequency and high-frequency CHAR respectively. C_peak and C_background  are also used as spatial variables reflecting local and regional charcoal spatial distribution respectively. Both of these values are expressed as charcoal accumulation rates (CHAR) in units cm-2yr-1. The software is able to estimate these CHAR over long periods of time from the raw data of charcoal particle count and age models originally inputted into it.
Figure above is from CharAnalysis and shows raw Charcoal count data, interpolated Charcoal data and C_interpolated.
But how can we still be sure our data is accurate?
We can never have a perfect data set because natural processes are incredibly dynamic. On top of that we are only able to work with a limited sample size. However, by removing as much noise as possible and using a multi-proxy approach we are able to isolate statistically significant trends and make conclusions from them. A multi-proxy approach  uses elemental analysis, Loss On Ignition (LOI) for Total Organic Content analysis, magnetic susceptibility as well as charcoal data. Satellite records also provide detailed chronological information on global environmental conditions but this data only goes back as far as 30 years ago. Correlation matrices and noticeable trends help to support our claimed local fire incidents and tendencies. It’s often hard to assume causation, but this data can help to isolate specific time periods of heightened fire activity, which can then be compared with global precipitation and temperature records to infer paleo-droughts.

In the next coming weeks, I will explore the utility of elemental analysis in paleo-climate reconstruction, particularly for the Amazon/Lake Ayauchi.

I hope you keep reading.
Kopo.

Comments