Earlier this summer, climate change was once again thrust into the global limelight, this time by an unlikely source – the pope. In his 184-page On Care for Our Common Home, Pope Francis warned:
“A very solid scientific consensus indicates that we are presently witnessing a disturbing warming of the climatic system. […] A number of scientific studies indicate that most global warming in recent decades is due to the great concentration of greenhouse gases (carbon dioxide, methane, nitrogen oxides and others) released mainly as a result of human activity.”
Calling on his followers to take action at last, Pope Francis’ encyclical may represent a turning point in worldwide attitudes toward global warming. And with the UN Climate Change Conference hitting Paris this November, greenhouse gas regulation could soon become widespread.
The next question becomes: how will any regulations aiming to limit emissions be evaluated and enforced? Everyone wants reassurance that their neighboring countries are playing by the rules, but not all UN signatory nations can afford to provide detailed emissions reports using the conventional techniques.
Orbiting satellites can detect regional CO2 concentrations from space, but fall short when it comes to representing more intricate patterns. Ground-level CO2 monitors can quantify emissions at a specific location, but the favored technology is still too expensive for common use. And yet, effective carbon regulation needs to take place on hyperfine spatial and time scales – right down to the individual freeways and neighborhoods hiding in these methods’ blind spots.
To document the complex emission patterns from urban CO2 sources, it would take a veritable army of sensors, packed into the atmospheric nooks and crannies of a city.
That’s where we come in.
I work on a project called BEACO₂N, which stands for the Berkeley Atmospheric CO2 Observation Network. BEACO₂N is a web of about 30 low-cost CO2-sensing monitors, or “nodes,” installed at two-kilometer (1.24-mile) intervals across the city of Oakland, California.
At this scale, it is the densest collection of CO2 monitoring instruments in the world.
The nodes utilize popular open-source microcontrollers and computers that transmit their measurements wirelessly using a simple smartphone data plan. The data are then made publicly available in near-real time.
The sensors themselves measure changes in infrared light intensity to calculate CO2 concentrations in the air. Carbon dioxide absorbs infrared radiation, so more CO2 molecules floating across the light beam means that less light reaches the detector on the other side. It’s the same operating principle behind more expensive CO2 sensors, but using lower-grade lamps and detectors. This small compromise in accuracy means an entire BEACO₂N node can be assembled for about US$5,500. That’s 10–20 times less than conventional monitors, and cheap enough to be bought in bulk. And in this big-data era, providing more measurements at lower cost has broad appeal.
By blanketing the city with a tight “grid” of sensors, potential CO2 sources can be identified and quantified simply by comparing signals from adjacent nodes. A higher CO2 level measured at the downwind node relative to its upwind neighbor indicates the presence of a CO2 emitter in between the two. This simple approach has already been used to assess the impact of the 2013 Labor Day Weekend bridge closures on Bay Area traffic emissions.
While individual nodes can perceive changes in CO2 as small as eight molecules out of a million, when used together, they are even more sensitive. Collectively, BEACO₂N’s high-resolution data set can drive complex atmospheric models that zero in on still subtler emission phenomena, like the CO2 wafting from a congested freeway during rush hour or a drafty home in wintertime.
Such analyses will give local lawmakers the facts and figures to critically assess individual line items on California’s 73-part climate action plan. Communities can then focus resources on the most effective emissions-reducing initiatives.
Not just CO2
These days, BEACO₂N also serves as a pilot platform for testing other low-cost sensing technologies.
While CO2 generally isn’t harmful to breathe, it comes with co-emitted pollutants, like carbon monoxide or soot particles, that cause asthma and cardiovascular problems. As equipment for measuring these other species comes to market, BEACO₂N could provide citizens with their own highly localized, real-time air quality data as well.
The first generation of measurements is already being analyzed by research scientists and high schoolers alike, and the data set continues to grow. Since its inception in 2012, the active network has expanded into San Francisco and Sonoma County, with interested parties as far away at Sydney, Australia.
That’s not to say that the BEACO₂N approach is without its own unique challenges. With cheaper equipment, you get what you pay for, namely: drift in the accuracy of your data over time. Dust collects on lenses, mirrors get jostled out of alignment, and these small changes can gradually invalidate the original factory calibration.
High-end instruments combat this drift with bulky tanks of “reference” gases or pricey proprietary parts, both of which are incompatible with doing science on a budget. It can be just as expensive and time-consuming, however, to repeatedly cycle cheaper instruments back to the lab for recalibration.
To improve accuracy, the BEACO₂N team is currently investigating ways to remotely cross-reference veteran nodes against more recently calibrated ones. BEACO₂N may one day be able to use natural phenomena, such as large gusts of wind, to synchronize the signals across long distances.
By finding ways to do more with less, it is hoped that BEACO₂N can pave the way for similar strategies to be adopted by developing nations, or even by curious citizen groups here in the US.
Data produced affordably and accessibly means more knowledge for more people. And when it comes to managing emission reductions, knowledge is power.