Gaaahhhhhhhh. Note to self: avoid ExxonMobil at all costs.
[Scientist and intern] Knisely projected [in 1979] that unless fossil fuel use was constrained, there would be “noticeable temperature changes” and 400 parts per million of carbon dioxide (CO2) in the air by 2010, up from about 280 ppm before the Industrial Revolution. The summer intern’s predictions turned out to be very close to the mark.
[…]
The report, which circulated within the company through the early 1980s, reflected Exxon’s growing need to understand when the climate implications of increased CO2 emissions would begin to spur policy changes.
So Exxon (now ExxonMobil) shelved an ambitious but costly program that sampled carbon dioxide in the oceans—the centerpiece of its climate research in the 1970s—as it created its own computerized climate models. The models aimed to simulate how the planet’s climate system would react to rising CO2 levels, relying on a combination of mathematics, physics, and atmospheric science.
Through much of the 1980s, Exxon researchers worked alongside university and government scientists to generate objective climate models that yielded papers published in peer-reviewed journals. Their work confirmed the emerging scientific consensus on global warming’s risks.
Yet starting in 1989, Exxon leaders went down a different road. They repeatedly argued that the uncertainty inherent in computer models makes them useless for important policy decisions. Even as the models grew more powerful and reliable, Exxon publicly derided the type of work its own scientists had done. The company continued its involvement with climate research, but its reputation for objectivity began to erode as it campaigned internationally to cast doubt on the science.