Abdul Hamid provides an insight into the historical discovery of anthropogenic climate change.
Photo Credit: Science Photo Library
In 1896, the Swedish physicist, Svante Arrhenius, proposed a radical idea. By calculating the effect of increasing atmospheric carbon dioxide on the Earth's temperature, Arrhenius concluded that human-induced activity is enough to cause significant global warming.
More than a century later, the world still isn't convinced. The people living in Arrhenius' time could be forgiven for doubting his findings; the temperature records over the years were not as alarming as they are today. Many could not accept the idea that the seemingly insignificant effect of human activity could cause noticeable environmental damage. But despite the mountain of indisputable data that has been gathered over the proceeding decades, the majority of the world have yet to acknowledge the undeniable crisis.
The physics behind Arrhenius’ discovery
Arrhenius came to his conclusion by employing a technique called infrared spectroscopy, a relatively new development at the time. He found a link between the increase of CO2 in the atmosphere and the amount of infrared radiation absorbed. It was also known at the time that the oceans contained a surplus of carbon dioxide, locked in from years of absorption.
The radiation emitted by the sun is partially reflected back into space and partially trapped by the ozone layer. Ultimately, Arrhenius realised that an increase in CO2 concentrations would cause more infrared radiation to remain trapped thereby increasing global temperatures. The effect was called the ‘greenhouse effect’.
Evidence of this greenhouse effect can be found on other planets. Venus is an extreme example of an aggressive greenhouse system where the atmosphere has become so thick with CO2, the surface has become a hellish environment, with temperatures exceeding 400°C. This results in Venus claiming the title for the hottest surface in the solar system despite Mercury being the closest body to the Sun.
The mounting evidence
A few decades after Svante Arrhenius’ death, further developments in scientific technology meant that atmospheric CO2 concentrations could be estimated. During the 60’s, CO2 concentration curves for data between 1940 and 1970 were displaying signs that global temperatures were decreasing, conflicting with Arrhenius’ argument. Some scientists began to worry that a new mini ice age was about to strike. These curves, however, fail to portray the wider picture. The Earth experiences natural warm-cold cycles, each lasting anywhere between a few hundred to several thousands of years. These are due to many factors, a major factor being fluctuations in the Earth’s tilt as it orbits the Sun. It wasn’t until the 1980’s that scientists were in for a worrying shock…
In the 1980’s, global temperatures began to rise rapidly. Several years later, temperatures were showing no sign of slowing down, increasing faster than ever recorded. It wasn’t until we entered the 21st century that the world began to listen. Temperature records continue to be broken. The problem has reached such an alarming point that we can almost guarantee that every year will be slightly hotter than the previous year.
It is interesting to see that more than 100 years ago, with limited scientific data, a man was able to predict the global consequences of anthropogenic activity. Yet, even with huge technological advancement and ever-increasing accuracy in the present, many are too stubborn to accept an issue that is indisputable.
ABOUT THE AUTHOR
Abdul Hamid is a 3rd year Physics student and is interested in exploring a career in the nuclear energy industry, a source of energy that could help battle the effects of fossil fuel emissions. Abdul is also interested in the promising research being undertaken to perfect the nuclear fusion process, a possible game-changer in the future of energy.
Comments