Ever since the 1980s, hardly a week has passed without some new report warning about the apocalyptic effects of man-made climate change. However, these reports routinely exclude data that might cast doubt on their conclusions.
The hypothesis of man-made climate change states than when human beings burn fossil fuels, the resulting carbon dioxide gases insulate the planet to an unnatural extent. This then causes the planet to warm and the polar ice caps to melt. While few people deny that the planet seems to be about a degree and a half warmer than it was a century ago, the link between rising temperatures and rising carbon dioxide levels is not nearly as obvious as all the climate doomsayers would have you believe.
According to NASA Earth Observatory, Earth’s average temperature increased by almost one degree Fahrenheit between 1918 and 1941. Over the same period carbon dioxide levels increased 5 percent. Between 1941 and 1978 carbon dioxide levels increased another 5 percent, while temperatures decreased by about half a degree.
The media at the time were raging about global cooling, only to see temperatures increase another degree between 1978 and 1998. Since 1978, carbon dioxide levels have risen fifteen percent, but global temperatures have remained stable.
These facts cast serious doubt on the theory that burning fossil fuels is the primary cause of the planet’s 0ne-degree temperature rise over the past century. Government-funded scientists are ignoring evidence that global warming does not correlate to carbon dioxide emissions, and that no warming has been recorded for the past nineteen years. They are also ignoring the evidence proving that Earth has experienced higher temperatures than at present — before sources of carbon emissions were even invented.
According to the Intergovernmental Panel on Climate Change, natural factors play an insignificant role in climate change compared to the man-made factors. But in order to believe this, you have to ignore reams of data from ice cores, dripstones, tree rings, and ocean sediment cores. The vast storehouses of scientific evidence all indicate that Earth naturally alternates between warm and cold phases in roughly 1000-year cycles. During the warm period prior to the one we are now in, the Earth warmed 800 years before the start of the Industrial Revolution and its accompanying carbon dioxide increases.
The evidence demonstrates very clearly, in fact, that carbon dioxide emissions, while certainly contributing to climate change, are insignificant when compared to effects from the sun. As solar activity increased from 1900 to 1940 and decreased from 1940 to 1970, this correlates much more closely with global temperatures than with levels of carbon dioxide.
Our rabid climate scientists — and we use the term “scientists” very hesitantly — still find it politically expedient to deny, bury, and discredit the research that proves them wrong. The climate change alarmists systematically ignore the evidence of naturally occurring climate change due to the sun. Using manipulation of data to push a political agenda is not science, and it threatens both the economy and the environment. Attempting to shut down the fossil fuel industries due to a climate change alarm which does not exist is not just misguided, it is foolish, stupid, and dishonest. Let’s find good alternative energy sources instead of spending billions upon billions of dollars subsidizing unprofitable industries and supporting the UN’s panicky climate fund efforts.
[From an article by Andrew Miller in The Philadelphia Trumpet]
As always, posted for your edification and enlightenment by
NORM ‘n’ AL, Minneapolis