I ended last week’s post on the topic of Confirmation Bias* with these questions:

After all, who among us wants to be wrong about important matters on which we’ve staked no small amount of credibility?

But what if being wrong about those important matters winds up being the least of our problems?

It’s human nature to seek out information, evidence, opinions, etc which support positions we’ve taken on a wide variety of topics. Contentious political and social issues provide glaring examples of this from both the left and right sides of the various debates. Climate change is certainly one of the more noteworthy subjects.

So is peak oil. I’m of the clear opinion that our future energy needs are not going to be based on an endless, forever abundant, affordable, easily accessible fossil fuel supply. I’m not alone, of course. There is an equally vocal, and more prominent contingent on the other side of this debate, claiming we peak oil proponents are nothing more than doom-and-gloom messengers who’ve been consistently wrong in predictions.

That’s the starting point.

The conflicts arise in part because of what one relies upon to support his or her position. In some instances, there are actual facts in dispute [some shaded to suit one’s inclinations, of course]. But in too many other instances—peak oil and climate change among them—one side has a clear tendency to not just restrict the facts relied upon to a select and duly-massaged few, they also completely ignore a more substantial and substantive body of evidence.

Offering statements with an assortment of qualifiers [“if”; “possible”; “could”; “potential”, etc] may offer those proponents some assurances that they are essentially correct. But to ignore an entire body of evidence contradicting—or least casting some reasonable doubt—on their staked positions calls into question motivations for disseminating partial truths.