One episode of A History of Ancient Britain, the recent BBC series, is about the Ice Age. If you know there was an Ice Age, you should grasp that the Earth varies in temperature a lot for reasons that have nothing to do with human activity. To measure the effect of recent human activity on global temperatures, you need to know what the Earth’s temperature would have been in the absence of human activity. Then you find the effect of humans by subtraction (actual temperature – predicted temperature assuming no human activity).
That’s hard to do. Because the non-human effects are so large, you need a really accurate model to “control” for them. No such model is available. No current climate model has been shown to accurately predict global temperatures — the IPCC chapter called “Climate Models and Their Evaluation” (informal title: “Why You Should Believe Them”) is the most humorous evidence of that. Lack of accurate predictions means there is no good reason to trust them. (That the models can fit past data means little because they have many adjustable parameters. “With four parameters I can fit an elephant,” said John von Neumann.) The case against the view that humans have dangerously warmed the climate (sometimes called AGW, anthropogenic global warming) is that simple.
Because it is so simple, “the other side” consists of saying why 2+2 really does equal 20 or whatever. Sure, many people say it, so what? When I was an undergrad, I gave a talk called “The Scientific ____ “. I said usage of the term scientific without explaining what it meant was a sign of incompetence and a reader could safely stop reading right there. That isn’t terribly helpful, because few people use scientific that way. My grown-up version of this test is that when someone claims AGW is true, I stop taking them seriously as a thinker. I don’t mean they can’t do good work — Bill McKibben is an excellent journalist, for example. Just not original thought.