Positivism, Popper, and Climate Change

View 709 Monday, January 16, 2012

clip_image002

Huntsman withdraws and endorses Romney. The only surprise here is that it took so long after New Hampshire. That is probably because it took a few days for Huntsman’s father to decide he didn’t want to pay any more to keep the campaign going. The Huntsman Corporation is huge, and Jon Huntsman Junior’s father is worth at least a billion. He could easily afford to finance more campaigning, but it is pretty clear that Huntsman could not win the nomination. I’m sure Huntsman Sr. took one last poll and confirmed that, then declined to pay any more for the campaign.

Candidate Huntsman has been CEO of the Huntsman Corporation prior to being a successful governor of Utah, so he has both private and public executive experience. He has connections to both the conservative and the establishment wings of the Republican party, but it is important to note that Huntsman was one of the few Reagan White House staffers who got promoted (to Assistant Secretary of Commerce) by George H W Bush. Bush did not much care for Reagan or Reagan’s people and systematically eliminated them from both the White House and other Executive Department positions. Bush I later appointed Huntsman to be Ambassador to Singapore; he was the youngest US ambassador in about a hundred years.

He was very effective as an ambassador to Singapore, then Indonesia, and later China, and is an obvious candidate for Secretary of State no matter who wins the nomination. He is not so enamored of the country club Republicans as to be repugnant to the conservatives, and his diplomatic skills are great.

Huntsman is a fairly representative of the younger generation of what is generally called the Establishment, holding positions considerably more conservative than the self-styled Liberal Republicans of Rockefeller’s day. He is not notably an opponent of the notion of “Big Government Conservatism.” We probably have not heard the last of him.

I note that Newt’s latest ads are back to the positive track, and I haven’t heard the highly negative anti-Romney ads lately; but then I don’t get local South Carolina radio and TV programs.

clip_image002[1]

The Climate Change debate has opened again.

"New molecule could help cool planet" actual conclusion.

The important part of this whole article is contained here "The molecules detected by the research team occur naturally in the presence of alkenes, chemical compounds which are mostly released by plants.

"Plants will release these compounds, make the biradicals and end up making sulphuric acid, so in effect the ecosystem can negate the warming effect by producing these cooling aerosols," Percival said."

Conclusion:

When there is more CO2, plants grow larger and faster (plant growth is up at least 7% right now worldwide). When there are more and bigger plants, there are more of these cooling molecules. Result, whatever warmth CO2 adds is offset by these molecules.

If CO2 produces more warmth, plants can grow at higher latitudes, result, more total plants, more cooling molecules, see above.

If CO2 produces more warmth, there will be more evaporation, more rainfall, less deserts, more plants, see above.

Final conclusion, it appears that the geoengineering we need to prevent global warming is already present. This rather explains how this planet has managed to maintain a relatively even temperature for so long, despite such things as the faint early sun paradox and such. It appears that this planet has many such things that do this, like the chemicals given off by plankton (dimethylsulfide) that are stimulated by warmth and aid in cloud formation and thus shade, cooling things down again, the way warmth creates evaporation creates clouds that move heat from down here to up there where it can be radiated away while it drops cooling rain, fans us with wind, and acts as a sunshade (there is a band of thunderstorms constantly around the equator doing just this right now), and probably others. This can explain why the global warming prophesied by the computer models has not occurred, and why the label attached to that has had to change, first to "climate change" and then to "climate disruption", both of the latter suffer from the problem of then explaining exactly how CO2 can do anything but produce warming. Unfortunately, in a world with already present, free biradicals, DMS, and sunshade clouds, we are not in need to spent trillions to offset something that seems to have plenty of things to handle it already.

In other words, chill out, literally.

Oh, and throw another log on the fire.

D

==

“Pollution-gobbling molecules in global warming SMACKDOWN:”

http://www.theregister.co.uk/2012/01/16/criegee_biradicals/print.html

I saw a story on this a few days ago but it didn’t register – the thrust of the earlier story was the potential use of Criegee biradicals as climate change agents. Now the importance becomes clear. These Criegee biradicals are another important element of our atmosphere that was not fully known or appreciated, and thus not part of climatic models. I guess the bishops of AGW must recast their catechisms — er — models.

Ed

I have said repeatedly that the proper approach to the Climate Change crisis is not financially disastrous limits to technology and economic growth, but the development of engineering methods to enhance natural forcing mechanisms. Admittedly the existence of proven means of changing climate would bring about enormous political pressures: while it is likely that most of us would be better off in a world a bit warmer with a bit more CO2, there are also those who would prefer a dead halt and stability, and a few who would prefer a rollback to the climates of the 1940’s. The politics would get fierce – but at least there would be something to debate.

What we have now is uncertainties.

And on that score, Mike Flynn, the best statistician I have met since Tukey, says:

Death by Data: The End of Science as We Once Knew It?

There is a disturbing article in The Atlantic dealing with the steadily increasing mountains of data, the ease of storing them, the expenses of reviewing and editing them, the ease of sharing them, etc.

"To Know, but Not Understand," by David Weinberger

http://m.theatlantic.com/technology/archive/2012/01/to-know-but-not-understand-david-weinberger-on-science-and-big-data/250820/

Summarizing briefly:

Henri Poincare famously said that just as a house is not simply a pile of bricks, science is not simply a pile of facts. It is the construction of those facts that make a science. No fact is self-explaining. It is only when facts are joined together in the light of a theory that they have any meaning. The problem today is that there are too damn many bricks.

Weinberger writes:

"For Sir Francis Bacon 400 years ago, for Darwin 150 years ago, for Bernard Forscher 50 years ago, the aim of science was to construct theories that are both supported by and explain the facts. Facts are about particular things, whereas knowledge (it was thought) should be of universals. [bf added]

"We therefore stared at tables of numbers until their simple patterns became obvious to us. Johannes Kepler examined the star charts carefully constructed by his boss, Tycho Brahe, until he realized in 1605 that if the planets orbit the Sun in ellipses rather than perfect circles, it all makes simple sense. Three hundred fifty years later, James Watson and Francis Crick stared at x-rays of DNA until they realized that if the molecule were a double helix, the data about the distances among its atoms made simple sense. With these discoveries, the data went from being confoundingly random to revealing an order that we understand: Oh, the orbits are elliptical! Oh, the molecule is a double helix!

A theory is a narrative that "makes sense" of the data. From the theory we can predict the data and deduce the mathematical laws that describe their regularities. The laws are the cement between the bottom layer of data and the capstone of theory. When the theory predicts thus-far-unknown data, we have the opportunity to confirm or falsify the theorem. It’s all great fun.

Starting already years ago, instrumentation in the factory began delivering continuous data on strip recorders and the like. This overturned the old spot-checking at discrete time points and resulted in a heap of data and what I called "paralysis of analysis." This has been happening in science, in spades, and folks don’t always realize that they are applying statistical methods that were developed for sparser data streams, where the challenge was to extract meaning from meager samples. A t-test is useless for two large data sets because for large enough values of n there will always be a non-zero difference between them.

Weinberger tells of a program, Eureqa, which will jump into the mass of data and noodle around until it constructs equations that predict the outcomes with tolerable accuracy. Sounds like a combination of orthogonal factor analysis and step-wise regression on steroids. (I assume it pays attention to functional coupling, covariance, and variance inflation factors.) What comes out are equations that accurately produce the Ys, but whose factors may not correspond with any physical factor. The result is equations that work, but the researcher does not understand what they mean.

One is reminded of Billy Ockham and his razor. He said we should keep the number of terms in our models as small as needed for them to work, because we would not otherwise understand the model. The real world, he added, could be as complex as God wished. Weinberger seems to be getting at the same issue. The modern way of science, which ran from Bacon and Descartes to our own time, may have to give way to some other way of knowing; just as medieval way science gave way to the modern. Weinberger calls that a different way of knowing things; but I am inclined to go with his title and say it replaces understanding with simple knowing. I did a typically discursive blog post on this at http://tofspot.blogspot.com/2012/01/autumn-of-modern-science.html

Now, if the factors churned up by Eureqa-like programs out of brickyards full of data, are not explicable as physical entities, we would have to say that the important factors are "hidden." The researcher knows what his inputs need to be to get the outputs; but he doesn’t know how he gets them. "Hidden" is what "occult" means, and the use of occult powers of nature to manipulate nature was called "magic."

So it may be that Arthur C. Clarke was more right than he knew when he said that a sufficiently advanced science is indistinguishable from magic.

Mike Flynn

When I studied Philosophy of Science under Gustav Bergmann at the University of Iowa in the 1950 I concluded that the scientific method was essential to knowing anything, and in keeping with young people of that time I thought that the relentless application of the scientific method would solve all problems. We knew how, now; all we needed was to learn the methods and apply them. Bergmann was a member of The Vienna Circle and thus an extreme positivist, and at the time I found that very attractive. I later learned to modify my logical positivist views to something closer to Karl Popper’s views, but that’s a subject of a much longer essay. The point is that we were certain that there was nothing we could not understand by the relentless application of logic.

At the same time, academic psychology was divided between the behaviorists who debated the distinction between hypothetical constructs and intervening variable and used pseudo-mathematical formulas with unknowable terms in them to appear “scientific” as opposed to the Freudians and their orthodox and heretical descendants who used case histories rather than data, and postulated Ego and Id and other concepts. (One late descendent of Freud, through Jung, is L. Ron Hubbard with his Dianetics.)

Most of that nonsense is gone from academic science now although it remains as “theory” in Modern Languages Departments and in some of the Voodoo Science departments; but the optimism of positivism as modified by Popper remains.

Now we have to wonder if we do have the tools we need to understand the data we have. Most Climate Scientists don’t really know how their models work; they postulate various feedback loops, but there are enough variables in there (give a physicist five manipulable constants and a couple of functions and he can explain anything) that can be adjusted to – well, to what? What no model has yet done is to start with the initial conditions of some distant time in the past – at least fifty years – and let it run to generate that actual climate history since then.

And perhaps that is the key here: the models are falsifiable propositions. They can only be tested by seeing if their predictions come true. It is argued that the consequences of ignoring the disaster predictions are so severe that we just can’t wait: we have to start making trillion dollar decisions now, because the models tell us that we have no choice.

Sometimes philosophy of science can be important. Pity that not very many modern students know anything about it. It used to be called Epistemology, and was one of the foundations of philosophy, but that, too, appears to be headed into extinction. And it’s lunch time, and I don’t want to spiral down into rambling about The Coming Dark Age. Despair is a sin.

Instead, rejoice: there may be an engineering solution to Global Warming, assuming that nature hasn’t already beat us to it.

clip_image002[2]

clip_image002[11]

clip_image005

clip_image002[12]

Bookmark the permalink.

Comments are closed.