View from Chaos Manor, Tuesday, January 27, 2015
I am trying to get everything to a new machine and I find myself in a complete state of confusion. I am trying to go from Word 7 to Word 10 because 7 has some incomprehensible bug that randomly kills autocomplete and sometimes spell checking. Drives me crazy.
Over time I will catch up despite the stroke. Bear with me. I will have more comments. Meanwhile, I will not bore you, I hope.
Perhaps we can sum up Global Warming.
1. Is the Earth getting warmer?
Yes. Of course. No one denies this. It has been warming at about a degree per century. This trend started in the early 19th Century, ending the Little Ice Age. For a while in the last Century there was a cooling trend that frightened some of those who watch such things, but after about twenty years the warming resumed. For the last twenty years the warming has been very slight, and some argue that it has stopped, but most observers believe the Earth will continue to warm at about a degree per Century for another hundred years.
2. Does human activity contribute to this warming?
Yes. Of course. We add CO2 to the atmosphere, and that contributes to warming. The disagreement is on how much effect this has: how much warmer are we now than we would be if there were no human released CO2 in the atmosphere? But we do not know this, because the warming is fairly slight, and while models tell us we had 1.4 degrees warming since 1880, we can’t know because we don’t know to any tenth of a degree how warm the Earth was in 1880. We think we know to a half degree accuracy, but that’s not even certain, and depends on how we average a great many numbers, some of which we know come from instruments influenced by urbanization any other non random factors not climate change.
We know that it was warmer in Viking times than now, and surely that was not due to Medieval human activities. It happened, and climate models do not explain it. There were other warm times in human history.
We may need to do something; but that will be expensive, and economy wrecking schemes to tax coal in the US will do little or nothing to halt the trends. We can say we tried and bankrupted ourselves in a good cause, but that isn’t true. Mostly we need to admit that we don’t know.
Climategate, the sequel: How we are STILL being tricked with flawed data on global warming – Telegraph
Although it has been emerging for seven years or more, one of the most extraordinary scandals of our time has never hit the headlines. Yet another little example of it lately caught my eye when, in the wake of those excited claims that 2014 was “the hottest year on record”, I saw the headline on a climate blog: “Massive tampering with temperatures in South America <https://notalotofpeopleknowthat.wordpress.com/2015/01/20/massive-tampering-with-temperatures-in-south-america/> ”. The evidence on Notalotofpeopleknowthat, uncovered by Paul Homewood, was indeed striking.
Puzzled by those “2014 hottest ever” claims, which were led by the most quoted of all the five official global temperature records – Nasa’s Goddard Institute for Space Studies (Giss) – Homewood examined a place in the world where Giss was showing temperatures to have risen faster than almost anywhere else: a large chunk of South America stretching from Brazil to Paraguay.
Noting that weather stations there were thin on the ground, he decided to focus on three rural stations covering a huge area of Paraguay. Giss showed it as having recorded, between 1950 and 2014, a particularly steep temperature rise of more than 1.5C: twice the accepted global increase for the whole of the 20th century.
For industrial and scientific instrumentation, calibration needs to be relatively frequent (usually annual, and some instruments more frequently; in the lab where I’ve sometimes worked, calibration was effectively daily).
This also gets into the rather deep subject of accuracy of a measurement (e.g., how close is the measurement to the actual value being measured) and the precision of the measurement (e.g., how closely do a series of measurements of the same quantity match). A frequent example of difference is made from target shooting. A shooter who fires ten shots, all of which hit the bullseye of the target, is both accurate and precise. A shooter who fires ten shots which are uniformly distributed within, say, the 8 ring of the target (assuming the bullseye as the 10 ring) would be accurate (the average of the shots is on the bullseye) but not precise. A shooter who is aiming at the bullseye and puts the ten shots into a bullseye-sized group to the right of the 8 ring boundary is precise (good grouping) but not accurate.
So in the example, if the analyst compares measurements for one fixed temperature station with consistent calibration over a period of time, AND if there is nothing changing about the environment (e.g. no nearby human activity which adds heat to the environment), one can reasonably say that there has been a change in temperature (2 degrees, in the example) for that station between the measurement periods as a matter of precision.
If the same is true of every station in the ensemble of temperature stations, they again you could claim that the temperature change is true as a matter of precision.
However, where the example falls short is that you are measuring a two degree change in temperature over a one-year period with one degree precision. (This is, of course, ignoring the effects of weather, which will dominate the trend over daily to annual time frames. We will also assume in what follows, as appears to be the assumption of the example, that the ensemble of measurements which is being reported is actually being measured as deltas from the baseline temperature of each individual station; if the air temperatures are being reported and the temperature difference is later calculated from the air temperatures, then we can no longer ignore accuracy in the statement of the problem.)
The climate change "problem" is that the researchers are attempting to measure a 0.02 degree change in temperature over a one-year period ( 2 degrees Fahrenheit per century) with one degree precision. In the midst of "weather," which consists of, typically, 20-30 degree daily variations of temperature; variations of the daily average temperature by roughly 5 – 15 degrees up and down, once or twice each week, as weather systems pass; and annual variations of the weekly mean temperature ranging from near zero at the equator to (roughly) 20-40 degrees in most of the Continental US, to something in excess of 60 degrees at the poles.
So for global averaging, it is necessary to maintain consistent precision of calibration as well as consistent accuracy of calibration (that is, if the thermometer was exactly 10 degrees off for the first measurement, it has to still be exactly 10 degrees off for the final measurement, in order for precision to be maintained) for all thermometers over the ensemble time. Over decades of time. And again assuming no systematic changes in the environment of each temperature station, such as construction of a 10-story building and associated parking lot.
Again, it comes down to the same bottom line: the perceived change is temperature is significantly less than the accuracy and precision errors of the instruments over annual time frames.
As I have said before, demonstrating the claims of the global warming alarmists over 100 year time frames would require input data accuracy (and analysis precision for their models) on the order of 10 parts per billion, or 8 significant digits, as measured by the contribution to the global heat balance.
Warning – It may raise your blood pressure. Take your pills before reading. <g>
“When I was young, everyone got measles; sometimes you might visit someone who had it so you’d get it over at a relatively convenient time, since you were going to get it. Now enough have inoculation that it’s not inevitable.”
When I was young, my parents saw two pock marks on me, and wondered if that was proof I had contracted Chicken Pox.
Fast forward decades later, I am now in my 40’s and am feeling run down. A trip to the doctor shows I have Chicken Pox and we have our answer.
Getting that as an adult, as many people will tell you, is pretty awful. I can agree from hard experience.
My mother exposed me to Mumps early in life, which is analogous to what you have stated in your narrative. That worked as expected, the Chicken Pox trick did not.
Me, I am all in favor of voluntary vaccination paid for at the state/local level with public funds.
For the police to start using the awesome power of the state to tell parents what to do with their children, ought, in my opinion, to be done only with a court order endorsed by a jury.
Dear Dr Pournelle,
You are wrong to think that measles is a relatively benign disease of childhood. There are significant complications at all ages, but particularly in those under 5 and over 20 or so.
According to the CDC, a generally reliable source – unless you are deeply paranoid – about 1 in 10 will get an ear infection, as many as 1 in 20 will get pneumonia, about 1 in 1000 will get encephalitis leading to convulsions and possible deafness and retardation. As well the mortality is about 1 or two children for every thousand cases, usually due to pneumonia.
In the longer term there is a condition called subacute sclerosing panencephalitis (SSPE) which is a progressive and fatal neurological disease seen about 10 years after measles infection in about 4 in 100000 cases. This is now almost unknown in the US, Australia, where I am, and other countries with high immunisation rates.
When compared with the risk of a serious complication (life threatening) from the MMR vaccine the balance of risk v benefit is very heavily in favour of vaccination.
Smallpox has been eradicated by vaccination. Polio was set to be eradicated till the idiotic interference of intelligence agencies in the program in Pakistan was used to try and identify Bin Laden’s DNA leading to the murder of numerous vaccination program workers and setting back the goal of elimination there and elsewhere.
MB, BS, FRACS
Measles is not trivial, but the risk from vaccination is lower
With respect to these links on your blog:
Scientists say destructive solar blasts narrowly missed Earth in 2012
Near Miss: The Solar Superstorm of July 2012 http://science.nasa.gov/science-news/science-at-nasa/2014/23jul_superstorm/
I just want to point out that much is being made of this, but in point of fact the flare and resulting CME went off nearly 180 degrees AWAY from Earth. (Yes, I went back and looked at the data and video awhile back, when I became aware of it — which wasn’t until sometime in the first half of 2014 if memory serves. And since I watch solar activity quite closely, aided and abetted by Jim, and have done for many years, that in itself says something, I think.) As it takes about 25 days and some fraction for the Sun to rotate on its axis, that’s something between 12 and 13 days out from being "aimed" at Earth. So I guess it kind of depends on your definition of a "near miss."
But yes. As you and I have discussed in the recent past, there is most certainly a danger. There is one theory (or perhaps it is a hypothesis; I’m not sure how much data supports it as yet) that Carrington-level events tend to occur on either side of an extended solar minimum. As the Carrington event itself occurred on the upswing from an extended minimum, there is, I suppose, at least SOME evidence. And most definitely modern infrastructure is highly susceptible, and is ill-prepared, despite much urging of the appropriate politicians and bureaucrats.
Interstellar Woman of Mystery
Just to show that Congress is not completely asleep at the switch regarding EMPs and Solar Flares (considered equivalent threats by them), H.R. 3410 was introduced and passed during the 113th congress. It died in Senate cimmitee. It took three visits to my congresscritter’s office and several emails and phone calls to find out about it. I’ve had no response to my communications from my senators regarding this issue.
I hope you might encourage your readers to reach out to their representatives to get this passed again.
H. R. 3410, Critical Infrastructure Protection Act (CIPA)
Â· Chairman Michael McCaul, Committee on Homeland Security
Â· Sponsor: Rep. Trent Franks (R-AZ)
Â· Cosponsor: (21) 2 Dâ€™s and 19 Râ€™s
o Critical Infrastructure Protection Act or CIPA – Amends the Homeland Security Act of 2002 to require the Assistant Secretary of the National Protection and Programs Directorate to: (1) include in national planning scenarios the threat of electromagnetic pulse (EMP) events; and (2) conduct a campaign to proactively educate owners and operators of critical infrastructure, emergency planners, and emergency responders at all levels of government of the threat of EMP events.
o Directs the Under Secretary for Science and Technology to conduct research and development to mitigate the consequences of EMP events, including:
(1) an objective scientific analysis of the risks to critical infrastructures from a range of EMP events;
(2) determination of the critical national security assets and vital civic utilities and infrastructures that are at risk from EMP events;
(3) an evaluation of emergency planning and response technologies that would address the findings and recommendations of experts, including those of the Commission to Assess the Threat to the United States from Electromagnetic Pulse Attack;
(4) an analysis of available technology options to improve the resiliency of critical infrastructure to EMP; and
(5) the restoration and recovery capabilities of critical infrastructure under differing levels of damage and disruption from various EMP events.
o Includes among the responsibilities of the Secretary of Homeland Security (DHS) relating to intelligence and analysis and infrastructure protection to prepare and submit to specified congressional committees:
(1) a comprehensive plan to protect and prepare the critical infrastructure of the American homeland against EMP events, including from acts of terrorism; and
(2) biennial updates of such plan.
o Electromagnetic Pulse (EMP) is an instantaneous, intense energy field that can overload or disrupt at a distance numerous electrical systems and high technology microcircuits, which are especially sensitive to power surges.
â€ Large-scale EMP effects can be produced either through a single nuclear explosion detonated into the atmosphere, or non-nuclear devices.
 Congress established the EMP commission in FY2001 to assess the threat of an EMP attack on U.S. infrastructure.
 The EMP commissionâ€™s 2008 report determined that an EMP attack â€œcreates the possibility of long-term catastrophic consequences for national security,â€ but argued that U.S. vulnerability could be reasonably reduced by coordination between the public and private sectors.
 U.S. critical infrastructure remains vulnerable to an EMP event despite the warnings laid out by the EMP commission. H.R. 3410 would begin to address this vulnerability by creating planning scenarios in the event of an EMP attack, and by educating first responders on how to respond to an EMP attack.
Â· Note: EMP interference is generally damaging to electronic equipment, and at higher energy levels a powerful EMP event such as a lightning strike can damage physical objects such as buildings and aircraft structures. The damaging effects of high-energy EMP have been used to create EMP weapons.
Sincerely, Jan Stepka
Phil Tharp is questioning computer science and computer engineering math requirements. I’ve been seeing the same thing.
I graduated from CSU in 1984 with a B.Sc. in Computer Science. At that time, if you took 3-5 more classes, you could graduate with a secondary degree in Applied Mathematics. At the very least, you had a minor in Applied Mathematics. We spent a lot more time on theory and foundational math in CS back then, less on programming.
I spend a lot of time with recent grads in CS in the software company that I work for. In some ways, they’re better prepared than I was: more in-depth knowledge of practical hands-on topics like networking and programming. Not so much math or theory, though. It hampers them when technology moves on from what they were taught, or if they’re presented with a problem for which they don’t have a cook-book recipe. At that point, they’re frantically googling for answers.
Thankfully, there are cook-book recipes for most problems that programmers run into nowadays, so this isn’t a significant restriction. About once a year, though, someone comes to me with a problem that isn’t readily solved by google and for which I have to dust off my old skills. They’re just not taught much anymore.
Now, it’s likely that other schools besides CSU or UC teach more theory and/or mathematics with computer science still. Top-tier schools like Caltech, Stanford or MIT come to mind. I don’t come into contact with grads from places like that too much. But for the state schools in the West, I’ve been noticing this trend for the last 20 years or so.
All the best on your recovery!
It is a problem. Statistical inference is difficult and design of good experiments more so.
Freedom is not free. Free men are not equal. Equal men are not free.