Science; Quantum Neepery

Chaos Manor Mail, Tuesday, June 23, 2015


There is also a View for today.



Quite relevant to the claims that “the science is settled”, regardless of the subject.

Science, Now Under Scrutiny Itself

By BENEDICT CAREYJUNE 15, 2015      (nyt)

The crimes and misdemeanors of science used to be handled mostly in-house, with a private word at the faculty club, barbed questions at a conference, maybe a quiet dismissal. On the rare occasion when a journal publicly retracted a study, it typically did so in a cryptic footnote. Few were the wiser; many retracted studies have been cited as legitimate evidence by others years after the fact.

But that gentlemen’s world has all but evaporated, as a remarkable series of events last month demonstrated. In mid-May, after two graduate students raised questions about a widely reported study of the effect of political canvassing on opinions of same-sex marriage, editors at the journal Science, where the study was published, began to investigate. What followed was a frenzy of second-guessing, accusations and commentary from all corners of the Internet: “Retraction” as serial drama, rather than footnote. Science officially pulled the paper, by Michael LaCour of the University of California, Los Angeles, and Donald Green of Columbia, on May 28, because of concerns about Mr. LaCour’s data.

“Until recently it was unusual for us to report on studies that were not yet retracted,” said Dr. Ivan Oransky, an editor of the blog Retraction Watch, the first news media outlet to report that the study had been challenged. But new technology and a push for transparency from younger scientists have changed that, he said. “We have more tips than we can handle.”

The case has played out against an increase in retractions that has alarmed many journal editors and authors. Scientists in fields as diverse as neurobiology, anesthesia and economics are debating how to reduce misconduct, without creating a police-state mentality that undermines creativity and collaboration.

“It’s an extraordinary time,” said Brian Nosek, a professor of psychology at the University of Virginia, and a founder of the Center for Open Science, which provides a free service through which labs can share data and protocols. “We are now seeing a number of efforts to push for data repositories to facilitate direct replications of findings.”

But that push is not universally welcomed. Some senior scientists have argued that replication often wastes resources. “Isn’t reproducibility the bedrock of science? Yes, up to a point,” the cancer biologist Mina Bissell wrote in a widely circulated blog post. “But it is sometimes much easier not to replicate than to replicate studies,” especially when the group trying to replicate does not have the specialized knowledge or skill to do so.

The experience of Retraction Watch provides a rough guide to where this debate is going and why. Dr. Oransky, who has a medical degree from New York University, and Adam Marcus, both science journalists, discovered a mutual interest in retractions about five years ago and founded the blog as a side project. They had, and still have, day jobs: Mr. Marcus, 46, is the managing editor of Gastroenterology & Endoscopy News, and Dr. Oransky, 42, is the editorial director of MedPage Today (he will take a position as distinguished writer in residence at N.Y.U. later this year).

In its first year, the blog broke a couple of retraction stories that hit the mainstream news media — including a case involving data faked by an anesthesiologist who later served time for health care fraud. The site now has about 150,000 unique visitors a month, about half from outside the United States.

Dr. Oransky and Mr. Marcus are partisans who editorialize sharply against poor oversight and vague retraction notices. But their focus on evidence over accusations distinguishes them from watchdog forerunners who sometimes came off as ad-hominem cranks. Last year, their site won a $400,000 grant from the John D. and Catherine T. MacArthur Foundation, to build out their database, and they plan to work with Dr. Nosek to manage the data side.

Their data already tell a story.

The blog has charted a 20 to 25 percent increase in retractions across some 10,000 medical and science journals in the past five years: 500 to 600 a year today from 400 in 2010. (The number in 2001 was 40, according to previous research.) The primary causes of this surge are far from clear. The number of papers published is higher than ever, and journals have proliferated, Dr. Oransky and other experts said. New tools for detecting misconduct, like plagiarism-sifting software, are widely available, so there’s reason to suspect that the surge is a simple product of better detection and larger volume.

The increasing challenges to the veracity of scientists’ work gained widespread attention recently when a study by Michael LaCour on the effect of political canvassing on opinions of same-sex marriage was questioned and ultimately retracted.

Still, the pressure to publish attention-grabbing findings is stronger than ever, these experts said — and so is the ability to “borrow” and digitally massage data. Retraction Watch’s records suggest that about a third of retractions are because of errors, like tainted samples or mistakes in statistics, and about two-thirds are because of misconduct or suspicions of misconduct.

The most common reason for retraction because of misconduct is image manipulation, usually of figures or diagrams, a form of deliberate data massaging or, in some cases, straight plagiarism. In their dissection of the LaCour-Green paper, the two graduate students — David Broockman, now an assistant professor at Stanford, and Joshua Kalla, at California-Berkeley — found that a central figure in Mr. LaCour’s analysis looked nearly identical to one from another study. This and other concerns led Dr. Green, who had not seen any original data, to request a retraction. (Mr. LaCour has denied borrowing anything.)

Data massaging can take many forms. It can mean simply excluding “outliers” — unusually high or low data points — from an analysis to generate findings that more strongly support the hypothesis. It also includes moving the goal posts: that is, mining the data for results first, and then writing the paper as if the experiment had been an attempt to find just those effects. “You have exploratory findings, and you’re pitching them as ‘I knew this all along,’ as confirmatory,” Dr. Nosek said.

The second leading cause is plagiarizing text, followed by republishing — presenting the same results in two or more journals.

The fourth category is faked data. No one knows the rate of fraud with any certainty. In a 2011 survey of more than 2,000 psychologists, about 1 percent admitted to falsifying data. Other studies have estimated a rate of about 2 percent. Yet one offender can do a lot of damage. The Dutch social psychologist Diederik Stapel published dozens of studies in major journals for nearly a decade based on faked data, investigators at the universities where he had worked concluded in 2011. Suspicions were first raised by two of his graduate students.

“If I’m a scientist and I fabricate data and put that online, others are going to assume this is accurate data,” said John Budd, a professor at the University of Missouri and an author of one of the first exhaustive analyses of retractions, in 1999. “There’s no way to know” without inside information.

Here, too, Retraction Watch provides a possible solution. Many of the egregious cases that it posts come from tips. The tipsters are a growing cadre of scientists, specialized journalists and other experts who share the blog’s mission — and are usually not insiders, working directly with a suspected offender. One of the blog’s most effective allies has been Dr. Steven Shafer, the former editor of the journal Anesthesia and now at Stanford, whose aggressiveness in re-examining published papers has led to scores of retractions. The field of anesthesia is a leader in retractions, largely because of Dr. Shafer’s efforts, Mr. Marcus and Dr. Oransky said. (Psychology is another leader, largely because of Dr. Stapel.)

Other cases emerge from issues raised at postpublication sites, where scientists dig into papers, sometimes anonymously. Dr. Broockman, one of the two who challenged the LaCour-Green paper, had first made public some of his suspicions anonymously on a message board called Mr. Marcus said Retraction Watch closely followed a similar site, “When it first popped up, a lot of people assumed it would be an ax-grinding place,” he said. “But while some contributors have overstepped, I think it has had a positive impact on the literature.”

What these various tipsters, anonymous post-reviewers and whistle-blowers have in common is a nose for data that looks too good to be true, he said. Sites like Retraction Watch and PubPeer give them a place to discuss their concerns and flag fishy-looking data.

These, along with data repositories like Dr. Nosek’s, may render moot the debate over how to exhaustively replicate findings. That burden is likely to be eased by the community of bad-science bloodhounds who have more and more material to work with when they pick up a foul scent.

“At this point, we see ourselves as part of an ecosystem that is advocating for increased transparency,” Dr. Oransky said. “And that ecosystem is growing.”


:Quantum Neepery

Two Cool Physics Findings
Andrew Truscott and his team showed that if you offer a speeding helium atom two possible paths, the route it takes appears to be retroactively determined by the act of measuring the atom at the end of its journey. The team reported the strange discovery in Nature Physics in May.
The new theory proposed by the Griffith team is a lot closer to Einstein’s vision than Bohr’s. Gone are the probability clouds along with the other conundrums of wave-particle duality. In the new picture the electrons being fired at the slits are particles after all – tiny little spheres just as Newton would have imagined them. In our world the electron might pass through the bottom slit. But in a parallel world the electron passes through the top slit. As the two ghostly twins travel towards the detectors (one in our world, one in a parallel world), their paths could overlap. But according to the theory, a newly proposed repulsive force stops the electrons coming too close to one another. In effect, the electron in our world “collides” with its ghostly twin, like billiard balls knocking together as they roll across a pool table.

By restricting the worlds to be discrete or finite, Poirier adds, the Griffith team has developed equations that are much easier for a computer to solve. Quantum mechanical calculations that would usually take minutes were completed “in a matter of seconds,” says Michael Hall, lead author of the study. Hall hopes that eventually this will lead to applications in predicting real world chemical reactions.
And if the number of worlds is finite – as modelled in the team’s computer simulations – rather than infinite, then the predictions made by the new theory will deviate from standard quantum theory. Though the deviations are likely to be only slight they could be testable in the lab using experiments similar to the double slit. Tantalisingly, as the size of the deviations depends on the number of parallel worlds, these experiments could provide an effective measure of how many worlds are out there.
But… parallel worlds? Is this not all too absurd to take seriously? Not for the physicists, it seems. And as David Wallace points out in The Emergent Multiverse, our sense of absurdity evolved to help us scratch a living on the savannahs of Africa. “The Universe is not obliged to conform to it.”
These both should be near and dear to the heart of every SF writer and reader!
Shouldn’t the last two sentences of the 2nd page be tattooed on the hands and arms of every AGW warmist zealot to get them to remember this fact!

Peter Wityk


The First Ever Photograph Of Light As A Particle & A Wave Is Here


Scientists at EPFL have succeeded in capturing the first-ever snapshot of light behaving as a particle and a wave, reports.

Back when Einstein first popularized the idea that light actually behaves as both a particle and wave, scientists began the mighty endeavour of capturing this concept visually.

However, this is no easy task – the closest we have come is seeing either wave or particle, but always at different times.

But EPFL scientists have now come up with a clever way to counteract this issue.

The experiment is set up like this:

A pulse of laser light is fired at a tiny metallic nanowire. The laser adds energy to the charged particles in the nanowire, causing them to vibrate. Light travels along this tiny wire in two possible directions, like cars on a highway. When waves travelling in opposite directions meet each other they form a new wave that looks like it is standing in place. Here, this standing wave becomes the source of light for the experiment, radiating around the nanowire.

This is where the experiment’s trick comes in: The scientists shot a stream of electrons close to the nanowire, using them to image the standing wave of light.

As the electrons interacted with the confined light on the nanowire, they either sped up or slowed down. Using the ultrafast microscope to image the position where this change in speed occurred, Carbone’s team could now visualize the standing wave, which acts as a fingerprint of the wave-nature of light.

While this phenomenon shows the wave-like nature of light, it simultaneously demonstrated its particle aspect as well. As the electrons pass close to the standing wave of light, they “hit” the light’s particles, the photons.

As mentioned above, this affects their speed, making them move faster or slower. This change in speed appears as an exchange of energy “packets” (quanta) between electrons and photons. The very occurrence of these energy packets shows that the light on the nanowire behaves as a particle.

Credit: Fabrizio Carbone/EPFL

This experiment demonstrates that, for the first time ever, we can film quantum mechanics – and its paradoxical nature – directly,” says Fabrizio Carbone. In addition, the importance of this pioneering work can extend beyond fundamental science and to future technologies. As Carbone explains: “Being able to image and control quantum phenomena at the nanometer scale like this opens up a new route towards quantum computing.”

Related CE Article:

How Is This Possible? Scientists Observe ONE Particle Exist In MULTIPLE states (wave).

Actually, it makes quantum mechanics even more spooky.


Forget drones. The Army could soon be using hovering speeder bikes. (WP)

By Brian Fung June 22 at 4:03 PM

Star Wars fans will no doubt remember that epic scene from “Return of the Jedi” in which Luke chases down a pair of fleeing scout troopers on a speeder bike. Well, get ready, because the U.S. military is designing a real-life version of that hovering vehicle.

The prototype, which is being developed by Malloy Aeronautics and SURVICE Engineering, doesn’t come with blaster cannons. But the Defense Department is imagining the carbon-fiber Hoverbike as a “multi-role tactical reconnaissance” vehicle that can be used to support a variety of missions, such as carrying supplies or gathering intelligence, according to Reuters.

The two companies have a contract with the U.S. Army Research Laboratory, to do research and development on the Hoverbike, according to Malloy. Terms of the deal weren’t disclosed, but U.K.-based Malloy is setting up an office in Maryland just so that it can test the product closer to its customer.

The real selling point for the U.S. Army appears to be that hoverbikes offer a cheap, reliable alternative to traditional helicopters. It has fewer moving parts and is therefore easier to maintain, according to Malloy. A video of the quadcopter craft shows scale models pulling tight turns pretty low to the ground.

The Hoverbike comes at a time when the Defense Department is investing heavily into unmanned robotic technology. In September, Malloy successfully wrapped up a Kickstarter for the project, collecting more than $101,000 for it. The company is trying to raise another $1.1 million on its Web site.

This is the future of war. It won’t be long before the military adapts these things to become listening platforms, pack carriers or even floating bombs. Of course, as Luke and his friends quickly discovered, all it takes to defeat a human riding on one of these is a clothesline.


Who’s Watching Whom?

FCC, Cable Ops Ready to Rumble Over Internet Privacy

6/08/2015 8:00 AM Eastern

By: John Eggerton


The rules for Internet privacy, and who has the right to enforce them, are at the heart of one of the most contentious debates roiling the broadband industry today.

WASHINGTON — What exactly, are the rules for Internet privacy, and who has the right to enforce them?

Those two issues are at the heart of one of the most contentious debates roiling the broadband industry today. The Federal Communications Commission’s reclassification of Internet access as a common-carrier service under Title II of the Communications Act gives the agency new powers to create rules for “protecting” broadband customer proprietary network information (CPNI).

That new authority could lead to creating “opt-in” methods for collecting online personal information that many public-interest groups have been clamoring for, and could take a bite out of targeted behavioral advertising. It is unclear just how the FCC will approach its self-given power to regulate in the space, which is the main dissenting issue that Internet-service providers have with much of the Title II order.

The new broadband CPNI oversight has also created a jurisdictional tug-of-war between the FCC and the Federal Trade Commission, which has been overseeing broadband privacy but must relinquish those duties to the agency under the new rules, unless Congress steps in.

“To have the FCC usurp the authority of the Federal Trade Commission is a very bad idea,” Rep. Bob Goodlatte (R-Va.), the House Judiciary Committee chairman, told C-SPAN in an interview. “It’s going to lead to regulation of the Internet in ways that some of the people who have been calling for that have not imagined.”


The fear of the FCC’s regulation of broadband privacy is similar to industry fears about the Internet conduct standard contained in the new Open Internet rules, which is fear of the unknown.

The FCC tried to give Internet-service providers some guidance in an Enforcement Bureau advisory issued May 20, but that guidance was essentially a call for ISPs to make good-faith efforts to protect privacy (and if you are unsure, run it by us and we’ll try to advise you).

That is the sort of “you’ll know it when the FCC sees it” approach that has ISPs taking the agency to court over its Internet conduct standard, a plan to potentially take government action against a broad “catch-all” (the FCC’s term) standard to sweep up conduct not prevented specifically under its bright-line network neutrality rules but that could “harm internet openness.”

Among the Title II provisions the FCC decided to impose were the customer-privacy provisions in Section 222 of the Communications Act of 1934.

“Section 222 makes private a customer’s communications network information — i.e. with whom they communicate and from where they communicate — unless a user provides express consent for its commercial use,” said Scott Cleland, chairman of NetCompetition, a pro-competition online forum supported by broadband interests, who added that the FCC has some “big decisions” to make. (See sidebar)

The FCC opted to forbear, or choose not to apply, the specific telephone-centric language of the section, preferring to come up with some new definitions for broadband CPNI protection. Just what those new definitions are and what they might cover is at the heart of the debate.


In pushing to retain jurisdiction over online data security, Jessica Rich, director of the Federal Trade Commission’s Bureau of Consumer Protection, told Congress at a March hearing that the FCC’s decision to reclassify ISPs under Title II, which removes the issue from FTC purview, had made it harder to protect consumers.

A bill that passed out of the House Energy & Commerce Committee would move some of the CPNI authority the FCC has just given itself back to the Federal Trade Commission by giving the latter agency authority over data privacy when that privacy has been violated due to a breach. The bill would make not protecting personal information per se false and deceptive, empowering the FTC to sue any company — including a cable operator or telecom carrier — that fails to do so. The measure says companies must “implement and maintain reasonable security measures and practices” to protect that information, so the FTC would have to decide what would pass muster.

Rep. Frank Pallone (D-N.J.), ranking member of the House Energy & Commerce Committee, has expressed his concern that moving that oversight back to the FTC could be an “enormous problem” because it could allow those ISPs to get out from under FCC privacy oversight through self-regulatory mechanisms at the Federal Trade Commission.

While the FCC has rulemaking authority — and has signaled it could come up with broadband-specific rules — the FTC is limited to using its power to sue companies over false and deceptive conduct.

Under the proposed new legal regime, the FCC and the FTC would share jurisdiction over broadband personal information. The bill gives the FTC cybersecurity and breach oversight, but leaves privacy protections to the FCC, though FCC chief counsel for cybersecurity Clete Johnson has said that is a distinction without a difference.

Johnson told Congress that the way the bill divides up accountability and narrowly defines what information could be protected, the FCC would lose the authority over protecting a subscriber’s viewing-history information, including the shows they watch and the movies they order. At present, what a Congressman watches in Las Vegas stays in Vegas, and under the protection of the ISP there.

“[W]hether a company (either by human error or technical glitch) mistakenly fails to secure customer data or deliberately divulges or uses information in ways that violate a customer’s privacy rights regarding that data, the transgression is at once a privacy violation and a security breach,” he said.

But getting Congress to pass a bill is a tall order, so unless the courts reject the FCC’s Open Internet rules for a second time, the agency is going to be coming up with some form of privacy-protection enforcement regime for broadband information.


At a panel at last month’s INTX in Chicago, National Cable & Telecommunications Association executive vice president James Assey said that folks trying to comply with the law are looking for help from the FCC as they try to figure out how to comply and get “some assurance” that what they are doing won’t run afoul of the law.

At a meeting of the Advanced Television Systems Committee in Washington, D.C., NCTA president and CEO Michael Powell warned against the government inserting itself into the role data can play in tailoring consumer experiences. He conceded that the use of personal data had troubling elements, but cautioned the government could “distort the market” if it acted prematurely.

The NCTA had no comment on the FCC’s Enforcement Bureau advisory, but it did not weigh in with thanks for the new guidance.

The NCTA and other ISPs outlined their concerns over the Section 222 issue in their May 13 request that the U.S. Court of Appeals for the D.C. Circuit stay the Title II reclassification and its attendant new broadband CPNI authority.

Telco AT&T estimated it would lose hundreds of millions of dollars in revenues if it had to stop using broadband-related CPNI while it implemented consent mechanisms based on having to “guess” what future FCC rules might be.

While broadband providers can, and do, lawfully use information about customers’ Internet service to develop customized marketing programs, the ISPs said they now can’t be sure what will be acceptable under the new rules and could be held liable if they guess wrong.

The FCC appears to have the votes to flex its muscle on privacy.

A month ago, the FCC held a workshop essentially launching the process of figuring out what it was going to do with its new privacy authority. FCC chairman Tom Wheeler framed the issue in historical terms, citing the Federalist Papers and intercepted telegraph messages during the Civil War.

“Consumers have the right to expect privacy in the information networks collect about them,” he said, adding that a in digital world, everybody is leaving digital footprints “all over the place.”

Privacy is unassailable, as the virtuous circle of innovation begetting innovation essential, he said.

Wheeler clearly views privacy — like competition and access — as one of those issues that must be viewed in the sweep of history and with the long view from the high hill. That could make it difficult for opponents of strong new FCC privacy regulations to dissuade him from that course with an argument that lies in the weeds of policy.

That’s the same view that helped move his position toward Title II in the first place.

At INTX, Democratic FCC member Jessica Rosenworcel signaled that there were a number of areas where the agency needed to be looking, including monetization of customer data and ad analytics. She said it would be important to align those obligations with the FCC’s traditional cable privacy oversight and suggested the agency needed to have a rulemaking — and that the chairman had acknowledged as much — because it was an area “where time and technology have made really significant changes and we are going to have to figure out how to protect consumer privacy and manage all those benefits from the broadband ecosystem at the same time.”

“You can dial a call, write an email, post an update on a social network and purchase something online, and you can be sure that there are specialists in advertising and data analytics who are interested in exactly where you are going and what you’re doing,” she said. “And then, finally, we all know that the monetization of data is big businesses, and that slicing and dicing is only going to continue.”

Commissioner Mignon Clyburn has said the public demands a “regulatory backstop” on broadband privacy and she is ready to use that power.


The FCC’s Republican minority is hardly convinced — but they are the minority.

Commissioner Ajit Pai told cable operators at INTX that one thing he gleaned from the FCC’s privacy workshop was that nobody really knows where the agency goes from here.

Commissioner Michael O’Rielly told an INTX crowd that the FCC’s understanding of privacy was “prehistoric” and “to now say that we are going to jump in the middle of this space is extremely problematic.” As to the impact on monetizing data, he pointed out that was why a lot of Internet content was free.

Privacy advocates definitely see a chance to push for tough privacy provisions.

Jeff Chester, executive director of the Washington, D.C.-based Center for Digital Democracy and a leading advocate for online privacy law and regulation, said the FCC has “long looked the other way as phone and cable companies, with their broadband partners, secretly grabbed customer data so they could do more precise set-top box and cross-device tracking and targeting.”

The FCC needs to use its new powers under Title II to force privacy protection on broadband giants, he said. But the FCC should also look at how “Google, Facebook and other data technology companies work alongside the Verizons and Comcasts, in order to develop effective safeguards for the public,” he added, suggesting his own sweeping change.

“The FCC should issue a new ‘Bill of Consumer Rights’ for the digital video era,” Chester said.

The public still has a strong expectation of privacy, said Harold Feld, senior vice president of Washington, D.C.-based public-interest group Public Knowledge. That point was supported by a recent Pew Research study that found that more than 90% of respondents said it was important for them to control who can access information about them online and what information is being collected.

Feld told the FCC at its privacy workshop that “rock solid” phone-network privacy protections need to move into the IP-delivered world. “This is not about, ‘Well, the universe is an awful place for privacy, so who cares anymore.’ ”

Clearly the FCC cares, but until it weighs in with a new regime — and starting June 12, unless the Title II reclassification is stayed by the courts — ISPs will have to trust their gut and likely verify with the FCC as well.

Privacy’s Big Three

If the Federal Communications Commission’s reclassification of broadband as a Title II telephone service is not stayed in court, the ISP industry’s business model could be dramatically affected by how the agency implements Section 222 “Privacy of Customer Information.”

Section 222 makes private a customer’s communications network information, i.e., with whom they communicate and from where they communicate — unless a user provides express consent for its commercial use.

The FCC has some big and telling decisions to make:

Privacy Protection Predictability: Does the FCC believe in a consumer-centric implementation of Section 222, where consumers enjoy privacy protection predictability because the FCC interprets that consumers own or legally control their Section 222 private-network information, and that anyone who wants to commercialize it, must first get the consumer’s express consent? If not, can everyone but an ISP use this legally private Section 222 information in any way they want, whenever they want for most any commercial purpose they want, without notifying or securing the affected consumer’s consent?

Competitive Privacy Policy Parity: Does the FCC want to promote competition, consumer choice and a level playing field by ensuring that all competitors compete based on the same consumer privacy protection rules? If not, will the FCC pick market winners and losers by allowing only FCC-favored competitors to earn revenues in targeted advertising?

FCC Do Not Track List: Will the FCC create a Section 222 Internet “Do Not Track” list like the FTC created the “Do Not Call” list enjoyed by three-quarters of Americans? Why would it not be in the public interest for the FCC to use Section 222 to make available a similarly simple and convenient mechanism for Americans to choose to opt out of unsolicited tracking of where they go on the Internet via a national FCC Do Not Track list that would protect consumers’ private information from commercialization without permission?

In short, how the FCC implements its newly asserted Section 222 “Privacy of Customer Information” authority will speak volumes about the FCC’s true priorities. Will the FCC choose to protect consumers’ privacy interests, or Silicon Valley’s advertising interests?

Scott Cleland is chairman of, an e-forum promoting broadband competition and backed by broadband providers.







Freedom is not free. Free men are not equal. Equal men are not free.




Warp Drives;Data vs. Big Science; Piracy; The Big Rain on Venus

Chaos Manor Mail Wednesday, April 29, 2015


I have many things to do, so this will be mail. I will try to deal with issues in a View. Got to run…


From my physicist friend:


Regarding the article and linked science blog posted today:

1.  One person on the blog makes the point: nobody knows for sure what is happening, so talking about warp drives is a bit premature.

2.  Other than that, most of the posts are technobabble that might do credit to a Star Trek episode but not to real scientific discourse.


But I suppose there’s hope:

: NASA Warp Drive

Eh. This has been going viral on Facebook. I’ve answered a lotta questions from non-tech types. So what I’m saying is this: A couple guys do not equal NASA. And since it’s my understanding that the difference amounts to something like 10^-18 m/s, and that the tests were done in atmosphere, not in vacuum, I’m figuring it’s in the grass, and probably IS the grass.
Anyway, I’m going to just sit back and wait for some REAL evidence…which I expect to be at least as long in coming as a sustained fusion reactor, which as we all know is always “just 20 years away.”
Stephanie Osborn

“The Interstellar Woman of Mystery”

So, now they’re claiming the EmDrive is actually an FTL warp drive?


Roland Dobbins

But probably not.  This time.

Red Line

– NASA EM Drive and FTL


Second-hand thanks for the tip – off to chasing links down to the original stuff again, sigh…
Quick observations just from a few clicks down the road.
1) The observations of “FTL” are apparently being inferred from anomalies in the interference patterns when sending a laser through the cavity of the EM device. Send a laser through *any* EM field and you will get a different interference pattern than your null field control – I’ll have to look much further to see how these are other than would normally be expected.

2) The big problem is that they have apparently *not*, so far as I am seeing, repeated their experiment in vacuo. We’ve been (knowingly) “exceeding the speed of light” in atmosphere at least since the days of Cherenkov – so I must color myself skeptical at this point. Actually, we can do it these days even in vacuo – just get two plates closer to each other than the wavelength of EM you are using. That effect has a well-described cause that does not violate Einstein in any way, however.

3) Side note, all too many of the commenters seem to think that FTL automatically implies time travel to the past, and is therefore impossible. An instantaneous trip to Alpha Centauri and back is *not* “time travel.” It is simply observing the past, just approximately four years *sooner* than you would by obeying the speed limit. We observe the past *constantly* – from the femtoseconds it takes light to reach your eyes from your monitor to the Hubble imaging the appearance of galaxies many billions of light years away. You – and the Universe – are still older than you were before. (You may specifically be somewhat less aged than the Universe, if your drive requires moving within the framework you are relative to – but you are still older than before.)







No one in the media seems to understand that seizing a Marshall-flagged vessel is almost tantamount to seizing a US-flagged vessel.

I wonder if anyone in the Obama Administration understands this?



Roland Dobbins

And now it becomes clearer:

‘Iran’s seizure of the cargo vessel follows a maritime standoff between an Iranian cargo convoy apparently bound for Yemen and a group of American warships in the Arabian Sea. The U.S. is supporting a Saudi-led military campaign against Iranian-backed Houthi rebels in Yemen, and commanders did not want Iran to resupply the Houthis with weapons or other assistance.

After several tense days at sea that included the movement of the aircraft carrier USS Theodore Roosevelt from the Persian Gulf into the Arabian Sea, the Iranian convoy and sailed east, in international waters, off the coast of Oman, according to defense officials.’


Absolute madness. The neocons are doing their best to get us into war with Iran.

Roland Dobbins

I will comment on this at length at another time.




By Christopher Booker

8:14PM BST 25 Apr 2015

Last month, we are told, the world enjoyed “its hottest March since records began in 1880”. This year, according to “US government scientists”, already bids to outrank 2014 as “the hottest ever”. The figures from the US National Oceanic and Atmospheric Administration (NOAA) were based, like all the other three official surface temperature records on which the world’s scientists and politicians rely, on data compiled from a network of weather stations by NOAA’s Global Historical Climate Network (GHCN).

But here there is a puzzle. These temperature records are not the only ones with official status. The other two, Remote Sensing Systems (RSS) and the University of Alabama (UAH), are based on a quite different method of measuring temperature data, by satellites. And these, as they have increasingly done in recent years, give a strikingly different picture. Neither shows last month as anything like the hottest March on record, any more than they showed 2014 as “the hottest year ever”.

An adjusted graph from the Goddard Institute for Space Studies


Back in January and February, two items in this column attracted more than 42,000 comments to the Telegraph website from all over the world. The provocative headings given to them were “Climategate the sequel: how we are still being tricked by flawed data on global warming” and “The fiddling with temperature data is the biggest scientific scandal”.

My cue for those pieces was the evidence multiplying from across the world that something very odd has been going on with those official surface temperature records, all of which ultimately rely on data compiled by NOAA’s GHCN. Careful analysts have come up with hundreds of examples of how the original data recorded by 3,000-odd weather stations has been “adjusted”, to exaggerate the degree to which the Earth has actually been warming. Figures from earlier decades have repeatedly been adjusted downwards and more recent data adjusted upwards, to show the Earth having warmed much more dramatically than the original data justified.

FINALLY. That’s a pretty decent board of investigators, and NOT all comprised of climatologists — which is to say, it isn’t the foxes guarding the henhouse. IIRC from reading, there’s actually only one professional climatologist on that investigative committee; the rest are stuff like data reduction and statistics experts.

This should get very interesting, and pretty fast.

Also the two graphs, composed of raw data and “adjusted” data, taken together are pretty damning. I went through ’em last night and ascertained that the earlier temps were shoved downward by some 1.25C, and the most recent temps have been pushed upward by the same amount. I’d need to sit down and dink (and preferably look at the actual numbers, not just charts) to figure out the “grading curve” they used to create the “adjusted” chart across the entire timeframe.
Stephanie Osborn

“The Interstellar Woman of Mystery”

Perhaps there will be some adult supervision?


Space Solar Power Initiative (SSPI)
“PASADENA, Calif. – April 20, 2015 – Northrop Grumman Corporation (NYSE:NOC) has signed a sponsored research agreement with the California Institute of Technology (Caltech) for the development of the Space Solar Power Initiative (SSPI). Under the terms of the agreement, Northrop Grumman will provide up to $17.5 million to the initiative over three years.
Working together, the team will develop the scientific and technological innovations necessary to enable a space-based solar power system capable of generating electric power at cost parity with grid-connected fossil fuel power plants. SSPI responds to the engineering challenge of providing a cost-competitive source of sustainable energy. SSPI will develop technologies in three areas: high-efficiency ultralight photovoltaics; ultralight deployable space structures; and phased array and power transmission.”
Well, other than the fact that you and others have been promoting this for decades, it is a step in the right direction. When you can get a major corporation on board to start spendng money to make this happen, someone has to be thinking this might actually work. Spending just $17.5 million might seem a bit small, but for a university this will help enable some very big experiments. I think most of the technology is already done so just putting it all together in a proof of concept that could be shipped to the space station for testing might be what we see come out of this. Hopefully, it will lead to a full-sized station being built.
Braxton Cook

You do understand that there tax subsidies at work here?


There Will Be War vol 1 & 2

Dr Pournelle

Thank you for making There Will Be War vol 1 & 2 available for Kindle. Bought ’em. Posted notice to the Heinlein Forum on Facebook.

Live long and prosper

h lynn keith

Thanks for giving me another excuse to promote them. I’ll slow down on that now…


Re: “Hinky” in Action


A quick little article in which Schneier makes the point quite well.

“This is what works. Not profiling. Not bulk surveillance. Not defending against any particular tactics or targets. In the end, this is what keeps us safe.”




Dear Dr. Pournelle,

It appears NASA has some interesting ideas for colonizing Venus:

NASA researchers have come up with a plan to send piloted, helium-filled airships cruising through the Venusian atmosphere. The idea, called the High Altitude Venus Operational Concept (HAVOC), could eventually lead to the permanent settlement of Earth’s hellishly hot sister planet, its developers say.

Venus is another potential target for human exploration, say Jones and his colleague Dale Arney, also of NASA Langley. At first blush, this assertion may seem surprising; the planet’s surface temperature is about 860 degrees Fahrenheit (460 degrees Celsius) — hot enough to melt lead — and its atmospheric pressure at ground level is a staggering 90 times that of Earth.

But HAVOC would avoid the surface, instead hovering about 30 miles (50 kilometers) up in Venus’ thick, carbon-dioxide-dominated air. Up there, conditions are much more manageable; atmospheric pressure is roughly what we’re used to, and the average temperature is 167 F (75 C).

Venus, which is about the same size as Earth, is also the closest planet to our own, making it the easiest (or at least the quickest) to get to.

I find the idea charming. If nothing else, it would make good novel fodder.


Brian P.

See Poul Anderson’s Big Rain novelette from fifty years ago…  Or my speculation about terraforming Venus with catalysts and genetically engineered forms.



Freedom is not free. Free men are not equal. Equal men are not free.