Marvin Minsky, RIP; Recovering fairly rapidly, apologies for the delay; climate change; the end of Eastman Kodak; Robots; Extinct Aliens

Chaos Manor View, Tuesday, January 26, 2016

“This is the most transparent administration in history.”

Barrack Obama

Liberalism is a philosophy of consolation for Western Civilization as it commits suicide.

bubbles

I have recovered from bronchitis, and Roberta is over pneumonia. This will be an intermittent journal today as I try to catch up on things, and get it done before I run completely out of energy.

First item, which I got yesterday but became aware of only a few minutes ago.

Marvin Minsky, RIP.

<http://www.nytimes.com/2016/01/26/business/marvin-minsky-pioneer-in-artificial-intelligence-dies-at-88.html>

—————————————

Roland Dobbins

I have known Marvin since the late 60’s, and we have been good friends for the entire time. I have always been an admirer, but like Marvin’s long time friend the late John McCarthy, he treated me like a colleague rather than a junior associate. We were colleagues on several studies, in particular the NASA study on self replicating systems during the Carter Administration, as well as others. We met whenever he came to Los Angeles over the years.For several years we met quarterly with John McCarthy and Dick Feynman; why Marvin invited me in that company is not entirely clear, but he continued to do so.

I’ve been a bit out of things for a year, and he has been retired for years; I last had lunch with him and Gloria at his home in 2014, I guess; too long ago. I’m rambling. There hundreds of Marvin Minsky stories, particularly of the moments when a student or colleague was suddenly enlightened by a single phrase or action of Marvin’s; if it happened to you I guarantee you would never forget it.

He would not have appreciated prayers while he was living, but he will have mine now. Rest In Peace.

bubbles

 

https://www.washingtonpost.com/news/speaking-of-science/wp/2016/01/25/marvin-minsky-1927-2016/

“The world has lost one of its greatest minds in science.” R.I.P. Marvin Minsky (WP)

 

By Joel Achenbach January 26 at 9:45 AM

This post has been updated.

Marvin Minsky, a legendary cognitive scientist who pioneered the field of artificial intelligence, died Sunday at the age of 88. His death was announced by Nicholas Negroponte, founder of the MIT Media Lab, who distributed an email to his colleagues:

With great great sadness, I have to report that Marvin Minsky died last night. The world has lost one of its greatest minds in science. As a founding faculty member of the Media Lab he brought equal measures of humour and deep thinking, always seeing the world differently. He taught us that the difficult is often easy, but the easy can be really hard.

In 1956, when the very idea of a computer was only a couple of decades old, Minsky attended a two-month symposium at Dartmouth that is considered the founding event in the field of artificial intelligence. His 1960 paper, “Steps Toward Artificial Intelligence,” laid out many of the routes that researchers would take in the decades to come. He founded the Artificial Intelligence lab at MIT, and wrote seminal books — including “The Society of Mind” and “The Emotion Machine” — that colleagues consider essential to understanding the challenges in creating machine intelligence.

You get a sense of his storied and varied career from his home page at MIT:

In 1951 he built the SNARC, the first neural network simulator. His other inventions include mechanical arms, hands and other robotic devices, the Confocal Scanning Microscope, the “Muse” synthesizer for musical variations (with E. Fredkin), and one of the first LOGO “turtles”. A member of the NAS, NAE and Argentine NAS, he has received the ACM Turing Award, the MIT Killian Award, the Japan Prize, the IJCAI Research Excellence Award, the Rank Prize and the Robert Wood Prize for Optoelectronics, and the Benjamin Franklin Medal.

One of his former students, Patrick Winston, now a professor at M.I.T., wrote a brief tribute to his friend and mentor:

Many years ago, when I was a student casting about for what I wanted to do, I wandered into one of Marvin’s classes. Magic happened. I was awed and inspired. I left that class saying to myself, “I want to do what he does.”

M.I.T.’s obituary of Minsky explains some of the professor’s critical insights into the challenge facing anyone trying to replicate or in some way match human intelligence within the constraints of a machine:

Minsky viewed the brain as a machine whose functioning can be studied and replicated in a computer — which would teach us, in turn, to better understand the human brain and higher-level mental functions: How might we endow machines with common sense — the knowledge humans acquire every day through experience? How, for example, do we teach a sophisticated computer that to drag an object on a string, you need to pull, not push — a concept easily mastered by a two-year-old child?

His field went through some hard times, but Minsky thrived. Although he was an inventor, his great contributions were theoretical insights into how the human mind operates.

In a letter nominating Minsky for an award, Prof. Winston described a core concept in Minsky’s book “The Society of Mind”: “[I]ntelligence emerges from the cooperative behavior of myriad little agents, no one of which is intelligent by itself.” If a single word could encapsulate Minsky’s professional career, Winston said in a phone interview Tuesday, it would be “multiplicities.”

The word “intelligence,” Minsky believed, was a “suitcase word,” Winston said, because “you can stuff a lot of ideas into it.”

His colleagues knew Minsky as a man who was strikingly clever in conversation, with an ability to anticipate what others are thinking — and then conjure up an even more intriguing variation on those thoughts.

Journalist Joel Garreau on Tuesday recalled meeting Minsky in 2004 at a conference in Boston on the future evolution of the human race: “What a character!  Hawaiian shirt, smile as wide as a frog’s, waving his hands over his head, a telescope always in his pocket, a bag full of tools on his belt including what he said was a cutting laser, and a belt woven out of 8,000-pound-test Kevlar which he said he could unravel if he ever needed to pull his car out of a ravine.”

Minsky and his wife Gloria, a pediatrician, enjoyed a partnership that began with their marriage in 1952. Gloria recalled her first conversation with Marvin: “He said he wanted to know about how the brain worked. I thought he is either very wise or very dumb. Fortunately it turned out to be the former.”

Their home became a repository for all manner of artifacts and icons. The place could easily merit status as a national historical site. They welcomed a Post reporter into their home last spring.

They showed me the bongos that physicist Richard Feynman liked to play when he visited. Looming over the bongos was 1950s-vintage robot, which was literally straight out of the imagination of novelist Isaac Asimov — he was another pal who would drop in for the Minsky parties back in the day. There was a trapeze hanging over the middle of the room, and over to one side there was a vintage jukebox. Their friends included science-fiction writers Arthur C. Clarke and Robert Heinlein and filmmaker Stanley Kubrick.

As a young scientist, Marvin Minsky lunched with Albert Einstein but couldn’t understand him because of his German accent. He had many conversations with the computer genius John Von Neumann, of whom he said:

“He always welcomed me, and we’d start taking about something, automata theory, or computation theory. The phone would ring every now and then and he’d pick it up and say, several times, ‘I’m sorry, but I never discuss non-technical matters.’ I remember thinking, someday I’ll do that. And I don’t think I ever did.”

Minsky said it was Alan Turing who brought respectability to the idea that machines could someday think.

“There were science-fiction people who made similar predictions, but no one took them seriously because their machines became intelligent by magic. Whereas Turing explained how the machines would work,” he said.

There were institutions back in the day that were eager to invest in intelligent machines.

“The 1960s seems like a long time ago, but this miracle happened in which some little pocket of the U.S. naval research organization decided it would support research in artificial intelligence and did in a very autonomous way. Somebody would come around every couple of years and ask if we had enough money,” he said — and flashed an impish smile.

But money wasn’t enough.

“If you look at the big projects, they didn’t have any particular goals,” he said. “IBM had big staffs doing silly things.”

But what about IBM’s much-hyped Watson (cue the commercial with Bob Dylan)? Isn’t that artificial intelligence?

“I wouldn’t call it anything. An ad hoc question-answering machine.”

Was he disappointed at the progress so far?

“Yes. It’s interesting how few people understood what steps you’d have to go through. They aimed right for the top and they wasted everyone’s time,” he said.

Are machines going to become smarter than human beings, and if so, is that a good thing?

“Well, they’ll certainly become faster. And there’s so many stories of how things could go bad, but I don’t see any way of taking them seriously because it’s pretty hard to see why anybody would install them on a large scale without a lot of testing.”

There is a very good tribute to Marvin by Steve Levy at https://medium.com/backchannel/marvin-minsky-s-marvelous-meat-machine-f436aec02fdf#.40ex3d27d

 

bubbles

I am not really up to original work, but there is a very great deal worth commenting on.

bubbles

On that 2015 Record Warmest Claim | Roy Spencer, PhD.

http://www.drroyspencer.com/2016/01/on-that-2015-record-warmest-claim/

This should be definitive, but of course it will not be, as it explains. It is now clear that we do not know enough to guide multi-billion dollar policies, and it is likely that attempts to do so will harm the economy and thus reduce available alternatives when we do know more. The global warming actions are much like the endless California bullet train – ships which exist only for the interests of their crew.

bubbles

What is really interesting to me is how it drove change in satellite reconnaissance, or perhaps how digital imagery in recon bled to the civilian market and became ubiquitous.

Tracy

In just one hour, two Bell Labs scientists had a breakthrough that won the Nobel prize — and changed photography forever

<4782818106_9ce05162eb_b.jpg>William Warby/FLICKRDigital photography is everywhere.

At Bell Labs in 1969, two scientists were told they had to make progress on a key research project or they would lose their funding. After just an hour of work, they had a breakthrough.

This was a milestone in the invention of digital photography, one of the most exciting inventions of modern times. 

It has given mankind access to invaluable information about space and hugely advanced medical science. And it has completely transformed the daily life of millions around the globe. We can — and do — document our lives on a minute-by-minute basis.

Here’s how the story unfolded:

In the winter of 1975, Steven Sasson, a young engineer working in the Applied Research Lab at Kodak, tested out a new device for the first time. Now known as the first true digital camera, it was cobbled together using leftover parts he found in the lab. Thirty five years later, President Obama awarded Sasson the National Medal of Technology and Innovation for his invention.

RE: In just one hour, two Bell Labs scientists had a breakthrough that won the Nobel prize — and changed photography forever (BI)

The article has most of the overall concepts correct while being wrong on the details.  CCDs may have opened the door but they were not the device that started the household digital imaging revolution.  And it was not clinging to film that caused Kodak’s downfall.

CCDs from the beginning were expensive and are still very high cost due to the manufacturing limitations.  The manufacturing method is obsolete and limited to 4” wafers and so they are made as custom jobs in old factories.   Almost all cameras except for scientific or other specialized work use CMOS based chips which are far cheaper to manufacture.  Until the late 90’s CMOS for imaging was considered at best a toy for very low end consumer cameras due to the high noise levels which produced very poor images.  Kodak (and a lot of other companies) assumed that the high noise was inherent to the CMOS design and focused their research and manufacturing on the CCD-based technologies.  Then Canon discovered a method to overcome the CMOS noise problem which prompted most other camera manufacturers to start researching this area and soon cheap CMOS chips that could produce images just as good as film started flooding the market.

This last item was what doomed Kodak as they had written off CMOS for at the time very good reasons but were caught off guard due to a manufacturing method breakthrough.  It was this disruptive technology that devastated them, not ignoring the digital market.  Even today for very high end specialized cameras Kodak CCD chips are in high demand. 

Gene Horr

bubbles

Just in case you thought climatology was a modern science

Dear Jerry:

Climate “scientists” claim very high precision in their knowledge of the temperature and other climate parameters from hundreds and thousands of years ago.

Yet today we learn they don’t even know how much snow falls in a snowstorm, especially if they lose their piece of plywood that they call a “snow board.”

(And I’ll be most people think they measure snowfall with electronic accuracy using some kind of advanced instrumentation technology. Nope, they do it the same way you and I do, with a board in the snow.)

——————–

From:

http://www.dailymail.co.uk/news/article-3415135/Washington-s-official-snowfall-17-8-inches-way-weather-observers-LOST-measuring-device-blizzard.html

Washington’s official snowfall of 17.8 inches is way off because weather observers LOST their measuring device during the blizzard

‘Everyone has to understand that measuring snow in a blizzard is a tough thing to do,’ Richards said. ‘We would like it to be as accurate as possible,’ he said.

‘But it’s an inexact science.’

Susan Buchanan, a National Weather Service spokeswoman, said on Sunday a team of experts would ‘comprehensive assessment of how snow measurements are taken’ at other locations in order to make suggestions about how to better calculate numbers in the future.

Some residents are questioning why Washington’s official weather records are being measured in Virginia since it is not representative of the city.

‘People use National Airport as the weather centerpiece of the entire region, but it’s the warmest location in the entire region,’ said Bob Leffler, a retired National Weather Service climatologist to The Washington Post.

‘It’s just not a good site.’

The National Weather Service measures the snow with a snow board which is oftentimes just made of plywood.

The measuring guidelines require the board to be placed on the ground before the storm so that it does not move.

The snow is meant to be measured every six hours and then the board is supposed to be wiped clear.

However, the board was buried in the heavy snowstorm and the observer could no longer find it so he took a few snow depth measurements and averaged them.

‘Snow boards are the standard to use ­ when you can use them.’ Richards said.

‘Snow boards are just not effective in a storm that has very strong winds it’s just going to blow off.’

It was not snowfall that was reported to the National Weather Service, rather it was snow depth.

He added that the snow totals are ‘perishable’ if not measured by guidelines and that a snow board in necessary.

——————–

Best regards and I hope you will be well very soon, –Harry M.

Getting temperatures accurate to a tenth of a degree is possible but difficult and expensive; most precise measurements are. Those difficulties are ignored in most climate models.  We know the Hudson and the Thames used to freeze solid every winter; now we know they do not.  We have records of when spring thaws took place (to the day) for a hundred and fifty years.  Getting more precise numbers requires averages and that requires assumptions and adjustments.

 

bubbles

 

http://www.futuretimeline.net/blog/2016/01/22.htm#.VqTr78fTbto

Brain implant will connect a million neurons with superfast bandwidth

22nd January 2016

Brain implant will connect a million neurons with superfast bandwidth

A neural interface being created by the United States military aims to greatly improve the resolution and connection speed between biological and non-biological matter.

clip_image001

The Defence Advanced Research Projects Agency (DARPA) – a branch of the U.S. military – has announced a new research and development program known as Neural Engineering System Design (NESD). This aims to create a fully implantable neural interface able to provide unprecedented signal resolution and data-transfer bandwidth between the human brain and the digital world.

The interface would serve as a translator, converting between the electrochemical language used by neurons in the brain and the ones and zeros that constitute the language of information technology. A communications link would be achieved in a biocompatible device no larger than a cubic centimetre. This could lead to breakthrough treatments for a number of brain-related illnesses, as well as providing new insights into possible future upgrades for aspiring transhumanists.

“Today’s best brain-computer interface systems are like two supercomputers trying to talk to each other using an old 300-baud modem,” says Phillip Alvelda, program manager. “Imagine what will become possible when we upgrade our tools to really open the channel between the human brain and modern electronics.”

Among NESD’s potential applications are devices that could help restore sight or hearing, by feeding digital auditory or visual information into the brain at a resolution and experiential quality far higher than is possible with current technology.

clip_image002[4]

Neural interfaces currently approved for human use squeeze a tremendous amount of information through just 100 channels, with each channel aggregating signals from tens of thousands of neurons at a time. The result is noisy and imprecise. In contrast, the NESD program aims to develop systems that communicate clearly and individually with any of up to one million neurons in a given region of the brain.

To achieve these ambitious goals and ensure the technology is practical outside of a research setting, DARPA will integrate and work in parallel with numerous areas of science and technology – including neuroscience, synthetic biology, low-power electronics, photonics, medical device packaging and manufacturing, systems engineering, and clinical testing. In addition to the program’s hardware challenges, NESD researchers will be required to develop advanced mathematical and neuro-computation techniques, to transcode high-definition sensory information between electronic and cortical neuron representations and then compress and represent the data with minimal loss.

The NESD program aims to recruit a diverse roster of leading industry stakeholders willing to offer state-of-the-art prototyping, manufacturing services and intellectual property. In later phases of the program, these partners could help transition the resulting technologies into commercial applications. DARPA will invest up to $60 million in the NESD program between now and 2020.

Marvin would of course have found this interesting but not surprising.  Whether Roger Penrose, who rejects “strong AI” or Minsky, who saw no difference between “human” and “artificial” intelligence and cognoscence will prevail we do not know; I have my own notions, which are somewhere in between.

bubbles

And as we contemplate AI

 

Manpower’s CEO just gave us an awesome solution to the ‘robots taking human jobs’ conundrum

Manpower Jonas Prising, CEO and Executive Chairman of Manpower, spoke to Business Insider in Davos for the WEF meeting.

Over 2,500 of the world’s most powerful people have talked about the risks and opportunities surrounding “The Fourth Industrial Revolution” this week at the World Economic Forum in Davos, Switzerland.

The biggest risk that has been pointed out time and time again when Business Insider spoke to the bosses of the largest corporations in the world is two pronged.

1. The tech revolution will lead to a net loss of over five million jobs in 15 major developed and emerging economies by 2020, as identified by WEF in its report “The Future of Jobs.”

2. Unskilled workers will most likely be affected by the job cull in favour of robots and automation but at the same time companies will struggle to meet the WEF estimation of the creation of 2.1 million new jobs, mainly in more specialised areas such as computing, maths, architecture, and engineering, because of the lack of digital skills. Adecco’s CEO told us about how 900,000 jobs in the EU might end up vacant due to a lack of digital skills while UBS said in a white paper that income inequality is likely to grow.

But when Jonas Prising, CEO and executive chairman of one of the world’s biggest HR consultancy firms Manpower, sat down with Business Insider on the sidelines at the WEF conference, he told us about a pretty awesome solution to what companies can do, without relying on governments to step in:

“Going forward, because we believe in the notion of learnability, companies should use the concept of getting more people into iterative training. We believe we have the winning formula. You make sure you [in your company] allow staff to go to work, then take time out for training, then allow them to go back into the company immediately after. You can do this a few times. Not only does this help with staff retention but it allows you to skill-up your workforce.”

iRobot the movie

Basically, Prising is saying that companies should continually find ways to skill up their workforce by letting them take time out to acquire digital skills and then return to work. Not only will this keep people in employment but it will also greatly benefit the corporations because their workforces will develop more modern and cutting edge skills.

On top of that, it will also “take the strain off” the education system, where traditionally people just go to school, college, then university and believe that that is the end of their education.

Manpower is very well placed to make this observation, after all the group is one of the largest HR consultancies in the world with a market capitalisation of $5.2 billion (£3.6 billion).

As Prising pointed out to us, Manpower is “not only observing the transformation of the workforce during ‘The Fourth Industrial Revolution, we are actively participating in it.”

But while the WEF is warning of the risks of job losses resulting from greater use of robots and automation, Prising is “optimistic” that the digital revolution will not kill off as many jobs as estimated — provided companies change the way they develop their workforce and stop thinking that skilling up or educating stops at university.

“It’s a dangerous prediction to make about what jobs are going to be destroyed because with technology changes it doesn’t necessarily mean it eliminates a job completely, it can enhance it. We just need to make sure we train men and women to gain those extra skills throughout their careers, not just when they first start their jobs.”

I have estimated that by 2020, 50% of all gainful employment jobs can be replaced by robots costing not much more than the annual wage paid to the human holding that job; the robots will require no more than one human for ever dozen robots, and will have a useful life of over seven years. Those numbers will change rapidly after 2025.

 

bubbles

“The mystery of why we haven’t yet found signs of aliens may have less to do with the likelihood of the origin of life or intelligence and have more to do with the rarity of the rapid emergence of biological regulation of feedback cycles on planetary surf

“The mystery of why we haven’t yet found signs of aliens may have less to do with the likelihood of the origin of life or intelligence and have more to do with the rarity of the rapid emergence of biological regulation of feedback cycles on planetary surfaces.”

<http://astronomy.com/news/2016/01/the-aliens-are-silent-because-they-are-extinct>

—————————————

Roland Dobbins 

An intriguing hypothesis.  Things have to be just right…

 

bubbles

Microsoft Removes Research from the Ivory Tower

  • (journal)

Microsoft CEO Satya Nadella speaks during the company’s annual shareholders meeting, Dec. 2, 2015 in Bellevue, Wash.

Good morning. Microsoft Corp., competing with Alphabet Inc.‘s Google and Facebook Inc. to establish “the strongest hold over people’s digital lives,” is removing its research group from the isolation of the Ivory Tower, Bloomberg reports.

The goal is to integrate research into product development and the rest of the business, and reflects the goals of CEO Satya Nadella, according to Bloomberg. The story offers an account of how Mr. Nadella was impressed two years ago with a demonstration of how artificial intelligence and speech recognition could be used to translate a live conversation into another language.

“Nadella told the team he wanted the tool combined with Skype and ready in time to show off at his first public speech three months later,” Bloomberg reports. “This is not how Microsoft typically works. As Nadella, a 24-year veteran of the company, would have known, the process of turning a Microsoft Research project into a product would often happen slowly, if at all.”

The old Microsoft research model reflected the ideals of an earlier era in American business, one that produced remarkable breakthroughs at Bell Labs, as well as at other companies. In those days, research labs could operate in a more academic fashion, and sometimes had the feel of national institutions. Mobile phone technology emerged from such a culture at Bell Labs. But it took many years of work and other companies to fully commercialize the technology. For better or worse, that culture of pure science largely has been supplanted with a more commercial mindset. How does your company make use of its R&D? Let us know.

bubbles

bubbles

bubbles

bubbles

bubbles

bubbles

bubbles

bubbles

bubbles

bubbles

bubbles

bubbles

Freedom is not free. Free men are not equal. Equal men are not free.

bubbles

clip_image002

bubbles

Bookmark the permalink.

Comments are closed.