Another Step Farther Out; Priorities; Moore’s Law Continues; and other matters.

Chaos Manor View Monday, March 30, 2015

clip_image001

Eric has got a publishable manuscript of Another Step Farther Out, a compendium of Galaxy, InfoWorld, New Destinies, scientific magazines, and other places where I published in the old days. You’d think it ought to be dull, but it isn’t, and I’m working on because I got a lot of it right—we just haven’t done some of that stuff yet. The difference is that when I wrote it we couldn’t quite do it yet; now we can, we just don’t,

One column shows the confusion of climate scientists, divided into “We’re warming!” and “The Ice is coming back!” groups. No Believers and Deniers. Just science. Nut that was before all that money went into building the Warming consensus…

I’ll have to write some comments one where I got it wrong, but the horror is that I got a lot right, and we still aren’t doing it.

clip_image001[1]

I’m still experimenting with the Surface, and I still have to update a lot of the Apple equipment; stand by; but Another Step is still important, more than I thought, so it gets moved up a few notches. I’m also moving 2020 Visions up a bit in importance. That’s a lot to do, and I don’t work as fast as I used to, but we keep going, thanks to Eric and Peter and Rick and Brian and Alex and Dan and my other hard working advisors. And the readers: I still say I have the most interesting mail of anyone I know.

From Another Step Farther Out:

THE STUDY SYNDROME

Lincoln, Nebraska doesn’t sound like much of a place for changing human destiny, even though it is said to have the highest “quality of life” in the U.S. It’s a nice little city with a good convention center, where, this spring, a very important event took place.

A five day formal report by the Department of Energy and NASA on the Solar Power Satellite (SSPS) concept. That should have changed the world.

In the late seventies, Stefan Possony and I presented a paper at the annual meeting of the American Association for the Advancement of Science. We tried, successfully I think, to show that there are very good reasons to believe the human race will be around for 100 billion years.

Roll that number around on your tongue a bit. One hundred billion years. That is our future. Compared to it, our past is minuscule, vanishing, a tiny drop in the bucket. We are so very young and so much lies ahead of us, our only certain doom is the end of the universe-and who knows, after a hundred billion years, perhaps we will know how to prevent that too. It may be as a species we have no inevitable doom; certainly 100 billion years is, for those of us her and now, close enough to eternity.

But to realize anything like the potential, we must outlive our planet. We must outlive our sun. Eventually we will outlive our galaxy.

None of this is impossible. We can today conceive of interstellar ships, although it will be some time before we can build them; meanwhile, the first step is within our grasp right now. We can, if we will, make our home not Only One Earth, but in the solar system at large. In this generation, in this decade, we could put a settlement on the moon. Not a base, or an outpost; but a settlement, a colony; a home. We know how to do this now, with today’s technology, for about what we spend on cosmetics, less than we spend on tobacco.

It is an idea whose time has come; and SSPS gives us another reason to start.

clip_image001[2]

Voices: A peek into the future of tech (USA Today)

Rick Jervis, USA TODAY 9:05 a.m. EDT March 29, 2015

MIAMI — Anyone interested in what the dot.com future may hold would have done well by strolling through the second floor of the InterContinental Hotel here recently.

There, mingling between the Disney World display and the CNN en Español booth, they would have found an intriguing mix of media titans, marketing gurus, start-up entrepreneurs and YouTube careerists — all part of and aimed at the country’s burgeoning Latino population.

They were there as part of Hispanicize 2015, an annual gathering of the nation’s top Latino media execs, journalists and new-media entrepreneurs for a week of workshops, networking and parties. I was invited to the conference to speak on a panel on race and media.

A few days prior, I had covered the SXSW Interactive conference in Austin. It was interesting traveling from SXSW, one of the premier tech gatherings in the country but one still struggling to be more diverse, to a similar, albeit smaller, gathering of techies flush with diversity. Hispanicize, in fact, is often referred to as the “Latino SXSW.”

In Austin, panel discussions explored the myriad reasons Silicon Valley firms — especially at the managerial level — aren’t more black, brown and female. In Miami, those very diverse faces that have eluded the upper echelons of Yahoo and Facebook shared ideas and unfurled their cyber strategies.

Hispanics make up just 4% of managerial positions at Yahoo and even fewer at Facebook and Google. That number drops even further for African Americans. Black and Hispanic professionals — such as lawyers, accountants and computer scientists — make up 5% of all professionals at Facebook, Google and Yahoo but 13% of similar professionals nationwide.

Meanwhile, Latinos are the nation’s largest minority, numbering 53 million in the USA, and its fastest growing. By 2060, they’re expected to make up one-third of the total population, with more than $1 trillion in spending power.

Attendees at Hispanicize didn’t seem overly concerned with those disparate stats. They appeared less anxious about climbing corporate ladders at Silicon Valley and more focused on starting their own empires.

Hispanicize is the brainchild of Manny Ruiz, whom I knew from our days working at the campus newspaper at Miami-Dade Community College two decades ago. Ruiz left journalism to start a Hispanic-focused public relations firm, sold that and used the proceeds to launch Hispanicize. The gathering has grown from 260 attendees at its inaugural event five years ago to more than 2,000 today.

“We’re in a new era where there’s so much opportunity for everyone,” Ruiz told me. “You don’t have to be in Silicon Valley anymore to succeed.”

It was a mantra repeated throughout the conference. Entrepreneurs shared stores of how they’ve cobbled careers out of blogs and YouTube channels with names like Rocking Mama and Crafty Chica, drawing hundreds of thousands of loyal online followers and the attention of major brands willing to pay handsomely for that coveted audience. There was very little talk of trying to break into Google.

Alejandra Ayala, 29, started her fashion/beauty blog and YouTube channel, known as Chulavision, two years ago. She began in English, with just 1,200 subscribers. But when Ayala, who’s Mexican-American, started posting videos in Spanish, her channel quickly swelled to 123,000 subscribers. Her YouTube channel has since captured 5 million views.

Ayala said she doesn’t know how far she’ll take her project. But the fact that brands are reaching out to her tells her something about the direction of online enterprises.

“Slowly, they’re starting to notice us,” Ayala said about both her loyal following and corporations willing to pay for a few seconds of their time. “They’re starting to realize the impact we can have.”

Asked about the lack of diversity in Silicon Valley, Ayala smiled.

“If someone doesn’t want to give it to us,” she said, “we’re going to find a way to get it.”

An interesting attitude…

clip_image001[3]

Windows 10 Or OS X: Can Hardware Make The Difference? (Forbes)

clip_image002The Surface Pro 3 is a design you won’t get from Apple.

Would you switch from Mac to Windows to get access to “better” hardware?

I resolved that dilemma long ago by becoming, more or less, operating system agnostic.

There is one stubborn, undeniable fact in favor of being agnostic: One side offers more choice. That would be Windows, of course. And that means that there are sometimes better hardware options. And with Windows 10 on the horizon, that becomes even more enticing.

Lots of businesses are already agnostic, i.e., either Macs or PCs are allowed.  Though that doesn’t necessarily favor Windows PCs (BYOD — Bring Your Own Device — policies are trending to non-Windows platforms), I’ve been moving in the other direction.

Barring job-specific platform requirements, the experience on Macs and PCs is increasingly the same for me.  Particularly, if you spend much of your time inside Google’s GOOGL+0.2% Chrome browser, which I do.

(And the virus or malware argument against Windows isn’t that convincing anymore after both my MacBook and a friend’s recently got slammed with nasty malware.)

Let’s look briefly at laptops: On the Mac side, you’ve essentially got the MacBook Air, MacBook Pro, and the new 2-pound MacBook. Good choices but limited. While on Windows it’s almost limitless, if you throw in third-tier suppliers and the white box crowd.

But that’s stating a well-known fact, which is not my point.  What I’m getting at are unique products from top-tier suppliers that, because of the design, pull you off the Mac and over to Windows.

There’s a good bit to discuss in that.

clip_image001[4]

http://www.eetimes.com/document.asp?doc_id=1326149&print=yes

How Will Deep Learning Change SoCs? (EE Times)

Junko Yoshida

3/30/2015 00:00 AM EDT

MADISON, Wis. – Deep Learning is already changing the way computers see, hear and identify objects in the real world.

However, the bigger — and perhaps more pertinent — issues for the semiconductor industry are: Will “deep learning” ever migrate into smartphones, wearable devices, or the tiny computer vision SoCs used in highly automated cars? Has anybody come up with SoC architecture optimized for neural networks? If so, what does it look like?

“There is no question that deep learning is a game-changer,” said Jeff Bier, a founder of the Embedded Vision Alliance. In computer vision, for example, deep learning is very powerful. “The caveat is that it’s still an empirical field. People are trying different things,” he said.

There’s ample evidence to support chip vendors’ growing enthusiasm for deep learning, and more specifically, convolutional neural networks (CNN). CNN are widely used models for image and video recognition.

Earlier this month, Qualcomm introduced its “Zeroth platform,” a cognitive-capable platform that’s said to “mimic the brain.” It will be used for future mobile chips, including its forthcoming Snapdragon 820, according to Qualcomm.

Cognivue is another company vocal about deep learning. The company claims that its new embedded vision SoC architecture, called Opus, will take advantage of deep learning advancements to increase detection rates dramatically. Cognivue is collaborating with the University of Ottawa.

If presentations at Nvidia’s recent GPU Technology Conference (GTC) were any indication, you get the picture that Nvidia is banking on the all aspects of deep learning in which GPU holds the key.

China’s Baidu, a giant in search technology, has been training deep neural network models to recognize general classes of objects at data centers. It plans to move such models into embedded systems.

Zeroing in on this topic during a recent interview with EE Times, Ren Wu, a distinguished scientist at Baidu’s Institute of Deep Learning, said, “Consider the dramatic increase of smartphones’ processing power. Super intelligent models—extracted from the deep learning at data centers – can be running inside our handset.”  A handset so equipped can run models in place without having to send and retrieve data from the cloud. Wu, however, added, “The biggest challenge is if we can do it at very low power.

AI to Deep learning

One thing is clear. Gone are the frustration and disillusion over artificial intelligence (AI) that marked the late 1980’s and early ‘90’s. In the new “big data” era, larger sets of massive data and powerful computing have combined to train neural networks to distinguish objects. Deep learning is now considered a new field moving toward AI.

There’s a lot more, worth your attention. AI is coming; as Minsky says, when you get it, they say, well that wasn’t Artificial Intelligence at all…

http://www.zdnet.com/article/new-3d-nand-flash-memory-from-intel-micron-could-result-in-10-terabyte-ssds/

New 3D NAND flash memory from Intel, Micron could result in 10-terabyte SSDs (ZD)

Summary:The two companies claim their new technology offers up to three times the density of other 3D NAND competitors, with full production ramping up later this year.

By Sean Portnoy for Laptops & Desktops | March 30, 2015 — 13:07 GMT (06:07 PDT)

NAND flash memory isn’t the type of technology that might get your heart racing, but breakthroughs in making solid-state storage denser means more storage can be squeezed into ever-smaller spaces. While Samsung has been the company most associated with making 3D NAND technology the latest trend in flash memory, longtime partners Intel and Micron have just announced the results of their collaboration that could yield equally impressive results.

As the term suggests, 3D NAND adds a new dimension to producing flash modules. By stacking cells vertically, density is improved, which allows for more capacity in the same dimensions. Intel and Micron have further refined this process by using a floating gate cell for the first time in 3D NAND production.

Moore’s Law isn’t dead yet…

Researchers Claim 44x Power Cuts (EE Times)

New on/off transceivers reduce power 80%

R. Colin Johnson

3/30/2015 00:01 AM EDT

PORTLAND, Ore.– Researchers sponsored by the Semiconductor Research Corp. (SRC, Research Triangle Park, N.C.) claim they have extended Moore’s Law by finding a way to cut serial link power by as much as 80 percent. The innovation at the University of Illinois (Urbana) is a new on/off transceiver to be used on chips, between chips, between boards and between servers at data centers.

The team estimates the technique can reduce power up to whopping 44 times for communications, extending Moore’s Law by increasing computational capacity without increasing power. “While this technique isn’t designed to push processors to go faster, it does, in the context of a datacenter, allow for power saved in the link budget to be used elsewhere,” David Yeh, SRC director of Integrated Circuits and Systems Sciences told EETimes.

Today on-chip serial links consume about 20 percent of a microprocessor’s power and about seven percent of the total power budget of a data center. By using transceivers that only consume power when being used, a vast amount can be saved from their standby consumption.

The reason the links are always on today is to maximize speed. The new architecture reduces their power-up time enough to make it worth turning them off when not it use. The team estimates that data centers alone would save $870 million per year by switching to their transceiver architecture.

clip_image001[5]

Surface Pro 3 and Hyper-V

Dear Dr Pournelle,

I have been following your Surface Pro 3 observations with interest, as my Precious arrived last September. It’s the Core i7 model with the 512GB SSD. At the moment I am running Windows 8.1. I love it to bits but I have some observations that may be relevant to the ongoing discussion about waking up from sleep:

I installed Visual Studio 2013 on my Surface Pro 3 and it promptly switched on Hyper-V for Windows Mobile app development. Hyper-V is fantastic on a decently fast desktop PC but it really messes things up on an SP3. Mine really really did not like waking up from sleep and there were many incidents of having to hold the power button and reboot. Eventually I switched off Hyper-V again as I really didn’t need it.

WiFi does my head in. My home network uses an Apple AirPort and a Linksys WRT54GL as access points. The SP3 is unable to reconnect to them from sleep without some encouragement or sitting back and waiting for a few minutes. Newer access points or routers seem fine though, including a NetGear AirCard 762S that I use for 4G internet access on the go. It works a treat for everything I can throw at it, including live video streaming using UStream.

Finally, for those of you who haven’t bought one yet, go for one of the base models. The one I have is super fast but it runs hot and battery life is compromised. On the plus side, it easily replaces a full desktop PC, unless you are a gamer. I use mine for development work, which includes running Android emulators and Ubuntu VMs, all without performance problems.

Best wishes,

Simon Woodworth BSc MSc PhD.

clip_image001[6]

clip_image004

Freedom is not free. Free men are not equal. Equal men are not free.

clip_image004[1]

clip_image006

clip_image004[2]

Bookmark the permalink.

Comments are closed.