jp.jpg (13389 bytes)



Mail96: April 10 - 16, 2000


read book now





BOOK Reviews

emailblimp.gif (23130 bytes)


The current page will always have the name currentmail.html and may be bookmarked. For previous weeks, go to the MAIL HOME PAGE.


If you are not paying for this place, click here...

IF YOU SEND MAIL it may be published; if you want it private SAY SO AT THE TOP of the mail. I try to respect confidences, but there is only me, and this is Chaos Manor. If you want a mail address other than the one from which you sent the mail to appear, PUT THAT AT THE END OF THE LETTER as a signature.

I try to answer mail, but mostly I can't get to all of it. I read it all, although not always the instant it comes in. I do have books to write too...  I am reminded of H. P. Lovecraft who slowly starved to death while answering fan mail. 

Monday -- Tuesday -- Wednesday -- Thursday -- Friday -- Saturday -- Sunday

Search: type in string and press return.


Boiler Plate:

If you want to PAY FOR THIS there are problems, but I keep the latest HERE. I'm trying. MY THANKS to all of you who sent money.  Some of you went to a lot of trouble to send money from overseas. Thank you! There are also some new payment methods. I am preparing a special (electronic) mailing to all those who paid: there will be a couple of these. I am also toying with the notion of a subscriber section of the page. LET ME KNOW your thoughts.

If you subscribed:

atom.gif (1053 bytes) CLICK HERE for a Special Request.

If you didn't and haven't, why not?

If this seems a lot about paying think of it as the Subscription Drive Nag. You'll see more.

Highlights this week:


line6.gif (917 bytes)

This week:



Monday  April 10, 2000

re: peculiar story -

The [referenced above] site reminds me of the supermarket tabloids, without the pictures. It even has a page for columnists, some of which one might respect, to promote an image of legitimacy, but all of which are associated only by links. If this the site really is a legitimate source of information it ought to be rethought to reduce the sleaze appearance.

What I haven't seen expressed anywhere is what kind of precedent keeping that boy here would establish. Should we do so, imagine the reaction of other countries the next time a disgruntled foreign spouse takes their child back home and we demand that the child be returned to its rightful mother/father. At best they might hurt themselves laughing at our presumption.

Paul Hampson

Actually, the usual accusation is that the Department of State is a bit lax in asking for the return of American born children with foreign fathers when the kids have been taken, often against US court orders, to the father's country of origin. It's usually an American girl trying to get her kids out of a Moslem country.

I have no idea whether the story that Elian was born over a year after the putative father and his mother were divorced is true or not, but it does seem the sort of thing that a family court should decide (unless the story is merely made up out of whole cloth; I don't know that one either). Assuming the father and mother were married at the time of conception then the legal presumption is that he is legitimate and the marriage father is the father, at least under the common law and I would presume under Florida law. Cuba probably retains many elements of Roman Civil Law, which I think holds the same presumptions, but of course in Cuba law is what the Party says it is, there being no Rule of Law as we know it. Again, all this seems ripe for decision in a family court, not arbitrary decision by the Department of Justice, which doesn't seem to have any place here at all.  I must have missed something.


Thanks for ALL the wonderful SF! I swear that reading your work has helped to keep me sane...

I would like to postulate a new natural law of the Internet that seems to be absolutely inviolable. With hubris, I name it after myself.

Sloan's Law:

If an email is sent that includes more than one (1) question to be answered by the recipient, the reply to the sender will include only an answer to the FIRST question.

Note: There are no exceptions to this law.

Respectfully yours,

John N. Sloan

I need to reflect on that; I think I have dealt with multiple questions in the mail column. Sometimes. And I certainly have dealt with the LAST rather than the first sometimes... Still it is an interesting observation.

Dear Dr. Pournelle:

I ran across the article at the end of this URL on the Stardock Systems website. I imagine that you are far more familiar with these issues than I, but I found it a clear and crisp explanation of what until now had been for me "buzzbuzzbuzz open source buzzbuzzbuzz".

Very respectfully,

David G.D. Hecht 

Thanks. Anything to make it all clearer...

Dr. Pournelle,

Just wanted to compliment you and Larry Niven on "The Burning City. I am an avid reader and my interests vary considerably from science fiction to mysteries to historical fiction and non-fiction. I have read most (can't be sure that I have gotten them all) of your books and have enjoyed them all, some more than most.

This latest, The Burning City was perhaps the best of them all. Not only was it satisfying as science fiction, but the story line was complex and well written it a way that would compare to novels written by authors, how to say this, perhaps more widely recognized as great storytellers. After I finished the book, I felt like I had experienced it myself. Both you and Mr. Niven are skilled authors, but together, this time you have created a work that transcends your own individual efforts with other projects. There is a word that means the sum is greater than the parts, I think it aptly describes your efforts here. Congratulations.

Gib Akers

Subject Macro Evolution

Dear Jerry,

I appreciated the comments made by Erich Schwarz, especially the statement coming from a noted molecular biologist quoting the genetic difference between a chimpanzee and a homo sapiens as something close to 350 bits.

Assuming this is correct, it creates another, maybe even bigger problem for macroevolution. Assuming tight coding, 350 bits cannot represent more than the equivalent of maybe 100 machine language instructions. The Darwinian model assumes fractions of the 100 instructions to be randomly inserted into the human genome of something approaching 10 billion bits. Anybody who ever worked on a large software project will appreciate the difficulties associated with such an undertaking.

The evolution from australopithecus to homo sapiens involved a design change which quadrupled the brain capacity. While this process is still poorly understood, significant structure was added to the brain. The changes provided the capabilities to speak, to compose Beethoven´s 9th symphony, invent relativity, and design the Apollo project.

To duplicate something like this is way beyond todays computer science capabilities. Even if we could do it, I assume it would require megabytes or terabytes of code. To compress it into a number of code pieces totalling 350 bits creates additional problems of understanding.

So Darwinian macroevolution requires an element of unknown characteristics, which may or may not exist, but, assuming it exists, is outside the existing realm of scientific knowledge. This does not make Darwinian macroevolution impossible, but rather speculative. It also leaves room for alternatives.

I would like to contact Erich Schwarz directly for further discussion and I wonder whether you could ask him to supply his e-mail addresses.

Thanks for a very stimulating discussion.

Regards, Wil Spruth, 

I have forwarded your letter. The point of opening that discussion was to get people thinking on what for all too many is a closed subject. With luck I have done that and I can turn my attention to other matters...

And Mr. Dobbins supplies yet another example of political correct run riot... --  





This week:



Tuesday, April 11, 2000

Subject: Elian

Dear Dr. Pournelle, As I recall, only the INS can determine whether Elian can stay here. This is the result of a law passed by Congress limiting the rights of appeal of immigrants who are not legal residents. The law means that the Florida family court has no jurisdiction, neither do the Federal courts. It was intended to prevent illegals from staying for long periods of time while they appealed deportation orders. The law was supported by many of the same elected officials who now decry its results. The Law of Unintended Consequences is passed every legislative term.

Kit Case 

If Elian were an adult, he could in his own person ask to stay in the US or return to Cuba. The question becomes who represents him? That question will determine the outcome since the INS has no real choice but to grant him asylum if he asks: but of course he cannot himself ask. 

But there is some question about the marital status of his parents when he was conceived, let alone when he was born; if the putative father was no longer married to the mother at the time of conception, there is no legal presumption that he IS the father under US and Common law. Perhaps there is in Roman Civil Law and almost certainly there is in Castro's Cube where what pleases the prince has the force of law, and it pleases Castro to declare Elian legitimate.

Those are questions of fact, and thus not to be determined by simple fiat; or so I would have thought. Whether Florida domestic court is the right place to determine who shall speak for him is itself debatable.

On the surface, the conservative position is that children belong with their parents, and the state (Federal or State) shouldn't be given expanded powers over family matters. Unfortunately, there are questions of fact as well as law here, and certainly there is no unambiguous moral position. Suppose the father were a subject of a nation that practiced ritual female circumcision and Elian were a girl? 

From: Stephen M. St. Onge

Subject: Bureacratic weirdness

Dear Jerry:

That picture of the addition to your staircase is bizarre.

In 1975, I broke my leg very badly, and spent over a year in a cast. I lived in a second floor apartment, and got very familiar with crutches and stairs.

You write: "it's not clear to us or anyone else how this screwy thing is going to help handicapped people mount stairs they don't need to take -- there's an internal entrance to the apartment anyway." I find it hard to imagine how that square piece of metal is supposed to help someone handicapped go up or down stairs, even if they needed to take them. It certainly wouldn't have helped me negotiate them, on crutches or in a walking cast.

Perhaps if you were blind, it would alert you that the steps were about to end, but surely there are better ways? And as you say, it is a safety hazard.

Puzzled, St. Onge

The primary purpose of government is to hire and pay government workers with money extracted from the rest of us. The secondary purpose is to do what the government agency purports to do. In the case of Defense and the Administration of Justice and Law Enforcement the need for the service is apparent, and even when the government workers aren't actually doing anything they are perceived as necessary (and with the military they have to train constantly or they won't be worth anything anyway; and in the Bad Old Days the military was seen as just another blood-sucking useless bureaucracy good only for parades, but that's another story).

In this case the bureau involved doesn't serve much purpose to begin with, so there is ingenuity in Doing Things. 

Henry David Thoreau noted that when the officers all resign the revolution is accomplished.  He didn't himself address the question of how they can be induced to resign.

re: CD Drivers in a WIN98 DOS boot -- Use Startup Disk.

Michael Bell [] has been trying to boot into MSDOS without the Windows GUI, but WITH the CDROM drivers.

The simplest way to boot into DOS 7.1 (?) is to ask Win98 to build you a "startup disk". (look at Add/remove programs in the control panel) This disk tries the drivers for a dozen or so CD Rom drives (The OAK driver is the one for all of my generic IDE drives. I don't remember what my mitsumi soundcard drives use). If any of the drivers work, then it assigns a drive letter to it. It unpacks the various drive preparation tools such as fdisk and format to a ramdrive. Then it gives you control in DOS.

If his drive is one of the dozen or so that the startup disk knows, then this would get him pretty much to where he wants to be. The startup disk is slow (because it does the trial and error every bootup and because he won't really need the ramdrive), but it's fast enough if he doesn't have any other quick answer.

Let it grind for the minute or so that it wants and it will give him a c:> prompt, with the drivers for any "normal" CD loaded.

And the config and autoexec on the startup disk will provide an example (though convoluted) of how to use the particular driver that works on his machine.

Some drives don't support DOS access. My 4-disk CD Changer from NEC assures me explicitly that no DOS drivers are available. I never looked any deeper. My generation-1 Creative (panasonic?) DVD assures me specifically that there are no DOS drivers. Again, I've never checked because I don't need either of these in standalone DOS.

Greg Goss (

Yes, of course. My 6-pack CD Changer won't support but one drive in DOS as another example. Thanks. That's the right approach.


I wonder if anyone who has downloaded Netscape 6 has any idea how to uninstall it? It does not appear in the control panel install/uninstall window. The start menu has no uninstall prompt.

When I clicked to download it, it installed some sort of program to access the netscape site to finally send the program to my machine. After that it just loaded. Period. It does not seem to be doing bad things, but I would like to get rid of it. Any help welcome.

Cheers, Peter

Interesting. I got it on a CD at Internet World but I have not installed it. I'll see if there is an uninstall on the CD.

Dear Jerry,

I did the full download, install and test. I am a very strong NetScape supporter, but can not describe how disappointed this demo is. I did the test running Windows 2000 on a Digital Ultra 2000 notebook with a Pentium/266 and 96 Mb RAM. I connect through a WinGate proxy server.

NS-6 crashed very regularly, and took Win2K with it twice. I was not able to get Instant Messenger, email or newsgroup access to work. The proxy settings are minimal. It would crash regularly when exiting from the preferences screens. It also does not implement Alt-LeftArrow and Alt-RightArrow to do page back and forward. As far as speed, it took about 50% longer to load and rendered pages at about the same speed as NS-4.72. The Quality Feedback Agent was not able to negotiate through the proxy server and their web page for feedback was under construction.

To remove it I deleted the directories, removed the start menu links and replaced the registry with the backup made before the install. Probably not correct, but it is gone.

Very disappointing.

John G. Ruff.

Ouch. I am beginning to wonder if I want to install it at all...  Is this Microsoft's fault?

Chris Ward-Johnson, computer columnist for Times, has some interesting material posted that describes some of the effects of the Godfrey decision.

Robert Bruce Thompson


THE LAST MILE Problem: Connecting to Internet


 Tom asked me to comment on this "last mile" problem based on some information I passed to him on the subject.

A little background: My day job is in Defense Department (DOD) procurement as a quality specialist. I also moonlight as a political activist for space issues. And I have been a serious military wargame enthusiast for over 20 years.

I have been following the "Digital Frontier," as it relates to the commercial space development, for the last several years. Four years ago it finally looked like there would be a commercial market that would spur the development of reusable launch vehicles. The huge Teledesic constellation of low earth orbit satellites (LEOSAT) seemed keyed to solve the last mile problem. It would go around local political roadblocks via satellite broadcast.

Then I watched in horror as the aerospace industry "usual suspects" became involved and the whole LEOSAT market went to financial hell. (Old NASA saying: it takes 10 years and $10 billion to do anything.) In response, the "digerati" moved on to terrestrial systems working within their 1-2 year product cycle.

The TeraBeam answer to the "last mile problem" is drawing on both DOD work on lasers and the cultural 2nd and 3rd order effects of the open digital frontier on the Federal employee workforce.

It has been apparent for some years in the DOD procurement community that laser weapon development was being slowed for political reasons. Lasers were too successful in their intended roles of blinding sensors and shooting down missiles or aircraft. Thus, they were a threat to arms control agreements. Add to that the usual bureaucratic turf wars of a shrinking defense budget and you had a lot of unhappy government scientists and engineers.

The motivations of government scientists and engineers are similar to those of Military officers. None of them are in it for the money. Military officers want to lead troops and live the dream of "Duty, Honor, and Country." Government scientists and engineers want to play with the biggest and best leading edge toys.

If they can't have what they want from their government job, they will leave and go someplace else to make money instead. Which is something that the "digital frontier" is offering in spades.

Terabeam's solutions to laser communication problems look to be straightforward applications of solutions found in DOD laser research, using the people who discovered them.

In defense literature I have read, a "pilot" or "targeting" laser beam is used to acquire the target (a receiver in Terabeam's case). Then it measures the atmospheric laser waveform degradation and provides information to "pre-degrade" the main laser beam's optical characteristics to ensure maximum energy tranmission, in both lethal and communication applications. Some of the more advanced solid-state lasers weaponized for laser-blinding research were tunable through many optical frequencies and penetrated heavy smoke and rain out to 1.5 km.

All TeraBeam's technology needs is a fiber optic line close enough to their market to tap into. Which leads me too something I tripped over playing space development lobbyist in Washington D.C. this past March.

Level 3, Williams and Qwest, the firms mentioned in the Terabeam article, have all been laying fiber optic cable at a tremendous rate. "Three feet a second" was the figure of merit I was told by someone knowledgeable of the industry. (My hotel was three blocks up the street from the Qwest HQ.) Most fiber optic cable is being laid where current right-of-ways exist. Easements for railroads, pipelines, irrigation canals, interstate highways, and most especially utilities are all being used. And where there is electricity and water, there are people. (I have taken to calling this strategy "infrastructure co-location.")

The problem is that laying all this fiber optic cable has an impact. There was a major controversy in the local D.C. media that these firms were tearing up the roads so badly doing this that they should be regulated to a halt. The firms involved actually don't mind much. If that comes to pass it will be a barrier to market entry that will lock them in a monopoly position. In the mean time they are going to lay as much fiber as they can.

The reason for that is the economics of this market is strange. Level 3, Williams and Qwest are not making money from communications being run down their fiber optic cables. Most cables are not being used at all. They are being leased to speculators. Who are betting that the last mile problem will be solved during their lease.

The implication is that there is a huge and growing pre-existing bandwidth out there waiting for whoever solves the last mile problem first. (A good analogy is that of a super-saturated solution waiting for a crystal to be dropped in.)

The early adopters will be able to do "killer" applications like a combination of ICQ chat and video telephony whenever they are on-line. What people will do with all this can only be guessed at but current telephone use by teenage girls is a good indication. The implication of this kind of application for dating services and the like is really interesting. (An aside: The people working Internet video telephony to the limits of current technology are the Porn industry.)

It will also make on-line game a truly different experience. The model here is the U.S. Army's SIMNET with ICQ video telephony added for spice. As you add to the ability to truly see and speak with your opponents and allies to the mix. People will be able to plan to game when their favorite opponents are available. Games will be 24/7, as people join and leave on-going campaigns with characters/empires/ships/tanks/whatever morphing between AI and human control.

The bottom line is whatever you do; the electronic game industry needs to add this possibility to its next 2-5 year planning horizon. Or it will be overwhelmed by events. 

Trent Telenko

And thank you. 

Cisco, Lucent, and other vendors are working on this by running terrestrial (fibre, cable systems, whatever) to the neighborhood, then using spread-spectrum wireless technology in the 2.6GHz range (no licenses required for 500mw or under).

Cisco can achieve up to DS3 speeds (44mb/sec) up to 30 miles away, depending upon local topography and so on. Lucent can provide T1 speeds at up to 10 miles distance, quite a bit cheaper than Cisco.

Again, it depends totally upon the topography between the two (or more) sites in question, but in many cases, it's feasible. As a matter of fact, I'm looking into it for you.

Conversely, Cisco especially are marketing their technology to providers who want to go wireless to the neighborhood, and then DSL into the homes. This would greatly extend the reach of DSL lines, of course.

Roland Dobbins


Backup Problem?

I'm thinking that between you and all of your readers, someone must know the answer to the following

I'm trying to determine if there are software backup solutions for Windows NT 4.0 patrons such that the device one backs up TO is a fixed hard drive (or drives), as opposed to removable Zip, JAZ, or optical disks.

The scenario is to set aside a large (400-500GB) dedicated partition that the tape backup software sees as a device to stream full backup data to. Said full backups are thus intended to be used for short-term disaster recovery of workstations only. I know that Seagate (now Veritas) has backup drivers for emulating tape drives when using optical drives (BNTIDRV.EXE in the case of BENT 7.x), but they appear to require actual optical drives be present.

Any help would be appreciated.

Thanks, CTP

Chris Pierik Ballpeen Solutions 1801 Century Park West Los Angeles, CA 90067-6406 Email:

"My's full of spam"

P.S., thanks for pushing BTVS, that show's usually pretty cool. Until I ran across your recommendations I vaguely thought it must be more mindless tube trash.

I guess I hadn't thought about it. I tend to do backups with Canyon's Drag'n'file which lets me copy later files only: Important stuff at Chaos Manor is stored in about half a dozen places including Zip Drives and my laptop and once in a while we write a CDROM as well.  I confess I don't do formal "automated back".  I probably should. And Mr. Dobbins says:

Check out this Veritas product: 






This week:



Wednesday, April 12, 2000

Jerry, Regarding Chris Pierik's need for the ability to back up files to a partition on an NT hard drive. Perhaps I don't understand completely his needs (highly likely from the short email message posted), but it appears that what he needs to do is fairly simple. I received a call from the local grocery store who was having problems backing up their store database (pricing, item numbers, etc) and their video rental software to the tape backup units. The backups couldn't be started until the store closed, took a long time, and tied up the computers while they were being done. Consequently, whoever was closing the store often just wouldn't do it. Sure enough, they lost the whole video rental database one day. The store owner was sufficiently panicked, and simultaneously relieved that he didn't lose the store database, that he called me to devise an alternative. I like to think it is because I am a good systems consultant, but more likely in this case, it was because he and I graduated from high school together, and have remained close :) .

At any rate, I wrote a simple TurboBasic routine (the store computers inexplicably still run DOS, a common thing among point-of-sale devices) to back up the databases to a partition a networked computer. I have it store the files for five days in five separate directories, then overwrite the first directory, rotating through them.

I also wrote an option to back up the files to a Zip disk with 'one-click' technology (I wonder if Jeff Bezos will sue me? Of course, I wrote this '96, so I have prior claim). This gets done every week or so, and consequently, between the two methodologies, the store has a fairly secure backup process. I've tried to get them to do offsite storage of the Zip disks, but so far, that hurdle has just been too much to overcome.

Anyhow, it's just directory manipulation, and creating routines to do this is very simple. You could write something to do this in C or Visual Basic, and schedule it with any sort of task scheduling software you are running. The program lines of code I wrote were less than 200, and I cranked it out in 3 hours or so, installed and tested it in about 2 hours, then wrote a needed very explicit guide (about 20 pages) in about 4 hours. You ought to be able to get it done in a day, and should be easily modifiable to meet the changing requirements of your environment.


Windows users can do much the same thing with any form of BASIC, DELPHI, or for that matter a good batch file program: the only problem being files that change as you are backing them up. If you can make a file stand still long enough to be copied, you can of course copy it anywhere with a good batchfile. QuickBASIC is more than good enough for that... And as you say, Windows task scheduler will take care of the rest.

Matt Volk asks:

I want to use a web cam to input photos into an Access database. Any ideas on how I could get this to work?

I figure I have to use some kind of ActiveX control to capture the image.


Matt Volk

and I am out of time...

To: Jerry Pournelle, Chaos Manor Subject: Uninstalling Netscape 6 Preview Release 1

Dear Jerry,

I found the instructions for uninstalling Netscape 6 in the product's Release Notes document. There are more files to remove than were mentioned in CurrentMail on Tuesday, although that writer was certainly on the right track.

Release Notes -- Netscape 6 Preview Release: 

I made a summary of the Uninstall section, but decided against including it. Those who are interested can get to the Release Notes web page in a click or two of the mouse.


Tom King

P.S. Thanks for your Byte columns and, especially, for this website.

I actually got this as a press release as well as mail:


All below is a press release quoted from this URL. Not knowing copyright status, if you find it worth printing, just refer folks to the URL if you feel that would be proper.


< >

For Immediate Release

Contact --

Jeffrey Zeldman Group Leader  +1.212.725.0847

Simon St. Laurent  +1.607.277.0167

Dori Smith  +1.707.473.0398

WEB STANDARDS PROJECT BLASTS MICROSOFT'S "ARROGANT" BREAK WITH STANDARDS  -- 10 April 2000 -- The Web Standards Project (WaSP) today denounced Internet Explorer 5.5 Windows Edition for abandoning Web standards Microsoft has publicly committed to supporting, and focusing on proprietary technologies which are certain to fragment the already-troubled Web space.

"We are incensed by Microsoft's arrogance, and perplexed by its schizophrenic decision to support standards on one platform while undercutting them on another," said Web Standards Project group leader Jeffrey Zeldman.

The group is outraged by Microsoft's decision not to support key W3C standards, notably the DOM Level 1 core and portions of the CSS1 specification, in the market-leading Windows version of its Internet Explorer browser. Microsoft's reversal will make it nearly impossible for Web developers to create documents that adhere to Web standards. At the same time, the proprietary technology that Microsoft is providing may lure some developers deeper into functionality that is supported on only one browser and one operating system - Microsoft's.

"This approach mocks the dream of 'code once, read anywhere' that has driven so much of the Web's success," said WaSP Steering Committee member Simon St.Laurent. "By 'innovating' ahead of the W3C ( in areas like Cascading Style Sheets behaviors while leaving large chunks of standardized processing and styling unsupported, Microsoft risks creating even more complicated browser incompatibilities than already exist."

"The Web community has waited for more than four years for Microsoft to fulfill their long-standing pledge to fully adhere to W3C-issued Recommendations," said WaSP Steering Committee member Sally Khudairi. "The collective patience of both users and developers is running out: why should anyone settle for Web pages that work on only one browser, on one platform and on a limited set of devices?"

The group pointed out that Microsoft itself helped create many of the standards it appears to be abandoning in IE5.5/Windows, and noted with bitter irony that Microsoft's newly released IE5/Macintosh Edition does a masterful job of supporting key Web standards. "Do they want us to code for the standards-compliant Macintosh version, or the incomplete - but dominant - Windows version?" Zeldman demanded.

"By casting aside standards, Microsoft is making it more difficult, if not impossible, to create Web pages that would be accessible on a variety of devices and platforms," said WaSP steering committee member Dori Smith. "This hurts a wide variety of Web users, from the executive using a Web-enabled cell phone to a visually impaired senior citizen."

Added Zeldman: "Coming on the heels of Netscape's preview release, it's hard not to view this as exactly the kind of 'predatory' behavior the U.S. Justice Department laid at Microsoft's door. If Microsoft, as the dominant player, undercuts Web standards on its prevailing Windows platform, developers will be helplessly spun in Microsoft's direction, killing the dream of a Web that is accessible to everyone."


The Web Standards Project is an international grassroots coalition of Web developers and users fighting for standards on the Web, by calling attention to browser incompatibilities that fragment the medium, prevent many people from using the Web, and add 25% to the cost of developing all sites. The WaSP urges all browser manufacturers to support existing standards before incorporating proprietary innovations, and is working to educate Web authors and Web-related software developers so that we may create a Web that works for everyone. For more information on WaSP, please see

### --

I don't usually print press releases, but I'm strapped for time, and this one may be worth commenting on.


A sober dissection of the conflict between "fair use" and the DMCA:,1151,13533-0,00.html

--Erich Schwarz


From Trent Telenko:


Mr. Dobson's technical description sounds like the ultra wide band (UWB) technology the USMC has been playing with in their recent urban combat experiments.

The FBI and NSA both showed up at those USMC experiments to monitor those UWB communications. Their best detectors did not pick up a thing. Both organizations *WERE NOT AMUSED*.

Now the punch line: The USMC's UWB communicators were off-the-shelf commercial items they purchased from a small California start-up.

There were a flurry of articles on ultra wide band sensors and communications in the defense press, after the Cold War ended and in the aftermath of the Gulf War. The claims being made about UWB technology ran from being able to detect and track stealth planes to being an undetectable and uninterceptable form of communications.

Since this technology had been living in the back world, the USAF brass shut it down as a threat to stealth platforms. Or so the story floating around in the procurement community goes.

A few articles came out in the usual places denying that UWB sensors could detect stealth planes a couple of years later. Whether they could or couldn't, the funding in these areas was cut.

Now it looks like those defense researchers got disgusted and found jobs elsewhere.

On a completely different note, the April 17, 2000 BUSINESS WEEK has an article on the coming deregulation of electrical utilities. Apparently some of the utilities are turning themselves into telecommunication companies. One of the technologies being pursued by these utilities is encoding digital information into your electrical power lines. Which is called power-line communications, or “PLC.”

PLC has limited usefulness in the USA because we have transformers for every 5-15 houses. Since digital packets in electrical current are destroyed by power step down. And bridges to circumvent that are expensive.

However it looks like a company in Dallas, TX may have beaten this limit.

Rather than putting digital information in the current, Media Fusion Inc. is encoding the digital packets in the magnetic fields surrounding the current. The bandwidth figure of merit for Media Fusion’s technology was 2.5 trillion bits per second.

The article did not say how this bandwidth was allocated, AKA is it like a cable modem when you have large numbers of users, but the company intends to demonstrate simultaneous powerline transmission of HDTV and CD audio from Dallas to Wash. D.C.

And I have to write about Free Network and such like shortly: it's all of a piece.

This is relevant to the "Strong AI" theories. As well as interesting in it's own right...

What most people refer to as consciousness seems to be a mechanism for implementing what Rosen (1985) calls an anticipatory system, using an internal model to predict the future as a basis for behavioral choices. The evidence seems to be that mammals are conscious (only a few are self-conscious like apes or people) suggesting that consciousness in some form is ancient, going back to at least the Mesozoic. The bat data are particularly strange (see Griffin, 1958, Listening in the Dark)--bats seem to memorize their environment and thereafter live mostly in their internal model. You can remove objects from their environment and they will continue to behave as though the object is still there until forced by circumstances to pay attention. During this, they continue to make echolocation cries; just ignoring the returns. A study by Moehres and von Oettingen (1946) where they had a bat living in a flight cage and accustomed to sleeping in a cage is particularly interesting. They conducted a series of experiments ending with the removal of the cage. The bat flew up to where the cage was, executed an acrobatic somersault to enter the non-existent door, and then tried to alight on the non-existent perch. Needless to say, it finally paid attention at that point. One blind person who had learned to use passive echolocation ('facial vision') to navigate indicated to me that he was also likely to behave this way. Apparently echolocation is an effortful task.

What sorts of brain mechanisms would support this mechanism? It would probably involve _fast_ match/mismatch processes to filter afference against reafference. The reafference would be generated by a dynamic world model of some sort. Motor actions would be set up using the model to actually take place in a second or so. Thinking about the real-time coordinate transformations necessary to link the two makes my head hurt. The world model can be simple as long as the mental states of other players are not important, but adding mental states--self consciousness--is nasty--think about bluffing games, which often have chaotic best strategies. The emergence of complex social behavior in advanced primates probably made self-consciousness adaptive.

Harry Erwin

And on that backup issue:

Dr. Pournelle;

Others might have suggested this but the backup program NovaDisk will backup to any disk either local or network. There are versions for OS/2, Win9x and WinNT. I have used both the OS/2 and Win9x versions and they work very well. The programs create a large backup file that appears to be an image of what would be written to tape. The programs support both compression and encryption.

Backing up over a network is slow! Backing up to my SCSI tape drive happens at better than 25Mbyte/sec, to a parallel port at 15Mbyte/sec. A 10Mbit/sec connection is closer to 1Mbyte/sec.

It takes almost as long to backup my OS/2 boot partition over the network as to backup the whole system to tape. I then write the image to a CD-RW which I can then restore from a set of recovery disks. The restore from CD is much faster than restore from tape when booting from floppies.

NovaStor's NovaBackup has both disk and tape backup programs. They now have a Windows backup program that will write directly to a CDR or CD-RW (I think.) They can be found at I can personally recommend the OS/2 disk and tape programs and the Win9x Disk program.

Rolf Grunsky 

Remember, what you see coming at you is coming from you. "Jungle" Jack Flanders 

Mr Dobbins reports this under the subject "What's good for the goose..." 

One of the rules in the magic universe of The Burning City is "Poetic Justice"...

Jerry, I read Trent Telenko's mail on UWB technology and NSA's consternation with it, and was interested. I'm going to do a search for more information, but any related sites or other sources of information users can point me to would be appreciated. (As a side note, it's interesting that the FIRST place I always think to look now is the internet, as opposed to libraries or periodical resources).

For several years, the military has spread spectrum and frequency hopping (still encrypted, of course) as a secure method of communication. Spread spectrum is secure as it transmits at low power across a wide frequency, which lets it's signal level be down in the electronic background noise, but the effective radiated power across the bandwidth can be quite high. Additionally, because it transmits across a wide frequency, it is less susceptible to narrow band interference, and even some wide band stuff, especially with frequency hopping.

I was Assistant Test Director years ago on a high level command and control system that used spread spectrum satellite communications. It was highly survivable, and was supposed to punch through almost anything. Of course, we couldn't set off any special weapons to test it, but the designers planned for minimum downtime in case of that type of attack. The system used some pretty cool DSP stuff and dynamically reconfigurable processors in the communications frames, too. Neat tech at the time.

Another type of system that is advertised as a "communication system" is the troposcatter system that Litton has installed for Saudi Arabia. I think it is advertised as a communication system to appeal to the delicate sensibilities of that region, but my understanding is that in reality, it's purpose is as an early warning detection system for airborne objects. It bounces RF energy off the tropospheric atmosphere layers, and looks for changes. I've always wondered how stealth technology would fare to that type system, as the frequencies vary widely from normal radar systems.

Tracy Walters







This week:


read book now


Thursday, April 13, 2000

Dear Mr. Pournelle

The original developer of so-called ultra-wideband radio is Larry Fullerton. He now owns and operates a company called Time Domain:

UWB is a simple concept: instead of modulating frequency (or amplitude or phase), the inverse domain is modulated: time. Short-duration white radio pulses, essentially spark-gap pulses, are sequenced with the information content being carried in the time between the pulses. The clever bit is this: to distinguish the signal from random noise, the pulses are stuttered in a known pattern and the signal modulated modulated about that time pattern. To communicate, a sender sends a few thousand pulses or so in a known pattern. The receiving unit recognizes this as a "carrier pattern". The sender then modulates the signal about the "carrier pattern" and communication begins. The pulse frequency doesn't matter much to the radio.

As you can see, this is very analogous to FM radio, but in the time rather than the frequency domain. Security is naturally part of the transmission; it is very difficult for a listener to tell what the pattern is, when the signal starts, etc... This doesn't have a good analogy in the frequency domain. It is as if frequency hopping was being done for each bit of a digital FM signal.

UWB radio has great characteristics. Spark-gap type pulses do not attenuate very much, so transmitter power can be much lower than conventional FM/AM radio. Time domain uses picosecond time resolution devices, giving GHz bit rates.

The "personal radar" part of this comes from insight that traditional SAR requires a lot of very expensive processing power to remove doppler and noise (sparkle) effects. Since the pulse frequency artifacts don't affect UWB radio, time-modulated UWB SAR, as Time Domain calls it, is far less computaionally intensive. Remeber too, that UWB pulses carry well. Unlike regular radar, there is no characterisic frequency. The result: hand-held radar sensors that can look through walls.

Put very secure communications and the potential to resolve the physical environment to a fare-thee-well into one breakthrough, and it is no surprise that the militaries are interested. Your marines can talk without being heard and have far fewer "surprises" in complex or urban combat areas. S&;R and police are understandably interested too.

There are two reasons this technology has languished. Firstly, Fullerton was in a protracted patent fight with Lawrence Livermore Labs. This seems to be coming out mostly in Mr. Fullerton's favour, but is not over yet: 

Also, probably because of the patent fight, little development work had been done on the time-resolution components and so they were laboratory demonstration level only: big and expensive. That changed about eighteen months ago, when Time Domain and IBM managed to put the time descriminator onto an IC. Now they can manufacture these devices cheaply and in volume. Time Domain has picked up a bunch of US military contracts and it looks like golden days ahead for them.

I hope you find this useful.

Kind Regards, Bruce Hollebone: hollebon (at) cyberus (dot) ca

Thanks! I may pull all this out into a special report.









This week:



Friday, April 14, 2000

Tax time. VERY short shrift.

A hole in FrontPage?

Dr. Pournelle,

You may want to check this link: 

The essence of the article is that there's a back door built into Microsoft's server products, giving people access to the server files. Bad idea.

Also, I need to thank you for a few things. I've recently added a search function from Atomz to a Web site I manage. Thanks for mentioning and using Atomz on your site.

Also, one of my machines was experiencing the "Win98 shutdown problem" you experienced last year. I came to your site, used the search function, and found the solution.

As much as I enjoy your articles on Byte and Intellectual Capital (I visit IC only when you post a new article), I'm really glad you decided to keep your site going after Byte reappeared. I'll renew my subscription this year; it's certainly worth it.

JA -- John Alexander Manager, Area Computing Services Capstone College of Nursing The University of Alabama

 Thank you. With so much to do I don't normally see Slashdot; last time I looked the noise to signal ratio is high and it had attracted people who hate Microsoft and will say almost anything bad about the company.  Not that there are not problems with FrontPage. 

In this case I don't use the Extensions, although the people at PAIR.COM may want to know about this. Those extensions are not turned on on my site, but they may be on others.

Thanks for the kind words.

And now in reply to my link in view: I leave off the name of the sender.

It was said previously:

If you go to 

you will find a rather technical article of considerable value in explaining "prejudice" and decisions. The arguments are mathematical. The social consequences are a bit hard to fathom.


To quote a paragraph from the site:

"In plain English, once a test score has been obtained, the best ability estimate will depend on the average ability of the candidate's group. Thus, if the goal is to select the best candidates, it will be necessary to consider group membership, and the mean ability of the candidate's group. The general effect is to move the ability estimate for each candidate towards his group's mean ability. When trying to select the best candidates (who will usually have evaluations above the mean for their group), the estimate for each candidate should be lowered by an amount that depends on mean and standard deviation for his group, and the estimate's precision."


When dealing with abstract mathematics, it is always nice to use real world examples when possible.

Lets apply this to a real world example: how far can you throw a baseball (on Average)?

We want to choose a person to throw baseballs the farthest.

We allow for a reasonable amount of training to get a basic technique, with prior experience counting, etc. We have a Universal Standardized Baseball Field (tm) We generate a bunch of Fancy Charts (tm).

We then say that if player x belongs to a group that is a "traditionally" low performing group, we get to discount his actual results to a certain degree.

What if wrong with this?

The False data lies is the mixing of two types of data. Estimates vs Actual measurements.

Because with Estimates, you can finagle anything.

The article is an excellent example of Garbage in, Garbage out.


Now you can, in a social context, take account for the damage caused by a destructive education system, etc. for a variety of purposes. The appropriate and correct fix is the repair the damage caused. Matters of social justice are a second matter, and can also be addressed. These need to be addressed in addition to, not instead of the previously required repairs.

Because otherwise, you wide up with more of the same conditions that you started out with in the first place.


We now get to discuss the social responsibility of calculating results like this using bad data. 

 "Work like you don't need the money" "Dance like nobody's watching" "Love like you've never been hurt." (Attributed to varying sources.) ====================================

This is the kind of glib statement that causes imbecile policies. You cannot ignore data, and simply because statistics can be misused does not mean you can ignore statistical data: in a sense everything we know is statistical, which is sort of the point of what Bayes was saying. Sometimes the probabilities are so high that we can ignore the statistical nature of the data and speak of "laws", but it's pretty hard to have an epistemology that includes absolute certainty.

If I know two individuals well and am asked to predict which is likely to score higher on an IQ test, then I'll take my impressions; but if I need to predict success in almost any activity, I'll predict the high IQ group will do better than the low. If you give me more data I'll use it, but I don't intend to throw away what I have because it is not "exact". If I want to predict who will run the track faster and I know one is East African and the other white American I will take the East African in the absence of any other data. Tell me that the East African is 80 years old and the American is 20, and I will take the American. Tell me they are both 30 and I'll take the African again. Tell me that the African has a broken leg and I'll take the American. And so forth.  Some data are not very exact.  It is impossible to predict what number will come up on a thrown pair of dice.  But I can predict that the most probable number is 7, and I can analyze the game of craps to show that house has a 1.4% advantage if you put your money on the pass line -- and that the house will get rich on that 1/4% although some people do walk away winner.

If group data (the set of 3's vs the set of 7's in dice as an example) is all you have, then you use what you have.

The urge to ignore data because it's not very good is a major intellectual mistake, and those who don't ignore it generally win.

After which I received from the same person:

 You said:

"The urge to ignore data because it's not very good is a major intellectual mistake, and those who don't ignore it generally win."

Responding from home:

Your response has all very excellent and valid points. Writing quickly from work, I did not have all of the details at my fingertips. However, I think the essence of my point could have been better communicated.

The point I was making was fundamentally in favor of more observation, and thus more facts, and more data.

The point on the page that has me alarmed is the idea that based on observation of person X we estimate a certain amount of performance. Then because they are members of a certain group we then change the estimates. up or down. Two bodies of data.

Now reading further in the page [from the comfort of my home I have the time to read and find] the following possibly positive note:

.... "Jensen (1980, p. 94) points out that the best estimate of true scores is obtained by regressing observed scores towards the mean, and that if there are two groups with different means, the downwards correction for the high scoring individuals will be greater for those from the low scoring group. Kelley (1947, p. 409) put it as follows "This is an interesting equation in that it expresses the estimate of true ability as a weighted sum of two separate estimates,- one based upon the individual's observed score. X1, and the other based upon the mean of the group to which he belongs, M1. If the test is highly reliable, much weight is given to the test score and little to the group mean, and vice versa.", although he may not have been thinking of demographic groups." ....

This means that we need to give careful consideration about the usefulness and accuracy of the tests applied. And we have to weight them appropriately. The danger lies in inappropriate weighting.

This is in harmony with your comment on the urge to ignore data. In fact I am sure there is some sort of spectrum of reliability of data, etc.

Part 2:

This brings up other thoughts:

Let's see ... making something up on the fly, rambling into the night based on the above :) I am including a quick matrix I just threw together as a 4kgif file.

There are two axis:

One has to do with Quality of data, from Positive to Negative. The other has to do with Quantity of Data, and includes negative values.

very off the cuff, there are likely better ways of doing this sort of thing.

rambling off to bed...


"Work like you don't need the money" "Dance like nobody is watching" "Love like you've never been hurt." (Attributed to varying sources.)

Again I remove the name because I have no wish to embarrass anyone; but it is alas clear that my correspondent does not understand the nature of statistics and Bayes Theorem.  First, ALL statistical data has built in a measure of "quality": it's the standard deviation. Correlation coefficients have built in a measure of "quality".  If by "quality" is meant "reliability" that too is rigorously defined. In statistical work, "reliability" refers to the chance that the score you obtained differs by how much from the "true" score, the "true" being defined as the one you would get if you gave tests to the same person in infinite number of times.  In IQ tests it is often examined by test/retest reliability, but also by the correlations among the scores obtained by different tests, such as Stanford-Binet, Wechsler, Raven's Progressive Matrices, and others. You might also correlate the times an individual makes in the hundred meter dash, or the 10 kilometer race when done many times. Clearly they will not be exact but will cluster. Those correlations are high but not perfect and the test reliabilities are built in to the predictive value of the test, whether you are predicting school success or track and field success.

Validity is the correlation between the predictor, such as a test, and what is predicted, such as rank order of "success" or academic grades or salary after ten years. Those correlations are never perfect either, and have a built in "quality" measure. If variables A and B have correlation r, then I can explain r^2 percent of B's variance by A.  If my correlation is .9 then I can explain 81% of B's variance knowing A. If it is only .4 then the corresponding measure is 16%, which pretty low -- but it is NOT ZERO. And if my correlation is .2, then I can account for only 4% of the variance, and that too is low but not zero -- and if it's all I can do, it is often better than nothing. Las Vegas relies on that.

Statistics are denigrated by those who don't understand them. But Bayes Theory can tell you just how much it is worth to have an additional x% accuracy of prediction. Sometimes that will be a lot, sometimes it will be worth almost nothing. In Las Vegas if I can overcome the house's 1.4% edge -- if I have a .2 correlation predictor -- I will get extremely wealthy. There are many other situations in which a 4% edge over my competition is worth a very great deal...

And here is another:


You said:

> If I know two individuals well and am asked to predict which is > likely to score higher on an IQ test, then I'll take my impressions; > but if I need to predict success in almost any activity, I'll > predict the high IQ group will do better than the low.

Certainly not "almost any activity". I don't believe IQ is a good indication of success in activities like the arts, writing, sports, and the like. It is just a fair indication of job success, even in technical fields. IQ is an extremely limited measure on the human mind.

Your account on how to make decisions on the data you have is good and sensible. I should know; in my field of expertise, everything is probabilistic, and I have to design stuff and measure it and put it to work based on these probablities.

However, the article in question seems to indicate that, once you have hard data (a score) on a given trait, you should still bias this score depending on the subject's group, and I think that in a social context that is wrong.

I am Mexican, and you can make a lot of fair guesses about me just knowing that. But once you know that I have taken the SAT exam 10 times and scored, let's say 750 average, would you still estimate my real SAT score at somewhere around 600, because the average SAT in Mexico is 500? (I'm just making up the numbers for discussion's sake).

I don't think that is correct.

But it is correct. First, if I know nothing else about two people but IQ, I will predict that the high IQ one will be the winner at anything they try, including sports. Clearly there are measures I would far rather have than IQ for predicting track and field abilities, including age group or sex.

And I would far rather know your SAT score if I must predict your grade or guess your occupation, than know that you are Mexican. I can think of predictions in which your ancestry is irrelevant; but I can also think of predictions in which I want to know a number of things before I am told your ancestry, but if ancestry is all I have, it will still help with prediction.

If I get to take account of ALL the information and you must ignore some of it, and we are betting even money on the outcomes, then over time I will bankrupt you. I don't need a very big edge, either. Bayes Theorem will in fact tell me just how much it is worth to me in money to find out that you are Mexican (assuming that I have the correlations). It may turn out to be worth nothing at all, but without having the numbers I can't say that.

Clearly if I am building regression equations (i.e. predictive formulae) I use the highest correlations first, and as I factor out the variance by using successive variables I will reach a point at which it does me no good to add more; but ideally I want the entire correlation matrix to start with, and if you give me that while you have only your favorite variable or two, once again I'll bankrupt you in betting on guesses over time. It may be unfair, but that's the way the universe is constructed...








This week:



Saturday, April 15, 2000

Finish taxes: a day of short shrift.

It is an honor writing to you. I have a tremendous amount of respect for you and your writings, most especially for your short story/essay compilations. Those books played a major role in the formation of my personal and political beliefs (your intention, I'm sure). I'm writing to ask about Mr. John Campbell's essays and editorial that appear in your books. Have his editorials ever appeared in a single volume? I can't speak for others, but I would treasure a book like that, especially edited and supplemented by you. Another question: Have you considered a year 2000 edition/revision of A Step Further Out ? I first read that book as a teenager, and it fired my interest in math and science. It is an incredible book. Thank you for your time, sir.

Don McLeod- an avid fan

There is at least one collection of Mr. Campbell's letters and some of the editorials, and other such in fanzines. Alas I haven't time to dig into it, but a dip into the on-line manifestation of science fiction fandom will probably find them. I know the Los Angeles Science Fantasy Society library has a pretty complete collection of materials and people there will know more (see ). I keep working on Step Farther Out and Two Steps Farther Out and I believe Jim Baen has them. I haven't heard from him for a while. You could bug him, and I probably should as well.

And more on UWB:

Various correspondents have written a lot about the advantages of ultra-wide-band radio. While I am not contesting those, they are not the whole story. The main disadvantage, at least for civilian applications, is that the devices have to be very low power; if they are not, they interfere with every other use of the radio spectrum, all at once. Thus UWB radar would have a maximum range in the tens of meters; UWB communications would have a maximum range in the hundreds or thousands of meters, depending on the data rate necessary. The site mentioned,, has some numbers:

- 32 kbps for a 900-meter range (using 1.8 mW output power), and

- 5 Mbps for a 10-meter range (using 50 microwatts output power).

Those are measured numbers, so they will get better with time, not worse. What power they will be allowed to emit at is as yet unclear. There is a curious set of comments and replies filed by the company "Time Domain", in which they are among other things begging the FCC to let them radiate as much power intentionally as a Class B computer or a Furby does unintentionally, which is about 50 microwatts. (See

-- Norman Yarvin, 

Fascinating. Thanks!

Backup solution

> Backup Problem? > > I'm thinking that between you and all of your readers, someone must know the answer to the following: > > I'm trying to determine if there are software backup solutions for Windows NT 4.0 patrons such that the device one backs up > TO is a fixed hard drive (or drives), as opposed to removable Zip, JAZ, or optical disks.

Piece of cake. Cheap, too.

Go to and buy PkZip for Command Line. Works w/ 9x, NT, 2k.

A batch file like this sould do the job - use a scheduler or a cron clone to have it kick off whenever you want.

:PKZBAK.BAT a script for selective backup to hard drive. if exist F:\pkzbak.4 del F:\pkzbak.4 if exist F:\pkzbak.3 ren F:\pkzbak.3 pkzbak.4 if exist F:\pkzbak.2 ren F:\pkzbak.2 pkzbak.3 if exist F:\ ren F:\ pkzbak.2 pkzip25 -add -max -dir=root -attr F:\ @C:\utils\list.txt

list.txt contains the file specifications for the files and directories you wish to back up. Here I assume your destination is the root of F: (which you are free to change) and the list of files you wish to back up is in a file C:\utils\list.txt. The PkZip25.exe file must be somewhere in your path (I add C:\utils to my path on all machines I service).

A bonus of this method is you cascade back the previous backups, so if you need to retreive something from the previous three backups, you've got it available.

The .ZIP file generated can be read, viewed and extracted by any archiving/extracting program, but since the PkWare folks make .ZIP the standard, there's no reason not to buy their PkZip for Windows to make retreival easy.

Suggest you either exclude the PKZBAK files from your backup (since they are redundant if you get the source data they come from), of if you do want to back them up, turn OFF compression in your backup (as your backup server will waste a lot of CPU cycles trying to recompress an already-compressed file).

Been using this for a decade, dating back to early DOS days.

--- John Bartley, PC sysadmin, USBC/DO, Portland OR Views expressed are mine own. "We should call this Day One of Year One." RAH to Walter Cronkhite, 1969-07-20

--== Sent via ==-- Share what you know. Learn what you don't.





This week:


read book now









birdline.gif (1428 bytes)