jp.jpg (13389 bytes)Picture of Jerry Pournelle

 

read book now

 

HOME

Current View
VIEW Home
MAIL Home
BOOK Reviews
 
©
This week:
Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday
 

 
Top

CHAOS MANOR MAIL

A SELECTION

Mail: June 7 - 13, 1999

emailblimp.gif (23130 bytes)mailto:jerryp@jerrypournelle.com

CLICK ON THE BLIMP TO SEND MAIL TO ME

The current page will always have the name currentmail.html and may be bookmarked. For previous weeks, go to the MAIL HOME PAGE.

 

Fair warning: some of those previous weeks can take a minute plus to download. After Mail 10, though, they're tamed down a bit.

IF YOU SEND MAIL it may be published; if you want it private SAY SO AT THE TOP of the mail. I try to respect confidences, but there is only me, and this is Chaos Manor. If you want a mail address other than the one from which you sent the mail to appear, PUT THAT AT THE END OF THE LETTER as a signature.

PLEASE DO NOT USE DEEP INDENTATION INCLUDING LAYERS OF BLOCK QUOTES IN MAIL. TABS in mail will also do deep indentations. Use with care or not at all.

I try to answer mail, but mostly I can't get to all of it. I read it all, although not always the instant it comes in. I do have books to write too...  I am reminded of H. P. Lovecraft who slowly starved to death while answering fan mail. 

If you want to send mail that will be published, you don't have to use the formatting instructions you will find when you click here but it will make my life simpler, and your chances of being published better..

This week:
Monday -- Tuesday -- Wednesday -- Thursday -- Friday -- Saturday -- Sunday

HIGHLIGHTS:

©
Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday
 
Top

Monday, June 7, 1999

There is a LOT of mail that needs commenting and this is column day.

 

 

Jerry,

A comment on your thoughts about D-Day and the Apollo Project being the most complex activities ever managed. A key word is missing, that being "CENTRALLY managed". Many far more complex activities go on all the time (e.g. the overall U.S. economy), but they operate under distributed, rather than centralized, decision making structures. I would speculate the Normandy invasion and the moon project are at the upper limit of what can be done centrally, at least without a lot better intelligence enhancement tools than PERT or Gantt charts. An interesting question is whether it makes more sense to continue to develop better tools to support bigger centralized projects, or to, instead, work on formal methodologies which acknowledge and take advantage of the power of decentralized, distributed management. While the former is the path of least resistance, the latter could pay off with big dividends, both in terms of productivity and in the way people think about how the world should operate.

By the way, with regard to your review of Alan Cooper’s new book (The Inmates Are Running the Asylum) last month, while I agree that Alan did not present as strong a case as I would have liked for how design engineering SHOULD be done, I nevertheless think that you have missed the point. As a product manager for many years, I have firsthand experience with most of the pathological behaviors he describes. Using a construction analogy, most software engineers are ‘carpenters’, while management (including product managers like myself) and customers are ‘clients’ who know what they want in the end product but not how to design it. The missing piece in software projects is usually the ‘architect’, who can translate the requirements into detailed blueprints (product specifications). Instead, the carpenters get written specifications, often of dubious quality, and try to translate that into a product, each according to his own past experience and talents. The best, craftsman-level carpenters can often bridge the gap, resulting in an acceptable product. But the process is still flawed, and craftsmen are few and far between. We have to find a better way, and Alan seems at least to have the beginnings of an approach. I am trying to get my company to use his consulting firm in a major product design. If we do, I’ll let you know the outcome.

Apropos your recent experience with the installation program that, to paraphrase your words, doesn’t give you enough control of the process, I would say that the problem is not lack of detailed controls but, rather, a flawed design and insufficient testing. The average user would not want or know how to use that added level of control anyway. A good analogy is the operation of early automobiles, which required the user to manually control engine settings such as spark advance and choke. Over time, these settings were placed under the control of automatic servomechanisms, removing the need for the user to understand or even be aware of them. I suggest that we are in a similar transition state today with our software, which is sufficiently complex that no one but an expert can troubleshoot problems manually, but not yet sufficiently smart in many cases that it can automatically work around these problems.

Things are getting better. My latest copy of NetScape Navigator collects crash information about itself automatically and sends it over the net to NetScape for analysis. Windows notifies me of updates and, more or less, automatically downloads and installs them. I just wish that more companies were putting their engineers onto functions like these, rather than pointless feature creep. How about a PC with a secondary, monitor processor, whose only job was to watch over the main processor and OS and to run fixes, up to an including rebooting, whenever necessary. Surely a 486 or early Pentium could handle this task with little incremental system cost. Just an idea.

On an entirely different subject, I’ve have a question I have been meaning to ask you or Larry for years, although I am sure I am not the first (probably not the 1001st) to ask. What happened to the Heinlein alter ego in FOOTFALL. Did his death accidently get edited out or did I miss something in three readings?

Thanks and keep up the great work,

John DeVries

Manager, Marketing and Business Development

VideoServer Connections

Regarding Cooper's book, you will notice that I made it the book of the month. That's not a pan.

Maybe you ought to hire me to consult on some of your problems...

I'm not sure what you mean with your last question. Footfall was written years ago, when Robert was still alive. We don't intend to change the book now... And I have never admitted that any of the characters in that book have any but coincidental resemblance to actual persons living or dead...

I like your notion of setting a machine to watch the machines, but given the speed of modern hardware, all that is needed is programs. Install Shield would be wonderful if it did what it is supposed to do. There are also all those uninstall programs. They're creeping toward working...

===

Dear Dr. Pournelle:

FYI, I was interested to see in the 26 May 1999 issue of Janes Defense Weekly, in a report describing how the Defense Science Board was looking into new weapons systems, that one of the systems was recognizably Project Thor, which I first read about in your first _There Will Be War_ collection.

Regards,

Steven

[SDunn@logicon.com]

Yes: Thor and Thoth were both concepts a team of us at Boeing devised in the early 60's. I've been advocating them ever since. It looks like at least some of that will be built now. Perhaps.

 

 

 

©
Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday
 
Top

Tuesday, June 8, 1999

Hi Jerry,

I very much enjoyed your article in Byte June 1999. You said that there is no longer any reason to run Win95, now that Win98 2nd Ed. will be available. My question is this: would you recommend running Win98 on a Pentium 133 with 32MB RAM? I’m currently running WFW 3.11 and was preparing to upgrade to Win95, but your strong Win98 recommendation has given me pause.

Any advice would be greatly appreciated.

Thanks very much,

Dave Kantor [dave.kantor@ardentsoftware.com]

Sharon, MA

-Dave.

I don't see any difficulties with that, but given that memory is so cheap, it would do not harm to add another 64 megs. Adding memory is probably the single most significant improvement you can make to a system. SEE ALSO Eric's notes. And a longer and better reply from a reader.

 

Jerry,

You can control power management in Windows 2000 (Beta 3). In the control panel, the Power Options tool lets you set several things about power management. You can set both the monitor and hard disk power down to Never, which I take it is what you want. As far as it goes, the power management features don’t cause any problems with my machine, although I did turn them off (as you’ve said, they are mostly pointless). The hibernate option (in the same place) works very well.

-Jon Dowell

You can turn them off all right, but that doesn't stop the problems I seem to be having. See the column. I generally turn off power management when I can. Unfortunately they are trying to make that harder to do.

 

©
Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday
 
Top

Wednesday June 9, 1999

We are experiencing Front Page problems. Boy are we.

Letter from Bob Thompson, who has done a LOT of work trying to get this site working on Pair.com which has a UNIX server and the FP Extensions:

 

While attempting to relocate your site to another web hosting provider, we've encountered two severe FrontPage-related problems. I'm hoping that some of your readers are experienced with using FP Extensions on UNIX servers and will have workarounds for one or both of these problems:

1. One of the most aggravating problems has been caused by the fact that Microsoft operating systems and applications are not case-sensitive. According to Microsoft, Windows NT is "not case-sensitive, but preserves case." Perhaps so, but that has not been my experience. These problems do not manifest as long as the web server is running NT, but attempting to migrate a web that uses mixed case file and directory names to a web server running UNIX is an exercise in frustration. I wonder if anyone has any suggestions about how to avoid or fix these problems. Obviously, one answer is to use all lower-case file and directory names from the start, but that's not helpful when working with an existing web that uses mixed case.

2. Another very aggravating problem is the fact that the FrontPage publishing process times out on large sites. This occurs during the final phase, "processing web updates for ...". After a minute or so of no activity, FrontPage displays a message that the server has timed out. The same thing occurs if I open the remote web directly in FrontPage and attempt to do a recalculate hyperlinks. Although the answer is perhaps buried in the FP Extensions documention, which I have not read, I can find nowhere in the FP documentation, TechNet, or the Microsoft web site that explains exactly what occurs during this phase. I suspect that text indices and other internal housekeeper matters are fixed during this phase, and it disturbs me that that housekeeping is not getting done.

I know that both these timeouts happen because the CGI Timeout period is set too short, but commercial web hosting companies are loath to increase that period because it serves as a safety measure to prevent a rogue CGI process from sucking down an entire server, which may be hosting 25 or 250 other web sites. Short of convincing the service provider to increase the CGI Timeout period, I wonder if there is anything else that can be done. We have telnet access to our web sites, and I'm hoping that there's a process we can spawn directly on the server that will run to completion without timing out.

Any suggestions will be greatly appreciated.

Robert Bruce Thompson thompson@ttgnet.com http://www.ttgnet.com

 

As it turns out, getting Pair to change the GCI period didn't help a lot. I then wrote to PAIR.COM support:

Attempting to publish to 216.92.73.36 takes forever; after which I get the message that it can’t complete the process.

Details are 500 internal processor error.

I am supposed to tell my administrator, jerryp@pair.com that the front page extensions aren’t working properly and get something done.Since that is me, and I can’t do anything, I fear this won't do much good. I and Robert Bruce Thompson have wasted days on this, and it does not work. Apparently your site doesn’t really do Front Page webs? I have done everything. We downoladed the entire site to here, opened it, and published without changes. It insisted on uploading everything. Hours went by. Got that message finally. Tried again. Minutes go by. Get that message. "Premature end of script headers /usr/local/frontpage/version3.0/apache/fp/_vti_bin/ III I (boldface on error message). Do you HAVE working front page sites at Pair?

 

I got this answer:

Yes, we do have sites here that successfully use FrontPage. Our servers do support FrontPage extensions, though the quality of those extensions is not our direct responsibility, of course.

If you do not feel that our services meet your needs, it is within your right to cancel your account.

Our 30-day guarantee would give you until June 24 to do so. The situation you are seeing is actually due to FrontPage itself having problems. There are over 29 MB worth of files in this single domain’s directory. FrontPage, being somewhat inefficient, loads *all* files for the web site at once when publishing. This becomes something of a memory hog and also makes you wait for a long time. Eventually this hits the web server’s timeout value or gets killed by the system "reaper" script for being an extreme memory hog user.

I’ve doubled the server timeout value so this may help you some, but your only real long-term solutions to this problem would be:

* break up the web site into several smaller sites. Generally FrontPage starts to choke when the site has more than 20 MB of content or so. This is not a limitation on our servers but rather one in FrontPage itself.

* simply make your site smaller (smaller graphics, etc.)

* use an alternate uploading method instead of FrontPage.

I believe you’ll likely find similar server memory usage for large sites on an NT server, though I can’t confirm this personally. Regards, Eric

-- pair Networks’ Support Department

support@pair.com

The Support Forum, online resource center: http://support.pair.com/

 

The difficulty here is that on Darnell's site (http://www.jerrypournelle.com) (207.218.51.2) the identical stuff works, except that we don't have the FP extensions running.

PLEASE DO NOT SEND ME SPECULATIONS on what you think is going on, or comments about how Micro$oft $uck$, or to get a horse. However, if you are familiar with Front Page and know something about this situation, I would appreciate advice.

See VIEW for more discussion.

========

Now for something on Operating systems:

Check out http://www.cryptonomicon.com/beginning.html and download the entire essay to read printed out. It's fairly long but worth the time as it is quite thought-provoking. I'd love to hear your opinion on it, and more detail of your experiences with Win2K.

Based on what I've heard of BeOS from various trusted sources (not just N. Stephenson), and my continued frustration with Microsoft's O/S paradigm, I'll probably put that O/S on the next PC I get for personal use, instead of an eventual gold version of Win2K. Here at our firm, a migration to Win2K will not happen anytime soon (i.e., not for years), even if Win2K is more of an improvement over NT 4.0 than NT 4.0 was over NT 3.51. This is because it would involve replacement of most of our current PC hardware (mostly Pentium Pro 200 MHz HP Vectras bought for use w/NT 3.51) for the best results, not a cheap thing to do.

Chris Pierik

Email: ChrisP@ZBBF.com

It is indeed an interesting article and I commend it to all. Thanks.

Years and years ago I was asked to be part of a Panel with David Bunnell who had just started a magazine to be called "WINDOWS"; Microsoft was introducing Windows 1.0 and I was asked my opinion. I said in effect that I could live with a GUI but didn't have to, and I preferred to keep a command line interface available just in case. Later I found Norton Commander a good compromise; frankly, I could have lived with Commander forever, without anyone's GUI. Ah well.

 

 

 

 

 

©
Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday
 
Top

Thursday June 10, 1999

In regard to Dave Kantor's query: Racing Cow is running Win98 Second Edition (Beta) very well with a 'mere' 48 megabytes and an Evergreen WinChip 200 upgrade. The Evergreen upgrade can be found for well under $100 now. Although for a little more money you could get a lot more CPU. For instance, Fry's is currently offering a K6-2 running at 400 MHz with an AGP motherboard for $170. This is a considerably more intensive installation job but the investment in effort has a payoff comparable to the monetary expenditure.

Even more importantly, a newer motherboard and Win98 will get you into the world of USB, which I give my highest recommendation. The half dozen USB devices I've tried so far all worked flawlessly. If 3D graphics don't get you excited then USB has to be among the top five best innovations added to the PC platform in the last several years.

Unless there is some dread limitation in the hardware or a severe limit on drive space (although with drives heading toward $10 a gig that is a poor excuse ), there simply is no good reason to use Win95 instead of Win98. The difference isn't as dramatic as that between Win3.x and Win9x but very compelling just the same. The same could be said if one goes the Linux route. All the major distributions have recently seen tremendous improvement in usability for regular folks. Given a choice between a year old Linux for free and one of the newest releases at retail I'd say it was worth spending the money.

Eric Pobirs [nbrazil@ix.netcom.com]

Thanks. Racing Cow is a Gateway 2000 Pentium 150, fast for its day. With considerable modification, but still working like a trooper. Eric has it at his place as his main experimental engine. This tells us two things: Gateway certainly built a solid machine, and since Eric updated to Windows 98 before I did, then to 98 2nd Edition, the age of the machine isn't all that important.

Jerry,

I think you overlooked some points in your response to Dave Kantor about putting Win98 2nd Ed. on a Pentium 133 with 32MB RAM, currently running WFW 3.11.

A Pentium 133 is old enough and slow enough that getting a new machine, rather than upgrading should be considered. A few points to consider in deciding which way to go:

CPU Speed:

Does your current system seem fast enough that it could take a performance hit and still keep you happy? Based on my Win 3.x to Win9x experiences, you’ll find some operations faster, some slower. Overall, expect a 10% - 15% reduction in performance for (CPU intensive) DOS &; Win16 applications as compared to Win 3.x. Screen I/O intensive applications will run about the same speed, disk I/O intensive applications a bit faster. You’ll find that late model Win9x applications will be substantially slower than the Win16 applications you’re used to.

Active Desktop is the default shell for Win98. The bad news is that using it will slow your machine down still more, especially if you decide not to add more RAM. The good news is that you can choose to use the original Win95 desktop shell instead, avoiding this overhead at the expense of loosing the (dubious, in my opinion) advantages of Active Desktop.

An overdrive processor, around $100, would make your CPU 1½ to 2x faster. More than enough to cover the Win32 performance penalty, but that increases the cost of the upgrade.

Also, for most Win32 applications, adding more RAM would give you almost as much of a performance boost and cost less.

On the other hand, even a cheap new PC will be at least 2-3 times faster than what you have now. Enough that even with the extra overhead of Win98 and Win32 applications, you’ll notice a nice performance boost.

Disk Space:

How much free disk space do you have? Windows 9x is bigger, a lot bigger than WFW 3.11. It will probably take at least 100Mb more space to get Win98 on you box even though you’ll be getting rid of the WFW files. Upgrading to Win9x applications could easily add several hundred more megabytes to that total. Are you going to be able to do that without a new hard drive?

For new PCs, 3 - 4 Gb is considered a "small" hard drive. Many come with an 8Gb or larger drive. Enough to keep you in free space even given the size of Win98 and Win32 applications.

Memory Type:

Jerry says that adding more RAM is the most cost effective way to improve your system performance under Win98, which is true. He also mentions RAM being cheap, which may or may not be true for an older computer such as yours. Before making a final decision on an upgrade, check on how much memory that fits your machine costs. Prices for old style (30 pin SIMMS) memory are much higher than for late model memory chips. Also, make certain your system has memory slots available for a RAM upgrade, otherwise you’ll need to throw away some of the memory chips currently in the machine, and replace them with larger memory chips. Of course, any memory upgrade increases the overall cost of the update, making the cost differential between an update and a new machine smaller.

Applications:

Are you planning to replace your current 16 bit applications as part of, or after, the upgrade? Some of the best features of Win32, such as long file names, really require 32 bit applications to be usable. If you’re going to be sticking with 16 bit applications, then aside from the utilities bundled with Win98, and applications such as browsers that can be downloaded free from the web, the update isn’t going to do a whole lot for you other than consume disk space and make your machine run slower.

On the other hand, almost all new PC’s come with bundled software. Even if the software bundle doesn’t have everything you need, the number of new programs you’ll want to buy will be a lot less with a new machine than an upgrade. So considering the 32 bit software you’ll want/need is likely to reduce the cost differential between an update and a new machine.

Your time, data, and nerves:

If everything goes smoothly, it’ll take about the same amount of time to upgrade your current machine to Win98 as to unpack and set up a new machine, you’ll probably spend about the same amount of time in either case tinkering and learning after you have a working Win98 system to play with, and the time spent backing up your old system before an upgrade would likely be about the same as the amount of time needed to move your stuff to a new machine.

The question is what happens if something goes wrong. With a new machine the odds of anything going wrong are lower, you’ve got a support staff who, at least in theory, knows exactly what kind of machine they’re dealing with to call on, and you always have the option of sending it back. You also have the comfort of knowing that your old machine is still there and still working just fine. If you upgrade and something goes wrong, you may know more about your machine than anyone else in the world, and the survival of all your data may depend on the quality of your backup software and your backup procedures. Having been there and done that, I can assure you that neither thought is particularly comforting when the box is dead and the jury is still out as to whether you’re going to be able to bring it back to life.

So, is an upgrade a reasonable choice for you?

If your current system has enough disk space for Win98 and whatever 32 bit applications you decide you want. If you don’t think the performance drop associated with moving to Win98 will be a problem.

If the cost of Win98, whatever 32 bit applications you decide to get, and any hardware updates you decide you need is well short of the cost of a new machine.

And you are confident you, or someone you can call on can put things back together if something goes wrong.

Then a Win98 update is the way to go. The machine will be marginal for the newest games and Windows applications, but if you’re not too demanding, will be able to handle pretty much anything currently available for Win9x.

Buying a new machine is a lower risk, albeit more expensive choice. As I pointed out, when you consider the big picture, the cost difference probably won’t be as much as it might seem at first glance. You’ll have noticeably better performance now, and be in much better shape as far as the hardware demands of software a year and a half or two years from know is concerned.

You’ll also have your old machine, which you can donate, or use for something else.

The third alternative is to do nothing. If you buy a new machine, or upgrade your current one, the one thing you can be certain of is that five years from now, whatever you’ve got is going to be obsolete and virtually worthless. If what you’ve got is working fine, and you don’t have a clearly defined need for anything from the Win32 world, then just using what you’ve got for as long as it lasts is a viable option. Just be sure to put something aside for a new computer.

Eventually, something is going to break or wear out and it won’t be cost effective, it may not even be possible, to fix it.

[userchj@conterra.com]

Well, I can't really disagree since I have said all that in print many times and will have it in the book Thompson and I are doing. The best way to upgrade is not to do it: to get a new machine and network to the old one. But I have said that often, and I was in a hurry, and I gave the short answer.

Thanks for reminding me.

 

©
Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday
 
Top

Friday June 11, 1999

 

This is a question that calls for either a long or a short answer, and I haven't time for either this morning. The very short answer is, be sure you have the latest upgrade of your browser, and that you are using a major browser; if you insist on using one of the older and less popular web browsers, this is going to happen, and there is little that can be done other than to avoid those sites. ("Doctor, it hurts when I to this…" "OK, so don't do that.")

 

I don't really expect you to answer this, but I thought you could have someone else answer it. I have been thrown off of several websites because plugins didn't initialize properly. How do I find out what plugins are and where they are and what I did to cause them? Please answer this, thank you in advance, James C. Lasater

James C. Lasater [jamesclasater@mediaone.net]

===

 

Cheap SMP Boards:

http://www6.tomshardware.com/releases/99q2/990609/computex-99-05.html

You could easily build a dual CPU 466 MHz NT system for under $1000.

Eric Pobirs

Tom's Hardware is a good place. And I need a good server...

===

A Linux That Makes Sense

Dr. Pournelle,

I am experienced with running Red Hat Linux as a server of various sorts, but as a workstation desktop OS it is a tad demanding.

I just bought Caldera OpenLinux version 2.2. It cost $30, and came with a $10 rebate.

It loaded first time, plug and play. It automatically configured itself to run KDE Desktop, a Windows GUI, on boot. It comes with WordPerfect 8 and Netscape Communicator, plus all the usual Linux tools. It also comes with a version of Partition Magic and Boot Magic so you can set up a Windows box as dual boot.

You can actually set up a dial up account and make it work within minutes. Ditto the networking.

A huge step forward in moving Linux to the mainstream. And a big improvement over Red Hat.

Donald W. McArthur

http://www.mcarthurweb.com

 

***********************************

"Rehab - just another term for quitter."

variously ascribed

Thanks. I met the Caldera people at COMDEX but I think I have not heard from them again. Alas, things FLOW here and I get busy and forget..

===

Front Page:

For your Netscape, go directly to the FTP area:

ftp://ftp.netscape.com/pub/communicator/4.6/english/windows/windows95_or_nt/

 

Here are some Microsoft links that may or may not be relevent:

"Search on UNIX Returns No Documents Found"

http://support.microsoft.com/support/kb/articles/q219/2/60.asp

 

Sorry to say that I can’t help you on Front page otherwise..

I personally find I prefer other editors and site managers where the editing of the html is more straight ahead.

Without going into the usual rants, check out, if you have the time and

interest

http://www.sharat.co.il/fpage/

The author points out certain items where FrontPage code is far more "expansive" than non-frontpage code, with an example of how 395 bytes of standard html is expanded into 1968 bytes using front page extensions, etc.

This all goes along the line of the more moving parts a thing has, the easier things break, etc.

I try to follow the "kiss’ principal ....

[shrug]

Mike \ Z" michaelz@alphasoftware.com

I do too. Using Front Page was in part an act of faith: Microsoft bought the company from people I knew; they kept most of the Vermeer team that built the product; and the goal was to integrate with Office and NT. A year ago this seemed like a good idea.

Now I am not so sure, but I have a lot of sunk cost here; I would hate to rebuild this site page at a time. If I were to change I would need a program that can import Front Page webs.

Well, I keep trying… Thanks. Meanwhile, I'd like to try sub webs and child webs and like that, but the documentation I have on how to do that is tantalizing: it says it can be done but it doesn't say HOW to do it. Alas.

======

THE OPERATING SYSTEM WARS

Paul Schindler wrote an editorial column at BYTE.com after having dinner here, largely around my remark that it wasn't that Microsoft was so clever or greedy as that Microsoft's only possible competitors threw in the towel without ever fighting. It generated a fair amount of mail, including this from someone whom I would guess was a leader of Team OS/2. TEAM OS/2 had a practice of intimidating press who wrote anything other than fulsome praise for OS/2, so that many press people found it so painful to write about that they ignored it entirely. I recall one of my BYTE colleagues who was denounced for what was in fact a column recommending OS/2, but which didn't say enough bad things about its rivals. In any event we received this letter:

 

Paul,

  Your and Pournelle’s views are those of a blind man, a drunk, or an  idiot.  Choose the one the fits you best. The point is that in the   vacuum-shrunk,  competition-free environment which its unregulated monopoly provides and  allows MS to extend, it doesn’t matter what the "competition" does. They  won’t get anywhere.    One example. If IBM cannot get OS/2 preloaded, the maximum market share  they  can achieve is ~10 percent: that being the share capable of installing  their  own OS. Don’t burn out any unused or broken brain cells trying to  puzzle out   why, just take my word for it.    Now if the cap on their share is 10%, it doesn’t matter how they wiggle  on the  hook, they can’t win. There isn’t any way for them to win. MS wins.  Not by  marketing, which you bone-headed idiots in the tres duh press have long  given  the credit for MS’s "victories," not by trying harder, not by pricing.  But by   virtue of their Mafiaesque death-grip on the primary means of  distribution of   the OS.

  Read the frigging transcripts, stupid. Or get a copy editor to read it  to you.  Whatever it takes, get this through that dense vault of stone that  surrounds  your pea-sized smaller than baud gave an animal cracker brain. Nobody  but  nobody but nobody could get preloaded on major OEMs but MS. Even IBM  was threatened  with the death penalty if they dared to compete.

  E. E. Cummings had you in mind when he wrote:

  "All ignorance toboggans

into know. Then trudges

up the hill again."

  Yours,

etc

  Joe Barr

The Dweebspeak Primer   

***********************************************************************

* Joe Barr The Dweebspeak Primer *

* joe@pjprimer.com http://www.pjprimer.com *

* "The hottest places in hell are reserved for those who, in a *

* period of moral crisis, maintain their neutrality." Dante *   *********************************************************************** 

I wrote a reply which I sent to him:

 

Thank you for your kind and well reasoned letter. Alas it turns out not to be the case.

IBM didn’t even ship OS/2 on their own PC’s, and yet they had a rather large market share in the PC world in those days. You could buy an IBM configured with OS/2 but it was difficult because they often shipped with CDROM that had no OS/2 drivers, and in general there was no sound card; and getting a sound card to work properly with OS/2 was very difficult.

I know because I was trying to configure an IBM-supplied ValuePoint with CDROM and sound card, and I was on the telephone to the most senior OS/2 programmers and product managers. Getting that machine working properly and wired into my network was a nightmare (at first there wasn’t any network capability at all: Microsoft brought out Windows for Work Groups months before OS/2 had any networking capability).

You probably don’t know this, but John Dvorak and I both were strong supporters of OS/2 and stuck with it long after everyone else abandoned the product; and I can tell you from quite bitter experience that the big problem with OS/2 was IBM management. They had brilliant people working on the product, but any time they got any intelligent people in the Executive class they fired them or moved them to another product line.

At IBM product managers had to be Executives, and only Executives were supposed to talk to the press. The Executives generally did not know anything about the products, and the programmers and engineers who talked to me—many did for they loved their products—risked their jobs by doing so.

I could write a book about the mistakes that IBM made in those days; meanwhile Apple was being run for quick profit rather than market share.

Microsoft’s aggressive discounts—you got a big break on the cost of Windows if you did a "per processor" deal agreement with them, and while that "per processor" deal was a discount, not a contract, the discounts were so high that many manufacturers went with it—had a lot to do with their success. They went for market share rather than immediate profit. Unlike Apple. Meanwhile they had Excel and Word for the Mac and made about as much profit off each Mac sale as they did off each PC with Windows sold.

Gates used to say "In 1989 I went to every software developer in the world and asked them to write applications for Microsoft Windows. They wouldn’t do it. So I went to the Microsoft Applications Group, and they didn’t have that option."

I know first hand that Gates personally tried to get major applications developers to write for Windows; the result was to force Microsoft into suites and aggressive applications marketing in order to have something to run on Windows.

At the same time, IBM was proud of charging "only" $640 for their Driver Developer Kit (one CDROM). Microsoft was at the same time dropping Driver Developer Kits from airplanes; you could not get near a Microsoft booth at a show without being given a Driver Developer kit CD. The result was that most CDROM and Sound Cards worked with Windows but not with OS/2. And since most people wanted sound and a CDROM for their system, the result was inevitable.

If you believe I am making this up, check with David Barnes, who was fired from IBM for working closely with Dvorak and me in trying to get OS/2 to take off. There are others still at IBM I can’t name because they broke company policy by talking with press when no Executive was present.

I have never heard of a Microsoft software engineer of any rank from junior to senior getting in trouble for talking to the press, and some of them used to come here, with or without Wagged account executives.

I have written this long because it might as well be written up. I trust you have no objection to our publishing your kind and thoughtful letter with my reply?

To which I received this reply:

 

Yes, publish the letter if you like. I am going to print a complete rebuttal to Paul’s editorial and to your letter. I will copy other members of the press who are currently swept up in the rewrite of history vis a vis MS and IBM.

I know David Barnes. He was here in Austin when he wasn’t on the road. I know some of the IBM exec’s, like the first one to testify in the current court case. I’ve written about how IBM marketing was wholesale while Redmond was retail.

But it was never IBM management that made OS/2 popular in the first place, popular enough to outsell Windows at retail, by the way. And Jerry, as I recall, you not only had IBM people come to your house to help you install OS/2 (wasn’t my friend Dave Whittle one of them?), you had to have MS assistance to get Windows 3.1 installed?

You did not refute my point about the deathgrip MS held (continues to hold) on the preload market. Here is another nail in your argument’s coffin: it was IBM, not MS, who made the most brilliant marketing move of the decade. And it had to do with OS/2.

But you’ll have to wait for my formal rebuttal to find out what it was.

See ya,

Joe Barr

The Dweebspeak Primer

I wait with abated breath for the formal refutation, which I hope is as polite and well reasoned as the first letter we received.

And it continues to be the case that IBM did not preload OS/2 even on IBM systems for quite a while into the operating systems war, even though IBM declared war on Microsoft at the big OS/2 Charting The Course For The Future meeting in Redmond (the first press gathering at the new Redmond campus). The real fact is that of the possible rivals to Microsoft, none acted with even minimal competence if their goal was to make inroads in the operating system business.

IBM thought it was a hardware company, and never thought that computers would become commodity items with tiny profit margins on the hardware. Thus it concentrated on owning the memory business, and used its political clout to build cartels to keep memory prices high. That worked, for a while, but like all cartels, even those put together by the US government and enforced by US import regulations and anti-dumping actions, it failed; and computer components and then computers became high volume small profit margin items. As classical economics would dictate.

Apple wanted to be IBM rather than Microsoft. Of course at the time it made that decision, no one realized that setting standards and becoming the operating system company would be such a fantastic win.

I am always gratified when people leap to the conclusion that I can't read (and need a copy editor to read to me) and that I haven't been around long enough to know what is happening. Incidentally, yes, I had real problems with Windows 3.1; the difference between Windows and OS/2 is that when I would report a serious Windows problem, then it would be fixed almost instantly if it had widespread implications; if it was of interest only to a few specialty users, it would be a while. With IBM all errors were taken seriously, but they also took a long time (months) to fix. Microsoft was moving on a much shorter cycle than IBM. The result showed.

And IBM continued to charge money for the Driver Development Kit until I convinced one of the few company Executives that this was a bad idea. Most said "Why should we give it away? It cost us a lot of money to write that." I wish I were making that up.

In any event, this gives a good flavor of what it was like back in the days when IBM was still a player. (To be fair, IBM is still the premiere mainframe maker, and is about to announce a major Linux initiative at PC Expo. There's a dance in the old girl yet.) If I denounced Microsoft they sent people down here to try to figure out what the problem was. IBM and Team OS/2 had a somewhat different strategy. I believe Guy Kawasaki calls it guerrilla marketing.

But I don't understand  what, precisely, is the point here; is it that as a member of the press it is all my fault? Or that IBM did all things well, and Microsoft, a much smaller company, bullied Big Blue? Ah well.

 

Jerry:

I agree with what you write about IBM’s inability to build a market through relative incompetence at the top. I do feel, however, that MS has / had put into place several severe barriers to anyone who wanted to pre-load other operating systems. I don’t claim that these are "Illegal," but they do qualify as a rather sneaky and underhanded way to make the PC Manufacturer stick to MS. In my time working for one of the top 3 direct manufacturers, I was aware of two contractual items that were, in my mind, designed around keeping any other OS from shipping on a PC.

1: If you were contracted to ship the MS OS on every PC, you had free site licensing for every PC in your company. This could, and did, amount to a substantial savings for the company in question. This sounds great, doesn’t it? Until you read the fine print that if you sold a PC with any other OS pre-installed (just one would do), you would have to pay for all of that software you have been freely loading and using, that removing it from use was not enough, you would still owe for all that previous use. That means that the first OS/2 system to ship out could end up costing the company from tens of thousands of dollars to who knows how much? (In our case, about 1,500 copies of Office and the OS) That one line in the contract was the deathbed for any thoughts of putting OS/2 on a PC.

2: The more well known clause of having to pay MS for every PC shipped, whether it shipped with MS OS or not. This was not nearly as damaging to other efforts as the above, but does lead one to believe that MS had a good team of lawyers working on those contracts. Must have been some sort of new tactic of "Marketing by Legal Expertise."

I believe that when these items came to light in the major press, MS allowed companies to re-negotiate the contracts and these items were removed. But the damage was, in all likelihood, already done.

I too found OS/2 to be better in most ways, and tried to stand by it. But in the end I had to give up and smile knowingly while several friends stuck to the bandwagon and suffered through more months and years of getting slapped around by Big Blue. In the meantime my soundcard worked, my games played, my video card had all the right colors. Oh well.....

PS Note to Linux community... Tactics like the letter from Joe Baar were far too typical of the kind of venom OS/2 "Teamsters" were putting out. I am not sure who they were wanting to impress, but the fact is, I liked OS/2 but thought little of them. They would have been better served helping people get it running than running their mouths.

Name withheld by request.

I don't always publish name withheld letters and never anonymous ones. In this case there are reasons.

Of course Microsoft was making it hard on competitors. That is the nature of business in a free enterprise system. The market determines winners, and sometimes that is luck, but sometimes people make their own luck.

The real point here is that while Microsoft was giving aggressive discounts to gain market share, IBM was trying to make their Driver Development Kit a profit item. Apple went for profit rather than market share. So did IBM. In both cases they went for hardware profit first; hardware market share second; software profit third; and only finally for software market share. The Microsoft strategy was to reverse those, go for software market share, and only do hardware in peripherals and then only where there was a substantial software element (such as with mice, and briefly for sound cards although few remember the 8-bit audio Microsoft Business Sound Card with no joystick port…)

Clearly it worked. Nor was such a strategy illegal per se, nor can I see that their particular tactics were illegal. Microsoft stepped far over the line with the Stac situation, but the irony is that Stac probably made more out of the settlement than they would have made in profits had not all that happened.

As to "sneaky and underhanded" those are odd terms to apply to businesses, but if you begin applying them there are many places to stick them. Netscape's games have been so described for that matter.

And Netscape certainly had a big head start, and used the wrong tactics in dealing with Microsoft.

No one I know claims that Microsoft will ever be confused with the Order of St. Francis of Assisi, but that is hardly the same as saying that what they did was illegal. And let me repeat: IBM started the war at Microsoft's own conference, OS/2, Charting the Course for the Future. The briefing books we got were full of OS/2 and Presentation Manager, but after IBM essentially walked out of the show, Gates began telling people that if you wanted things to work with OS/2, you should get them running on Windows first. That line never appears in the briefing books: it came only after IBM walked out of the conference with some lame excuse.

I have often thought that one requirement for elevation to Executive status in IBM was a lobotomy: at least in the PC ("Entry Systems"  which tells you volumes about their understanding of the future) Division that was true. They had two executives, one male and one female, involved in OS/2 during its whole history who were intelligent people. The others radiated misunderstanding while proclaiming that only they could speak for the company.

Many of us tried for years to keep viable some alternatives to Microsoft. In every case Microsoft's enemies clawed and stabbed each other -- recall the furor over pull-down as opposed to drop-down menus which clobbered Atari's DOS? Recall the Amiga fiascos? And it remains the case that it was much much harder to get an IBM PC with OS/2 than an IBM PC with Windows; and when you did get an OS/2 system you were relegated to applications software that just didn't keep up with Windows. Bugs, crashes, and all it was in most cases just easier to get your work done with Windows than with OS/2. I know. I kept both going here and worked with both, and I was rooting for OS/2 which I thought had more potential.

IBM's executives made it difficult, and Team OS/2 often made it unpleasant to write about OS/2; and eventually almost no one did.

Pity.

It does seem to me that IBM had a bit of marketing clout back in those days; so how did this small software outfit get such a strangle hold if IBM acted competently?

====

And now for something competely different--

 

Hi Jerry,

I really enjoy your site. It’s one of the few on the web that actually seems to have real content.

I thought you and you readers might be interested in this posting on the Los Alamos web site. It’s a new theory on producing a microscopic warp bubble.

The link is http://xxx.lanl.gov/abs/gr-qc/9905084

Take care,

 

Al Carnali (acarnali@tiac.net)

Thanks. Not sure what to make of that...

===

Mark Huth, physician and reader, has a problem. I sure can't help. Can anyone?

 

Jerry, perhaps I can ask you to ask your readers to help me with a horrid ISDN problem.

I’ve had a stable ISDN circuit running over a Cisco router for almost 3 years.

Two weeks ago, we had a power failure which lasted 7 hours. When the power came back on my ISDN circuit was dead. Now my equipment is all on protected circuits, so I felt the problem was likely a US West problem. However, I tested the router anyway and it routed on my network but couldn’t find the ISDN lines. (Failed the loopback test and couldn’t find any B channel)

Called US West and they sent a tech out to check the line. He hooked up a blue box (a CPI) to it and both circuits worked. Problem must be the router. Called Cisco (stunning tech support!!!) and they spend hours on the phone with me and walk me through an extensive check. Conclusion is that the router worked. OK, US West says good circuit, Cisco says good router. So I picked up a new router (Netgear). Same error. US West back to test the circuit, good circuit.

Dragged the routers to my ISPs site and found that both routers worked (hooked to the same phone company switch I use). ISP gives me a working Motorola Bitsurfer.

Tested each router and the Motorola on my home ISDN circuit. All equipment fails to find the B channel.

US West senior fella in Seattle looks at my circuit long distance and finds that there may be a problem in the "circuit pack". They replace 2 circuit packs. They are sure I’ve got a good circuit. All routers continue to fail. Cisco baffled, US West baffled. Netgear baffled. US West basically tells me that it isn’t their problem and should bother someone else. Cisco says that they are positive that it is a US West problem, but can’t put their finger on what the problem could be. I’m somewhere below baffled. I sit down and work on the problem for several hours and by looking at dumps of the Netgear router log files discover that in fact that the router is connecting to US West switch, but that for some reason it can’t seem to maintain the connection. Discuss the situation with senior US West tech guy and he can’t think of any way that it could be their problem. Sigh. I go to bed.

Wake up the next day, don’t change anything on the Netgear router and try it again and guess what, it works. It works for 3 days without problem. Passes logon tests, connects to ISP, etc.

The saga is almost over. Attempted to log on today and I can’t get on again. The router again tells me it can’t find a B channel and it fails the loopback test (which it had passed) But as a lark I set the router to multilink (use both channels all the time and it logs on without fail).

My suspicion is that US West has something flakey, but what? I don’t understand anything about how the phone company makes ISDN lines work.

Does anyone have any clues?

Mark Huth [mhuth@mind.net]

This is certainly well beyond my level of competence. Does anyone have a clue?

 

 

 

 

©
Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday
 
Top

Saturday

 

Russell Kay, one of my colleagues at the old BYTE, reports:

 

I thought you might find this amusing:

http://www.computerworld.com/home/print.nsf/all/990531rk

Russell Kay [Russell_Kay@cw.com]

Russell has his own section on this Chaos Manor site.

Thanks, Russ. It's always good to hear from you. "Think of it as evolution in action" was of course a slogan used in the New York Times bestseller Oath of Fealty by Larry Niven and Jerry Pournelle. It's still a pretty good book...

===

 

Solid State Hard Drives

Your long-ago prediction about hard drives going solid state may be close to coming true.

From:

Hitachi, long a sleeping giant, may be about to wake up.

http://www.forbes.com/forbes/99/0614/6312064a.htm

 

"Further on the horizon, Hitachi engineers are excited by a new type of memory they think will have many multimedia applications, notably in devices like digital cameras. It may eventually be able to hold 4 gigabytes of memory on a single chip, be as fast as a dynamic random access chip, use less power and retain information when turned off. If (a big if) no technical hitches pop up, commercial production will begin in 2005."

Jim

Jvalenti@telerama.com

 

-= If at first you don’t succeed, then don’t try sky diving =-

-= bad music imitates; great music steals =-

Windows95 (noun)

32 bit extensions for a 16 bit patch to an 8 bit operating system originally coded for a 4 bit microprocessor, written by a 2 bit company that can’t stand 1 bit of competition.

Telerama [jvalenti@telerama.com]

You remind me of the worst prediction I ever made: back in 1980 I said "silicon is cheaper than iron" and solid state mass storage would replace spinning metal as the main mass storage devices. Hard disks were a passing fancy.

I was wrong for interesting reasons, but of course over the long haul I was right…

====

 

Here is an announcement and its cover letter; it should be self-explanatory. If you are or ever were part of a Science Talent Search, National Science Fair, or other such, READ THIS!

 

Dear Jerry--

May be you can help out here. I know over the years you've been a big supporter of all the Science Education Programs that Science Service, and NSF did and are doing to help do a better job in Science Education. Well it appears Science Service is in a small pickle. (Here's the real skinny on what's happening.)

I volunteered to help Science Service in locating literally hundreds (if not thousands) of missing Alumni who were Finalists, Winners or Participants of The International Science Fair, The National Science Fair, and/or Science Talent Search. (Once known as the Westinghouse, now sponsored by Intel.) From my conversations with the folks at Science Service in the past couple months, Science Service has completely lost contact with over 90% of these people who have participated or won over the years. This also included a couple Nobel Laureates as well. <grin>

What Science Service wants to do is form an Alumni Association of previous participants in order to foster some networking and mentoring to the schools today. But most important is to re-establish a contact with these people.

So, what do I need you to do?

I need the following note or some variation, to be posted on your website. Please. From my perspective, it's possible that there are a few of "Missing in Action Alumni" hanging out as regular readers on your site. If we get lucky, a note on your website just might round up a few strays we would miss elsewhere.

Plus your group of readers are fairly intelligent and know how move a message out to other sites on the Net. (I call this the Grass Roots Approach to Distributive Computing. ;-)

In my conversations with Tzeitel Fetter, the PR person for Science Service, we agreed that a volunteer effort will get them a better response (coming from a grass root level,) vs a massive professional PR campaign.

Do what you can. Anything positive would be appreciated. If you have any questions, I know you won't be shy about asking. Thanks a bunch. I owe you on this one.

Mary Lu Wehmeier

------- Message Text we've been posting. Feel Free to do your own variations-----

Hello there!

We're looking for some folks who appear to have disappeared off the planet. These people were once in their youth, Finalists, Winners or Participants of The International Science Fair, (formerly known as) The National Science Fair, and/or Science Talent Search Scholarship Program. If you, or you know of anyone, that fits this description please read on.

Science Talent Service is looking for its missing Alumni!

Can you answer yes to any of the following questions:

* Did you ever attend the International Science and Engineering Fair as a student?

* Or the National Science Fair?

* Were you ever a former Science Talent Search semi-finalist or finalist?

* Do you know someone who could answer yes to any of the above questions?

If you answered yes to any of these questions, we need to hear from you. Science Service, who administrated these programs over the years has lost contact with many of our Alumni. Now Science Service is trying to reunite all past participants of these programs and form an Alumni Association of our group.

If you or someone you know of someone who qualifies, please have them personally contact Tzeitel Fetter at Science Service Tel:(202) 785-2255 or email develop@sciserv.org.

Mary Wehmeier, ISEF '71 &; '72, STS '72
Wizop Broadcast Professionals Forum on CompuServe
Email: MWehmeier@csi.com

(Reposted with permission.)

Please feel free to pass this information along to anyone or group that could assist us in finding these people. Thanks!

How to lose people….

But it happens.

====

Niven forwarded this to me. I have no idea where it came from:

 

INSTRUCTIONS FOR GIVING YOUR CAT A PILL

1. Pick cat up and cradle it in the crook of your left arm as if holding a baby. Position right forefinger and thumb on either side of cat’s mouth and gently apply pressure to cheeks while holding pill in right hand. As cat opens mouth, pop pill into mouth. Allow cat to close mouth and swallow.

2. Retrieve pill from floor and cat from behind sofa. Cradle cat in left arm and repeat process.

3. Retrieve cat from bedroom, and throw soggy pill away. Take new pill from foil wrap, cradle cat in left arm holding rear paws tightly with left hand. Force jaws open and push pill to back of mouth with right forefinger. Hold mouth shut for a count of 10.

5. Retrieve pill from goldfish bowl and cat from top of wardrobe. Call spouse from garden.

6. Kneel on floor with cat wedged firmly between knees, holding front and rear paws. Ignore low growls emitted by cat. Get spouse to hold cat’s head firmly with one hand while forcing wooden ruler into mouth. Drop pill down ruler and rub cat’s throat vigorously.

7. Retrieve cat from curtain rail, get another pill from foil wrap.

Make note to buy new ruler and repair curtains. Carefully sweep shattered figurines from hearth and set to one side for gluing later.

8. Wrap cat in large towel and get spouse to lie on cat with its head just visible from below spouse’s armpit. Put pill in end of drinking straw, force cat’s mouth open with pencil and blow down drinking straw.

9. Check label to make sure pill not harmful to humans, drink glass of water to take taste away. Apply band-aid to spouse’s forearm and remove blood from carpet with cold water and soap.

10. Retrieve cat from neighbor’s shed. Get another pill. Place cat in cupboard and close door onto neck to leave head showing. Force mouth open with dessert spoon. Flick pill down throat with elastic band.

11. Fetch screwdriver from garage and put door back on hinges. Apply cold compress to cheek and check records for date of last tetanus shot. Throw T-shirt away and fetch new one from bedroom.

12. Ring fire brigade to retrieve cat from tree across the road.

Apologize to neighbor who crashed into fence while swerving to avoid cat. Take last pill from foil wrap.

13. Tie cat’s front paws to rear paws with garden twine and bind tightly to leg of dining table. Find heavy duty pruning gloves from shed. Force cat’s mouth open with small spanner. Push pill into mouth followed by large piece of fillet steak. Hold head vertically and pour ½ pint of water down throat to wash pill down.

14. Get spouse to drive you to emergency room; sit quietly while doctor stitches fingers and forearm and removes pill remnants from right eye.

Stop by furniture shop on way home to order new table.

15. Arrange for vet to make a housecall.

(See also Cat Bathing as a Martial Art.)

===

A minor quibble

Glad to help. I hope you’ll let me know how it all works out (or at least write it up in View).

One quibble: In View today you write <<but since NT isn’t capable of the server efficiency of UNIX systems>> [along with complaints about Unix and case-sensitivity of filenames].

I’m not sure that’s true, since at the Motley Fool we support hundreds of thousands of users daily on NT-based web and database servers. The real problem is that Unix (and Linux) filenames are case-sensitive in file names. This is a problem on all Unix web servers, not just those that use FrontPage extensions.

What a concept: case insensitive filenames! VMS does it, DOS did it, Windows does it, even the Amiga does it. THE CASE OF FILENAMES SHOULD NOT MATTER! (sorry to shout :) ).

Unix purists may disagree; to me saying that "Myfile", "MyFile", and "myfile" can be three completely different files (any of which can be executables, documents, or graphics) is simply madness. That’s like saying that carl, Carl and CARL are three different people!

Just My Humble Opinion.

-Carl

carll@fool.com

Actually I agree. And I guess I have been brainwashed by the UNIX people about its efficiency vs. NT. I get that lecture daily…

Boy do I agree that the case of file names ought not matter!

Thanks. I'll keep plugging.

 

 

 

©
Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday
 
Top

Sunday June 13, 1999

A dialogue on CD/R

The following dialog is based on private mail exchanged between Thompson and me, with irrelevant parts removed:

 

[RBT]: Incidentally, I've now joined the ARRRR! brigade. I ordered a Smart &; Friendly 4X2X24 burner from PC Connection yesterday afternoon, and installed it this morning. I've spent the day (and a bunch of blanks) trying to burn a good copy of the Office 2000 CD that I begged from my friend to hold me until WaggEd sends me a real copy.

My friend’s "master" is itself a CD-R. We've tried burning copies of it in his HP 7100, his HP 8100, and now my S&;F. We can't get a good copy to burn from his CD-R "master", although we can copy the files freely from it to the hard drive or install directly from the "master." I've tried everything: CD-to-CD/R at 4X, CD-to-CD/R at 2X, CD-to-CD/R at 1X, CD-to-disk image-to-CD/R, etc. Nothing works.

I'm using S&;F branded blanks certified for 4X. The drive is installed and configured properly. I called S&;F tech support. He ran through a couple of things with me to verify that the drive itself was okay, which it is. I just can't copy that damned CD-R disk. He says he's never had much success copying CD-R to CD-R, and always uses a disk image as an intermediate. Have you ever encountered something like this? We need to tell readers if this is a common thing.

* * * * *

[JEP]: In the old days we had problems with CD’s but not recently. I don’t know Smart and Friendly but I’d say you have a bad one. I presume you are using Adaptec’s CD Creator program?

There was a time when it was smarter to make a disk image then burn, but I haven't had that difficulty in a long time. But I do stick with the big name CD R systems.

What error messages are you getting? If underflow then the thing you read from is too slow.

* * * * *

 

[RBT]: It's not the drive. I used Easy CD Creator to drag some files off the hard disk and drop them to a new CD-R blank. That worked fine, and I'm able to access that CD-R disk and its files on other systems. There's apparently something wrong with the source CD-R disk that refuses to let it be copied. I'm not getting any messages at all: CD Creator tells me the disk was copied successfully. It just won't read afterwards. I think I'll do a test copy of a real distribution CD or something, but I'm pretty sure that'll work fine.

* * * * *

[JEP]: I've had that happen once or twice too, making me wonder if there is some trick involved to make for copy protection. But if so I don't know it.

* * * * *

 

 

[RBT]: Well, from what Smart &; Friendly tech support tells me, not being able to copy from a CD-R source is not at all unusual. I've just wrung out the drive a bit more, and it seems to be working fine.

I xcopied 500 MB/5,400 files of archive data from the server down to the system with the CD-R drive, assuming that there's no way I wouldn't get an underrun if I attempted to use files on a network drive as a source. I dragged and dropped those 500 MB onto the CD-R disk I'd used yesterday while I was on the phone with S&;F tech support. He'd had me copy about 10 MB to that disk, so I decided to see what would happen if I tried to add to it. Everything went fine. Once the CD finished burning, I deleted the archive directory from the hard drive, and used Windows to do a copy and paste from the CD-R to the hard drive. All files copied normally.

I'm going to knock off a couple of audio CDs and a data CD or two just to see what happens. It's a whole lot easier to experiment with $0.97 blanks than it was back when I first tried this technology and blanks were $12 each.

* * * * *

[JEP]: Making a copy of a copy depends on the Reader CD of course; if the lens is at all dirty or if the electronics are at all slow, you will get lots of retries, and underflow results.

I have never been able to copy a CD over a network. I get underflow every time. I gave up trying.

In fact I usually have SCSI source CDROM and SCSI burner. Probably they're better now, and I have used a DVD IDE source with SCSI burner in the same machine. But I never try over a net. If I have to do that I make a disk image first.

* * * * *

 

[RBT]: I have been doing a fair amount of research on this issue. Did you know that a CD-R is *never* an exact copy of the source? I had been under the impression that burning a CD-R was a straight digital copy like using diskcopy to copy a diskette. Not so. In fact, it more resembles an analog copy like a photocopy or an audiotape copy, with all the generational degradation that implies. Each generation introduces additional errors. Those are not visible at the gross level, because error correction allows succeeding generations to continue to deliver perfect digital data, up to a point. But when the error correction stuff is swamped by generational copy errors, the disk itself or some files on it may be unreadable.

My conclusion is that a CD-R source disk may be marginal but within limits for direct reading. But using that CD-R as a source disk with the inevitable errors that arise during the copy operation may render the target CD-R unusable. The only real solution is to use a bit-by-bit compare or CRC compare utility to compare the source against the destination. Rule of thumb: copy only from original CDs.

Obviously, all this has serious implications for people who depend on CD-R for backing up or archiving data.

* * * * *

[JEP]: You clearly know more than me, and this is good, but I thought bits is bits. If I make a disk copy of all the files on a CDROM surely I can then burn an exact copy of that?

* * * * *

 

[RBT]: No. I cleaned up what I sent you and will post it tomorrow on my journal page.

Basically, there is an underlying proprietary data structure on a CD-ROM or CD-R disk. They don't use bytes natively, for example. They encode 8-to-14 bits (as I recall) and add a bunch of CRC information. The drive electronics themselves convert to and from that underlying data structure. During a copy operation, some of the underlying data doesn't get copied exactly. Although the intermediate stages are digital and error-free, both the initial read operation and the final write operation are basically analog processes. That's okay, because there's so much redundancy and CRC information that the drive electronics can reconstruct the byte properly.

But with succeeding generations of copies, the cumulative errors can cause some data to become so damaged that the CRC information can't reconstruct the original byte. On the first generation copy, if you've copied 650 MB of binary data to the CD-R, chances are that every single byte will be readable. If you then copy that copy, errors that occur during the second copy process accumulate on top of the original errors. Eventually (in as little as two copy operations), one or more bytes may become unreadable. That may easily go unnoticed, but if it's in a binary file, a directory file, or some similar critical file, one or more files may be unusable.

My understanding of this whole issue is and is likely to remain imperfect. Perhaps some of your readers can comment on it.

* * * * *

[JEP]: Write up what you have found, and I will post it with a request for information. We'll learn enough to write a book chapter...

* * * * *

 

[RBT]: Oops. I didn't read your message carefully enough.

Yes, you can burn an exact copy of a disk file to a CD-R. The issue rears its head only when you're trying to burn directly from a CD-R source to another CD-R disk.

* * * * *

[JEP]: As I suspected If it creates a digital file then bits is bits... But it is well worth writing up, and getting comments.

Bob Thompson also reports:

This from Richard Michael Todd [rmtodd@mailhost.ecn.ou.edu <mailto:rmtodd@mailhost.ecn.ou.edu>], referring to yesterday’s notes:

Today on your notes page you wrote:

Here’s an amazing true fact. Did you know that when you copy a CD to a CD-R, the result is *never* an exact copy of the source? I had been under the impression that the process of burning a CD-R was a straight digital copy operation like using diskcopy to copy a diskette. Not so. In fact, it more resembles an analog copy like photocopying a photocopy or duping a VCR tape, with all the generational degradation that implies. Each generation introduces additional errors.

Um, you’re kidding, right? All a CD-R drive gets feed is the user-level data; the Reed/Solomon ECC codes and the 8/14 (IIRC) low-level channel codes get done by the drive. I seriously doubt CD-ROM drives even have a way of letting the user see the low-level coded datastream. Well, that’s what I thought. It seemed reasonable to me that when copying a data CD, the information would be supplied byte-wise from the source CD. I mean, that’s what has to happen when you list a directory or run a program directly from a CD, right?

Apparently, that’s not the case when copying CDs, however. See Section [3-18] at <http://technology.niagarac.on.ca/courses/comp465/cdrfaq1.htm> for more details. Unless I misunderstand this entirely, it appears that when doing a CD-R copy of a CD source, the CD source is supplying a raw bitstream rather than formatted and corrected data. If I’m understanding this wrong, please tell me.

None of this makes a lot of sense to me (JEP); I thought bits is bits, and digitals is digitals, and I am astonished to find that CDROM isn't quite like anything else. I suppose I shouldn't be; it always was a strange technology. For more:

=============

 

Hi Jerry,

I enjoyed your page on the rollout of Rotary’s ATV. The Roton is the one of most exciting developments in space technology and its great to see them making steady progress.

Thanks for your active support over the years in this area, expecially for helping get the DC-X off the ground, so to speak. When I first read about the DC-X a couple of years before it flew, I knew immediately that it was going to bring about a paradigm shift in thinking about space launch. I don’t believe we would be seeing projects ranging from the Roton to the X-33 without it.

I thought you might like to checkout the RLV web pages I’ve developed. See

www.hobbyspace.com/Links/RLVCountdown.html

for links and information on RLV projects. I’ve added several links there to your site.

In addition,

www.hobbyspace.com/Links/RLVNews.html

provides news items that I’ve picked up from various sources concerning RLVs.

These pages are part of the HobbySpace site that is devoted to space related hobbies and activities. It has about 30 sections ranging from activism to collecting to history to satellite building to space tourism.

I’ve been in the process of developing a "Solar Sci-fi" section but not sure how to proceed with it. The theme is to support sci-fi that concerns near term developments within reasonable extrapolations of current technology.

I ‘ve been disappointed by the lack of support for space exploration among many sci-fi fans who seem to be bored by anything slower than the Enterprise. I would like this page to show that sci-fi about space exploration within our own lifetimes (at least if we live a long time!) can be as exciting as any other sci-fi. If you have any suggestions I would, of course, greatly appreciate it.

-Clark

Thanks. As I said years ago, the way to go to space is the way Buck Rogers did. It may be time to remind readers there's a space section here, including some papers on Reusable Launch Vehicles.

===

Jerry,

FP keeps working copies in a cache file. I found that running a batch file to dump that cache made it perform better. Try this instead of of re-booting your system.

deltree /y "C:\Program Files\Microsoft FrontPage\temp\"

Regards,

Jim

James Cooley [miwok@att.net]

Thanks. I'll try that. Sometimes it slows to glue...

====

Jerry -

With regard to your ISDN question: You are correct in stating that two devices cannot simultaneously share an ISDN line. This doesn’t work with the IDSN protocol standards, which are still built around the idea of a two-ended circuit.

However, you should have absolutely no problems turning off your Telos ISDN device, removing the ISDN connection, attaching it to the Ascend router, and turning the Ascend on.

I’ve gone back and forth with my ISDN line between an Ascend router that I borrowed from my work to the Motorola ISDN modem that I have now. There is no problem at all.

One thing you will notice is that when a new device is powered up on the ISDN line, it will take sometimes close to a minute before the device negotiates with the switch on the telco side and becomes ready for use.

Roger Weeks [roger69@hotmail.com]

Thanks. Since I am not drawing dead, I'll give this a try, then. I presume it's going to require adventures in TCP/IP which we have not previously done here (this is a Microsoft Network, Net/Buei which is good enough for what I do and has some security aspects I like).

New adventures…

===

Dr. Pournelle

As far as I know, you are correct in your bits is bits approach to CD Rom contents. While there are various subcodes involved in defining headers and other things, these are not read directly and the data that is read can be copied. If there was a problem with the lower level data it would show up as a read error (if it could not be corrected.) When the data is written this low level data (subcodes etc.) will be recreated on the new disk. But the data will be a byte for byte copy if there are no data errors (or hardware failures.)

I have had little experience with Adaptec’s software, except for some very poor experiences with an earlier version (which would not create Joliet disks that could be read by OS/2, new versions have been fixed.) I have used Jeff Goldhawk’s CDRWIN a lot. This is an excellent program! It has never failed me. One of the options is a raw sector copy that appears to make an identical copy of a disk. A file compare accross both disks, reveals no differences right down to the volume label.

Audio CDs area different matter. Tracks that are copied, especially if they have been ripped to wave files and then recorded as audio can have null bytes (silence) either prepended or appended.

A site that can tell you almost everything you could want to know about CD and CD-r is www.fadden.com.

 

Rolf Grunsky

-----------------------------------------------------------

rgrunsk@ibm.net

 

Thanks. I would imagine that the problem with copying a copy of a copy is that there are enough retry errors to generate underflow; CD/R wants a steady stream of data, and sometimes the buffering isn't sufficient if there are lots of soft read errors. I still believe bits is bits, but I am willing to be convinced otherwise.

And now this:

I too was having problems making a CD-R. (I’ve started making my backups of a couple computers, like my Win98 laptop, to CD-R since disks got cheap.)

The other day, I decided to compare my backup copy to the source on hard disk before deleting it, and it miscompared. Not encouraging. Further experimentation. I created another disk and compared the 2 CD’s (I have a 12X SCSI CD-ROM drive and an IDE CD-RW drive). Still miscompared. I found, though, that the disk in the CD-RW drive compared correctly to the hard disk.

This stuff used to work, but I remembered I used to have a different SCSI card. I experimented, and disabled SYNC. It then worked.

Probably explains a case about a month ago where a CD to CD copy failed.

FWIW, I had a CD-R recently that didn’t work in an older computer, though it worked in another computer. I did a diskcopy to yet another CD-R that then worked. I blame the media in this case. Reports are that, in some drives, the different brands work differently.

As to whether copies can degrade for every copy, I would think this depends on whether a digital copy copies the underlying bits, or the bits after error correction has been applied. And vice versa. If error correction is disabled during the copy, and the bare bits copied, then I can see errors accumulating. Otherwise, of course, the more copies you make increase the chances that too many errors will occur during any given write to correct. But this should apply even if you make all your copies from the same source.

Kevin Krieser [kkrieser@delphi.com]

Thanks for the additional data. I have several CD's of everything I ever wrote digitally plus the editors I wrote them on, and I can read them all. I keep them in various places and make new every now and then.

I still think bits is bits…

 

 

 

©
Chaos Manor home

Entire contents copyright 1999 by Jerry E. Pournelle. All rights reserved.
Comments and discussion welcome.

birdline.gif (1428 bytes)