jp.jpg (13389 bytes)

CHAOS MANOR MAIL

Mail 131 December 11 - 17, 2000

read book now

HOME

VIEW

MAIL

Columns

BOOK Reviews

emailblimp.gif (23130 bytes)mailto:jerryp@jerrypournelle.com

CLICK ON THE BLIMP TO SEND MAIL TO ME

The current page will always have the name currentmail.html and may be bookmarked. For previous weeks, go to the MAIL HOME PAGE.

FOR THE CURRENT VIEW PAGE CLICK HERE

If you are not paying for this place, click here...

IF YOU SEND MAIL it may be published; if you want it private SAY SO AT THE TOP of the mail. I try to respect confidences, but there is only me, and this is Chaos Manor. If you want a mail address other than the one from which you sent the mail to appear, PUT THAT AT THE END OF THE LETTER as a signature.

I try to answer mail, but mostly I can't get to all of it. I read it all, although not always the instant it comes in. I do have books to write too...  I am reminded of H. P. Lovecraft who slowly starved to death while answering fan mail. 

Day-by-day...
Monday -- Tuesday -- Wednesday -- Thursday -- Friday -- Saturday -- Sunday
 
atomz search

Search: type in string and press return.

 

or the freefind search

 
   Search this site or the web        powered by FreeFind
 
  Site search Web search


Boiler Plate:

If you want to PAY FOR THIS there are problems, but I keep the latest HERE. I'm trying. MY THANKS to all of you who sent money.  Some of you went to a lot of trouble to send money from overseas. Thank you! There are also some new payment methods. I am preparing a special (electronic) mailing to all those who paid: there will be a couple of these. I am also toying with the notion of a subscriber section of the page. LET ME KNOW your thoughts.
.

If you subscribed:

atom.gif (1053 bytes) CLICK HERE for a Special Request.

If you didn't and haven't, why not?

If this seems a lot about paying think of it as the Subscription Drive Nag. You'll see more.

Highlights this week:

Search: type in string and press return.

 

LAST WEEK                                                               NEXT WEEK

line6.gif (917 bytes)

 
This week:

Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday

TOP

Monday  

Today was devoured by locusts

 

 

TOP

 

This week:

Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday

TOP

Tuesday

Another day lost.

TOP

 

This week:

Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday

TOP

Wednesday, December 13, 2000

ON LINUX and LINUCES:

Hi Dr. Pournelle,

I just read your column on installing Slackware, in which you thanked the Linux Documentation Project for their HOWTOs. As an LDP project, volunteer, I want to say "you're welcome," and thank *you* for the thoughtful mention in your column.

Regards,

David

-- Dr. David C. Merrill http://www.lupercalia.net Linux Documentation Project dmerrill@lupercalia.net Collection Editor &; Coordinator http://www.linuxdoc.org Finger me for my public key

If you tell the truth you don't have to remember anything. -- Mark Twain

Thank you.

In your Byte column, you recently listed your experiences with installing Linux. I just did this myself, and have some notes to compare. With regards to:

"Red Hat 7.0 includes XFree86 4.01, and that's what's installed if you do an original installation..."

For some people, yes. But on my original (non-upgrade) installation, Red Hat 7.0 installed XFree86 version 3.3.6. Looking in their RPM packages directory on the installation CD-ROM, I noticed some 4.0.1 packages, and some 3.3.6 packages. Which package (and thus which version) is installed probably depends on which graphics card you have.

About your further comment:

"This is a major oversight, because scrolling in the older version of X-Windows is very slow."

That is highly dependent on which graphics card you have. There are plenty of graphics cards that work very fast with older versions of XFree86; for instance mine (Matrox G400) works very fast (basically everything I want to do, including scrolling, is completely instantaneous) with the version of XFree86 that Red Hat 7.0 installed. The question is whether or not accelerated drivers for that particular graphics card are included in that version of XFree86. Newer cards tend not to be supported by older versions.

With regards to:

"...compiling and installing a network card driver is a daunting task -- and one that Red Hat spares you. It's one of the reasons Red Hat takes so long to install. "

Well, Red Hat didn't spare me. I got a Linksys 10/100 card, which Red Hat's "Hardware Compatibility List" lists as compatible, but which the driver distributed with their kernel didn't recognize. The manufacturer's driver disk included source code for a Linux driver, which I had to compile and install, and which seems to recognize the card, and probably will work, though I haven't gotten around to sending any actual data through it yet.

I don't think Red Hat versus Slackware has much to do with this; I think it is because Linus Torvalds has been quarrelling in a minor way with Donald Becker, who wrote most of the network card drivers, about the development policy for those drivers, with the result that the official series of drivers has to some extent forked off from the series that Becker maintains. Red Hat distributes the official series; the driver that came with the card seems to be one from Becker's series.

My biggest problem with Red Hat 7.0 was that the installation script didn't install LILO (the Linux Loader) properly. I was setting up a dual-boot machine (Windows 98 SE + Linux), and put Linux on the second partition, which starts 10G into the disk. The install script not only failed to install LILO, but failed to tell me that it had failed. This I regard as unacceptable. I had to guess what went wrong (without the boot loader installed, it looked like the installation had failed completely -- the computer just booted into Windows), then I had to boot off the installation CD, in "rescue" mode, mount the correct hard disk partition, find the LILO documentation, read it, run LILO with the correct options, read the error message it gave, make another guess as to what that error message meant about how to change the LILO configuration file (it meant that the word "linear" in the file had to be changed to "lba32"), edit the configuration file, and reboot. I am not exactly new to Linux, but this took me over an hour, even though both my guesses were correct the first time. And my situation is one that Red Hat should have tested; plenty of people have dual-boot machines with large hard disks. For it to err is bad; for it to not even give an error message is too much. Next time I'll try a different distribution.

Norman Yarvin <yarvin@cs.yale.edu>

Well there are enough religious wars going that I don't need to do more on this one than be a war correspondent...

Hi Jerry,

While I know I am entering an area of everlasting religious debates, I'd still like to make a differing opinion on the the Linux distribution issue you brought up.

First of all, you made a statement that if you install on an RPM system any software without using the package manager, you invalidate the whole RPM system. That is, well, not quite true. Any locally installed software usually installs in /usr/local tree, while software installed from RPM's goes to /usr tree (e.g. the binaries go to /usr/local/bin or to /usr/bin). That way, you may freely add manually installed software on your system, but it does *not* harm the RPM structure of your installation. I agree there are cases where life is not quite that simple, but even then the deviations are very local in nature.

Downplaying the importance of package managers is still astonishingly common inside the Linux 'community'. I really do not know why, since I find them the single most important feature of Linux distributions. The need for them becomes obvious if you really have to do some work on the system, instead of just playing around. For an administrator, lack of package management system would just be unacceptable. RPMs do not prevent you from doing anything, they just help you to keep things in order.

I would also like to note that RPMs really cannot be compared to Windows registry. While the registry is a hierarchical system and application configuration database, RPM is a software management system and database. There is no counterpart for it in the Windows world, although InstallShield (or similar) tries to do it for each application separately.

Finally, I'd really like to encourage you to try out GNOME or KDE seriously. Either of them should give you any customizability and features you might wish from WindowMaker (except simplicity, of course). Personally, I would recommend Helix Code's GNOME setup, available at http://www.helixcode.com/desktop/download.php3. It installs very easily on systems using RPM.

Best regards,

-- Matti Airas mairas@iki.fi +358 50 34 64 256 http://www.iki.fi/mairas/

I completely agree that without package managers Linux will remain a fairly obscure system used only by gurus. That may be no bad thing, of course. But for the masses, things have got to be a little simpler than "just recompile the kernal, and then patch in the usual way"...

I liked your review at http://www.byte.com/column/BYT20001205S0005. But I noticed that your experience with ATA/66 resulted in turning everything back to UDMA/33 just to install. You may not have noticed, but in Slackware we include the ata66.i boot disk image for installing on machines with ATA/66 drives.

-- David Cantrell | david@slackware.com * KG6CII | Slackware Linux Project

Sir, I just wanted to let you know I thought your column article for "Building A Slackware System" was excellent! In fact, I intend on eventually building a Slack system as you described. I just recently got into Linux, and just the other day installed Slackware 7.0 (which I purchased from Walnut Creek). Anyways, there is a question I have for you if you would not mind taking your time to answer it, which really wasnt explained much in the article. 

I also have a Netgear Ethernet Card (Netgear FA310-TX) and for some reason, Slack wont recognize it, much like your problem. I searched pretty much everywhere on the net I could find for Netgear Linux drivers as you said you obtained, even tried the Netgear site... but I cannot find them. I have been having trouble getting my Verizon DSL setup in Slack with Roaring Penguin's PPPoE, it worked fine in Mandrake. Basically, Im just wondering if you would mind telling me where you found the Netgear Linux drivers, so I could install those onto my system and surf the net in Slack, would really hate not to use such an excellent distro simply because I cannot get on the net. I look forward to a responce when you have time sir, thank you.

Sincerely, Daniel Stehm Editor-in-Chief @ GA-Linux.Com

Will someone please help him? I don't feel entirely qualified. But I am sure there are many who are.

Jerry,

You were asking about Fortran compilers on Linux a while back.

1. g77:

This is a free (GNU) Fortran compiler provided with Red Hat Linux. Instructions come with "man g77" or "info g77".

2. VAST/f90:

http://www.psrv.com/lnxf90.html

The version for personal use is free, and works for Fortran 90.

--Erich Schwarz

Thanks!


    Jerry, In case you're interested, there's a new version of Irfanview out. Has a nice install, let's you choose graphics, movie and sound files that you want it to open automatically.

http://www.irfanview.com/english.htm  

I have always been a fan of Irfan, and thanks. I'll go get it.


THE LANGUAGE WARS RESUME

Hello, Jerry,

You wrote recently that "I really ought to start the language debates again. I never thought C was the proper language for writing complex programs."

Please do it. However, development language is not all, or most, of the problem.

I love C, and I love assembly language even more. In fact, it is clear that those too young to have written in assembly language have not tasted the true sweetness of computer life.

However, the larger the C-language program the more likely you are to make a mistake that only shows up at run time as a reference to a piece of memory that has been freed, or as a reference to an array element that does not exist, etc etc. I have learned, painfully, that you are right.

My own experience suggests, though, that poor quality software results mostly from the business demands that control development. Business people define the business need that software fills, and pay for the software. In all serious projects, the programming is the easy part. Yes, the larger the system, the more likely you are in C to accidentally tie your own shoe-strings.

But the chief problem is that requirements and specs change. The payers have a conception of a system, and a business window through which the system has to fit. As the systems gets built, they discover other features that, if added, would give their system an edge of the competition.

They pay the bills. They call the tune. ("...who pays the piper...").

Across nearly twenty years (yes, when I started reading your Byte articles, your kids were in college) I have consistently found that big crashes come from the business/management side changing a spec during development, trying to add features the market will demand, while trying to keep to a delivery that the market equally demands.

Books, articles, methodology, IEE Software Magazine, all write as if developers exist in some isolated lab, in some place -- like a college campus -- whwre users and market do no exist, and where the only commandment to developers are to "do you best to deliver a clean product no matter how long it takes, and even if time has long passed when anyone would need to use it".

The fundamental problem is the translation of a real-world need -- to me, a business need -- into a set of instructions that make a compuer and network fill the need.

Off-shore development has the same problem. If anything, it is much harder to develop a decent solution if your programmers are in India, and your users are in New York. Someone still has to translate user-demand-language into programmer-requirement-language. It is harder, much harder, if you want to use programmers who have never met the users -- programmers who do not share the business process assumptions of the people defining the requirements.

In fact, the translation problem exists whether your developers are in India, or Germany, or Pennsylvania, as long as they are physically and mentally separated from the users. If anything, it is worse when the development model assumes an early-twentieth-century assembly line model -- kind of software Taylorism (cf, Frederick Winslow Taylor). Usually (again, my painful experience) the cheap off-shore development company can be so cheap because about 10% of their developers are really good, and the rest are stumbling rookies. The very good 10% are paid well, and the rest are paid very poorly.

Without this split, there is no way, short of slavery, to keep the average software developer's wage so low in Mexico, Phillipines, or India. And no way for an overseas programming factory to bid so low. Our companies would hire their developers...as we often hire from that pool of 10% experienced and talented programmers.

(And incidentally, overseas development is not often not so inexpensive, but its full cost if often disguised by the mid-level managers who choose to push business that way. Usually the day rates are cheaper overseas. Managers pat themselves on the back as they hand a powerpoint slide up to senior management -- "Our programmers would have billed $1,500 per day, but the Elbonian Development Factory is charging only $500 per day". However, the EDC will use three times as many people. plus a full-time US project manager, plus a full-time project manager in Elbonia. This "all-in" cost shows up somewhere else, under another budget line, in next year's report. By then, the manager who cut the deal has been promoted, and is no longer responsible for the "problem" that has cropped up in the project.)

OK, that's enough...

Regards,

John Welch

The real quality control problem comes when the original programmer leaves, and his successor leaves, and someone now has to maintain this C program: but who hasn't been told about the really clever hack that the original programmer used to save resources in a resource-short environment. Only now the hardware has so improved that no one in his right mind would resort to a clever hack to bum a few bytes of code. But it was done, and the "documentation" usually on the order of  "Here beginneth the clever hack" and not much else, and you can't find that unless you page through hundreds of pages of code.

Jerry,

We know at least one way to engineer software, that is by using the Carnegie Mellon Software Engineering Institutes software maturity model. Last time I paid attention to such things, there were two organizations doing Level Five software: NASA's contractor for shuttle software and the contractor for the Japanese bullet train system. Cheap it is not.

I expect you are correct about C and complex systems, but programmers like C because they can jump right in to a problem and wale away producing code.

I would be interested to see if there is a cheap and cheerful process that will yield reliable code in timely fashion that is not bloatware or which performs only slowly.

jim dodd

Microsoft has made a fortune -- literally -- in trusting that the hardware will rescue the software. I recall calling Office 97 "bloatware" because it wanted a couple of hundred megabytes of disk space. That's about $2.00 (Two Dollars) worth of space now.


Hi Jerry

> I think it may be time to open the language debates again. C will compile > nonsense. Pascal and its derivatives will not, and do range checking; it's > harder to get a Delphi or Pascal or Modula-2 program to compile, but when > it does, it will generally do what you expected it to do. With C you have > to simulate the compiler in your head, and for many that's a formula for > making your head explode... > > I think it is time for readability

Unfortunately, most of the language debate is conducted by people who are less than expert in languages. I don't know whether you fall into that category, but from the few comments above I must confess to suspicions that you might.

The biggest single problem in software development is that too many who work as software developers lack fundamental skills.

Anybody with basic carpentry skills can design and construct a simple footbridge over a 5' wide creek. If the same person tries to design and build (single-handedly) the Golden Gate bridge we expect problems. Software development is a bit like that. With a six-month crash course you are amply equipped to write 50-line Excel macros, but all too many people with the same skill level are trying to build large, complex software applications.

This year, I was involved with interviewing applicants for a "Senior Software Engineer" position where the job advert asked for expert C/C++ developers with 5 to 10 years development experience. Most applicants, despite large and impressive resumes, had no real grasp of basic software engineering principles and only limited grasp of the languages in question.

I have developed software in around 40 or 50 varieties of C and C++, as well as FORTRAN 77, Fortran 90, COBOL, Basic, Visual Basic, Pascal, Prolog, Awk, Bourne Shell, Java and three different types of assembler (PDP 11, 68000 and 80x86). I've taught programming courses in C, C++, Pascal, COBOL and Java. One thing that has become eminently clear is that you can write ugly, un-maintainable code in any of those languages, just as you can write clean and clear code.

Modern C compilers are considerably more sophisticated than those developed at Bell labs in the 70s. If the compiler is set up correctly it will complain loudly if you mix your data types. It presents a mechanism to tell the compiler "Don't complain, I know what I'm doing", but only a fool would use that unless they truly do know. (And yes, there are many fools developing software).

Programming languages have different strengths and weaknesses, and you need to balance the right tool for the right job.

Modula 2 is fine for a novice developer to build small to medium size systems. Many experienced developers find it cumbersome for complex problems.

Visual Basic (which is much closer to Pascal than to Basic) or Delphi are great for data processing apps and come with excellent database functionality.

Fortran 90 is without peer for number crunching.

C++ is, I find, a good language for complex systems, especially those which must interact with the system at a low level.

C is great for ultra-lightweight apps, such as embedded systems.

There is a fair bit of overlap between the languages, and I think that language choice is only a small part of a bigger problem. More critical is an understanding of sound development techniques, of software architecture, of algorithms, of reviews and other QA techniques and of testing. If you don't understand these, then you cannot build a "Golden Gate" software project without developing a house of cards.

Software development is an engineering discipline, and without good engineering skills, swapping C for Modula will achieve very little.

As you might suspect, the above are my own views. If you want my employer's views then you'll have to ask them.

---------------------------------------------------------------------- ----- Michael Smith, Senior Software Engineer michaels@aurema.com Aurema 

 

Well, less than expert is a fair description of my status; "rusty and out of date" might be applicable as well. But I have read most of the major books. Kernighan, for instance, wrote his long piece about Pascal, and my son, 15 at the time, said "Well, he doesn't like Pascal because it isn't C."  That was true then and I think remains so. Having spent some time with Wirth I understand what his programming philosophy is: it's expressed in his book  "Data Structures Plus Algorithms = Programs", which is a pretty descriptive title to a book. It's a different approach from the "C" approach.

Minsky many years ago accused me of liking structured languages because I was less than expert. "Some people like to put themselves into a straightjacket, then see what they can do while in there."  He was I suppose right; certainly Pascal and Modula-2 were straightjackets compared to LISP, or APL,  both of which Minsky could do in his head. But then Marvin is not at all like anyone else I have ever known with the possible exception of McCarthy, and John has his own unique features. When you are in the high realm of high genius, general observations do not apply.

But in the more mundane world in which programmers are more like IQ 130 than 180, we find ourselves in a situation in which our lives are controlled by programs that no one understands, and worse, that no one CAN iunderstand. Back when the hardware was greatly limited, using tricky languages that would let you do clever hacks and wonderful kludges was fine; but now the hardware far outstrips the software, and we can afford programs that are "larger" and "slower", which may be the price of understanding -- although better code generator libraries for readable languages can take care of much of those difficulties. And for that matter, hand optimization of loops in assembler should not be a lost art, and a well commented assembler routine is no bar to understanding.

It has been a while since I did serious programming, but I did enough to have some right to an opinion: and what I found was that with C, a couple of months away from the program was enough to make it incomprehensible even if I wrote it. That was not true for Pascal or Modula-2. In those days, computers were slow enough and disk and memory space were limited enough that there were real advantages to using a lower level language like C; but I also noted that it was no great trick to get C programs to compile, and you could write them pretty fast, but getting them to do what you wanted was much harder, and "proving" the program by testing extreme cases could take a long time -- and you still weren't sure.

Now true, "Lint" and other pre-compilers helped a lot, just as "RATFOR" (Rational Fortran) made FORTRAN programs considerably more comprehensible. And to this day I know of large programs in Modula-2, and even in, God save us, ADA, that work, and are maintained by people who never met the original authors, and which are pretty well comprehended by those who work with them; I also know of large C programs (including I think Windows NT) which are simple incomprehensible to any human being.

But that's my take and I am sure there are many others.


From: Steve Setzer <setzer@backfenceDOTnet>

 Subject: Language debates--SuperTalk et al

Jerry,

Let me put in a vote for the xTalk family of languages--HyperTalk, SuperTalk, MetaTalk, and various others (see www.metacard.com  for genealogical data). These are interpreted languages/toolkits suitable for rapid prototyping, free-form data organization, and other useful programming tasks. The syntax is extremely clear, if verbose, and quite English-like. No pointers, no low-memory access, lots of high-level functions. The MetaCard/MetaTalk version even runs identical scripts on Mac, Windows, and several flavors of Unix without any recompiling.

Given the speed of today's processors, the slowness of the xTalk interpreter is hardly noticeable. The prototyping and development speed is quite remarkable. Great stuff for anyone with little or no formal programming background. Especially suitable in education.

Steve Setzer

Good point. I recall not too long ago when SmallTalk and SuperTalk were considered the great hope of the future. But then PROLOG was popular then, too (and I still like it for solving logic problems, but it's hard to find a good copy now.) 


Dr. Pournelle:

Martin Heller wrote in a recent column that the C# download took up about 2 Gig of disk space.

The early 1980's versions of Turbo Pascal, also developed by Anders Hejlsberg, ran about 48 kilobytes (yes kilobytes) for a single executable that did all the work.

I am not complaining about bulk on my fixed disk drive as gigabytes are cheap. I am worried about complexity and issues of reliability.

My inference about Turbo Pascal was that it had about 13 K worth of run-time library. My guess is that was the part that was hand-coded in assembler -- mainly a lot of DOS calls. The sweet thing about that system was that the run-time library got copied out of the Turbo executable into your executable -- no linker overhead, no worries about finding the right libraries. My other guess is that while the product was billed as written in assembler, they only hand-coded that run-time library; Turbo Pascal and its Word Star-like editor (Ctrl-KB, Ctrl-KK for BlocK select still works in Delphi) were written in Turbo Pascal itself and brought up with a boot strap process.

This means the Pascal compiler and IDE (yes, it was the pioneering IDE) ran about 4000-6000 lines of Pascal code. No more than about 100 pages of source listing. Yes, the user interface was primitive by today's standard, but I found it extremely easy to use and was programming within an evening of reviewing the manual.

Does this mean C# and its associated run-time comprise about 200 million lines of code (3 million pages of source listing)? What the heck do you do with 200 million lines of code? Does Mr. Heljlsberg understand what 200 million lines of code even does? Can we rely on anything written with that system?

What began as Turbo Pascal has evolved from the most elegant compiler ever developed into the most egregious example of bloatware. I had programmed in FORTRAN, COBOL, Algol 60, Ada, FORTH, and a bunch of other stuff, but the original Turbo Pascal was so powerful I have been programming in Pascal for 16 years now and has been willing to follow Anders Hejlsberg to the ends of the earth and would gladly learn C# if that was the way to go, but I am beginning to worry.

Paul Milenkovic Madison, Wisconsin

Good points all. For those who don't know, although Philippe Kahn is the name usually associated with Borland Turbo Pascal, it came from Denmark. (Philippe is French.) 

And Delphi is not only still available, but works quite well. And is not two million pages long...

And sort of on topic:

Dr. Pournelle,

I've read your online journal for several months now and will be subscribing shortly. I appreciate you doing theses things so we don't have to.

I'm a freelance computer consultant/VAR/IT Director - whatever you wish to call me today, and I have some experiences to share with you.

Bad Software Department:

Quickbooks Pro 2000

I don't know if you ever use this type of product, but with all the concerns over privacy and code bloat this is a good example. QB2000 *requires* you to be online and subscribe to their service to use their payroll function now. With prior versions you used to be able to order tax updates on floppy - now they are *forcing* people online - what a wonderful thing (heavy sarcasm). I have many customers that just have no need to go online - or are concerned with security. This particular thing isn't that terrible, at least they warn you about it directly.

I noticed something I really dislike the other day; I run Windows 2000 professional on my main workstation (honestly, what an improvement over 98, 95, 3.1 in stability) and I have my LAN connected to the Internet and shared through this. I run QB on a number of machines here in my office. I came back one day to find a message box on a screen claiming, "Quickbooks couldn't reach the internet would you like dial a connection?" This was on a machine without a modem (the server had been shutdown - alas no internet across the LAN) So here QB was trying to update itself without even asking me! Very Rude, Very Very Rude - almost as bad as Microsofts Windows ME AutoUpdate (at least they have a readily apparent way to turn this feature off)

**** Athlon Woes ****

Ok, I'll admit I've always been attracted to the Dark Side :-). If not by their pricing in the past, how about their wonderful performance now? The old AMD K6-2's and their bretheren were a very nice inexpensive chip, very few compatibility or stability problems in my experience (probably worked with 100 or so over the years). Along comes the Athlon, which delivers on many promises -- including both speed and price. But wait there's trouble in paradise! I've built, and purchased a number of Athlons now - at least 10 or so varying from the first Slot A's to Durons to Socketed Thunderbirds. All have been with name brand (Microstar, Epox) mainboards and good quality RAM, a few were even purchased prebuilt from the distributor I buy from (supposedly throughly tested and burned in). Not a single one of them has proved to be a stable machine! Blue screens, unpredictable errors, no explanation for most of this behavior. Hardware compatibility seems to be at an all time low for this product, at least for system builders using off the shelf components. There may be a magic combination here that the big guys are using, but us small folk sure don't know about it!

**** Trouble in Paradise ****

Here's even worse news! I'm having similar problems with Intel products as of late. I've always found them reliable in the past. I have an Intel Pentium III 733 - 128 Meg - IDE DVD &; CDRW - Microstar MB - Inwin MidTower - Maxtor 20 Gig. It came prebuilt from the supplier with Windows 98se, which the customer promptly attemtped to upgrade to Windows ME. Before the upgrade they experienced maybe on blue screen a week (heavy user of Office and the Internet IE 5.0). Upon installing ME now upon boot they would recieve a set of 5 vxd errors! Now the system seemed to work fine, and if you logged off and back on you didn't receive these errors (if you didn't log on and off you didn't have the icons in your system tray becaues of explorer crashing) This was unacceptable of course, so a clean install of ME was in order. Did that - everything seems honky dory right? Wrong - the machine was working somewhat well, but it would slow down quickly over time - the mouse would stop responding and so would the keyboard - WTF as Tom Syroid would say. Now the problem has progressed to the machine not shutting down, blue screening all the way.

**** One last Rant ****

One other warning for your readers, ATI Rage Fury Maxx Blows! The card if it works in your systems seems to do a wonderful job, however it's extremely unlikely to work! I've installed (or tried) to in a number of systems (I've yet to get it into an Athlon). It requires two IRQ's!!! It has two processors so I guess this is understandable - but on a few systems where it would boot I couldn't get installed due to lack of IRQs. They should definitely be more up front about this. To top it off the card only works with Windows 98, directly supported that is. I suppose if you wanted to run in 16 color 640x480 then you could use it with a generic VGA driver with other OS's. They don't even directly support it in Millenium, although it does seem to work.

**** The End ****

Well, I think I've rambled on enough - I hope some of my info is useful to you. I really enjoy working in the tech industry, especially when things work like advertised. Product testing takes up a good deal of my time due to the terrible products that are on the market now. Often tech support requires more time than buying a turnkey solution. I guess if it weren't for these issues I wouldn't have a job though! :-)

Keep up the good work on your web site, and I enjoy the Byte.com column as well.

In Technology We Bust,

Lorentz W. Hinrichsen

Thanks. And of course I don't be at all unhappy if you do subscribe...

Tech support is a rapidly rising cost of doing business. And while some of it is of the sort satirized in USER FRIENDLY and EVIL GENIUSES, much is absolutely required.

Dr. Pournelle,

Today (Wed) slashdot linked to this article at upside.com about software quality. I thought it might be fodder for your language wars.

http://www.upside.com/Open_Season/3a3661271.html 

Sean Long seanlong@micron.net


From: Gen. Robert E. Lee Commander in Chief Army of the Confederacy

To: President Bill Clinton The White House

November 26th, 2000

Mr. President,

In light of the recent actions by your Vice President, Mr. Al Gore, I hereby retract my concession of the Civil War and subsequent surrender to Lt. Gen. Ulysses S. Grant on April 9th, eighteen hundred and sixty five. It appears that the overwhelming presence of blue unit markers on my battle map proved exceptionally confusing, leading me to the false assumption that we were actually outnumbered by Union forces. Moreover, the arrows used to indicate location and heading of your troops left my subordinates, most notably a colonel from Florida, with the mistaken belief that we were surrounded. In addition, body count numbers provided to me have been deemed suspect. Some bodies having only one arm or leg were arbitrarily under-counted. This was in direct contradiction to the uniformly accepted "one head, one body" standard generally accept as the norm by military historians. I believe that dangling limbs were overlooked and should be considered in a final reckoning.

Given these gross violations of my civil rights and upon the grounds first defined by Al Gore, I hereby request the following:

1. A "Do Over" of the following battles: Elkin's Ferry, Vicksburg, Gettysburg, Ft Sumter, Shiloh, Manassas Gap, Antietam, Chickamauga, and of course, Appomattox.

2. A manual recount of the casualties at four battles where overwhelming victories of our Confederate forces were recorded. We can provide a Confederate assessment team to confidentially manage the recount process.

3. Return my damned sword

If my demands are not met immediately I'll see your damned yankee ass in court.

Sincerely,

Bob

 

 

 

 

 

TOP

 

 

 

This week:

Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday

read book now

TOP

Thursday, December 14, 2000

 

Off shopping.

 

 

TOP

 

 

 

This week:

Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday

TOP

Friday, December 15, 2000

From Roland, labeled "I told you so."

http://www.zdnet.com/zdnn/stories/news/0,4586,2665020,00.html?chkpt=zdhpnews01

--- Roland Dobbins <roland_dobbins@yahoo.com> /


This is one of several letters I have complaining that CURRENTMAIL isn't working. I have no problems with it, and apparently not many do. Anyone know more?

When I open a Netscape 6 window, it appears complete. With IE 5.5 it is still truncated. Could this be a 'feature' of AOL 6.0?

(Also, at work thru proxy server on IE 4, the page is complete.)

Ray Solomon


You might consider using a program called VNC. It might save you some of your trips to the cable room to check the monitor of your system.

It's free, it's open source, and it's very useful. It's just like PCAnywhere except that it's cross platform and free.

Also, regarding the power issue, not only do you have to wory about your UPS putting out minimal power, you also need to wory about your power supply putting out minimal power. The last time my power supply died, it took some other parts with it.

The thing is, most motherboards have hardware monitoring onboard. I've been using Motherboard Monitor -- http://mbm.livewiredev.com/ -- and it lets me know if the power that it is receiving is not enough, which means either the UPS or the power supply is toast.

BTW -- If you check out http://www.nasawatch.com/  , there is some discussions of you as a Dream NASA Administrator. I actually think that it makes sense. Is this something you are considering or not?

Ken "Wirehead" Wronkiewicz -- wh@wirewd.com  http://www.wirewd.com/wh/ \_ \_ \ [Error: Humorous remark not found]

Thanks. As to NASA Administrator it was never in the cards. Dan Quayle recommended me but Bush Senior did not much care for Reagan people and I never seriously thought it might happen; and Dan Golden has been as effective as anyone could be given the bureaucratic structure at NASA. I doubt much will ever be accomplished there now. We have failed Mars probes, a Space Station that cost more than Apollo (even though a better way to go to the Moon that certainly would not have cost more would involve building a manned space station in permanent orbit); no Skylab; an "X-33" that is not an X program and can't work (and worse it grew out of my DC/X which DID work but then that was a real X program not a shitepoke designed to subsidize a big company) -- and that despite Golden being about as good as you could have given the present White House.

Let Bush put in his own man, one younger than me. If they want to promote within the NASA ranks, Dr. Yoji Kondo, one genuine space scientist with international admiration and with administrative experience in building a project on time and in budget would be the right man. Then put some decent people in USAF and US Navy space commands, let them compete, and let a hundred flowers bloom.

The US needs a whole series of small X projects. Not big ones. Small. Two more DC/X follow-ons, preferably simultaneous and preferably reporting to different parent agencies: the goal of each project is to build reusable multi-engine rocket ships that can fly out of sight, land at a distant place (say White Sands to Edwards), refuel, and fly back within a day. 

  • Resuable
  • Savable
  • Higher 
  • Faster
  • Cheaper

in that order. Of course we won't do it. But what's needed isn't another enormous project labeled "X" but in fact has nothing to do with X projects.


Dr. Pournelle, The Miami Herald is reporting that some Florida counties didn't do any of the recounts, including the "automatic" machine recount. I guess the "Talking Heads &; Lawyers Full Employment Act" will continue.

 http://www.herald.com/content/today/news/florida/digdocs/025232.htm 

Kit Case kitcase@starpower.net

I cannot think anything will come of any of this.


 

 

 

 

 

TOP

 

 

This week:

Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday

TOP

Saturday, December 16, 2000

I am Filling The Dumpster, and Throwing Things Away. 

 

 

TOP

 

 

This week:

Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday

read book now

TOP

Sunday, December 17, 2000

Much mail on public schools; I will get that up tomorrow.

And The Dumpster Continues To Fill.

 

 

 

  TOP

 

 

birdline.gif (1428 bytes)