[Gllug] Newbie asks: how to learn?
Pete Ryland
pdr at pdr.cx
Mon Dec 23 07:45:41 UTC 2002
On Sun, 2002-12-22 at 12:37, Chris Ball wrote:
> >> On 22 Dec 2002 07:15:08, Pete Ryland <pdr at pdr.cx> said:
>
> >> Read Neal Stephenson's "In the beginning was the command line".
>
> > *however*, it's quite clear he doesn't fully get some of the
> > technical subtlties and makes some pretty bad conclusions from
> > these misunderstandings too, which is why I find the article as
> > a whole a bit misleading.
>
> Can you think of anything in particular?
A few things (although it's hard to actually find things to quote
directly):
Firstly, the whole concept that a GUI desktop and its applications are
executing _on top of_ the 'telegram-based' command line (a concept which
the whole essay revolves around) doesn't quite sit well with me. I
guess I view one of the benefits of Unix is that GUI applications run
_alongside_ other processes and are no more special than, say, the crond
process. In a way he's making a good point because at the end of the
day, a GUI MUA, for example, still has to talk 'telegram-based' SMTP.
But he misses the really big point technically with regards to memory
and process management when comparing the OSes. The article was written
when Win95/8 probably held a vast majority of the market, and yet he
fails to mention the differences explicitly - the ones that mean that
under Windows, one is reliant on *all* application software to be
totally bug-free in order to have a stable system - pretty important I
would have thought. This is simply too important not to explain; much
more important than the fact that Linux didn't have a graphical bootup.
(Microsoft is slowly catching up, but remember, the article was written
before Win2k) He does come frustratingly close to this when talking
about how Unix is multi-user (which he seems to confuse with
multi-tasking), but he just doesn't quite get there.
Ooh, I've found a something explicit (but pretty minor) when he's
talking about why the Mac's CRT and CPU were in the same unit:
| Apple wanted to make the personal computer into an appliance, like a
| toaster. But it also reflected the purely technical demands of running
| a graphical user interface. In a GUI machine, the chips that draw
| things on the screen have to be integrated with the computer's central
| processing unit, or CPU, to a far greater extent than is the case with
| command-line interfaces, which until recently didn't even know that
| they weren't just talking to teletypes.
|...
| Anyway, a Macintosh had to switch individual bits in the memory chips
| on the video card, and it had to do it very fast, and in arbitrarily
| complicated patterns. Nowadays this is cheap and easy, but in the
| technological regime that prevailed in the early 1980s, the only
| realistic way to do it was to build the motherboard (which contained
| the CPU) and the video system (which contained the memory that was
| mapped onto the screen) as a tightly integrated whole--hence the
| single, hermetically sealed case that made the Macintosh so
| distinctive.
AFAIK, this is factually wrong, and he also fails to point out how silly
Apple were to do this considering the shelf-life of CRTs. Although,
this may have been planned obsolesense on Apple's part - who knows.
Ok, here's another quote, when he's talking about how OSes have no value
any more:
| Applications create possibilities for millions of credulous users,
| whereas OSes impose limitations on thousands of grumpy coders, and so
| OS-makers will forever be on the shit-list of anyone who counts for
| anything in the high-tech world.
I'm really not sure where he gets the idea that coders get grumpy with
the so-called imposed limitations of OSes. If an OS has a really bad
programmers' interface (like BIOS/DOS/Windows :) then sure this will
make the coders grumpy, but these are not imposed limitations, these are
simply messy APIs. From my pov, as a coder on Linux, I'm only ever full
of praise for the kernel hacker, for the glibc developer, for the
glib/gtk/gnome developer whose well-thought-out APIs I use daily.
And surely when he says limitations he's not talking about security
features, one of the most important parts of modern OSes that (when well
implemented) make *everyone's* lives easier?
A pretty major historical fallacy he makes is crediting Microsoft with
creating the 'parts bazaar', the credit for which should really go to
IBM. This is stated a few times.
Also, the way he describes Linux doesn't allow it to meet with my vision
of Linux in the future which is potentially as user friendly as the BeOS
or OS/2 (if you want it to be).
Anyway, I could probably find more things if you like, but I've written
enough already haven't I? :)
Actually, one more; he uses the incorrect nomenclature (sic) 'X Windows
System' (*shudder*).
> I didn't get that impression
> at all, and the feeling I picked up from books like _Cryptonomicon_ is
> that he knows a great deal about Unix.
Well, I guess you always have to lie a bit in order to keep things
simple enough to be understood, so I'm happy to give him the benefit of
the doubt since he's so good at writing about technical stuff in a way
that even the technically illiterate understand. And in the other 98%
of the essay he makes some excellent (and often witty) observations, so
don't get me wrong - I don't want to imply that it's a bad thing to
read, I just think that there are a few minor technicalities, and me
being me, I like to pick them out. :)
I'm actually about 3/4 the way through Cryptonomicon at the moment.
Ooh, and I have a flight this arvo... :)
> > The car dealership analogy is cool though, and often quoted/copied.
>
> And the Eloi/Morlock _The Time Machine_ analogy. :-)
Oath. Although I haven't read H.G.Wells, just seen the movies.
Pete
--
Because Linux is not commercial, it does not have to maintain any
pretences as to its reliability. Consequently, it is much more reliable.
-Neal Stephenson (paraphrased)
--
Gllug mailing list - Gllug at linux.co.uk
http://list.ftech.net/mailman/listinfo/gllug
More information about the GLLUG
mailing list