home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.misc
- Path: sparky!uunet!usc!sol.ctr.columbia.edu!caen!hellgate.utah.edu!lanl!cochiti.lanl.gov!jlg
- From: jlg@cochiti.lanl.gov (Jim Giles)
- Subject: Re: Giles' Manual Mania (Was - Re: About the 'F' in RTFM)
- Message-ID: <1992Aug28.005710.1989@newshost.lanl.gov>
- Sender: news@newshost.lanl.gov
- Organization: Los Alamos National Laboratory
- References: <1992Aug26.214319.14738@mksol.dseg.ti.com> <1992Aug27.020832.23988@newshost.lanl.gov> <1992Aug27.192610.12441@wixer.cactus.org>
- Distribution: world
- Date: Fri, 28 Aug 1992 00:57:10 GMT
- Lines: 158
-
- In article <1992Aug27.192610.12441@wixer.cactus.org>, rhodesia@wixer.cactus.org (Felix S. Gallo) writes:
- |> [...]
- |> Okay, give some concrete examples of a better 'sed', a better 'yacc',
- |> and a better 'awk'.
-
- GNU has a better yacc (they may even call it yacc, I forget - I use LR)
- which is full LR and not LALR like normal yacc. LR (a public domain
- parser generator) is also better than yacc. However, if your research
- and development is scientific/numerical software, a parser generator
- is not really a central concern.
-
- Awk is a poor imitation of snobol. The very fact that you mentioned both
- sed and awk is interesting also. They do the same thing (or, to be more
- precise, there's a *considerable* overlap in functionality). Yet they
- do so differently - incompatibility! If UNIX were true to it's creed
- (simple orthogonal tools each doing one thing *well* and piping them
- together), there would be one tool - or at least the two would have
- identical syntaxes for the overlapping parts of their operations.
-
- |> [...]
- |> >*NOTHING*!! With the alternatives I've seen, you *gain*: speed, features,
- |> >user productivity, etc..
- |>
- |> Err, these 'GUI's you keep referring to are faster than /bin/rc?
-
- The /bin/rc I have access to is part of the system boot sequence and is
- not a part of the naive user envoronment at all. What relevance to does
- it have to a discussion about user-level tools? I'm talking about tools
- which are useful in writing and maintaining software for applications
- *other* than system design or system maintenence.
-
- |> >[...]. The "pipe together simple tools" approach
- |> >has *some* merit. Too bad UNIX didn't stick to it. More precisely, too
- |> >bad UNIX didn't periodically winnow out all but the best tools and utilities
- |> >and rewrite them to be compatible and orthogonal. Might be an interesting
- |> >system.
- |>
- |> But they are. sed, awk, perl, tr, pr, nroff, troff, ls, tbl, pic, eqn,
- |> rm, and all the other programs in my /[usr/]bin perform completely correctly
- |> and predictably when placed in pipes. Maybe there's something integral
- |> to the process which you're failing to understand.
-
- Yes, I fail to understand why your list contains several tools (nroff,
- troff, tbl, eqn, etc.) which are individually as difficult to learn
- as a single integrated tool which does *all* those word processing
- operations - and contains context sensitive help (even though it's
- already easier to learn than the UNIX stuff). Especially since these
- integrated tools actually have MORE features than what you mentioned.
- If those tools were each *very* simple to learn, or if they each
- actually *did* more, the technique might be an acceptable alternative
- to integrated tools. However, the low quality and long learning
- curve of these tools is probably the reason that the world is
- moving away from the piped-tools model and going toward integrated
- environments - for word processing anyway.
-
- |> [...]
- |> The failure you're experiencing with UNIX is in thinking that it's
- |> a business productivity tool like Windows 3.1 or Quicken, to be compared
- |> with MSDOS color GUI programs. UNIX and C are research and development
- |> tools.
-
- Yet, they're being promoted for use in production environments. If they
- are research and development tools, let the researchers and developers
- keep them - hidden away and out of the rest of the public's way. In
- any case UNIX and C are for research and development of systems and
- system maintenence tools - not for, say, X-ray imaging or groundwater
- flow or stock market analysis tools.
-
- |> >[...] AT&T *also* selected MS/DOS for the
- |> >computer services contract it has with AMTRAC: because AT&T (the
- |> >inventor of UNIX) determined that training and system administration
- |> >overhead for a UNIX installation were unacceptable.
- |>
- |> AT&T probably correctly determined that UNIX was not the correct
- |> operating system for a non-research-and-development site. Indeed,
- |> UNIX makes a poor operating system for cash registers and business
- |> management. This doesn't make it a poor operating system. Rather,
- |> people who claim it's a poor operating system because it doesn't
- |> have The Feature They Want are barking up the wrong tree.
-
- It also makes a poor system in research and development environments
- if what's being researched and developed is not more system tools.
- Supercomputers, parallel syatems, etc. are not appropriate for a
- minicomputer-level development systems.
-
- |> [...]
- |> I've spent 13 years on computers now, and I must confess I have no idea
- |> what these secret tools you're referring to are. Could you provide
- |> concrete examples of a tool that is as easy to use as, say, sed, that
- |> provides just as much power, but whose documentation is 'better laid
- |> out'?
-
- Sed? Easy to use? I don't use it much because it's *not* easy to
- use (and because most of what I do doesn't involve text filtering).
- In the 25 years I've been programming, the only system I remember
- to *compare* with the difficulty of learning and using UNIX is OS/360.
- And for the same reasons: *lots* of arcane trivia and poorly organized
- manuals. The terms `arcane' and `user friendly' are mutually exclusive.
- UNIX is arcane.
-
- |> [...]
- |> >[...] Many businesses would never have even considered using UNIX
- |> >without something *like* windows (or alternative utility sets,
- |> >or alternative shells - anything to improve the crummy environment)
- |> >because UNIX is not cost-effective without them (too much training
- |> >and "guru" overhead).
- |>
- |> Right, there's your problem again -- assuming that businesses have a
- |> need for UNIX, or that UNIX was designed for use in the commercial
- |> marketplace. I think Roell Piper has this misnotion too.
-
- No, Ive been claiming exactly what you just said: that there are
- environments where UNIX is not appropriate. I'm arguing *against*
- the assumption you accuse me of making. Since you are not recommending
- UNIX for environments for which it is not suited (like the person I was
- responding to), my comments don't apply to you. I was responding to claims
- that UNIX *was* suited to these application domains - which is not so.
-
- |> [...]
- |> Again, you miss the point of UNIX entirely. UNIX is designed for
- |> a networked environment. It is designed for multi-user machines.
- |> It is designed to be modular, and text-based; there are few needs
- |> for 'off-the-shelf GUIs', whatever that's supposed to mean. The
- |> toolset is integrated. In fact, you can even *make your own
- |> tools.* Try that under Windows. :)
-
- No, UNIX was basically designed and fixed in its present form before
- networks even existed. The network protocol which is presently used by
- most UNIXes is a system independent protocol (deliberately so) designed
- by DARPA. UNIX was designed at AT&T as a testbed for trying out experimental
- ideas of system design. It was not intended, when first designed, to be
- seen or used by any but the researchers involved and their colleagues - all
- of whom were willing to endure the idiosyncracies of an ad-hoc system.
- It became widespread because it was free and open (neither are technical
- qualities - schools wanted some unimportant system which was simplistic
- enough to teach from and could be sacrificed to studen hacks and such;
- no one was foolhardy enough - in those days - to do anything really
- important on UNIX) and it is popular now because the students who
- learned under it are not willing to learn anything new (as is evidenced
- by the strong resistence and abuse from many on the net when it is suggested
- that there *might* be better ways). UNIX retains its experimental, testbed
- character - as well as many of the experimental features (even those which
- are now thought to be bad ideas).
-
- |> [...]
- |> I'm currently running a UNIX clone on a 386/25 with 4M of memory.
- |> This is hardly a massive workhorse, and, since the kernel is less
- |> than 1.2M large, I can also run X. The cost to me? $0.00.
-
- Gee. I wouldn't want to use up a third of my memory on *system* overhead!
- And, I'll bet it uses 50M or more of your disk as well - at least if you've
- got all the bundled tools installed. For a single user PC clone, that's a
- *lot* of overhead. Further, X, TCP/IP, and several other things have
- *nothing* to do with UNIX or the bundled UNIX environment I've been
- talking about - they were designed to be system/hardware independent.
-
- --
- J. Giles
-