home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Media Share 9
/
MEDIASHARE_09.ISO
/
private
/
mpc93mar.zip
/
HOWFAST.DAT
< prev
next >
Wrap
Text File
|
1993-02-14
|
6KB
|
107 lines
How Fast is Fast Enough?
By: Curtis M. Trout, Twin Cities PCUG
I've recently heard many comments like: "No single user can use the
capabilities of an 80486 computer." Sometimes the line has an 80386-33
as the target. Quite frankly, I'm getting sick of this type of
comment.
To see the absurdity of this statement, at least from my viewpoint,
consider the following similar statement: "No single user can fully
use the capabilities of a pencil." Based upon observed utilizations,
it is easy to conclude that pencils should be shared. Simply look
around the typical office. If there are ten people in the office there
probably will be no more than two or three that are actually writing
simultaneously; the peak utilization is probably no more than fifty
percent. Therefore, pencils should be pooled and shared, thus reducing
the investment made in pencils. (If you don't like the analogy because
pencils are consumable, use telephones instead.)
I hope the point is obvious. We don't buy equipment to fully utilize
it; we buy it simply to use it! Except for pacemakers, very few
non-consumable products are purchased with the intention of using them
100% of the time. Instead, we make such purchases with peak
utilization in mind. Computers should be no different! We buy
computers to use them effectively.
You may accuse me of taking this position because my new computer is
80486 based. It won't stick. I'd also argue that my old '286 based
machine is effectively used by my daughter, even running Windows.
Don't, however, presume that I'd recommend purchase of a '286, or
less, for anything other than a dedicated controller type function or
a very specific application that will never use Windows or otherwise
benefit from a '386.
There is little argument that a '486 system isn't fully utilized by a
single user; like other computers, it spends most of its time waiting
for the user to enter the next character. However, that isn't why we
buy the machine. Depending upon the user, we buy computers to write
reports, compute spreadsheets, keep track of inventory, or any of
thousands of other real world applications. The capabilities of each
user's computer should be sized to those tasks and be capable of
meeting the user's expectations without impairing or adversely
affecting the user's productivity. Indeed, they should always improve
the user's productivity; even maximizing it.
How fast, then, should a computer be? I'd like to suggest (and it's
probably not an original thought) that the machine be fast enough to
complete each task without the user being aware of the time it took.
In other words, response time should be imperceptible to the user.
Obviously even the fastest computers fail when measured by this
standard.
Many readers are probably questioning my sanity about now. A few
others are nodding their heads in agreement.
Those skeptical readers deserve further explanation. I remember
attending a presentation (I think it was by an IBMer at a SHARE
meeting in San Francisco) where the speaker justified sub-second
mainframe response time; something akin to the response time noted
above. IBM had studied user productivity under varying response times
and had found that when the time exceeded about a second, user
productivity decreased in an amount that was disproportional to the
increase in response time. The explanation is that users tend to
create a series of operations to do the task at hand. They then ask
the computer to perform sequential steps in completing that task. When
the computer responds quickly, they very rapidly progress to the next
step. If the computer's response is delayed, or slow, they have to
redevelop their plan, or refind their place, between steps. It's sort
of like me when I go into a department store; I know I wanted
something, but what was it?
The difficulty comes in determining how fast a machine each user needs
to provide response that enhances their productivity without impairing
their thought process. If we wanted to statistically analyze this (I
really don't, but it may aid the discussion) we might look at the
number of transactions that are performed that meet our "imperceptible
response time" criteria. Since a PC transaction is based upon each
keystroke this number should be very, very high. I suspect that this
type of analysis would fail because we need the most computing power
for tasks that are relatively infrequent. Even a lowly 8088 will
generally process text data input fast enough to keep up with the
user's keystrokes. We need to look at more complex tasks that, if
completed in a sufficiently long time, will impair the user's
productivity. Examples might include spreadsheet recalculation, or, if
you use a print enhancement program like FaceLift or Adobe Type
Manager (as I do), printing. For example, I just "printed" this
article (up to this point) on my '486 system, under Windows, using
FaceLift. It took about 15 seconds before the computer had all the
data formatted and spooled for the printer. On my '286 system this
would have taken about 5 minutes. Now 15 seconds is short enough that
I didn't even bother to switch to another task; 5 minutes would be
long enough to go get a cup of coffee and probably get sidetracked on
another activity.
The time it takes to do a task is frequently perceived to be
associated with the difficulty of that task. Backing up data using
floppy disks generally takes about 30 seconds per megabyte. During
this time the user is tied to the computer feeding it diskettes. The
common perception is that this makes frequent backups hard to do.
Using a tape backup unit speeds the process considerably and frees the
user from attending to floppies. Backups become easy to perform. In a
recent column, Steve Gibson stated "Gibson's First Law of Life: Easy
things happen and hard things don't." Fast machines make many more
things "easy."