home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.benchmarks
- Path: sparky!uunet!zaphod.mps.ohio-state.edu!saimiri.primate.wisc.edu!ames!data.nas.nasa.gov!amelia.nas.nasa.gov!eugene
- From: eugene@amelia.nas.nasa.gov (Eugene N. Miya)
- Subject: [l/m 3/17/92] Measurement environments (12/28) c.be FAQ
- Keywords: who, what, where, when, why, how
- Sender: news@nas.nasa.gov (News Administrator)
- Organization: NAS Program, NASA Ames Research Center, Moffett Field, CA
- Date: Tue, 12 Jan 93 12:25:18 GMT
- Message-ID: <1993Jan12.122518.203@nas.nasa.gov>
- Reply-To: eugene@amelia.nas.nasa.gov (Eugene N. Miya)
- Lines: 81
-
- 12 Measurement environments <This panel>
- 13 SLALOM
- 14
- 15 12 Ways to Fool the Masses with Benchmarks
- 16 SPEC
- 17 Benchmark invalidation methods
- 18
- 19 WPI Benchmark
- 20 Equivalence
- 21 TPC
- 22
- 23
- 24
- 25 Ridiculously short benchmarks
- 26 Other miscellaneous benchmarks
- 27
- 28 References
- 1 Introduction to FAQ chain and netiquette
- 2
- 3 PERFECT
- 4
- 5 Performance Metrics
- 6 Temporary scaffold of New FAQ material
- 7 Music to benchmark by
- 8 Benchmark types
- 9 Linpack
- 10
- 11 NIST source and .orgs
-
- Benchmarking environments
- Taxonomy described by: A, B, C, D, E
- Examples:
- A Cray Y-MP or better with integrated hardware performance monitor
- B Cray-2, Convex C-1/C-2...
- C VAX-11/780, IBM PC, IBM 370-class
- D Apple II, Timex Sinclair
- E Incomplete hardware, chip, module, board-level
-
- An A environment should have state of the art hardware and
- software to aid benchmarking. Hardware should be minimally or non-intrusive.
- This category should change with time. The SS-1 should have a
- more programmable HPM, this would lower the Y-MP down into
- a B environment (can stay A if no other HPM type hardware becomes
- predominant). This is akin to high-precision Cesium clocks.
-
- A B environment is a "good" environment to do benchmarking.
- It has as a minimal, a near cycle-time clock, profiling software,
- etc. The environment at this level or higher may also be more hostile
- for measurement. Good performance is likely to be an issue, so the
- compilers will have good optimizers.
-
- A C environment is an "average" benchmarking environment.
- This is just typical: a 60 or 100 Hz clock. Some profiling software.
-
- A D environment is a difficult environment to do benchmarking.
- This environment may not have a clock or software to help the poor
- benchmarker. Timing by wrist watch or maybe oscilliscope.
- Software? What software?
-
- An E environment:
- Another difficult environment to do benchmarking. It might have software.
- It might be an incomplete computer: a processing using of a larger
- multiprocessor: it touches on the issue of composition.
-
- If dealing with a simulator:
- there maybe an F category for simulated timing.
-
- ^ A
- s / \ r
- m / \ c
- h / \ h
- t / \ i
- i / \ t
- r / \ e
- o / \ c
- g / \ t
- l / \ u
- A / \ r
- <_____________________> e
- Language
-
-