home *** CD-ROM | disk | FTP | other *** search
/ NetNews Usenet Archive 1992 #18 / NN_1992_18.iso / spool / comp / unix / wizards / 3592 < prev    next >
Encoding:
Internet Message Format  |  1992-08-12  |  1.7 KB

  1. From: tmilner@hprpcd.rose.hp.com (Tom Milner)
  2. Date: Tue, 11 Aug 1992 23:40:53 GMT
  3. Subject: Re: MP CPU-time phenomena
  4. Message-ID: <15250002@hprpcd.rose.hp.com>
  5. Organization: Performance Technology Center, Roseville
  6. Path: sparky!uunet!zaphod.mps.ohio-state.edu!darwin.sura.net!mips!sdd.hp.com!hpscdc!hplextra!hpcc05!hpyhde4!hpycla!hpergfg2!hprdash!hprpcd!tmilner
  7. Newsgroups: comp.unix.wizards
  8. References: <1992Aug6.214740.164@socrates.umd.edu>
  9. Lines: 30
  10.  
  11. In comp.unix.wizards, boyd@prl.dec.com (Boyd Roberts) writes:
  12.  
  13. |  In article <1992Aug6.214740.164@socrates.umd.edu>, steves@socrates.umd.edu (Steven Sonnenberg) writes:
  14. |  > The cpu-bound application is:
  15. |  > 
  16. |  >     while (1) i++;
  17. |  > 
  18. |  > I am measuring process CPU utilization based on u.u_utime + u.u_stime
  19. |  > over the elapsed time (CPU seconds).  For example, in 10 seconds
  20. |  > there are 20 seconds on CPU time (assuming 2 processors).
  21. |  > 
  22. |
  23. |  And just how is your while loop going to run on both processors at the same time?
  24. |
  25. |
  26. |  Boyd Roberts            boyd@prl.dec.com
  27. |
  28.  
  29. There's no reason that a processor switch could not occur anywhere in
  30. the code... In a 10 second window, all processors could have a chance to
  31. execute the code stream. 
  32.                                   _____________________________________
  33.                                  /_______________  ___________________/
  34.                                                 / /
  35.                                                / /
  36.                                               / / Tom Milner (Tn: 785-5637)
  37.                                              / /  HP Performance Tech Center
  38.                                             / /   tom@hpptc16.rose.hp.com
  39.                                            /_/    __________________________
  40.                     
  41.