home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!cs.utexas.edu!sun-barr!news2me.EBay.Sun.COM!seven-up.East.Sun.COM!newscan!sixgun.East.Sun.COM!matthew
- From: matthew@sunpix.East.Sun.COM (Matthew Stier - Sun NC Development Center)
- Newsgroups: comp.lang.perl
- Subject: Paralleling data collection.
- Date: 11 Jan 1993 13:40:32 GMT
- Organization: Sun Microsystems, Research Triangle Park, NC
- Lines: 46
- Distribution: world
- Message-ID: <1irtcgINNh5@sixgun.East.Sun.COM>
- NNTP-Posting-Host: klondike.east.sun.com
-
-
- Okay folks, here is what I'm doing, and what I'd like to do.
-
- First, I'm using perl to collect and process data on file servers at
- my worksite. I currently do this by grabbing a list of hostnames and
- then stepping thru each of the hosts, rsh'ling and saving the results,
- Then I do my processing. Here's some sample code.
-
- #
- # Compile a sorted list of hosts to check
- #
- @hostnames = ();
- while (($hostname, $aliases) = gethostent) {
- push(@hostnames, $hostname) if $aliases =~ /\bserver\d+\b/;
- }
- @chk_hosts = sort @hostnames;
-
- #
- # Run $task on each host, and save the result.
- #
- foreach $host (@chk_hosts) {
- # Rsh to $host, and execute df for local filesystems only
- open(RSH, "($rsh -n $host '$task') 2>&1 |") || next;
- undef $/; $task{$host} = <RSH>; $/ = '\n';
- close(RSH);
- }
-
-
- Notice that the each host is processed sequentially. Nothing procludes
- running the $task on each hosts at the same time, except my ability to
- write the code.
-
- I'd prefer to key all the data in memory. I don't want to have to juggle
- around with log files more than I have to. If someone has written a
- subroutine to do this, I'd like to get a copy of it. I'm looking for
- something along the lines of:
-
- %task_out = &rsh($task, @hosts);
-
-
- Thanks in advance
- --
- Matthew Lee Stier |
- Sun Microsystems --- RTP, NC 27709-3447 | "Wisconsin Escapee"
- Email: matthew.stier@East.Sun.COM |
- Phone: (919) 469-8300 fax: (919) 460-8355 |
-