home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.sys.mac.hardware
- Path: sparky!uunet!nwnexus!kanefsky
- From: kanefsky@halcyon.com (Steve Kanefsky)
- Subject: Re: Use of external HD to setup multiple new Mac's/PC's ?
- Message-ID: <1992Aug27.175820.17622@nwnexus.WA.COM>
- Sender: sso@nwnexus.WA.COM (System Security Officer)
- Organization: The 23:00 News and Mail Service
- References: <1992Aug25.223247.22031@nrc.com> <87026@netnews.upenn.edu>
- Date: Thu, 27 Aug 1992 17:58:20 GMT
- Lines: 99
-
- In article <87026@netnews.upenn.edu> jensen@ben.dev.upenn.edu (Christopher Jensen) writes:
- >Hi,
- >
- >For many of our new installations of Mac's and PC's, we install
- >the same core system software (System 7.x, Dos 5, Windows 3.1, etc)
- >and the same fonts, and the same network card drivers for the respective
- >machines.
-
- >Be it mac or pc, we end up installing multiple megabytes of the same
- >software from floppies, answer the same install questions, and then
- >juggle floppies until the basic installation is complete.
- >
- >We then do some minor "tweaking" of config.sys, autoexec.bat, and System7
- >control panel settings.
- >
- >This process seems grossly inefficient - especially when setting up
- >labs with multiple identical machines. As an example lets say you have
- >a lab to set up with 10 Quadra's and/or 10 PC 486's.
- >
- >It would be ideal if we could custom configure an external hard drive
- >and then use that hard drive to set up the other internal hard drives
- >of the individual machines (no floppy aerobics, no installer questions, etc)
- >
- >Questions:
- >
- >Can any one suggest some techniques to improve the efficiency
- >of setting up multiple machines ?
-
-
- Since you mention network cards, your machines must be on a network.
- That makes your job a lot easier.
-
- Back when I ran some Mac labs, I used a shareware utility called
- VolumeImage, which would take a "slave" hard drive and make it look
- just like a "master" hard drive of your choosing. Furthermore, it did
- this intelligently -- only adding files to the slave drive which were
- missing, deleting extra files which were on the slave drive but not the
- master drive, moving misplaced files back to the folder they belonged
- in, closing open windows, etc. Thus, it was not only a great way to
- configure the hard drives in the first place, but it was a great way to
- keep them looking identical. I just created some startup floppies with
- the VolumeImage program on them and set VolumeImage as the startup
- application. This way, all I needed to do to bring a machine back in
- line with the master was restart with one of these floppies. The
- master hard disk was mounted as an AppleShare volume and the local hard
- disk was brought in line with it (I imagine you could use System 7 file
- sharing instead of an AppleShare file server, but this was before
- System 7). If I wanted to install new software on every machine, all I
- had to do was install it on the master disk and then reboot all the
- other machines with the VolumeImage floppy.
-
- The one problem with VolumeImage was when very large changes had to be
- made to a large number of machines on a LocalTalk network. The network
- got so bogged down that it took forever to bring all the machines up to
- date. This is because each machine was requesting the same information
- from the master disk, so the same information was being sent over and
- over again unnecessarily. I don't run Mac labs anymore, but I remember
- seeing a utility recently (also shareware) that claimed it could
- broadcast the data to all machines simultaneously and thus update any
- number of machines in the same time it would take to do a single
- machine with VolumeImage. I don't know if it has the intelligence of
- VolumeImage when it comes to incrementally updating a hard disk, but
- there's no reason you couldn't use them both. It might even be faster
- to completely replace the contents of all the hard disks on a network
- rather than incrementally update each one, because of the parallelism.
- The incremental approach will still be faster if you need to quickly
- update one machine, however. Whenever one of the lab users had a
- problem with a Mac, I would just reboot with a VolumeImage disk, let it
- do the update, and chances are the problem would be fixed. It only
- took a couple of minutes and was usually faster than trying to track
- down the particular problem.
-
-
- On the PC side, there may be similar utilities, but I wasn't aware of
- any and our PCs weren't networked most of the time anyway. What I
- would is make a backup of the master disk (using FastBack or whatever
- was available) onto HD floppies, then restore from those floppies onto
- the other machines. The trick I used to speed things up was to form a
- pipeline. I would start a restore on all the machines, then insert the
- first disk into the first machine. When the first machine was done
- with the first disk, I'd put the second disk into the first machine and
- the first disk into the second machine. When that was done, the third
- disk would go into the first machine, the second disk into the second
- machine, and the first disk into the third machine. And so on. This
- involved lots of running around and swapping floppies, but because of the
- pipeline it was very efficient. I even used this method on the Macs sometimes
- when I needed to do a complete replacement of the hard drive (because of the
- aforementioned problem with massive updates on LocalTalk networks).
- Depending on the number of machines and their physical locations, you
- can make two or three sets of backup floppies and have multiple
- pipelines going if you wish.
-
-
-
- Hope this helps,
-
- --
- Steve Kanefsky
-
-