home *** CD-ROM | disk | FTP | other *** search
- Xref: sparky sci.math:17181 rec.puzzles:8053
- Newsgroups: sci.math,rec.puzzles
- Path: sparky!uunet!zaphod.mps.ohio-state.edu!caen!destroyer!gumby!kzoo!k044477
- From: k044477@hobbes.kzoo.edu (Jamie R. McCarthy)
- Subject: Re: Naming Large Numbers (Re: Negative Zero)
- Message-ID: <1992Dec18.212004.10921@hobbes.kzoo.edu>
- Followup-To: poster
- Organization: Kalamazoo College
- References: <1992Dec12.010711.15778@leela.cs.orst.edu> <1992Dec15.210004.2556@hobbes.kzoo.edu> <1992Dec17.144306.10885@vax.oxford.ac.uk>
- Date: Fri, 18 Dec 1992 21:20:04 GMT
- Lines: 21
-
- wilcox@vax.oxford.ac.uk writes:
- >k044477@hobbes.kzoo.edu (Jamie R. McCarthy) writes:
- >> I submit that one cannot write down, in scientific notation, the number
- >> x, such that [blah blah blah]...
- >
- >Practically speaking, one cannot. Theoretically speaking, one can.
-
- Because it's so trivial to write the number down in theory (you just
- pull out your theoretical pencil and write down all 10^126 digits), I
- didn't feel it necessary to point out that I was speaking of "writing"
- in the practical, rather than the ideal, sense.
-
- The original article asserted that there was no rational number that
- could not be "written" using standard symbols. Since this is so
- trivially true if "write" is used in the ideal sense, I assumed the
- assertion intended the practical sense.
- --
- Jamie McCarthy Internet: k044477@kzoo.edu AppleLink: j.mccarthy
- "As you should know by now, we're strong believers in the Apple II
- and always will be. But we can't ignore reality forever."
- - Tom Weishaar
-