home *** CD-ROM | disk | FTP | other *** search
- Xref: sparky comp.os.misc:1008 comp.theory:2805 comp.lang.misc:4089 alt.lang.asm:533
- Newsgroups: comp.os.misc,comp.theory,comp.lang.misc,alt.lang.asm
- Path: sparky!uunet!zaphod.mps.ohio-state.edu!cs.utexas.edu!torn!nott!uotcsi2!news
- From: cbbrowne@csi.uottawa.ca (Christopher Browne)
- Subject: Re: PROTOLO A FUTURISTIC OBJECT-CODE FORMAT
- Message-ID: <1993Jan6.185513.15335@csi.uottawa.ca>
- Keywords: languages,objectcode
- Sender: news@csi.uottawa.ca
- Nntp-Posting-Host: prgf
- Organization: Dept. of Computer Science, University of Ottawa
- References: <1993Jan6.172707.14435@ifi.uio.no>
- Date: Wed, 6 Jan 93 18:55:13 GMT
- Lines: 71
-
-
- This sounds very much like some languages that already exist:
-
- 1) The GNU Project internal language called RTL
-
- The compiler front end generates code in RTL. An internal optimizer
- then wrenches it apart, puts it back together again, and translates it
- into the assembly language of the target computer. There's then some
- further peephole optimization, and eventually you get executable
- object code.
-
- There's this theory that they're going to eventually have a whole
- family of front ends. Right now, there are front ends for C, C++ and
- Objective C. There may be an alpha version of a FORTRAN front end.
- There has been talk about Pascal, Modula-2, COBOL (!) and Ada front
- ends, but this seems to be just talk at this point.
-
- RTL seems to be fairly LISP-like.
-
- 2) Some European compiler project involving A. Tanembaum
-
- In an old SIGPlan, I've got an article on a project that Tanembaum (of
- Minix/Amoeba fame) worked on a few years ago.
-
- The basic idea was that they generated compiler front ends (very
- similar to the GCC idea described above) that would produce as output
- a "machine independent" language. In this case, it was a "stack
- oriented" language. (FORTH and PostScript are examples of stack
- oriented languages, but have many more op-codes than Tanembaum's
- system.)
-
- A second program would then generate code for the actual CPU.
-
- Much of the paper talked about how they had to do some pipelining to
- it, combining multiple passes into one in order to improve the speed
- of compilation.
-
- For BOTH of these systems, the advantages to the "internal language"
- were:
-
- a) It makes it easy to use "multiple front ends," which means that you
- can more easily produce compilers for SEVERAL languages, targetted
- towards SEVERAL platforms.
-
- I.e.: If you want compilers for three languages that will run on 6
- different computers, you only need to write 9 programs. The
- conventional "monolithic" method would require 18 compilers.
-
- It gets REALLY worthwhile if you look at having a dozen computers, and
- a dozen languages. It cuts the work by a factor of about 6.
-
- b) Optimization can take place at THREE levels:
- 1) Language level
- 2) Internal pseudo-machine level
- 3) Object code level
-
- which can be very helpful. It may be possible to do MORE
- optimizations, and at the least, it means that each optimizer is
- simpler, and is targetted at ONE of those levels. You're not doing
- variable folding simultaneously with peephole optimization, which cuts
- down on bugs.
-
- Your idea is fairly well understood; I suggest you check out the way
- one of these implementations does things, and try to find ways of
- making it BETTER.
-
- --
- Christopher Browne | PGP 2.0 key available
- cbbrowne@csi.uottawa.ca |======================================
- University of Ottawa | Genius may have its limitations, but
- Master of System Science Program | stupidity is not thus handicapped.
-