home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.lang.verilog
- Path: sparky!uunet!cs.utexas.edu!usc!elroy.jpl.nasa.gov!swrinde!emory!europa.asd.contel.com!darwin.sura.net!nntp.msstate.edu!saimiri.primate.wisc.edu!relay!relay2!afterlife!smb
- From: smb@afterlife.ncsc.mil (Steve M. Burinsky)
- Subject: Avoiding inertial delay in delay lines
- Message-ID: <1992Nov12.053306.12799@afterlife.ncsc.mil>
- Organization: The Great Beyond
- Date: Thu, 12 Nov 1992 05:33:06 GMT
- Lines: 28
-
- I need to simulate delay lines in order to model a bus interface. The
- solution seems obvious -- use a net delay. However, the signal I need
- to delay has pulse widths (w) which are shorter than the delay time (d);
- that is w < d. The problem is that inertial delay causes the short pulse
- to disappear within the delay.
-
- So, how should this be implemented? I have two options, neither of which
- I like:
- 1. Implement the delay line as a string of (for example) 10 buffers, each
- with 10% of the total required delay. This has drawbacks, as it requires
- that the user know the number of internal sub-delay units in order to
- guarantee acurate delay generation. Also, (given that 10 buffers are used)
- 10w > d is required.
- 2. Implement a PLI routine which calls (for example) tf_strdelputp, which
- allows me to specify transport (versus inertial) delay.
-
- Is there anyway I can choose transport delay using a standard language
- construct? Is there another (easy, portable, general) way of implementing
- delay lines I'm missing?
-
- Thanks,
- Steve Burinsky
- smb@afterlife.ncsc.mil
-
- --
-
- Steve M. Burinsky
- smb@afterlife.ncsc.mil
-