home *** CD-ROM | disk | FTP | other *** search
- Comments: Gated by NETNEWS@AUVM.AMERICAN.EDU
- Path: sparky!uunet!paladin.american.edu!auvm!QMRELAY.MAIL.CORNELL.EDU!DICK_DARLINGTON
- Message-ID: <STAT-L%92081713153575@VM1.MCGILL.CA>
- Newsgroups: bit.listserv.stat-l
- Date: Mon, 17 Aug 1992 13:20:04 U
- Sender: "STATISTICAL CONSULTING" <STAT-L@MCGILL1.BITNET>
- From: dick darlington <dick_darlington@QMRELAY.MAIL.CORNELL.EDU>
- Subject: SE(b) without inverse
- Lines: 22
-
- 1:35 AM
- OFFICE MEMO Time:
- Subject:
- SE(b) without inverse#000# 08-04-92
- Date:
-
- This is in answer to a recent question about how to compute the standard error
- of a regression slope without inverting a correlation or covariance matrix. We
- predict a dependent variable Y from a set of P predictors in a sample of N
- cases. As usual, define MSE = SS(errors)/(N-P-1). To find the standard error
- of one regression slope b(j), run a second regression predicting that one
- predictor X(j) from the other P-1 predictors. Let SSU(j) denote the sum of
- squared errors in this second regression; the "U" stands for the "unique"
- portion of X(j). Then the standard error of b(j) is sqrt(MSE/SSU(j)).
- I don't know the motivation behind the original question, but sometimes when
- rounding error is a problem due to near-singularity of the X matrix, this
- method calculates standard errors more accurately than they can be calculated
- with an inverse of R. In Gauss, which I use, you can use this method to find
- standard errors accurately even when the inversion routine fails due to
- near-singularity.
- Dick Darlington, Psychology, Cornell
- #000#
-