The computation of a resultant of r forms in r variables was already solved by Bézout as an instance of this general approach.
An alternative proposal was put forward by Cayley
He assumes to have m1 variables connected by m2 linear equations, not being all independent, but connected by m3 linear equations, again not necessarily linearly independent; we thus obtain s matrices
the number of quantities [m1] will be equal to the number of really independent equations connecting them, and we may obtain by the elimination of these quantities a result#math614#Δ = 0.
The approach,
denoting
The application considers a set of forms
I am not in possession of any method of arriving <#3308#>at once<#3308#> at the final result in its more simplified form; my process, on the contrary, leads me to a result encumbered by an extraneous factor, which is only got rid of by a number of successive divisions.
The first solution, apart Bèzout, for computing the resultant of more than 2 polynomials is due to A.L. Dixon
Given three polynomials
φ(X1, X2) | = | ||
ψ(X1, X2) | = | ||
χ(X1, X2) | = |
equating to zero the cofficients of#math642#[Y1rY2s], for all values of r and s, [D = 0] is equivalent to 2mn equations in [X1, X2] and the number of terms in these equations is also 2mn. Thus the eliminant can be at once written down as a determinant of order 2mn, each constituent of which is the sum of determinants of the third order of the type #math643#Δ : = #tex2html_wrap_inline12863# #tex2html_wrap_inline12864# #tex2html_wrap_inline12865#
In other words, denoting
Clearly the vanishing of the determinant of the matrix
Finally Dixon remarks that such method is
applicable to the problem of elimination when the number of variables is greater than two.
Denote, for each
Given n + 1 polynomials
Δ : = |
(5) |
In a previous paper
He then fixes ``two sets of arbitrary quantities''