The Resultant, Episode 5 (The Finale)
Recap: The setting is an integral domain R, with fraction field K, and extension field L of K in which E(x) and F(x) split completely. E(x) and F(x) have coefficients in R. E(x) has degree m, F(x) degree n; we assume m,n>0. The main special case for us: R=k[y], K=k(y), so R[x]=k[x,y], and E and F are polynomials in x and y. As always, we assume k is algebraically closed.
The formulas:
(1)
(2)
E(x) = amxm+···+a0 = am(x–u1)···(x–um) (3a)
F(x) = bnxn+···+b0 = bn(x–v1)···(x–vn) (3b)
PE+QF=det(S) (4)
for some P∈Rn[x], Q∈Rm[x]
i.e., deg(P)<n, deg(Q)<m, coefficients in R
Φ: Rn[x]⊕Rm[x] → Rm+n[x] (5)
Φ(P,Q)=PE+QF
ditto with K in place of R
Matrix of Φ is the Sylvester matrix, e.g.
;
In our main special case, eq.(3a) now reads, in part: E(x,y) = am(y)xm+···+a0(y), likewise for (3b). So we have polynomials E(x,y) and F(x,y) defining curves E and F. The key facts:
- The resultant resx(E,F) is a polynomial in y alone, call it resx(y).
- resx(y) is identically 0 ⇔ Φ is singular ⇔ E and F have a common component ⇔ E and F have a common nonconstant factor.
- resx(y0)=0 ⇔ [the line y=y0 passes through an intersection of E and F, or am(y0)=bn(y0)=0].
- The order of resx(y) is the sum of the multiplicities of the intersections on the x-axis, provided am(y) is a constant.
- The degree of resx(y) is the sum of the multiplicities of the intersections in the affine plane, provided am(y) is a constant.
Also, x and y can trade places in all of these.
Now to tie up some loose ends, namely proving some of this. Fact (1) holds because det(S) belongs to R=k[y]. Episodes 1 and 2 spent most of their running time justifying Fact (2).
One direction of Fact 3 falls out immediately from eq.(4): if the line y=y0 passes through an intersection of E and F, then there exists an x0 such that E(x0,y0)=F(x0,y0)=0, whence
resx(y0)=P(x0,y0)E(x0,y0)+Q(x0,y0)F(x0,y0)=0.
On the other hand, if am(y0)=bn(y0)=0, then plugging this value of y into the Sylvester determinant gives a first column that is entirely 0. So the resultant is 0 at y=y0.
In the other direction, we stated Fact 2 for the polynomial resx(y), implicitly assuming R=k[y]. But Fact 2 holds just as well for the more boring case of R=k. It says that if resx (an element of k) equals 0, then E(x,y0) and F(x,y0) have a common nonconstant factor in an extension field. Since k is algebraically closed, the polynomials factor completely over k and so must have a common factor (x–x0). Thus E(x0,y0)=F(x0,y0)=0, and E and F have an intersection on the line y=y0.
“Hmm”, you’re probably wondering, “where did you use the assumption that am(y0)≠0 or bn(y0)≠0?” If you trace through the proof of Fact 2 in Episode 2, you’ll find this step: pE=–qF, so the degree m polynomial E divides qF. But deg(q)<m, so E and F share a nonconstant factor. Well, E(x,y) as a polynomial in x has degree m, and E(x,y0) will still have degree m when am(y0)≠0. We can use the same reasoning if bn(y0)≠0. But if am(y0)=bn(y0)=0, then the argument falls through.
A somewhat subtle point: we’ve tacitly assumed that we obtain the resultant of E(x,y0) and F(x,y0) by plugging y=y0 into the resultant of E(x,y) and F(x,y). If you want to get all fancy about it, we’re applying an evaluation homomorphism from k[y] to k, converting the two-variable polynomials E(x,y) and F(x,y) into the one-variable E(x,y0) and F(x,y0), and likewise converting det(S), with entries from k[y], into a determinant with entries from k. I’ve belabored this because it’s at the heart of the other direction for Fact 3.
If you ponder the double-product form for the resultant, on the right hand side of eq.(1), you might see an alternate proof of Fact 3. I’ll talk about this in “Inside the Episode”.
As for the proofs of Facts 4 and 5, I will feature these as the initial steps of Kendig’s proof of Bézout’s Theorem.
[Closing theme music, credits—
“These episodes owe much to Anthony Knapp’s treatment in his Advanced Algebra.”