# SM-3d-ACloserLookAtAFamilyOfPolynomials

## 4309 days ago by MathFest

As in the last worksheet, our first order of business is to set the stage: in what ring will we be working? Well, the ring of single-variable polynomials over the integers, of course.
R.<x> = ZZ[]
Next, let's define a recursive formula. The following recurrence relations generate the independence polynomials of a family of path-like graphs. We'll define the relations and spit out the first several polynomials.
max_n = 10 b(x) = (3*x+1)*(2*x+1)*(x+1) c(x) = 2*x+1 def p(n): if n == 1: return b(x)+x*c(x) else: if n == 2: return b(x)*p(1)+x*c(x)*b(x) else: return b(x)*(p(n-1)+x*c(x)*p(n-2)) first_few_ps = [p(n).expand() for n in range(1,max_n+1)] first_few_ps
Our hope is to show that these polynomials are logarithmically concave by showing that every root of each is real (this is sufficient, as a simple inductive proof can show). To prove this fact, we'll first need to eliminate all roots of nontrivial multiplicity. To do this, we define a method that finds the exact power of one polynomial which divides a second.
def div_pow(R,p,q): current_pow = -1 current_poly = 1 while p/current_poly in R: current_poly = current_poly * q current_pow = current_pow + 1 return current_pow
Does it work? Let's check:
div_pow(R,(x+1)^3,x+1)
All right, now let's use our new method to find out how many times $b(x)$ divides into each of our polynomials:
mult_of_b = [div_pow(R,first_few_ps[i],b(x)) for i in range(0,max_n)] mult_of_b
Knowing how many we've found, we can safely divide them out and check that the quotients are indeed still polynomials:
first_few_qs = [first_few_ps[i]/(b(x)^(mult_of_b[i])) for i in range(0,max_n)] just_for_fun = [first_few_qs[i]/b(x) for i in range (0,max_n)] in_R = [q in R for q in first_few_qs] print in_R
Just to be on the safe side (not that we don't trust ourselves...), let's check that we have indeed divided out exactly the right power of $b(x)$:
not_in_R = [q in R for q in just_for_fun] print not_in_R
Now to get rid of the powers of $c(x)$. First, let's find how many we can factor out...
mult_of_c = [div_pow(R,first_few_qs[i],c(x)) for i in range(0,max_n)] mult_of_c
...and then we'll divide them out:
first_few_rs = [first_few_qs[i]/(c(x)^(mult_of_c[i])) for i in range(0,max_n)] first_few_rs
Good? Good. This is all fine and dandy...but it's not hard to show that this operation, of ridding ourselves of $b$s and then of $c$s, can be applied to the original defining recurrence relation, yielding a new recurrence relation which gives the family of reduced polynomials we just obtained above:
def r(n): if n == 1: return b(x)/c(x) + x else: if n == 2: return b(x)/c(x) + 2*x else: if n%2 == 1: return b(x)*r(n-1)/c(x) + x*r(n-2) else: return r(n-1) + x*r(n-2) first_few_rs_v2 = [r(n).expand() for n in range(1,max_n+1)] first_few_rs_v2
These don't look exactly like the polynomials $r_n(x)$ we got above...are they the same?:
first_few_rs == first_few_rs_v2
Whew! Okay, now that we've eliminated multiple roots (each of which is real, by design, by the way), let's try to understand why the remaining roots are also real. The trick is to examine the two summands which come together to define $r_n(x)$ recursively. Let's plot these two summands, for a fixed $r_n(x)$, on the same axes:
def our_plots(n,min_x,max_x): if n%2 == 0: p1 = plot(r(n-1),(min_x,max_x),rgbcolor=hue(0.3)) p2 = plot(x*r(n-2),(min_x,max_x),rgbcolor=hue(0.8)) else: p1 = plot(b(x)*r(n-1)/c(x),(min_x,max_x),rgbcolor=hue(0.3)) p2 = plot(x*r(n-2),(min_x,max_x),rgbcolor=hue(0.8)) return p1 + p2 show(our_plots(max_n,-3,0),axes = true)
Pay attention, as we zoom in, to the locations of the roots of these polynomials:
show(our_plots(max_n,-2.5,0),axes = true)
Notice anything? Keep looking:
show(our_plots(max_n,-2,0),axes = true)
The roots appear to be...
show(our_plots(max_n,-1.5,0),axes = true)
...alternating...even as we approach $x = 0$:
show(our_plots(max_n,-0.4,-0.1),axes = true)
Let's use some of Sage's built-in root-finding methods to make our observations a bit more precise. The following block of code finds the roots of the two summands we've been plotting above:
def roots_of_summand1(n): if n%2 == 0: return solve(r(n-1) == 0,x) else: return solve(b(x)*r(n-1)/c(x) == 0,x) def roots_of_summand2(n): return solve(x*r(n-2) == 0,x) n = 20 r1 = roots_of_summand1(n) r2 = roots_of_summand2(n) print r1 print r2
Let's plot those roots:
root_ys1 = [r1[i].right() for i in [0..len(r1)-1]] root_ys2 = [r2[i].right() for i in [0..len(r2)-1]] plot1 = list_plot([(root_ys1[i],0) for i in [0..len(root_ys1)-1]],rgbcolor=(1,0,0)) plot2 = list_plot([(root_ys2[i],0) for i in [0..len(root_ys2)-1]],rgbcolor=(0,0,1)) show(plot1 + plot2)
...Even when they scrunched up together, the alternating pattern holds:
n = 20 root_ys1 = [r1[i].right() for i in [0..len(r1)-1]] root_ys2 = [r2[i].right() for i in [0..len(r2)-1]] root_ys1.sort() root_ys2.sort() trunc_plot1 = list_plot([(root_ys1[i],0) for i in [floor(len(r1)/2)..len(r1)-1]],rgbcolor=(1,0,0)) trunc_plot2 = list_plot([(root_ys2[i],0) for i in [floor(len(r2)/2)..len(r2)-1]],rgbcolor=(0,0,1)) show(trunc_plot1 + trunc_plot2)
When we plot the roots of the sum of these two polynomials (that is, of $r_n(x)$), we find, not surprisingly, that the roots of this sum lie within the intervals determined by the summands' roots:
roots3 = solve(r(n) == 0,x) root_ys3 = [roots3[i].right() for i in [0..len(roots3)-1]] plot3 = list_plot([(root_ys3[i],0) for i in [0..len(root_ys3)-1]],rgbcolor=(0,1,0)) show(plot1 + plot2 + plot3)
Of course, this behavior obtains even for those roots that are bunched up near the origin:
root_ys3.sort() trunc_plot3 = list_plot([(root_ys3[i],0) for i in [floor(len(root_ys3)/2)..len(root_ys3)-1]],rgbcolor=(0,1,0)) show(trunc_plot1 + trunc_plot2 + trunc_plot3)
The upshot: the roots must be real, since sufficiently many of them are real numbers completely determined by the intervals lying between the roots of the two summands.