Indian Statistical Institute B.Math & B.Stat Solved Problems, Vinod Singh ~ Kolkata
Let
c be a fixed real number. Show that a root of the equation
x(x+1)(x+2)…(x+2009)=c can have multiplicity at most
2.
Let
f(x)=x(x+1)(x+2)…(x+2009)−c
First we compute the derivative of
f(x) and see that
f′(x)=(x+1)(x+2)…(x+2009)+x(x+2)…(x+2009)+⋯+x(x+1)…(x+r−1)(x+r+1)…(x+2009)+ ⋯+x(x+1)(x+2)…(x+2008) where
r is a positive integer less than
2009.
Now
f′(−r)=(−r)(−r+1)…(−1)(1)…(−r+2009)=(−1)rr!(2009−r)!>0 if r is even, else
<0.where
r∈{0,1,2,…,2008}
Thus we have the following inequalities,
f′(0)>0,f′(−1)<0,f′(−2)>0,…,f′(2008)>0,f′(2009)<0
This shows that
f′(x)=0 has one real root in each of the intervals
(−1,0),(−2,−1),…,(−2009,−2008). Since
degree of
f′(x) is
2009, all the roots of
f′(x)=0 is real and simple. Thus a root of
f′(x)=0 cannot be a root of the equation
f″(x)=0. So a root of
f(x)=0 can have
multiplicity at most
2.
f(x)=x^2+ax+b+1=0 quadratic polynomial with integral coefficients ,show that if f(x) factorized into factors with integral coefficients ,then there are integer d, e such that d+e=b and de=ac
ReplyDelete