Sunday, September 24, 2006

 

Coverage problem with algebraic integers

I came across yet another way to show the coverage problem of the ring of algebraic integer, again using mostly simple algebra.

Use the polynomial

x^2 - (3+t)x + 2 = 0

as when t is an algebraic integer both roots must be algebraic integers, and also when t = 0, the roots are quite simply 1 and 2.

So express the roots r_1 and r_2 as

r_1 = 1 + t^k*f_1(t)

r_2 = 2 + t^k*f_2(t)

where k is some non-zero rational number, and f_1(t) and f_2(t) are algebraic integer functions of t.

I know I have t^k as a factor because those terms zero out when t=0, as then you just have the trivial

x^2 - 3x + 2 = 0.

Notice then that

r_1 + r_2 = 3+t = 3 + t^k*(f_1(t) + f_2(t)

so

t^(1-k) = f_1(t) + f_2(t)

so abs(k) is less than or equal to 1.

Multiplying the root together you have

r_1 * r_2 = 2 = 2 + t^k*f_2(t) + 2*t^k*f_1(t) + t^{2k}*f_1(t)*f_2(t)

so

t^k*f_2(t) + 2*t^k*f_1(t) + t^{2k}*f_1(t)*f_2(t) = 0

so

f_2(t) + 2*f_1(t) + t^k*f_1(t)*f_2(t) = 0

which tells you that f_1(t) must give unit results.

Now it's trivial to show the coverage problem by using t=2 as then

I have

x^2 - 5x + 2 = 0

and the roots are

r_1 = 1 + 2^k*f_1(2)

r_2 = 2 + 2^k*f_2(2)

so one root is coprime to 2, which forces the other to have 2 as a factor, meaning k=1.

But let x = 2y, and I have

4y^2 - 10y + 2 = 0

and dividing both sides by 2 gives

2y^2 - 5y + 1 = 0

but that is a non-monic polynomial with integer coefficients irreducible over Q—that is, it does not have rational roots—so it cannot have algebraic integer roots proving that neither root of x^2 - 5x + 2 = 0 can have 2 as a factor!!!

The seeming contradiction is resolved by realizing that the ring of algebraic integers excludes some numbers, which isn't so remarkable if you consider evens as a ring, and note that 2 is coprime to 6 in that ring because 3 is not even.

But mathematicians haven't noticed this problem for over a hundred years while using the ring of algebraic integers during that time, so there are mathematical arguments believed to be proofs which are not.

As an analogy continuing with the evens example, imagine someone had what they thought was a proof that settled on 2 being coprime to 6, in evens. But it's not a proof if they realize that 3 is a factor of 6, though not even, but they just don't know it somehow.

That's kind of what happened over the last hundred plus years in number theory.

It's easy to prove the problem, but quite a few people around the world have careers built upon it, like if many people had arguments depending on 2 being coprime to 6 in evens, and someone finally pointed out that 3, while not even, is a factor of 6.

The sad thing is that modern mathematicians have chose to ignore the error and keep teaching the wrong mathematics!!! Which they are doing to this day.

Like, college students around the world—even at premiere schools like Harvard or Yale—tomorrow will get homework on or be tested on bogus mathematics that doesn't work because of this weird error, when the information is out there, but mathematicians are avoiding the truth maybe because it's easier!





<< Home

This page is powered by Blogger. Isn't yours?