### Wednesday, December 10, 2008

## JSH: Why algebraic integers do not really work

Before I came up with my special construction any one of you if presented with the following would call it easily:

7*P(x) = (f_1(x) + 7)(f_2(x) + 2), where f_1(0) = f_2(0) = 0

I'm sure you'd call it trivial as well that f_1(x) is the product of some unknown function times 7, that is

f_1(x) = 7*g_1(x)

so you must have

7*P(x) = (7g_1(x) + 7)(f_2(x) + 2), where g_1(0) = f_1(0) = f_2(0) = 0

by the distributive property and then you would probably angrily ask why you were being bothered with a triviality without even knowing exactly what P(x) is.

But I came up with a special construction and you're asked to throw all that out the window to believe that suddenly the 7 behaves bizarrely:

7*(175x^2 - 15x + 2) = (5a_1(x) + 7)(5a_2(x)+ 7)

where the a's are roots of

a^2 - (7x-1)a + (49x^2 - 14x) = 0.

And all because NOW if you think it's simple you have to believe that only one of the root of

a^2 - (7x-1)a + (49x^2 - 14x) = 0

can have 7 as a factor and mathematicians will tell you that you're wrong, and that NEITHER of the roots can have 7 as a factor, and you will be overwhelmed by the authority figures who are the world's presumed experts in this area, but why do they make that claim?

Turns out they are relying on mathematics associated with what they call the ring of algebraic integers and you can prove that neither of the roots have 7 as a factor in that ring.

But watch an interesting example with integers where I'll show you how their arguments work, except here you'll be able to see what is happening directly:

x^2 + 3x + 2 = 0

but let's say only ONE of the roots has 2 as a factor (of course only one does), and substitute with x=2y, then

4y^2 + 6x + 2 = 0, so 2y^2 + 3x + 1 = 0, and aha! You have a non-monic polynomial!!!!!

Of course you now have roots of -1 and -1/2 because you divided them BOTH by 2 with your substitution.

Well the ring of algebraic integers has a very special feature which is that NONE of its members can be the roots of a non-monic polynomial with integer coefficients that is not reducible over rationals. (That is VERY important, consider it carefully as everything revolves around this point!)

That feature it can be proven results from the definition of algebraic integers as roots of monic polynomials with integer coefficients.

You can say the ring of algebraic integers has a monic prejudice. It prefers polynomials that have a leading coefficient of 1 or -1, which is what monic means.

And THAT is the basis for why mathematicians will tell you in general you cannot say that only one of the roots of

a^2 - (7x-1)a + (49x^2 - 14x) = 0

has 7 as a factor. At x=0, you can see that's not true as one root is 0, and 0 has everything as a factor, but at x=1, you end up with a = 3+/-sqrt(-26) and you cannot resolve that square root, and if you can prove that in the ring of algebraic integers neither root can have 7 as a factor—because if one did then an algebraic integer would be the root of a non-monic polynomial with integer coefficients irreducible over rationals!!!

It's circular. The definition creates the requirement that reinforces the definition.

And it's wrong—one of the roots DOES have 7 as a factor, trivially. So the ring of algebraic integers is wrong. But the ring of algebraic integers was invented in the late 1800's.

Get it then? Incredibly the mistake in thinking occurred over a hundred years ago!

So there is a lot of inertia behind the wrong result, so what you thought of as obvious before my research as I demonstrated at the start—which is correct—is challenged by the people the world has designated as experts because they have over a hundred years of practitioners in their field believing something that turns out to be wrong.

While

Now if mathematicians could be convinced to just follow mathematical proof then they'd realize their definition creates problems, but instead they've chosen to mostly ignore me while you can see some math people arguing with me, continually dodging the simple reality that with something like

7*P(x) = (f_1(x) + 7)(f_2(x) + 2), where f_1(0) = f_2(0) = 0

the thing being multiplied by 7 cannot tell the 7 where to go. The tail does not wag the dog!!!

Worse, once you know the problem you can figure out that a lot of what number theorists do must be wrong but enough of it is wrong that it can challenge the worth of Ph.D's themselves, as in, if a professor's research over decades is completely invalidated by a quirky math error, did he really do anything to deserve the Ph.D? Or the social standing based on the research?

Thorny question.

Anyone willing to answer?

If a quirky math error invalidates the entire thesis of a professor with a Ph.D on which he

Does he still have a Ph.D?

I'd guess that enough math people fear that question that the impasse continues and they challenge very basic algebra to hold on to a world where those questions aren't being seriously asked, at least, not by anyone not maligned globally as a "crackpot".

7*P(x) = (f_1(x) + 7)(f_2(x) + 2), where f_1(0) = f_2(0) = 0

I'm sure you'd call it trivial as well that f_1(x) is the product of some unknown function times 7, that is

f_1(x) = 7*g_1(x)

so you must have

7*P(x) = (7g_1(x) + 7)(f_2(x) + 2), where g_1(0) = f_1(0) = f_2(0) = 0

by the distributive property and then you would probably angrily ask why you were being bothered with a triviality without even knowing exactly what P(x) is.

But I came up with a special construction and you're asked to throw all that out the window to believe that suddenly the 7 behaves bizarrely:

7*(175x^2 - 15x + 2) = (5a_1(x) + 7)(5a_2(x)+ 7)

where the a's are roots of

a^2 - (7x-1)a + (49x^2 - 14x) = 0.

And all because NOW if you think it's simple you have to believe that only one of the root of

a^2 - (7x-1)a + (49x^2 - 14x) = 0

can have 7 as a factor and mathematicians will tell you that you're wrong, and that NEITHER of the roots can have 7 as a factor, and you will be overwhelmed by the authority figures who are the world's presumed experts in this area, but why do they make that claim?

Turns out they are relying on mathematics associated with what they call the ring of algebraic integers and you can prove that neither of the roots have 7 as a factor in that ring.

But watch an interesting example with integers where I'll show you how their arguments work, except here you'll be able to see what is happening directly:

x^2 + 3x + 2 = 0

but let's say only ONE of the roots has 2 as a factor (of course only one does), and substitute with x=2y, then

4y^2 + 6x + 2 = 0, so 2y^2 + 3x + 1 = 0, and aha! You have a non-monic polynomial!!!!!

Of course you now have roots of -1 and -1/2 because you divided them BOTH by 2 with your substitution.

Well the ring of algebraic integers has a very special feature which is that NONE of its members can be the roots of a non-monic polynomial with integer coefficients that is not reducible over rationals. (That is VERY important, consider it carefully as everything revolves around this point!)

That feature it can be proven results from the definition of algebraic integers as roots of monic polynomials with integer coefficients.

You can say the ring of algebraic integers has a monic prejudice. It prefers polynomials that have a leading coefficient of 1 or -1, which is what monic means.

And THAT is the basis for why mathematicians will tell you in general you cannot say that only one of the roots of

a^2 - (7x-1)a + (49x^2 - 14x) = 0

has 7 as a factor. At x=0, you can see that's not true as one root is 0, and 0 has everything as a factor, but at x=1, you end up with a = 3+/-sqrt(-26) and you cannot resolve that square root, and if you can prove that in the ring of algebraic integers neither root can have 7 as a factor—because if one did then an algebraic integer would be the root of a non-monic polynomial with integer coefficients irreducible over rationals!!!

It's circular. The definition creates the requirement that reinforces the definition.

And it's wrong—one of the roots DOES have 7 as a factor, trivially. So the ring of algebraic integers is wrong. But the ring of algebraic integers was invented in the late 1800's.

Get it then? Incredibly the mistake in thinking occurred over a hundred years ago!

So there is a lot of inertia behind the wrong result, so what you thought of as obvious before my research as I demonstrated at the start—which is correct—is challenged by the people the world has designated as experts because they have over a hundred years of practitioners in their field believing something that turns out to be wrong.

While

**what**they believe is based on something circular: where a definition creates a rule, which reinforces the definition that spawned it.Now if mathematicians could be convinced to just follow mathematical proof then they'd realize their definition creates problems, but instead they've chosen to mostly ignore me while you can see some math people arguing with me, continually dodging the simple reality that with something like

7*P(x) = (f_1(x) + 7)(f_2(x) + 2), where f_1(0) = f_2(0) = 0

the thing being multiplied by 7 cannot tell the 7 where to go. The tail does not wag the dog!!!

Worse, once you know the problem you can figure out that a lot of what number theorists do must be wrong but enough of it is wrong that it can challenge the worth of Ph.D's themselves, as in, if a professor's research over decades is completely invalidated by a quirky math error, did he really do anything to deserve the Ph.D? Or the social standing based on the research?

Thorny question.

Anyone willing to answer?

If a quirky math error invalidates the entire thesis of a professor with a Ph.D on which he

**got**his Ph.D and it turns out that all of his research over his career is now known to be wrong, is he still really a professor?Does he still have a Ph.D?

I'd guess that enough math people fear that question that the impasse continues and they challenge very basic algebra to hold on to a world where those questions aren't being seriously asked, at least, not by anyone not maligned globally as a "crackpot".