Monday, September 01, 2008


JSH: Explaining the huge math error

For years now I've tried to raise the alarm about a huge problem in an esoteric branch of number theory where there may be now a huge case of academic fraud, where the problem for me is that the mathematical community itself is so impacted by the size of the error that I haven't found a way to get mathematical proof of it accepted.

Luckily it's easy to explain and crucially depends on the distributive property so my hope is that if I can convince mathematically experienced members of the physics community of the existence of the error they can help with the daunting task of handling the non-response from the math community and the issue of possible widespread academic fraud.

It's simple to explain the error.

Trivially if I have a polynomial factorization like x^2 + 3x + 2 = (x+1)*(x+2), I can multiply both sides by 7, like

7*(x^2 + 3x + 2) = (7x+7)*(x+2)

and divide that 7 off, just as easily, without a trace, as I could multiply by anything.

But over six years ago I came up with a technique where you have a polynomial multiplied by a constant but factored into non-linear functions:

P(x) = 175x^2 - 15x + 2


7*P(x) = (5a_1(x) + 7)*(5a_2(x)+ 7)

where the a's are roots of

a^2 - (7x-1)a + (49x^2 - 14x) = 0,

So now you have something more complicated, where I've enmeshed the quadratic P(x) with a quadratic generator, and it turns out that now in an areas where mathematicians routinely operate to try and prove things, you cannot divide the 7 off, when with an integer x:

a^2 - (7x-1)a + (49x^2 - 14x) = 0

has non-rational roots.

For instance, with x=1, you have the a's are the roots of

a^2 - 6a + 35 = 0

and if you try to use the quadratic formula you have a = (6 +/- sqrt(-104))/2, and if 7 divides just one of those there is a problem in SEEING it because sqrt(-104) is imaginary, and in fact you can rigorously prove that in something mathematicians call the ring of algebraic integers it is IMPOSSIBLE for either of the roots to have 7 as a factor.

So I can prove that one of the roots has 7 as a factor by the distributive property AND prove that it cannot have 7 as a factor in the ring of algebraic integers. So there is an apparent contradiction.

But the ring of algebraic integers is the ring mathematicians have used for over a hundred years for arguments they think are proofs in number theory and it just betrayed a huge problem because think back to

7*(x^2 + 3x + 2) = (7x+7)*(x+2).

I can use functions f_1(x) = 7x and f_2(x) = x - 5, and have

7*(x^2 + 3x + 2) = (f_1(x) + 7)*(f_2(x) + 7)

and that doesn't change how things work so what's changed with

7*P(x) = (5a_1(x) + 7)*(5a_2(x)+ 7)

where the a's are roots of

a^2 - (7x-1)a + (49x^2 - 14x) = 0?

The TYPE of function has changed. I've gone from linear function to non-linear ones that are the roots of a quadratic generator.

But if

a*(f(x) + b) = a*f(x) + a*b

by the distributive property, how can the TYPE of function change the behavior?


does not say, yeah, only if f(x) is linear, now does it?

Sure, maybe if one of the functions has 1/7 in it, the value can change but the distributive property remains the same. But the ring of algebraic integers cannot have 1/7 in it. It does not.

So something weirder is happening.

Turns out that no matter how trivial all this sounds the problem is huge enough to invalidate mathematical arguments thought to be proofs over the entire 20th century until now, as the ring of algebraic integers became widely used by mathematicians in the late 1800's.

I have gone to the mathematical community with this problem, and even got a paper bringing attention to the problem published in the now dead math journal Southwest Journal of Pure and Applied Mathematics, or SWJPAM for short.

Members of the sci.math newsgroup mounted an email campaign against the paper.

The journal pulled it after publication.

It managed one more edition and then quietly shut down.



If the simple mathematical argument I gave you above is correct, then it implies that ALL number theorists today who use the ring of algebraic integers may have flawed results which are not correct because it has a completeness problem, so the very people who are tasked with accepting this error can be completely invalidated by it.

It may remove some of the arguments considered great works over a period of a hundred years, and make useless a tremendous amount of mathematical knowledge which can be shown to lack value by it.

At this point in time, rather than acknowledge the error, mathematicians are continuing to TEACH the flawed mathematical ideas, and have shut all doors—like publication.

The Catch-22 here is that the professors who would be tasked with acknowledging this error are the very people who would find tremendous invalidation from it, so much so in fact, that many of them may have no real mathematical accomplishments at all: from their doctoral theses to their latest research.

Admitting the truth would remove the very basis for their positions.

So then, what is the solution here? How do you handle an error of this magnitude?

<< Home

This page is powered by Blogger. Isn't yours?