Saturday, December 09, 2006

 

JSH: Short explanation, why error is so big

I want to give a short post so that you can quickly understand what all the latest arguing about, but it looks like to cover everything it will be a bit long, but you need to have enough information to understand why this seemingly minor result could have such a huge impact, such that, for instance, it can tell you immediately that Andrew Wiles did not prove Fermat's Last Theorem, which then explains my concern for him and other leading mathematicians seriously impacted by the error.

I start with a factorization:

175x^2 - 15x + 2 = (f(x) + 2)*(g(x) + 1)

multiply by 7

7*(175x^2 - 15x + 2) = 7*(f(x) + 2)*(g(x) + 1)

re-order in a special way on the left side and pick one way to multiply through by 7 on the right:

(49x^2 - 14x)5^2 + (7x-1)(7)(5) + 7^2 = (f(x) + 2)*(7g(x) + 7)

next I introduce new functions that I call the a's where

f(x) = 5a_1(x) + 5

and

7g(x) =5a_2(x)

substitute to get

(49x^2 - 14x)5^2 + (7x-1)(7)(5) + 49 = (5a_1(x) + 7)*(5a_2(x) + 7)

allowing me to find a solution of the a's where they are roots of

a^2 - (7x-1)a + (49x^2 - 14x) = 0.

Well you can clearly see

7g(x) = 5a_2(x)

so it seems reasonable to think that 7 is a factor of a_2(x), but if you stick in some numbers, like x=1, you find that NEITHER of the a's can have 7 as a factor in the ring of algebraic integers.

The ring of algebraic integers is defined to be made up of numbers that are roots of monic polynomials with integer coefficients, and it will not allow just one of the a's to have 7 as a factor.

I say that is a flaw to the ring and that may seem odd, but think about, well, odds and evens, and if you only take evens then 2 is not a factor of 6, right? It's not because 3 is not even, so if you have a rule to only take evens then you can't divide 2 from 6 because that will give you an odd.

Similarly, the ring of algebraic integers excludes some numbers, like the odds are excluded with evens so you can get weird results, but a lot of modern mathematics was built on the idea that it means that neither of the a's has 7 as a factor, which means that with

7g(x) = 5a_2(x)

some factors of 7 would be divided off by g(x), like if w_1(x)*w_2(x) = 7, and you have

g(x) = c(x)/w_1(x)

then

7c(x)/w_1(x) = 5a_2(x)

would explain how a_2(x) does not have 7 as a factor in the ring of algebraic integers.

But now g(x) is like a fraction with g(x) = c(x)/w_1(x), but going back to

175x^2 - 15x + 2 = (f(x) + 2)*(g(x) + 1)

that would mean g(x) is different from f(x), because

f(x) = 5a_1(x) + 5

so f(x) cannot be like a fraction, so anyone arguing with this approach now need f(x) and g(x) to be distinct in this special way.

But I noticed that you can just move a factor of 2 around with

175x^2 - 15x + 2 = (f(x)/2 + 1)*(2*g(x) + 2)

and re-do the same analysis and get the same answer, which now would force f(x)/2 to be like a fraction, with some factor of 7 in the denominator, so don't let that 2 in the denominator throw you off.

Now that should end all debate as it's clear that the 2 can be moved around as described, but you can look in recent threads to see posters still arguing about it, and me talking about protecting leading mathematicians from the impact of the result.

So how could something this trivial looking be so big?

Well, mathematicians over a hundred years ago when algebraic integers were discovered, thought that what was true with algebraic integers was true in general, so to them 7 is NOT a factor of a_2(x), and somehow the 7 is getting split up, and they built mathematical ideas they thought were proofs on the flawed belief.

May not seem like a big deal, but that was over a hundred years ago.

Math people built on what came before, so the error has had over a hundred years to snowball, so now it shadows much of number theory, and is more than big enough to take away the techniques used by Andrew Wiles in his attempt to prove Taniyama-Shimura.

Math is a special discipline in that even seemingly small mistakes can take everything away.

So the arguing is over denying the basic math you see in this post to deny the huge impact over a lot of famous mathematical works, and famous mathematical people.

It would be unfair to just let this sock people like Wiles at random, as at first the impact could be very huge, while time would help to heal the wounds, so leading mathematicians need to be around people who care about them and understand what is going on in the early phases, so that they get through the hardest period ok.

Understand now?

I know it's a lot but in the modern age with so many incredible things happening. I'd hope that some of you can just follow the math, realize the implications, and understand that all of this is for real.

[A reply to someone who asked James to provide a reference for this claim: “what is true with algebraic integers is true in general”.]

Ok, maybe I'm wrong. I've been told this by posters on math newsgroups, so let's check.

For the sake of argument, consider the result that whether or not an algebraic integer has some number as a factor in the ring of algebraic integers means nothing important mathematically.

Just like it doesn't matter that 2 is not a factor of 6 with evens because we know it is a factor of 6 as 6 = 2*3. So let's say that now, for the sake of argument, it is clear that the ring of algebraic integers and results in that ring don't matter.

Would that impact any arguments accepted as proofs in the mathematical field?





<< Home

This page is powered by Blogger. Isn't yours?