Monday, October 20, 2003

 

Simple principle in core error proof

What makes my situation especially frustrating is how simple the argument is that proves there's this problem with algebraic integers, yet still I have to keep explaining.

The basic principle is like with

P(m) = 2(m^2 + 2m + 1) = (a_1 m + 2)(a_2 m + 1)

where most of you can probably guess what factor a_1 must have!!!

Now the polynomial I use is more complicated such that I need to set m=0 to figure things out, but notice here what happens:

P(0) = 2, and dividing off 2 from P(m) gives

P(m)/2 = m^2 + 2m + 1

and notice that P(0)/2 = 1, which tell you that the independent term changed.

Given

(a_1 m + 2)(a_2 m + 1)

it's clear that a_1 has a factor that is 2.

It's a simple idea that I use with a more complicated polynomial, and I think that the reason so many math people go out of their way to make it seem like it's wrong is that they're embarrassed by the over hundred year old error that I found.

I've been surprised at how dedicated they can be at trying to hide the truth.

After all, I've communicated with top mathematicians like Barry Mazur, Andrew Granville, and Ralph McKenzie, who may be people many of you haven't heard of, but in certain math circles they're well-known.

In McKenzie's case I explained it all to him in-person and he basically blew me off.

These mathematicians are hellbent on trying to hide that their discipline could actually have a flaw like this for as long as they can get away with it.

It's wrong, and it can't help world society.

Meanwhile it's not doing me a bit of good either, and I think that part of their motivation is making me miserable, knowing that what I found *should* get me accolades.

Yup, call me crazy, but I think the bastards are out to get me!!!





<< Home

This page is powered by Blogger. Isn't yours?