### Sunday, August 24, 2008

## JSH: Understanding the distributive property result

I made a very simple post that has a lot of detail to explain a rather basic result that follows from the distributive property because for YEARS posters on the sci.math newsgroup have disputed it, and it's worth explaining in enough detail for people to understand how easy the mathematics is.

Essentially the key result is that the distributive property doesn't care about the value of functions, so

a*(f(x) + b) = a*f(x) + a*b

but I have shown cases where with expressions of the general form

c*P(x)=(f_1(x) + c)*(f_2(x) + c)

in the ring of algebraic integers where P(x) is a polynomial with integer coefficients, and 'c' is a non-zero, non-unit integer, that factors of c in common with f_1(x) + c or f_2(x) + c at one value of x are NOT factors at some other value, so it's like the distributive property would have to be choosy, and it NOT be true in general that

a*(f(x) + b) = a*f(x) + a*b.

Posters in arguing with me against the distributive property for years have simply claimed that factors of 'c' bounce around depending on the value of x, which I call the tail wagging the dog, as if

a*(f(x) + b) = a*f(x) + a*b

is true sometimes depending on the value of f(x) but not true at other times.

That is also the objection that Professor McKenzie of Vanderbilt University and University of California at Berkeley shot down immediately when I discussed a non-polynomial factorization argument with him at Vanderbilt, in a visit to my alma mater a few years back.

So for years I've had any number of reasons to know that absolute mathematical proof shows that I'm correct, and that disputes about my research are disputes about the distributive property, but posters would just deny that and keep doing it.

And my papers were being denied by journals across the board, where SWJPAM was the only journal to at least briefly do the right thing and publish before posters from the sci.math newsgroup conspired and went after the paper in an email campaign.

So there is easy mathematical proof of a "core" error and I've contacted notables like Barry Mazur and Ralph McKenzie with whom I spoke in person and wrote things out on his chalkboard at his request, and he shot down immediately the objection that some sci.math'ers have used for years, so why hasn't anything happened?

Well the result is so crushing as it removes the ring of algebraic integers completely from usefulness in number theoretic analysis as you can make apparent contradiction using that ring where you appear to prove one thing one way, but can definitively prove the opposite another, which is what I did with my paper.

Considering the across the board rejection of my research from that sweeping result, to my prime number research, to my factoring research, it appears to me that certain people in the number theory community may have decided to let the human race go hang, block progress in number theory indefinitely, and just hold on for as long as they can to prestige, and paychecks that they now know they are not earning.

And, possibly most horrendously, teach young minds flawed mathematical ideas as THAT is the best way to guarantee they keep getting away with it, and maybe they're just cynical about human progress if that is true, as what they are doing is stopping progress in number theory so that it is all "pure math" in that it will NEVER be of value as it's all wrong, unless someday, someway the truth comes out, possibly by computers checking?

But you see, calling it "pure" and saying that it's not supposed to have practical application removes the threat of discovery until computers develop to the point of checking these arguments, and I suspect they are blocking computer science in that area as well, so maybe they figure if they can hold off human progress in number theory for 50 years or more they'll be long dead before anyone realizes they are frauds.

So they will have had the rewards in their lifetimes and avoided the painful reality.

But that is a speculative theory as I try to figure out how people could do these things, especially to so many young people around the world, who trust them.

Essentially the key result is that the distributive property doesn't care about the value of functions, so

a*(f(x) + b) = a*f(x) + a*b

but I have shown cases where with expressions of the general form

c*P(x)=(f_1(x) + c)*(f_2(x) + c)

in the ring of algebraic integers where P(x) is a polynomial with integer coefficients, and 'c' is a non-zero, non-unit integer, that factors of c in common with f_1(x) + c or f_2(x) + c at one value of x are NOT factors at some other value, so it's like the distributive property would have to be choosy, and it NOT be true in general that

a*(f(x) + b) = a*f(x) + a*b.

Posters in arguing with me against the distributive property for years have simply claimed that factors of 'c' bounce around depending on the value of x, which I call the tail wagging the dog, as if

a*(f(x) + b) = a*f(x) + a*b

is true sometimes depending on the value of f(x) but not true at other times.

That is also the objection that Professor McKenzie of Vanderbilt University and University of California at Berkeley shot down immediately when I discussed a non-polynomial factorization argument with him at Vanderbilt, in a visit to my alma mater a few years back.

So for years I've had any number of reasons to know that absolute mathematical proof shows that I'm correct, and that disputes about my research are disputes about the distributive property, but posters would just deny that and keep doing it.

And my papers were being denied by journals across the board, where SWJPAM was the only journal to at least briefly do the right thing and publish before posters from the sci.math newsgroup conspired and went after the paper in an email campaign.

So there is easy mathematical proof of a "core" error and I've contacted notables like Barry Mazur and Ralph McKenzie with whom I spoke in person and wrote things out on his chalkboard at his request, and he shot down immediately the objection that some sci.math'ers have used for years, so why hasn't anything happened?

Well the result is so crushing as it removes the ring of algebraic integers completely from usefulness in number theoretic analysis as you can make apparent contradiction using that ring where you appear to prove one thing one way, but can definitively prove the opposite another, which is what I did with my paper.

Considering the across the board rejection of my research from that sweeping result, to my prime number research, to my factoring research, it appears to me that certain people in the number theory community may have decided to let the human race go hang, block progress in number theory indefinitely, and just hold on for as long as they can to prestige, and paychecks that they now know they are not earning.

And, possibly most horrendously, teach young minds flawed mathematical ideas as THAT is the best way to guarantee they keep getting away with it, and maybe they're just cynical about human progress if that is true, as what they are doing is stopping progress in number theory so that it is all "pure math" in that it will NEVER be of value as it's all wrong, unless someday, someway the truth comes out, possibly by computers checking?

But you see, calling it "pure" and saying that it's not supposed to have practical application removes the threat of discovery until computers develop to the point of checking these arguments, and I suspect they are blocking computer science in that area as well, so maybe they figure if they can hold off human progress in number theory for 50 years or more they'll be long dead before anyone realizes they are frauds.

So they will have had the rewards in their lifetimes and avoided the painful reality.

But that is a speculative theory as I try to figure out how people could do these things, especially to so many young people around the world, who trust them.