Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
The Almighty Buck Science

Mathematician Claims Proof of Riemann Hypothesis 561

TheSync points to this press release about a Purdue University mathematician, Louis de Branges de Bourcia, who claims to have "proven the Riemann hypothesis, considered to be the greatest unsolved problem in mathematics. It states that all non-trivial zeros of the zeta function lie on the line 1/2 + it as t ranges over the real numbers. You can read his proof here. The Clay Mathematics Institute offers a $1 million prize to the first prover."
This discussion has been archived. No new comments can be posted.

Mathematician Claims Proof of Riemann Hypothesis

Comments Filter:
  • Re:Apology (Score:4, Insightful)

    by badboy_tw2002 ( 524611 ) on Wednesday June 09, 2004 @07:10PM (#9382614)
    Uh, the above comment was a joke people. The quote in the parent post does NOT appear in the document. Apology in this case means a defense of the proof.
  • Re:Proof of theory (Score:5, Insightful)

    by k98sven ( 324383 ) on Wednesday June 09, 2004 @07:13PM (#9382639) Journal
    Interesting that the only time a proof of concept is ever challanged is when money is involved.

    Bull. There are thousands of mathematical researchers. Most don't have hefty salaries, and most aren't working on money-prize problems.

    Mathematicians are never in it for the money.

    Wonder what he'll do with the money?

    Seems like he wants to restore the old family castle:

    The ruin of the chateau de Bourcia overlooks a fertile valley surrounded by wooded hills. The site is ideal for a mathematical research institute. The restoration of the ch^ateau for that purpose would be an appropriate use of the million dollars offered for a proof of the Riemann hypothesis.


    I must say that at he seems a bit full of himself, or at least, getting a bit ahead of himself. Given how many have tried and failed witht his problem.
  • by TorKlingberg ( 599697 ) on Wednesday June 09, 2004 @07:19PM (#9382682)

    Will the media keep publishing claims of extraordinary mathematical findings without checking the facts forever?

    Just like this one over again:
    Swedish Student Partly Solves 16th Hilbert Problem [slashdot.org]

    That's what I like about /. If the article is wrong, there is always the comments there to solve it.

  • Re:Proof of theory (Score:3, Insightful)

    by the_2nd_coming ( 444906 ) on Wednesday June 09, 2004 @07:20PM (#9382698) Homepage
    huh?

    Mathematicians have been working on this for a long time. it is not like one day this guy woke up and said "oh, 1 million dollars for it eh, well I better get to work."
  • by stigin ( 729188 ) on Wednesday June 09, 2004 @07:21PM (#9382706)
    Uhm, because none of the media have the experts to check if claims like this are true. And there is no harm in publishing that a claim has been made.
  • Re:Good job (Score:5, Insightful)

    by nametaken ( 610866 ) on Wednesday June 09, 2004 @07:22PM (#9382707)
    You're probably right. But society does recognize a one million dollar prize. This one may actually get TV time. Funny how that works.
  • by pclminion ( 145572 ) on Wednesday June 09, 2004 @07:28PM (#9382751)
    So if a guy fails you should never listen to him again?

    It took Einstein many tries to arrive at the correct fomulation for general relativity. I guess according to you, he should have just given up after his first failure?

  • by Lane.exe ( 672783 ) on Wednesday June 09, 2004 @07:35PM (#9382810) Homepage
    Not really. It means he's a prolific member of the community who is not afraid to take risks with his work. Consider an experimental scientist -- in an experiment, one that turns back negative results, or on that fails, still produces important data. Similarly, this is like "experimental mathematics." If he fails, then we'll know why he fails, how far he got doing things right and other things which can point us to the correct proof.

  • by Tsiangkun ( 746511 ) on Wednesday June 09, 2004 @07:49PM (#9382918) Homepage
    A long time ago, in the distant past, there were Finders. Dedicated individuals that wandered around outside the camps and found stuff. Over time, it became more difficult to find stuff, and the Finders became the Searchers. Many times the Searchers would return empty handed. As technologies improve and new insights are gained, the same fruitless searches of the past were repeated. Sometimes with a new results, sometimes as fruitless as before. Regardless, it was this not giving up on an idea just because it failed once that led the change in title from Searcher to Researcher.

    Most reseachers I know produce one magnificent failure after another on the quest for a new piece of knowledge. Everything that is easy to find has probably already been discovered, and mathematics is no different. So the guy made a few failed attempts at solving the puzzle, this doesn't make each sucessor to the first attempt a garaunteed failure.
  • by Txiasaeia ( 581598 ) on Wednesday June 09, 2004 @07:50PM (#9382923)
    What you're supposed to do is come up with a theory and then prove it. It's assumed that any mathematician/scientist would do so, given enough time (and years of life left). The guy's also dead, so we give him the benefit of the doubt - "If he were alive, yeah, he would have solved it. Good ol' Bernie was always good for stuff like that."
  • Re:Apology (Score:2, Insightful)

    by princewally ( 699307 ) on Wednesday June 09, 2004 @07:55PM (#9382951) Homepage
    Yeah, there's no chance I'm going to understand a mathematical proof, so I'm relegated to the realms of the ignorant.
  • by exp(pi*sqrt(163)) ( 613870 ) on Wednesday June 09, 2004 @07:57PM (#9382966) Journal
    But the Riemann hypothesis is hard. We can be pretty confident Riemann wouldn't have figured it out (whatever that means). He probably didn't even have the mathematical tools he needed available to him. The only way I would allow such a statement would be if he died and left a manuscript with a partial proof that could be extended to a full proof by a good mathematician in a reasonable time. We know that no such document exists.
  • by praxim ( 117485 ) <pat AT thepatsite DOT com> on Wednesday June 09, 2004 @08:17PM (#9383080) Homepage
    The origins of the hypothesis date back to 1859, when mathematician Bernhard Riemann came up with a theory about how prime numbers were distributed, but he died in 1866, before he could conclusively prove it.

    You need to read this a bit more carefully. It does not say "died before he proved it." It says "died before he could conclusively prove it," as in before he was able to do so.

  • by Anonymous Coward on Wednesday June 09, 2004 @09:31PM (#9383403)
    And who do you think is going to manage and live in the wonderful chateau doing all this research?
  • by EvanED ( 569694 ) <evaned@NOspAM.gmail.com> on Wednesday June 09, 2004 @09:42PM (#9383435)
    And the first attempt of Andrew Wiles to prove Fermat's Last Theorem also failed, but he managed to patch it.
  • Re:Proof? (Score:2, Insightful)

    by Anonymous Coward on Wednesday June 09, 2004 @11:04PM (#9383849)
    Math does not rely on the scientific method. Math relies on rigorous proof based on axiomatic definitions, i.e. logic. Proofs can be flawed, i.e. wrong, but we can never at a later date say, "Hey, it turns out that x^3 = y^3 imples x = y is just not true this time."

    Unless, of course, you think that logic could have some holes in it. But let's not go down that road.
  • by Phleg ( 523632 ) <stephen AT touset DOT org> on Wednesday June 09, 2004 @11:11PM (#9383878)

    Not to take away from the brilliant work of this guy, and I'm sure his work will have generated some good math on the way. But just knowing whether the Riemann hypothesis is true is not of much help (people have been assuming it to be true for a while).

    Your comment explains why discovering a proof for the Riemann Hypothesis is such a monumental event. Mathematicians have assumed it to be true for some time now, and there exists a massive amount of mathematical theory which rests upon its validity. Proving the hypothesis ensures that their reasoning is on solid ground. Without one, there's no way to know for sure whether or not their conjectures are true.

  • by crashnbur ( 127738 ) on Thursday June 10, 2004 @12:03AM (#9384100)
    The media just report the facts (insert joke here), and the fact -- in this case -- is that someone claims to have made an extraordinary mathematical discovery. Therefore, in this case, we are the fact-checkers. Or, rather, anyone who understands enough math to sift through 124 pages of an alleged proof (to prove the proof?) are the fact-checkers.
  • by Dasein ( 6110 ) * <tedc@nospam.codebig.com> on Thursday June 10, 2004 @12:08AM (#9384116) Homepage Journal
    There's the occasional post that deserves to be modded to "+10 -- Best Damn Thing I've Read On Slashdot This Year".

    Thanks!
  • by Anonymous Coward on Thursday June 10, 2004 @06:32AM (#9385392)
    yeah but back then it was new and people could recognize it would be useful someday (Nobel prize in 1921) but this everybody assumes so a proof really doesn't change much, besides its probably flawed
  • by dasmegabyte ( 267018 ) <das@OHNOWHATSTHISdasmegabyte.org> on Thursday June 10, 2004 @10:17AM (#9386693) Homepage Journal
    OK. So your claim is that he's not a real mathematician because, though he proves things, he does so in an unusual manner? And that he will be forgotten because of his eccentricity?

    I'm sorry, but if you had to make a brief list of mathematicians...just off the top of your head...I'm willing to bet real cash that most of them have "eccentricities" of their own. Are you claiming Newton wasn't a moody asshole? Or that Fermat wasn't a bit nutty? That Rene Descartes was some kind of boring pencil pusher with no ponderance on philosophy and life?

    What about DaVinci? What about Nash? And I hear that Galileo and Kepler were rogues as well.

    I mean, come on! Insanity and flair make a mathematician's career more than any actual acheivment. I mean, look at your post. As much as you claim to hate de Branges, you know everything about him. And you don't even know the NAMES of the Russian students who rewrote his hypothesis.
  • by Lodragandraoidh ( 639696 ) on Thursday June 10, 2004 @12:33PM (#9388520) Journal
    This whole thread is precisely why Computer Science should have never been allowed to fall into the Mathematics Department.

    How many practical computing problems have I run into in my carreer that have been NP complete? 0 - in 10+ years.

    99.999% of the computer science graduates will not have to deal with this issue - which is mainly concerned with or cutting edge theoretical issues (for example, how to do ray tracing in real time in a video game). Most programming is algorithmic, rather than mathematical, and what little math is needed is generally polynomial or matrix transformations.

    In many instances rigorous mathematics isn't needed at all, and fuzzy logic or rules of thumb can be used effectively to get the job done. However, due to computer science being tied at the hip to mathematics, people are getting educations which don't mesh with the reality they see in the business world (where 85% - give or take - of the graduates will end up).

    I propose the 2 following divisions of computer science:

    Theoretical Computing - the mathematics ladden branch - includes logic design and engineering, as well as software to support 'deep science' in peripheral disciplines as well as applied to computing and theoretical mathematics.

    Algorithmic Computing - the art of computer programming and system integration. This is the trial and error, get your hands dirty department.

    Finally, I don't know if I like the idea of having a seperate Information Technology curriculum in the business department. From my experience, I always end up having to teach these folks new on the job things that I learned in school (if they need to learn it, they should pay for learning it - praticularly if they end up with a salary equivalent to mine). They are getting an incomplete education that is not useful in an environment where systems integration is the norm, and thus being a jack of all trades is more important than being able to write an SQL query or kick out a Cobol program to calculate the depreciation of someone's stock portfolio.

It is easier to write an incorrect program than understand a correct one.

Working...