Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming IT Science Technology

Interview With Turing-Award Winner Robin Milner 132

Martin Berger writes "Turing Award (1991) winner Robin Milner is one of the most influential computer scientists. He may not be as well-known as he deserves to be, but his research contributions are ubiquitous: he developed the first mathematically sound yet practical tool for machine assisted proof construction. This research has been continued successfully and led to many useful proof assistants such as HOL, Coq or Isabelle that are being used heavily for verification purposes today." Read on for more information about Milner, and a link to Berger's excellent interview with him.
Berger continues "There is also a direct line from this strand of Milner's work to what may be one of the hottest topics in computer science: proof carrying code. Milner also headed the effort to develop ML (best known today by its descendant Ocaml), the first language to include polymorphic type inference together with type-safe exception-handling and module mechanisms. Most modern programming languages can trace some of their advanced features directly back to ML's pioneering efforts. Most of all, he established concurrency theory as a scientific field by creating and studying idealised concurrent programming languages like the Pi-Calculus. That calculus is becoming more and more influential in the design of new programming languages (for example Microsoft's XLANG) and the WWW infrastructure. A few weeks ago, I interviewed Milner. I wanted to find out about the man and the stories behind all this great research. I hope you find it as interesting as I do. The transcript of the interview can be found here."
This discussion has been archived. No new comments can be posted.

Interview With Turing-Award Winner Robin Milner

Comments Filter:
  • by bsharitt ( 580506 ) <(moc.ttirahs) (ta) (tegdirb)> on Wednesday November 26, 2003 @02:58PM (#7571363) Journal
    So he's the reason I have to take so much calculus to be a computer programmer.

    • s/programmer/scientist/
    • Re:So he's the one (Score:5, Insightful)

      by Jagasian ( 129329 ) on Wednesday November 26, 2003 @03:24PM (#7571637)
      First we should make the distinction between Integral Calculus and other calculi such as the pi-calculi and the lambda-calculi. The Integral Calculus can be categorized as continous math, while the pi and lambda calculi can be put under the label of "discrete math" as both pi and lambda calculi are term rewrite systems.

      I am still trying to figure out why Integral Calculus is forced down everyone's throat. Computer Scientists are better off studying proof theory, axiomatic set theory, lambda-calculi, etc...

      There are subfields within CS that make use of Integral Calculus... but most subfields of CS do not use it and instead use things like proof theory, set theory, etc.

      One thing that you have to understand is that there are still people out there that think that "math" means numbers, and therefore sophisticated numerical math such as the Integral Calculus is crammed down everyone's throats. I think that for most people, a more conceptual math based on category theory or set theory or somesuch would be far more useful in the long run.
      • Re:So he's the one (Score:1, Insightful)

        by Anonymous Coward
        as both pi and lambda calculi are term rewrite systems.

        i'm not sure calculi like lambda or pi can easily be classified as term rewrite systems because of (1) alpha conversion, i.e. renaming of bound variables; and (2) because of all the coinductive stuff that you have in the definition of the congruence relations that you use to quotient the syntax with ...
      • Re:So he's the one (Score:5, Insightful)

        by Com2Kid ( 142006 ) * <com2kidSPAMLESS@gmail.com> on Wednesday November 26, 2003 @04:15PM (#7572077) Homepage Journal
        • I am still trying to figure out why Integral Calculus is forced down everyone's throat.


        Because Differential/Integral Calculus makes up the basis of, well, everything. Understanding D&I Calculus has allowed me to grasp both concepts that I could sort of understand, but never really comprehend, and new ideas that have expanded my awareness of the world around me.

        For starters, lines, curves, and surfaces, the very basics of geometrical mathematics, has differentials and integrals as a fundamental idea that surrounds it completely. As a person who enjoys doing 3D modeling and is currently hired as a writter of 3D tutorials, Calculus has enabled me to have such a deep understanding of the material that I am able to explain it to others with ease, including those who have not had calculus or even trig classes!

        Second off, Calculus relates to Physics. Anyone can be taught to memorize Physics equations, but once you know calculus, you can derive them yourself! It is a far better feeling to have a good solid theoretical understanding of the material being covered rather than just saying "well this equation fits to the data decently well so it must be true."

        The first, is a scientific mind set. The second is what a business major learns.

        Continuing on the physics note, Calculus is used in an understanding of how the basic fundamentals of computers work. Being able to say "take the derivative of it here" and have it make sense, ah, a lovely thing. Hell, anytime you have units and notation being thrown around, Calculus can make itself handy. A lot of relationships in the natural world only show up if you understand what taking the differential or the integral of something does to its units.

        Then there is Linear Algebra, which is VERY fundamental to computers. Matrixes are used throughout computers of all types and sizes, (with some limited exceptions, yes yes ), teaching Linear Algebra becomes far easier if the students are required to learn Calculus first. Heck everything becomes easier after Calculus is learned.

        You are also ignoring that, at its heart, all mathematics are related. The more a person understands in any field of mathematics, the more they can learn in all other fields. Calculus forms an excellent base for more knowledge to be built upon, and even if sight of that base is (unfortunately) lost by the student over time, it still exists there as a foundation for all that the student has learned.

        (That, and, quite frankly I found Calculus to be fun. :)
        • Very Nice. I agree with you 100%.
      • I think it is forced because CS is not only for computer using... but for general problem solving. Integro-Diferential Calculus is the basis for solving a lot of problems, so it is just a matter of basic knowledge for any engineering or scientific career.

        If you didn't knew Integral calculus, then you could still program (a lot of people are like this) but your aproach is to actually solve problems as an engineer... and those are represented by integral calculus...

        This is what I think though, and to me it
        • So you are telling me that you don't learn how to solve problems in metamathematics? Come on, any non-trivial collection of math involves solving sophisticated problems.
      • I am still trying to figure out why Integral Calculus is forced down everyone's throat. Computer Scientists are better off studying proof theory, axiomatic set theory, lambda-calculi, etc...

        There are subfields within CS that make use of Integral Calculus... but most subfields of CS do not use it and instead use things like proof theory, set theory, etc.

        Not all problems potentially encountered by CS majors are discrete - maybe in the textbooks but real life differs from textbooks. You need a toolset fo

        • The problem is that you have a finite amount of resources (semesters) and you have to teach undergrads "computer science". Teaching Integral Calculus at the detriment of teaching automata theory or formal logic is a real problem.

          Sure, given infinite resources I would agree that we should teach Integral Calculus along with every other concept in mathematics, engineering, economics, art, politics, etc...
      • I am still trying to figure out why Integral Calculus is forced down everyone's throat. Computer Scientists are better off studying proof theory, axiomatic set theory, lambda-calculi, etc...

        There are subfields within CS that make use of Integral Calculus... but most subfields of CS do not use it and instead use things like proof theory, set theory, etc

        Most undergraduates do not have an accurate idea which field, let alone which subfield, they are going to end up in. The integral/differential calculus is

        • In practice, the average CSci B.S. is going to end up writing database front ends for a business somewhere, so you could argue that all of these topics are academic and irrelevant.

          Which is why I like Graham's (article posted elsewhere in this thread - but it's been proposed by others) idea of breaking up CS. Put all the chairwarmers who complain about "academic irrelevance" into the non-science, non-mathematics faculty. I even have an idea where - Communications needs a little more money and a little mo

      • Re:So he's the one (Score:5, Insightful)

        by AxelTorvalds ( 544851 ) on Wednesday November 26, 2003 @04:59PM (#7572470)
        I am still trying to figure out why Integral Calculus is forced down everyone's throat. Computer Scientists are better off studying proof theory, axiomatic set theory, lambda-calculi, etc...

        I'm a CS and Math guy... to preface this opinion

        Now it depends on the program and the school of thought. Anyone who's ever worked with a physicist in the tech business (they crop up from time to time) understands that the guys with the PhD in Phyiscs is almost always better than the guys with Masters in CS, it just works out that way. Physics and Calc are one in the same when you get through all the BS.

        Everyone knows that physicists are better and so there is a desire to teach the tools that they use. That's just a theory I have, nothing to back it up other than everyone knows how Einstein was and everybody has an idea who Hawking is and nobody knows who Turing was or Euler was or Galois. If it wasn't for Russle Crowe a fair number of math geeks wouldn't know who Nash is... Copy what works.

        Secondly, better programs tie it all together. You can start off simulataneously learning continuous calc and Zermelo set theory in a discrete class. Keep learning calc and more discrete. Then throw some linear algebra in and some abstract algebra and then right about that time one of them (the way I had it, it was a calc class) goes into the throes of a mathgasm and proves Euler's formula, using discrete math and calc both and kind of ties the whole thing together (because after you've learned all the different methods of integrations you're spending a lot of time doing what a class mate of mine called "that big E shit" with additive and multiplicitive series...) If all goes well you'll be wondering what's the true "key to math" at about this time and it's kind of like having God whisper in to your ear when you see how it all links up. I think proving a lot of the linear algebra stuff is substantially easier if you have calc as a tool. Then you continue on and prove all of the calc stuff using the set theory that you had been building up, take a few more calc courses doing diffeqs, partials and calc in 3d which is all mostly mechanical at that point and then after all of that you do whatever the hell you want in math. I think most of the stuff in typical stats classes is very difficult to prove without calc.

        The link between linear algebra, abstract algebra and discrete math is pretty easy to see as you're doing it. The bridge between discrete and continuous math is a bit more complex but it's really undeniable when you see it.

        • Now it depends on the program and the school of thought. Anyone who's ever worked with a physicist in the tech business (they crop up from time to time) understands that the guys with the PhD in Phyiscs is almost always better than the guys with Masters in CS, it just works out that way. Physics and Calc are one in the same when you get through all the BS.

          That's like comparing a water-mellon to a grape. A person who has a PhD is bound to have much more research experience that guy with a Masters. A better

      • I am still trying to figure out why Integral Calculus is forced down everyone's throat

        That's because you probably assume that the only point in learning pythagoras' theorem is to save the effort of measuring the third side of a triangle. The reason that mathematics has been taught in schools for most of the last 2.5 millenia, has had nothing to do with calculating stuff. Learning mathematics trains the mind to reason about things.

        To get to the point, calculus trains the mind to reason about dynamic sys

      • It surprises me that a Lisp advocate would refer to what he does as making "beautiful" software.

        Or should that be "(surprises (would (a Lisp advocate) (refer-to (does what he) (making beautiful software))) me)"? :p

    • Lamda calculi is different from your usual calculus that involves integration and differentiation - it's more symbolic processing and discrete maths, and can be traced back to even church/turing, so you can't really blame him for it.

      now some anecdote:

      I took an undergraduate communicating automata/pi calculus course in cambridge in 1996 - I also had the good luck to be supervised by prof Milner himself (who was the head of the Computer Laboratory at that time) for that course. Then, you could download his
  • "He may not be as well-known as he deserves to be, but his research contributions are ubiquitous"

    Now that just makes feel so much more confident about his work...

  • by guacamolefoo ( 577448 ) on Wednesday November 26, 2003 @03:04PM (#7571427) Homepage Journal
    How do we know that the interview is with the Turing Award winner? Maybe the interview was just with a computer?

    GF.

    Obligatory disclaimer for the Comic Book Guys out there:
    I know, I know -- the Turing Prize isn't the Loebner Prize. It's the day before a holiday -- give it a rest and laugh a little.

  • by Space cowboy ( 13680 ) on Wednesday November 26, 2003 @03:06PM (#7571450) Journal
    Coq and Isabelle are used heavily "for verification purposes" today. ... poor old Isabelle... Give a girl a break!

    Simon.
  • I went to the Suez Canal. At the time, the British were still in charge.

    Must have been warm there.

  • One of the first things I learned as a computer scientist was the impossibility of proving a piece of software "correct".

    Still, I always found it an interesting concept.

    Then, one day, watched a Eureka (EU) team build a formal verification suite based on Z. One of my friends was on the team. One of the best programmers I ever knew, he decided the whole venture was a fraud, said so, and was fired.

    My opinion? Formal proofs are a holy grail, unreachable, and the reality of software is that small, testable
    • by heironymouscoward ( 683461 ) <heironymouscowar ... .com minus punct> on Wednesday November 26, 2003 @03:19PM (#7571590) Journal
      You're an ignorant bastard and you should RTFA instead of spouting your half-cooked opinions.

      (Help, moderators, mod me "-1 Insane in the membrain", I'm flaming myself!)
    • Re:Formal proofs? (Score:1, Insightful)

      by Anonymous Coward
      Er, what?

      One thing you should have learned as a computer scientist is the proving a peice of software correct is *possible*.

      The caveat is that you can only prove it correct with respect to a formal specification (as we all know if your spec is wrong your still shafted) and formal proofs are hard to construst for non-trivial programs. Hence the need for machine assisted proof tools.
    • That whole thing has set back computer science quite a bit. It's very silly to use only one example and say *all* programs cannot be proven/analyzed.

      Godel's proof is merely a way to show that no one *complete* program can decode the operation of all other programs... and his reasoning was, in summary.. pretty simple.

      The first program might be hidden inside the second one and thus it can be programmed to act essentially with all the smarts of the first... and can trap it in a loop. The checker can be

    • One of the first things I learned as a computer scientist was the impossibility of proving a piece of software "correct".

      *Please* tell us where you learned this. Please. This is just too damned funny.

      Granted, there are programs that defy formal proofs, but it's nonsense to say that *no* program can be proved correct.

      • Granted, there are programs that defy formal proofs, but it's nonsense to say that *no* program can be proved correct.

        That's not what he said. "a piece of software" does not mean "all software", but rather gives a good impression of the fuzzy area called legacy code.

        Today it's not even imaginable to have a tool that can load a typical piece of software written in one of the major languages (be it C, C++, Delphi, Java, C#, Ada or Lisp), and have it verified.

        Current wisdom is to use a very special subset

    • You can prove that given certain inputs you achieve certain outputs. However, so much can go wrong with the average application (i.e. database connectivity, resource issues, memory leaks), you'll never have error free code. Most of the errors I see exist because of complexity and scope creep (management issues). So, program verification still won't catch all of the "errors" in our code. Not to mention that proving a verification is impossible. You're dealing with self-referential stuff here. Proving y
  • "It doesn't rain because it is so dry" as Milner said in the interview in regard to the Suez Canal region.

    But such banalities aside, it is an interesting interview. I was particularly interested in learning that he doesn't have a doctorate -- on this side on the pond that would have been required (for better or worse) for his academic position.
    • Huh. I never made the connection.

      ( Comment source [bobandtom.com].)

    • I was particularly interested in learning that he doesn't have a doctorate -- on this side on the pond that would have been required (for better or worse) for his academic position.

      Indeed, he had a distinguished early career for someone without a PhD. Still, he was both a Fellow of the Royal Society and a Turing Award recipient several years before he became Prof. Milner and then became the head of the Computer Laboratory at Cambridge. (Note to US-based readers: in the UK, the title "professor" is very

      • Indeed, he had a distinguished early career for someone without a PhD. Still, he was both a Fellow of the Royal Society and a Turing Award recipient several years before he became Prof. Milner and then became the head of the Computer Laboratory at Cambridge. (Note to US-based readers: in the UK, the title "professor" is very senior, typically given only to the most distinguished figures in a university department.)

        FRS and Profs are much rarer than Turing award winners. Not all award winners are in the aca

  • remember MegaHAL (Score:3, Interesting)

    by DarklordSatin ( 592675 ) on Wednesday November 26, 2003 @03:18PM (#7571575) Homepage
    Mention of the Turing Award brings back memories of all the fun times we had with MegaHAL (http://megahal.sourceforge.net/ [sourceforge.net]). We set up a machine in my local dorm kitchen and our MegaHAL rapidly became a horrible swearing bigot; it provided endless amusement for all.
  • This can't be /.'s own Robin Miller [roblimo.com] (roblimo [slashdot.org]), can it? Of course not, it's Robin MilNer!
  • Proof of Code (Score:4, Interesting)

    by starseeker ( 141897 ) on Wednesday November 26, 2003 @03:41PM (#7571774) Homepage
    I have been wondering for some time now if proving code might be the next step in computers. If you think about it, most problems related to everyday use of computers have been solved, in one form or another - spreadsheets, word processors, databases, and communications seem to account for most of what we want to do, and their feature sets are largely well defined. The task now is not to figure out what we want to do - we know that very well. What we want to do now is do it WELL.

    I know little about the field of proving code, so whether this is possible I don't know, but it would be interesting to try something like the following:

    a) create a large, detailed specification of what a database (for example) should be able to do. Start general and work down to specifics. Map the full feature set out, eventually down to the function level.

    b) translate those requirements into some proof language - Z or B are one's I've heard of, perhaps there are others more appropriate. Identify what the limits are - ultimately the behavior of the program should be well defined, ideally. Break it down to a point where the individual units under consideration can be reasonably expected to be provided by the operating system or system libraries (which, in an ideal world, would also have been created or could be created by a similar process).

    c) Having the proven structure, use code generation techniques to automatically produce code that will create the program.

    In essense, basically all the work would be done at the specification level, and once we can specify in full detail what we want, the computer itself handles the job of writing the actual code.

    As I said, I don't know how much of this is possible, but if we were to start from scratch at the assembly level perhaps something like this could occur:

    0) Before anything else, based on language specs, create a proven compiler for the language(s) to be used. Without that, all practical work is useless.

    1) Define kernel or microkernel design, and map that design down to hardware levels (RISC might be an easier platform for this). EROS might be a good design starting point. Once it is clear what jobs the hardware would need to do for each command, map out and prove the assembly commands in the RISC platform that would do each job. Build off of those proven components to prove the behavior of each higher level language command, and once the higher levele language behavior is defined and matched to what is needed, write and prove the kernel.

    2) Having a kernel whose behavior is now well defined and trusted, the real work begins. Working off of the well defined and proven components in the kernel, build up the rest of the tools needed to provide an operating environment expected on a modern machine. Compartmentalization is key, both for system security concerns and for proof concerns. Essentially the unix idea of one tool doing its specific job correctly, taken to its logical extreme.

    3) Having a basic system developed, begin to work on the end user components. Graphics libraries and toolkits would need to be implimented and proven. Porting current toolkits would be possible, but the would likely not be suitable for the rigorous hard core proof testing and a major system graphics setup would have to be designed, specified, and created. Fresco might be a good source of ideas here. Once the proven structure is available,

    4) Identify and specify key end user applications. Define an Office application, with various components like word processor and spreadsheet, and define clearly their features. Treat the last 20 years of software usage as field research on what features are required. Impliment them using the proven system tools. As they mature, replace ports of non-proof based tools with new software. Instead of having many tools for one job, define the job itself clearly, and its solution clearly. If more features are needed or desired, the place to add them is in the proof structu
    • The basic problem seems to be that the correctness proof starts from the premises of a logical description of what the program is supposed to do and how the abstract machine it runs on will operate, and it is at least as easy to make mistakes in these logical descriptions as it is to make mistakes in the program to be analyzed.
    • Re:Proof of Code (Score:4, Insightful)

      by nicophonica ( 660859 ) on Wednesday November 26, 2003 @04:21PM (#7572123)
      Almost no branch of computer science has seen more countless hours of research devoted to it with more meager results then program verification theory. (And that is not primarily what Milner's work addresses.) The fundament problem with program verification, and why you will not see any of the applications the you mentioned for at least the foreseeable future and probably ever, is that even after you've developed a language that is amenable to correctness analysis and after you have developed a specification requirements language to articulate the 'implication' of your programs written in the verifiable language, and after you have built a tool that allows you automate the construction of correctness proofs, you find that all you've done is push the real problem solving work of programming (where most of the errors come from) into a fuzzy realm of prerequirements that is even less conducive to the types of problem solving that programmers do then the original programming language. These techniques can be useful for very narrow, specialized types of applications which must be correct. But can never work for something even as specialized as an operating system, let alone a general-purpose business application.
      • Re:Proof of Code (Score:2, Insightful)

        by Anonymous Coward
        Ignorance abounds. Just because it is hard to verify some properties, we should ignore verifying those that we can?
      • Re:Proof of Code (Score:2, Interesting)

        by Anonymous Coward
        These techniques can be useful for very narrow, specialized types of applications which must be correct. But can never work for something even as specialized as an operating system, let alone a general-purpose business application.

        This is correct, BUT the applications for which you really NEED verified correctness are so vital (for example airplane software) that we cannot afford not developing the formal methods. That research will spin off to more general purpose software. Just look at the countless mi
      • Re:Proof of Code (Score:3, Insightful)

        by starseeker ( 141897 )
        I expect this is true, but if I may...

        "all you've done is push the real problem solving work of programming (where most of the errors come from) into a fuzzy realm of prerequirements that is even less conducive to the types of problem solving that programmers do then the original programming language."

        By prerequirements do you mean identifying the behaviors and properties needed at each level of programming? I know this won't suddenly enable the creation of "perfect" programs but if errors are occuring i
      • Verification of arbitrary programs is a very different proposition from constructing programs given a specification. The first is based on taking a given program and proving it has certain properties. The second is based on creating a program that has certain properties because you constructed it that way.

        There are lots of ways to create an under-specified specification that admits multiple possible program implementations, and then allow the compiler to simply "choose one" arbitrarily. For examples, lo
    • I have been wondering for some time now if proving code might be the next step in computers. If you think about it, most problems related to everyday use of computers have been solved,

      This has been pretty much true for at least 30 or 40 years, when computers became widely accepted as business tools. However what's stopping more formal proven software is that it's incredibly expensive to do so. Those fields which can justify the expense, eg flight control systems, are proven. General purpose applications

  • Q: How do we determine who gets the Turing award?

    A: By finding a geek who can pass the Turing test.
  • by C A S S I E L ( 16009 ) on Thursday November 27, 2003 @07:08AM (#7576348) Homepage
    ...way back in the mid to late 80's, mostly doing compiler programming in the ML project. In fact, my first exposure to ML was in 1979, with an early version running on on DECsystem 20. (I can't remember what it was written in - something like Lisp or Prolog.) For my Ph.D. I implemented a polymorphically typed hybrid functional/logic language based on the ideas of Milner and Ehud Shapiro.

    The big push to ML as a language was, as stated, a redesign and implementation (in Pascal, I think) on VAX/VMS by Luca Cardelli back around 1981 - it was a lovely piece of design, with first-class records and union types, and pattern matching. During the standardisation of the formal semantics (late 1980's), the implementation effort was shared with Dave MacQueen at AT&T and Andrew Appel at Princeton (whose compiler books and papers on continuation passing are a good read).

    Throughout, the implementations and the mathematics advanced hand-in-hand; on many occasions I'd be doing something in the compiler, and referring to the formal equations of the static or dynamic semantics to make sure I was doing the right thing.

    I've been out of touch for about ten years (just when the Pi-Calculus was emerging) but my patterns of thought and reasoning about software structure are still rooted in the experiences of a quarter of a century ago, and I still come across things whose lineage can be traced back to ML's theory and practice (for example, Java exceptions and inner classes).

  • Did you get a temporary or permanent position in Edinburgh?

    I got a permanent position in 1973. That was coming back from the States, which was a good thing because we wanted our family to be educated in Europe, not in America.

    What do you make of the increasing patenting ideas in computing?

    It's terrifying. It's ridiculous and terrifying at the same time.

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...