Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Science

A Unified Calculus? 31

DeAshcroft writes "Science Daily is reporting that one Martin Bohner's work, "Asymptotic Behavior of Dynamic Equations on Time Scales," has made significant waves (ahem) in the mathematical community. The work is "part of a fairly new and exciting effort to unify continuous and discrete calculus" I guess it's time to re-learn long division."
This discussion has been archived. No new comments can be posted.

A Unified Calculus?

Comments Filter:
  • by jerkface ( 177812 ) on Thursday January 30, 2003 @06:38AM (#5188012) Journal
    The relationship between the discrete time scales approach and the unification of calculus has been widely known since S. Hilger published "Ein Masskettenkalkul mit Andwendung auf Zentrumsmannigfaltigkeiten" in 1988as his Ph.D. thesis. The problem remained for other to, um, elaborate on the connection. Martin Bohner, as one of the few individuals taking a great interest in this somewhat narrow area of the field, turns out to be the prime mover in the progress in the field. The really important development is that more people are going to take interest now and they will publish new and interesting results. Bohner's key accomplishment so far is proving to the community that this topic is worthy of more interest.
    • Yeah...mod me offtopic but yo...leave it to the germans to come up with published works titled "Ein Masskettenkalkul mit Andwendung auf Zentrumsmannigfaltigkeiten"...good god...Zentrumsmannigfaltigkeiten???
      • by Anonymous Coward on Thursday January 30, 2003 @07:23AM (#5188081)
        Considering how German words are formed, "Zentrumsmannigfaltigkeiten" probably means something like MathOvertheSpectrumContinuouslyILikeSaurkraut

      • Your subject line should have been Jesusgottinhimmelskind!

      • Relax. Those are nothing but noun phrases. The difference is that in the English writing system, we are use spaces to separate the nouns. This makes them easier to unravel, but leads to other ambiguities.

        Consider for instance ``Law school entrance requirement examination test score''.

        The noun phrase has a root, on the rightmost side, in other words ``score''. The other nouns modify the root, giving it more refined semantics. It's a test score; moreover, it is an examination test score, and so forth.

        Now if you were German, you would write lawschoolentrancerequirementexaminationtestscore. Probably with enough practice, you can learn to read this! ;)
  • by Bazzargh ( 39195 ) on Thursday January 30, 2003 @07:11AM (#5188064)
    ...what this is all about, after a little digging on Martin's site I found this paper [umr.edu] "Basic Calculus on Time Scales and some of its applications"

    Its readable enough if you can remember your calculus from first year at Uni.

    The gist: normally we do calculus with the set of real numbers, and difference equations with integers. The 'time scales' notion is that instead of having even gaps between numbers like the integers, you can have independently varying gaps, down to infinitesimal ones. Thus, timescales are really just arbitrary subsets of the reals. An example of a time scale might be:

    1_2 3_4 5_6

    (the underscore indicates a chunk of real numbers, the space a gap of numbers we don't use, and so on)

    It's hopefully obvious that the set of integers and the set of reals are special cases of timescales. So, if you derive the fundamental theorems in calculus using timescales, you find the equivalent theorems for reals and integers are special cases.

    Cheers,
    Baz
  • cool (Score:4, Interesting)

    by apsmith ( 17989 ) on Thursday January 30, 2003 @10:28AM (#5188754) Homepage
    I remember from math competitions way back one of my favorite tricks was when an iterative problem looked like it lended itself to a difference equation, to solve the related continuous calculus problem, and then use that solution as a starting point for the difference-equation solution. Always worked much faster than anything else I could think of... Of course I was no expert in the calculus of difference equations, but this sounds really neat. And given how much application both calculus and difference equations have had in other areas of science, this could have big implications once somebody figures out what they are :-)
    • Re:cool (Score:3, Informative)

      by trixillion ( 66374 )
      Linear difference equations can be solved methodologically using the Z-Transform. This is dual to the use of the Laplacian Transform with linar differential equations. Find an advanced book on signal processing for more details. Similarly there are methods for handling coupled difference equations in a manner dual to coupled differential equations.
  • This stuff looks really uninteresting. No seriously, I'm saying that as a mathematician. It sets up a formalism for solving equations that are a mixture of discrete recurrence relations and continuous differential equations. But there are no new theorems and insights. Often when different branches of mathematics are combined deep insights appear. But this stuff is trivial. Any half competent mathematician who tried to unify these types of equation would probably come up with the same formalism but there's absolutely no motivation for doing it. Every example he gives can be solved by standard methods. Honestly, I think this guy is a hack mathematician. Usually when only handful of people publish papers on a subject it's because few others can understand it. In this case it's all too easy to understand - it's just not interesting.
    • It looks like early days yet. Off the top of my head, I can think of a couple of areas where this might be very useful.

      In signal processing, remote sensing, image processing and so on, we want to do "continuous" things to discrete samples. If we can carry solutions over from the continuous world, we may get nice algorithms.

      There is a deep link between certain kinds of algebra and formal language theory. A recent discovery is that formal languages obey the rules of calculus. For example, DFA construction from a regular expression turns out to be a Taylor series expansion of the expression. (If anyone is curious, I can supply the details.) Perhaps this will motivate someone to bring formal languages into the picture.

      It's not very deep so far, but you never know.



      • For example, DFA construction from a regular expression turns out to be a Taylor series expansion of the expression. (If anyone is curious, I can supply the details.)

        Ok, I'll bite. This statement doesn't make any sense to me, and a google search turned up matches (with DFA in {"Deterministic Finite Automata", "Detrended Fluctuation Analysis", "Dynamic financial analysis", "Descrete functional analysis"...}), none of which seemed to be what you were refering to.

        -- MarkusQ

        • In this context DFA refers to "Deterministic Finite Automata". Among other things, they can be employed in highly efficient lexical analyzers. If you've got a decent book on compilers around, you can look up the details.

          • In this context DFA refers to "Deterministic Finite Automata". Among other things, they can be employed in highly efficient lexical analyzers. If you've got a decent book on compilers around, you can look up the details.

            In the context of formal languages, sure. (Note that that was the first interpretation on my list.) But in the context of Taylor series? Remember, the original poster was claiming a link between formal language theory and Calculus via a DFA <---> Taylor series mapping. How in the heck do you say that Deterministic Finite Automata are somehow the same as Taylor series? Structurally, I just can't see it. But I also can't see/find any other interpretation of DFA that makes this make sense. Thus my request that the original poster provide the offered details.

            -- MarkusQ

      • Woah! This is too weird a coincidence. I was just reading, 10 minutes ago, about Kleene algebras and was looking at some curious taylor expansion stuff used in the proof of Parikh's theorem. Is that what you are talking about? If not, then I'd *love* to see the details you are talking about.
      • (If anyone is curious, I can supply the details.)

        I'm curious... details please. I'm not sure what you mean by a "Taylor series of the expression". Is it a power series? If so there's lots of cool things that can be done once you have a power series... including lagrange inversion...
    • Yeah, and this derivative stuff isn't that interesting either. It's just a simple ratio of the delta of a function value devided by the delta in its arguments... yeah there's some sort of ad hoc limit thing that says you can somehow make the devisor zero and still get a value... I mean, c'mon! ....Just because a paper is simple and easy to understand doesn't mean its without value... I for one find this timescales stuff a very interesting insight...
      • I didn't say it was bad because it was easy to understand. And there's nothing ad hoc about derivatives. Sure, you can strings the words 'ad hoc' and 'derivative' together in a sentence if you like. But it doesn't really demonstrate anything.
  • by Valar ( 167606 )
    The discrete summation from a to b of f(x) is equal to the integral from a to b+1 of f(int(x)). Proof left as an excercise to the reader (break the integral over the interval, resolve to constant values).

To invent, you need a good imagination and a pile of junk. -- Thomas Edison

Working...