Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Math

The 'Useless' Perspective That Transformed Mathematics (quantamagazine.org) 21

Representation theory was initially dismissed. Today, it's central to much of mathematics. From a report: When representation theory emerged in the late 19th century, many mathematicians questioned its worth. In 1897, the English mathematician William Burnside wrote that he doubted that this unorthodox perspective would yield any new results at all. "Basically what [Burnside was] saying is that representation theory is useless," said Geordie Williamson of the University of Sydney in a 2015 lecture. More than a century since its debut, representation theory has served as a key ingredient in many of the most important discoveries in mathematics. Yet its usefulness is still hard to perceive at first. "It doesn't seem immediately clear that this is a reasonable thing to study," said Emily Norton of the Technical University of Kaiserslautern in Germany.

Representation theory is a way of taking complicated objects and "representing" them with simpler objects. The complicated objects are often collections of mathematical objects -- like numbers or symmetries -- that stand in a particular structured relationship with each other. These collections are called groups. The simpler objects are arrays of numbers called matrices, the core element of linear algebra. While groups are abstract and often difficult to get a handle on, matrices and linear algebra are elementary. "Mathematicians basically know everything there is to know about matrices. It's one of the few subjects of math that's thoroughly well understood," said Jared Weinstein of Boston University.

This discussion has been archived. No new comments can be posted.

The 'Useless' Perspective That Transformed Mathematics

Comments Filter:
  • Brave words (Score:5, Insightful)

    by phantomfive ( 622387 ) on Wednesday June 10, 2020 @09:38PM (#60169398) Journal

    "Mathematicians basically know everything there is to know about matrices. It's one of the few subjects of math that's thoroughly well understood," said Jared Weinstein of Boston University.

    Is he willing to take a bet on that?

    • Are you implying that gambling is better known in mathematics?
    • For one thing, it would be nice to have answers to the big questions about computing the permanent. [wikipedia.org]

    • He's not claiming we know everything about matrices. There are a lot of things we don't know about matrices. For example, suppose that an n by n matrix is allowd to have entries that are only 1 and -1, then how large can you make its determinant (in terms of n)? This is a problem I'm picking not just because it is easy to state and open but because I was at a talk a few years ago where Jared was in the audience (I did my PhD at BU). So I can give you one very concrete problem which is open here. But his sta
      • He's not claiming we know everything about matrices

        That's why math majors should take English classes. If he wasn't claiming it, he shouldn't have said it, he should have said something different.

  • Yet its usefulness is still hard to perceive at first. "It doesn't seem immediately clear that this is a reasonable thing to study," said Emily Norton of the Technical University of Kaiserslautern in Germany.

    Except for pretty much all of machine learning, but that's pretty useless so...

  • by thadtheman ( 4911885 ) on Wednesday June 10, 2020 @10:29PM (#60169502)

    Yeah it has it's uses, but it is so much like doing math with a crayola.

    It ignores many subtle relationships vbetween objects, and you always have tgo show that your results are independent of representation anyway.

  • ...had been discovered earlier and independently by Frobenius and Augustin Cauchy (Burnside's lemma [wikipedia.org]).
  • Paraphrasing: We can see no practical use for lasers... how wrong they were... again.
  • one sided (Score:4, Informative)

    by gtall ( 79522 ) on Thursday June 11, 2020 @04:57AM (#60170122)

    The article is only talking about group representations. Representation theory write large is much bigger than that. In classical logic, Stone's representation of Boolean lattices represents them as topologies on (what are now called) Stone spaces. Johnstone's book on Stone space shows the basic theme is much bigger for other classes of lattices (Boolean algebras can be configured as lattices). Other areas in math have their own representations.

  • by coats ( 1068 ) on Thursday June 11, 2020 @08:13AM (#60170436) Homepage
    Representation theory was the key insight for developing the SMOKE atmospheric-chemistry-emissions model. It was "somewhat" practical -- the prototype in 1993 replaced its database-style predecessor's 11-hour overnight Cray supercomputer run by 2 minutes 43 seconds on a desktop SPARC-2 workstation...

    To be honest, it did also employ faster techniques for its set-up (not part of the 11 hours or 00:02:43): fast sorts, binary instead of linear searches, integer instead of character-string ID-lookups), but the non-setup part was almost entirely (sparse) matrix multiplies.

  • by UnknownSoldier ( 67820 ) on Thursday June 11, 2020 @10:31AM (#60170916)

    History is littered with "useless" Mathematics only to be discovered a practical use later on.

    e.g.
    Hamilton invented Quaternions [wikipedia.org] in 1843. At the time they thought to be pretty much useless even though they are exactly equivalent [duke.edu] to a half-angle axis rotation (that itself was debated until we had a few proofs in the 1980s.) After 150+ years unit quaternions are are used extensively in animation and its "slerp" (spherical interpolation), due to a few factors:

    • They avoid Gimbal lock,
    • only take 4 floats instead of a full 4x4 matrix to represent a rotation,
    • are faster to multiply then a two full 4x4 matrices,
    • the sign of the quaternion tells you which direction the rotation is -- useful if you have "one-sided" joints like elbow,
    • etc.

    Likewise, the outer product was invented by Grassmann in 1844, which was further refined by Clifford in 1878 into the geometric product and Geometric Algebra [wikipedia.org] with scalars, bi-vectors, and tri-vectors, was basically ignored for decades. It is now starting to be used [google.com] in Physics.

  • Well, if it were a car, it would have 4 wheels and an engine and would be initially dismissed because it's no more useful than a horse and carriage!

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...