Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Has Productivity Peaked? 291

Putney Barnes writes "A columnist on silicon.com is arguing that computing can no longer offer the kind of tenfold per decade productivity increases that have been the norm up to now as the limits of human capacity have been reached. From the article: 'Any amount of basic machine upgrading, and it continues apace, won't make a jot of difference, as I am now the fundamental slowdown agent. I just can't work any faster'. Peter Cochrane, the ex-CTO of BT, argues that "machine intelligence" is the answer to this unwelcome stasis. "What we need is a cognitive approach with search material retreated and presented in some context relative to our current end-objectives at the time." Perhaps he should consider a nice cup of tea and a biccie instead?"
This discussion has been archived. No new comments can be posted.

Has Productivity Peaked?

Comments Filter:
  • Cough (Score:5, Interesting)

    by caluml ( 551744 ) <slashdot@spamgoe ... minus herbivore> on Monday November 27, 2006 @09:01AM (#17000326) Homepage
    Cough [wikipedia.org]
  • by kahei ( 466208 ) on Monday November 27, 2006 @09:10AM (#17000392) Homepage

    My local lawyer, for example, used to get about 20% of the town's law traffic 10 years ago. It's now computerized and processes far more documents and communications, at a far faster rate, than it ever used to. It still gets about 20% of the town's law traffic, as its competitors have upgraded in exactly the same way. The courts, of course, recieve far more documents and messages from these lawyers than they ever used to, but the courts themselves have also computerized (just barely) and can handle the extra traffic.

    In terms of 'productivity', I'd think that the lawyers, paralegals, court administrators and so on have improved by 10 times. In terms of how much useful stuff gets done, it's exactly constant.

    So yeah, by all means integrate Google technology with your cornflakes to achieve a further tenfold increase in productivity. Go right ahead.

    In more important news, I currently have a co-worker who spends all day reading his friend's blogs (which doesn't bother me) and giggling over the witty posts he finds (which is driving me fucking mad). Can any slashdotters suggest a solution that will not result in jail or in me being considered 'not a team player'?

  • Re:Obligatory (Score:3, Interesting)

    by lawpoop ( 604919 ) on Monday November 27, 2006 @09:25AM (#17000480) Homepage Journal
    "Anyway, once we've invented AI that can do our jobs, the whole human race is pretty much redundant. "

    Unless that AI can self-replicate, our new jobs will be building and maintaining that AI.

    We are now in the situation you describe, except with machines and labor. It used to be that we toiled in the field with sticks and rakes, smacking oxen on the back to keep them moving. Now, we ride in air-conditioned cabs of giant combines, listening to satellite radio and resting our buttocks on a leather seat, watching our progress on GPS screens. We also build, maintain, and finance those combines. Some of us work in the satellite, GPS, and technology fields.
  • by f00Dave ( 251755 ) on Monday November 27, 2006 @09:44AM (#17000616) Homepage
    Sounds to me like the old "information overload" phenomenon. The solution-pattern to this situation is never going to be found via incremental improvements in information processing, as the growth is exponential. Nor will an "add-on" approach solve the problem; while hyperlinks, search engines, and other qualitatively-impressive tools are awesome in their own right (and do help!), they only add a layer or two to an information-growth process that adds layers supralinearly ... they're another "stop-gap measure", though they're also the best we've come up with, so far.

    So how to solve an unsolvable problem? Rephrase it! IMO, the problem isn't "too much information", as that's already been solved by the "biocomputer" we all watch the Simpsons with: our senses/brains already process "too much information" handily, but with lots of errors. No, the problem is that we're using the wrong approach to what we call "information" in the first place! We're rather fond of numbers (numeric forms of representation), as they've been around for around eight thousand years, and words (linear forms of representation) go back even farther. Pictures, music, etcetera store far more information (qualitative, structural forms of representation), but usually get mapped back to bitmaps, byte counts, and Shannon's information theory when this discussion starts. And that's the heart of it right there: everyone assumes that reducing (or mapping) everything to numbers is the only way to maintain objectivity, or measure (functional) quality.

    Here's a challenge: is there a natural way to measure the "information-organizing capability" of a system? Meaning some approach/algorithm/technique simple enough for a kid or grandparent to understand, that most human beings will agree on, and that puts humans above machines for such things as recognizing pictures of cats (without having to have "trained" the machine on a bajillion pictures first). [Grammars are a reasonable start, but you have to explain where the grammars come from in the first place, and what metric you want to use to optimize them.]

    A constant insistence/reliance on numeric measurements of accomplishment just ends up dehumanizing us, and doesn't spur the development of tools to deal with the root problem: the lack of automatic and natural organization of the "too much information" ocean we're sinking in. If we're not a little bit careful, we'll end up making things that are "good enough" -- perhaps an AI, perhaps brain augmentation, [insert Singularity thing here] -- as this is par for the course in evolutionary terms. But it's not the most efficient approach; we already have brains, let's use 'em to solve "unsolvable" problems by questioning our deep assumptions on occasion! :-)

    Disclaimer: the research group [cs.unb.ca] I work with (when not on "programming for profit" breaks, heh) is investigating one possible avenue in this general direction, a mathematical, structural language called ETS, which we hope will stimulate the growth of interest in alternative forms of information representation.
  • Re:Centuries-old saw (Score:5, Interesting)

    by FooAtWFU ( 699187 ) on Monday November 27, 2006 @09:56AM (#17000732) Homepage
    Economists, since productivity determines how much stuff will get produced, which determines how much stuff per person there is, and that's pretty much a measure of the standard of living that will result ("real GDP per capita").

    When you're talking about productivity in the entire economy, you can draw a graph - on the Y axis is "real GDP per capita" while on the X axis is "capital / labor" (K/L for short). If you add more capital (machines, computers, tools) people get more productive, but less so as you add more and more and more. This means the line you graph will start somewhat steep, but then level off as you get higher (not entirely unlike the graph of sqrt(x)). The rough guideline for the economy at present is the "rule of one third" - if you increase your capital stock by 100%, you'll get about 33% more output. This sort of rule determines how much capital we end up having - we will increase our capital stock with investment until we have reached the "target rate of return", which is actually a slope of this productivity curve. This is the point at which investment pays for itself.

    Then there are wonderful things like increases in technology. These end up shifting the productivity curve upward: people can do more with their technology than they could before. This increases real GDP per capita directly, but it also means that for the same level of capital, we're below the target rate of return, and can invest in all sorts of new capital, which will pay for itself - so we increase our capital stock as well.

    The good news is that technology keeps coming, and while it may not be quite the same Spectacular Breakthrough as the introduction of computers, there is plenty happening in a variety of industries. Take, for example, Wal*Mart (the company everyone loves to hate, yes...) They have achieved a substantial portion of their success by becoming more productive with managing their warehouses and inventories, and are actively looking to increase their productivity in this area. (In fact, I've seen studies that claim they were responsible for the bulk of retail productivity growth in the late 90's, directly or indirectly). "Supply chain management" is trendy. And perhaps some day we will see RFID tags at the check-out line (to replace the last great checkout productivity enhancer, bar codes).

  • by Pedrito ( 94783 ) on Monday November 27, 2006 @10:03AM (#17000790)
    Sure, for most people, productivity isn't going to increase 10-fold. Hell, as a software engineer, I can't imagine getting 10 times as much stuff done in the same period of time anytime soon. Faster computers wont' help and about the only thing that would speed up my productivity as a programmer is software that would write the code for me, putting me out of a job.

    There are a lot of people working in the sciences who think differently, though. Chemists, biologists, physicists, could all do well with, not just smarter programs, but faster computers. As a couple of simple examples: Molecular mechanics modeling for chemists and protein folding modeling for biologist (particularly the latter, and both are related), are insanely computationally intensive and if computers were able to provide the results in 1/10th or 1/100th of the time, it would make a big difference in their ability to get things done. So I think it kind of depends what you do. I mean, let's face it, if you're a secretary, a faster word processor isn't going to make you 10 times more productive. Maybe a faster copier would help...
  • Re:Parameters (Score:3, Interesting)

    by lightknight ( 213164 ) on Monday November 27, 2006 @10:27AM (#17000974) Homepage
    How small a company? 5-10 people, or perhaps a hundred?

    If your co-worker isn't as technically literate as you, I recommend getting the site blocked. If it's a small company, kill it at the router (just add it to the blocked sites list yourself, no one will be the wiser). If it's a large company, talk to the network admin in charge of the proxy/firewall (under the guise of lost productivity attributed to employees using company assets for personal reasons).

    It's simple and effective.
  • by hey! ( 33014 ) on Monday November 27, 2006 @10:52AM (#17001250) Homepage Journal
    I've never seen or heard of anything like a blanket ten fold increase in productivity come from the introduction of a new system or even new technology. Perhaps in certain tasks were speeded 10x, but he volume of revenue generation does not increase 10x. Of course there are cost reductions by staff reduction, but for some reason it seems rare to have large scale downsizing as a result of introducing IT (as opposed to new manufacturing technologies or new business practices).

    Mostly we are talking about marginal improvements -- although these are often not to be sneezed at. Margins are where competition takes place; they're where they difference between profitability and unprofitability, or positive cash flow and negative cash flow are determined. For things that are done on massive scales, marginal improvements add up. But even doubling actual productivity?

    What IT mainly does is shift expectations. When I started work in the early 80s, business letters and memos were typed. Now we expect laser printed output or emails. A laser printed letter doesn't have 10x the business impact of a typed letter. An email gets there 1000x faster than an express package, but it seldom has 1000x the business impact when looked at from the standpoint of the economy as a whole. You only have to use email because your competition is using it as well, and you can't afford a competitive differential in speed.

    Many changes created by information technology are imponderable. For example, one great difference between the early 80s and today is that there are far fewer secretarial staff. Memos and letters used to be typed by specialists who often were responsible for filing as well. Now these tasks are most done by the author, arguably eliminating a staff position. On the other hand, the author spends much more time dealign with computer and network problems; not only is his time more expensive than the secretarial time on a unit basis, he also needs the support of highly paid technical staff.

    Some technology mediated changes are arguably negative: We used to set deadines for projects based on the delivery time plus a margin for delivery. Now it's common for proposals and reports to be worked on up to the last possible minute.

    There are, no doubt, many substantial business savings created by new practices enabled by technology. Outsourcing, specialization, accurate and precise cost of sales, these are things that over time may have a huge impact.

  • Re:Cough (Score:2, Interesting)

    by extremescholar ( 714216 ) on Monday November 27, 2006 @10:54AM (#17001288)
    I agree! At my current employer, the processes in the accounting department are in need of help. Ugly Access databases that have hideous queries. People creating and distributing three different versions of the same report. People producing reports that no one uses. "This is the Tuesday report. I don't know what it is, but I run it on Tuesday. Definitely, definitely Tuesday. It's the Tuesday report." Don't ask the drones what it is, and God help you if something goes wrong; like a spreadsheet that reaches column IV. There is plenty of productivity to be had by streamlining work that is already being done. Raw computing power makes these jobs easier, but intelligent design will make things 500% better.
  • Re:Obviously... (Score:4, Interesting)

    by name*censored* ( 884880 ) on Monday November 27, 2006 @10:55AM (#17001296)
    Any amount of basic machine upgrading, and it continues apace, won't make a jot of difference, as I am now the fundamental slowdown agent.
    So HE'S the one slowing us down? Well that's easy, we just get rid of him. Problem solved.

    In all seriousness, the computers have only reached a point where the interfaces are now outdated in comparison to how much data it can simultaneously accept and act on (eg, i can click on an icon and it will be told both "click", and "open program" fast enough that I don't have to wait for it). Seems to me that it's just calling for the UIs to be upgraded - we could start using other body parts (cue jokes) such as eye focus for mouse pointer position (not my idea, another slashdot pundit). Or, as has been suggested in this topic, better voice commands, and audiable hotkeys (like that light-clapper thing, except it opens your web browser instead of turning the light on/off). Or we could have interfaces that have more complex meanings than only one ascii value - such as the internet keyboards with buttons for various programs, or with hotkeys speeding up productivity.

    OR.. we could have interfaces that don't rely on physical movement, since even the fastest typist (keyboard) or gamer (mouse) are still much slower than their own brains. All the real life influences - the actual physics of arm momentum (don't go for the numpad too fast or you'll overshoot), appendage-anatomy limitations (RSI anyone?) and taking into account other obstacles (don't knock that coffee over!) slow them down. Perhaps we could have more intuitive machines, as the post suggests. Perhaps we could just have MORE task-queueing technology, which performs background tasks while waiting for user input (indexing the hard disk for searching, defragmenting, virus scanning, etc) so that the machine is ALWAYS waiting for user input, and we cut out that last little bit of having the user wait on the machine. Maybe we could enlarge UI areas, like the control centres in the matrix or minority report - it might be especially useful for coding (grab a variable/etc name or three from one place and a chunk of code from another window of related code) or graphics/design type work (grab colours, picture segments, morph shapes, you could assign a different line thickness to each finger! Perhaps body alterations - installing extra "memory" for multitasking, a telly in your tubby, a USB in your knee, bluetooth in your tooth or WIFI in your thigh..
  • Re:Centuries-old saw (Score:4, Interesting)

    by Skim123 ( 3322 ) on Monday November 27, 2006 @11:36AM (#17001854) Homepage

    The difficulty lies in getting a good programmer and whether or not a program is worth the cost.

    I agree that it is too difficult to get a skilled programmer, but I think almost always it will be worth the cost.

    Even if you find yourself a good app developer there are costs to consider. If it still cheaper to do it by hand, then why bother? Especially considering the glut of labor in the US. Heck, people go to college, get saddled with loans, and are happy to take 30,000 a year jobs. Toss in all the foreign workers chopping at the bit to come here too. From a business perspective having them do the same old makes financial sense and I'm sure some people look at automation with some amount of fears as it might make them redundant.

    In the short term, yes, it may make sense to stick with a person doing the job. But in the long run, automation will be more profitable. For example, imagine it takes $90K to write the software to replace the job of a $30K/year worker. That will pay for itself in three years and by year four, the investment will have a positive ROI. While you're still paying that $30K worker, I'm getting the work done for free. Also, since I'm assuming this $30K worker has some intelligence, some ideas, and some skills in the marketplace, by automating his mundane job, I can now turn him lose on more interesting projects. He can help lead new product lines, while you are still paying his equivalent to just do repetitive tasks that are only fit for a computer.

    I think the real challenges and hesitation from people to move to an automated system is from familiarity with the old system or fear/experience of failure with an automated system. All it takes is one bad experience - a poorly written program that crashes one day and wipes out weeks of data since the backups weren't setup properly, for example - and many decision makers will insist on more manual approaches. Another factor may be that some business partner or regulating agency requires that work be performed in a particular mannere or that certain items be made available that essentially have to be done by humans. I work on software for the health care industry, and some of the "complexities" in dealing with the county and state agencies greatly reduce the amount of automation that can be applied to a given task.

  • Re:Centuries-old saw (Score:3, Interesting)

    by gvc ( 167165 ) on Monday November 27, 2006 @12:01PM (#17002256)
    Productivity measures money. A Manhattan lawyer is more productive than one in Grand Forks because he or she bills more per hour. The argument that the Manhattan lawyer makes more "stuff" other than money is tenuous at best.
  • Machine learning... (Score:2, Interesting)

    by theGil ( 1010409 ) on Monday November 27, 2006 @12:27PM (#17002626) Homepage
    "machine intelligence" is the answer to this unwelcome stasis
    This is exactly what I was thinking when I read the title. In fact, this is where everything is going. One shining example is my company, who uses machine learning algorithms in their software to boost the productivity of workers in the GIS industry. In time, and with the proper people involved, we'll all see more examples of "intelligent" software to decrease the workload people have...this isn't, however, complete automation as some might suspect (we still have a ways to go for Skynet); most of these programs will (at least at first) require a person to guide them. The object right away will to use machine learning tactics to do the dirty work.
  • Re:Cough (Score:2, Interesting)

    by itlurksbeneath ( 952654 ) on Monday November 27, 2006 @12:59PM (#17003138) Journal

    I second that. There are tons of places at my current employer where the same problems exist. Old processes that nobody want's take the time to streamline. The main issue? If you ask them, it's already streamlined, they just don't see that it can be made better for lack of vision. "But it is automated! We take this data, load it into Excel, massage it with this Access database, upload it to Oracle, then download it into this other tool... etc, etc."

    Biggest problem? People that think they are "programmers" and "database developers" because they can use VBA in Excel or crete a form an report in Access on top of some hideous schema that probably makes Mr. Codd [wikipedia.org] spin in his grave at 7200 RPM.

  • Bad interfaces. (Score:3, Interesting)

    by MikeFM ( 12491 ) on Monday November 27, 2006 @04:35PM (#17006542) Homepage Journal
    The biggest productivity limitation of today's computers is the user interface. The desktop metaphor simply is not powerful enough to make accessing and manipulating large amounts of information effecient. Anyone that is good at using the command-line, working through scripts, etc knows that you can accomplish much more using these methods than you can using a desktop enviroment and that when the task is even possible on the desktop it's quite a bit slower than working on the command-line.

    What we need to do is stop making it okay to be computer illiterate. It's not okay to not know how to read, write, or do at least basic math but at one point in our history people really believed that the average person didn't need to know those things. It's not even unique to require knowledge of a machine to live - in most places you can't live without knowing how to use an automobile and if you try people think there is something wrong with you. Why aren't we teaching basic skills like common Unix commands, bash, Perl, and SQL in schools? Why don't we allow the desktop to evolve to work more seamlessly with the command-line and scripting and to handle task management better?

    Just because an interface is command-line and script driven doesn't mean it can't have powerful graphical interfaces too. A lot of CAD packages have graphical interfaces, command-line interfaces, and scripting tied together. Why can't more applications work that way? Or even the whole OS?

Suggest you just sit there and wait till life gets easier.

Working...