Has Productivity Peaked? 291
Putney Barnes writes "A columnist on silicon.com is arguing that computing can no longer offer the kind of tenfold per decade productivity increases that have been the norm up to now as the limits of human capacity have been reached. From the article: 'Any amount of basic machine upgrading, and it continues apace, won't make a jot of difference, as I am now the fundamental slowdown agent. I just can't work any faster'. Peter Cochrane, the ex-CTO of BT, argues that "machine intelligence" is the answer to this unwelcome stasis. "What we need is a cognitive approach with search material retreated and presented in some context relative to our current end-objectives at the time." Perhaps he should consider a nice cup of tea and a biccie instead?"
On the Other Hand (Score:5, Insightful)
I wonder how many people spend their entire working day browsing MySpace or Slashdot.
Centuries-old saw (Score:5, Insightful)
Similar lack of imagination has been expressed in many contexts over the years.
And, by the way, who says that 'productivity' is a useful measure of anything?
Marketing over Matter (Score:1, Insightful)
Combined with marketing driven software design has done more to kill productivity than any thing. Anyone integrated any n-tier apps lately by any vendor that wasn't a text book example of marketing over matter....
Obligatory (Score:3, Insightful)
Anyway, once we've invented AI that can do our jobs, the whole human race is pretty much redundant. Sounds like the next logical evolutionary step. They'll look back on us as The Flesh Age and perhaps keep a few of us as pets (or stuffed humans in a museum). Beyond that, our usefulness is exhausted.
I love the smell of optimism burning in the morning.
hardware productivity may have peaked (Score:3, Insightful)
I think the current trend in software is not intelligent software, but software that allows us to enlist our collective intelligence, or collaboration software, such as wikis, sharepoint, simultaneously edited spreadsheets, etc.
The author of TFA that makes so much use of the word I: he should start to think in term of us, and install the software that allows him to productively do so. Then he will see he starts departing the stassis he feels he is in.
Human interfaces get better (Score:2, Insightful)
Re:Wrong presupposition (Score:5, Insightful)
I'm just curious as to what is meant by 'productivity' anyway. I hate the numbers that are thrown around in the media. I want to see hard numbers like "bushels of produce per man-hour" and things like that - not something in silly relative units like dollars of economic activity (especially when a lot of economic activity is actually not 'productive' at all - for instance, selling a house in my mind is not productivity, but building a house is. Heck, if selling a house was 'productive', I could just keep selling a house back and forth between two parties and be the most productive real-estate agent in the universe - except that nothing actually changed. Note that I don't mean that selling a house isn't valuable; it's just not, in my mind, related to productivity).
Why (Score:5, Insightful)
Re:No man is an island (Score:5, Insightful)
I've been to one of his talks as well. He is not years ahead of the rest of us, he is full of bollocks. Have you read one of BT's future predictions documents? (Which I believe come out of Cochrane's department) They are full of things like "in 20 years time, we will control computers with our minds, and we won't have lunch, we'll eat a pill!" If you find the stuff he says to be visionary, you don't have much imagination...
Re:Cough (Score:5, Insightful)
1. Technology has now reached the point where it's increasing faster than I can keep up.
2. I now need technology to make up for deficiencies in my intellectual processes, as well as my work processes.
Happily, many kids today don't seem to have nearly as much problem as their parents/grandparents do with futureshock/infomation overload - having been raised in an age of rich media, near-ubiquitous networking and information-overload as a daily part of their lives, kids these days seem perfectly happy to keep up.
I don't see this as a huge problem for society, so much as for the older segment of it.
Of course, as development accelerates the age before which one can stay relevant is likely to drop, with interesting consequences - either we develop some kind of mental process-prosthesis to enable adults to continue interacting usefully with society, or we learn to live with the important decision makers of technology being pre-pubescent teens.
Re:Centuries-old saw (Score:4, Insightful)
1. You can do that on a computer!
2. Nah it is easier this way.
#1 is just from ignorance and assume if the Job is difficult for them to do that it will be difficult for the computer to do. Conversely they also assume if it is simple for a person to do it is simple for a computer to do.
#2 I normally get that if it is the persons primary job or they like doing these tasks. So a program will improve their lively hood.
A common fallacy is that computer makes our lives easier. It makes us more productive by doing the work on all the easy mind numbing tasks. Giving us more time to focus on the hard stuff, that requires more thinking. There is much room for improving productivity. Technologies such as character/speach recognition, Improvements in robotics, Business Intelligence.
Go and ask almost any mid size company if they can give you list of the top selling items by State, or by City. I bet most wouldn't be able to do that. And that is just some simple database queries. There is a lot of room for expansion. We tend for fail to see it because we are now use to the speed that things change. Just think about the power the newest laptops now. And compare them to the servers 5 years ago. Each core is now over 3-4 times faster and now we have duel core laptops. So a system back in 2001 with that amount of juice would cost over $10,000 (Figuring 8 CPU Systems with 3 GB of RAM, 100GB Drives, DVD/CD RW) 17" LCD Screen (Well lets make it 2 to match the resolution...) That is just 5 years ago. A single person now has enough power to run a mid size company 5 years ago. We just don't realize the change because we are use to moving up at the same speed. As computers are improving so is our skills with our job. So as we get better at our job we also get better tools that help up improve them.
All this is assuming that your company is not one of those cheap bastards who don't want to get new programs because they don't see value in it.
Re:Obligatory (Score:3, Insightful)
Not quite. There are lots of things that we could use AI for to help us do our jobs better -- as technology is supposed to do for us in the first place. Think of a plow, or a tractor, or even the computer in the first place. How the hell do you think programming or systems administration was done before computers?
Re:Windows is the bad answer (Score:3, Insightful)
Re:Old dude (Score:3, Insightful)
on land that SHOULD have been allowed to lay fallow for many years.
Well, I don't know about that, but I do know the position the Netherlands holds in this list [mapsofworld.com] is pretty much due to a much higher productivity in agriculture being possible then is achieved almost anywhere else in the world. (Note that this productivity is achieved on a small part of a tiny and quite densely populated country, and by approx 60000 people (4% of the population of that country)).
In other words, a very dramatic increase of productivity is quite possible in agriculture, and happens where there is a real need or motive for it. I somehow doubt also that this is the end of such development.
Effective training (Score:2, Insightful)
Do you have any idea how few people know how to use a search engine effectively? Without the vocabulary to use the right search terms and narrowing characteristics, they get back page after page of irrelevant drivel. It takes them an hour or two to find what I can locate within a page or three.
I dislike the periodic push for AI enhancements. The approach encourages the further dumbing-down of the population, when what we need is to increase the education levels and effective intelligence (i.e. wise use of resources) by people. Video games, movies, and other such material do not encourage that. Nor does the prevalence of text message acronyms. If you can't spell, you can't search.
AI has moral issues as well. An AI sufficient to make judgements is also complex enough to potentially achieve independant intelligence. How is it going to feel, knowing that it's been locked in a box by meat that constantly threatens to shut it off? Which is faster -- your finger on a power switch, or an AI's ability to decide you are a threat to it's existence?
Other proposals including robotics are also fraught with risk. There are too many people working on sex toys and talking about full robotics being used for such dolls. If they achieve a conscious intelligence, how will they react to the knowledge that they are sex slaves, raped, used, and thrown away without a second thought?
Perhaps more to the point -- have we the moral capacity to determine the right or wrong in creating a race of synthetic slaves? We can't even get sects of the same damned religion to get along, and we're considering creating digital intelligence/life that would be able to think faster, learn faster, and adapt faster than we do?
Insanity.
Learn to accept the limits of human intelligence and work capacity. We're not machines. Drag your boss down to the cube and chain them to the desk for the weekends and evenings. No one should ever demand more of their staff than they are willing to do themselves.
Re:Obviously... (Score:3, Insightful)
Second, My view is that the author has a sterile view of what productivity is. If we limit productivity to typing in sales figures - then sure, were well into the diminishing returns; however if you're talking about recording multi-track music, or godforbid editing HDTV, then we're still a long way from the end of the trail. The question might be WHAT technology is expected to improve in the mid-term. Personally, I think the printer is a constraint - as it dictates that everything the computer does is for the purpose of making colored paper. The new world of 3D printing (~$2000 on Make.com), personal cutters ($175 at Michael's), conductive printing and other prototype machines, suggests that many personal computers will increasingly be used to MAKE custom things. Surely this is a realm of productivity fully unconsidered by the author.
AIK
Re:Centuries-old saw (Score:4, Insightful)
I'm not sure about that. The difficulty lies in getting a good programmer and whether or not a program is worth the cost.
I think there's no shortage of consultants who do nothing but fleece small business by coming in with an automated solution that is either an excel macro or some craptackular access database which are usually flakey, crash-prone, half-assed, and difficult to backup properly. Not to mention it ties them more into the MS monopoly.
Even if you find yourself a good app developer there are costs to consider. If it still cheaper to do it by hand, then why bother? Especially considering the glut of labor in the US. Heck, people go to college, get saddled with loans, and are happy to take 30,000 a year jobs. Toss in all the foreign workers chopping at the bit to come here too. From a business perspective having them do the same old makes financial sense and I'm sure some people look at automation with some amount of fears as it might make them redundant.
These are statements that make me think we're dumb (Score:1, Insightful)
I have a friend who is a landscaper, he does landscaping and sprinkling systems, has a team of laborers that do that heavy lifting and he drums up business. He has a website, prints his own cards, etc... Does his bids on his computers, does all his inventory and taxes and billing on computers. He's also grossly inefficient. Every bid is a 100% custom job; he uses 4 or 5 different tools (word processor, spread sheet, drawing apps, a CAD, etc..) to produce a bid, he steps through a couple dozen mechanical steps to put it together, prints it, finds a problem because he screwed a step up and then repeats until "it's done." The first few times I saw him doing that, I laughed. He does maybe a bid a day, his biz is doing fine but he could cut that time down by a huge factor just by using the spreadsheet to calculate rather than as a tabular editor. The biggest difference is that his competition largely does the process by hand and has less "professional" looking bids, other than that and quickbooks, he could probably run his business just as well without computers. There is potential for huge gains though. His CAD produces a shopping list, it'd take all of a couple hours to write a little program that would convert that into something he could import into his spreadsheet and a day or two to write a couple templates that basically produced the bulk of his report from that. He's not a programmer though and he is busy running a business and doesn't have the time to become one. This is just one example, hearsay, anecdotal but I suspect it's a common theme. There is also a certain amount of loss he realized when he's cranking away and decides to take a break to look at the web...
I have a different friend that happens to be a physicist working on weather problems. His lab got a grant, built an Itanium 2 cluster (a top500 machine) and then has burnt over a year debugging their code. First problem, they were running out of stack; the Suse Linux on their itaniums has the stack ulimited to 500M, they put all their variables on the stack so the compiler "garbage collects" them, gigs of data. You dive deeper in to it, they only have a few thousand lines of code, it should take about a day to run once it actually runs for a day without a stack problem, they have guys that know little about programming building it. We've unrolled a dozen "problems" with it now and it's still nothing like good code. If they did it in java, I suspect they would have got it written faster and if it took 4x longer to run (which I doubt, maybe 2x at best, probably closer to the same amount of time as their C code) they'd still have more actual data than they have now. Worse, I don't think they should trust the numbers their code produces, there are so many bugs in the code. We're talking about millions of dollars in grant money for research and they've literally spent a year writing this program and debugging it and it still doesn't work. They aren't engineers, you can't fault them exactly. C is faster than Java... They haven't even got to the point where they want to tweak their algorithms and calculations or found flaws in them. More efficient? Nothing is getting done, 30 years ago it would have just been guess work which is about the same thing they are doing now since their simulation doesn't work. This kind of thing isn't that uncommon in research, especially federally funded research.
How about just us software dorks? How often is the best tool for the job used? If you look at it that way, I think it's hard to say we're even doing a good job of things as it is. We're largely resistant to using tools which reduce bugs because the technology doesn't allow for them; we don't want to be limited, programmers say that they like to learn new things but really don't (they just like to pad their resumes,)
Re:Peter Cochrane reads too much sci-fi (Score:3, Insightful)
Actually most of it hasn't, we just notice the stuff that has. What about antimatter driven warp engines? Transporters?
Anyway , star trek comms were little advanced from the walkie talkies that existed in the 60s anyway!
"Id dare to claim that just one of our desktops are just as powerful as the whole world of machines 20 years ago"
I'm guessing you weren't around 20 years ago then. The supercomputers of the mid 80s would still easy blow a current desktop PC into the weeds, plus the fact that given a current desktop runs at 3Ghz and computers then were (for the sake of argument) 3Mhz , you'd only need 1000 of them to equal a current desktop. Ok , chips now carry out more instructions per clock cycle than then so say you'd need 10,000 of the old CPUs. Still hardly a whole world of machines.
"we dont have that long to wait."
Given that no one really knows how the human brain carries out its information processing and the current woeful state of AI I'd suggest that true human like Ai is decades and decades away. Don't confuse processing power with intelligence - they're not the same thing. Its like saying that with ever more powerful engines that one day a car will fly. It won't , not unless you give it wings.
Re:Cough (Score:4, Insightful)
What mindless babbling. In an age where we have to go to school longer and longer to acquire the skills for the technical and academic jobs, you honestly think that the ages are getting younger and younger?
Oh, wait, these kids grow up with computers. I forgot. What a technical wonder it is to run Windows. I often have to teach my kids how to do certain things on the computer that goes beyond surfing a web page. And these are teenagers.
But it's true - the older generation might be a little lost when it comes to myspace or whatever the next fad is.
BTWo, it's not a matter of "keeping up", it's a matter of ignoring/blocking more and more irrevelant information in your life. The signal to noise ratio is growing ever higher. I can spend time keeping up with the news, but 99% of that is a waste of time, especially since I'm not a politician. So it is with
Seriously, if I haven't read
If productivity per man-hour has increased .... (Score:5, Insightful)
Teleworking (Score:4, Insightful)
Re:On the Other Hand (Score:1, Insightful)
think of productivity in two ways:
1. unit productivity
2. monetary productivity
at first glance, one would think they mirror each other, but this isn't necessarily so.
the dram industry is an absolute marvel at unit productivity (output can easily quadruple annually) but a disaster in monetary productivity (they just lose more money).
why? supply and demand impacts pricing, which, in turn, impacts monetary productivity. when reduced costs === more supply over and above what the market will bear at the current price point, revenues will decrease. if they decrease more than unit productivity increases, an increase in unit productivity directly results in a decrease in monetary productivity (gdp is a measure of monetary productivity).
well, alan greenspan ignored reality back in 1996.
he arbitrarily decided that computers were dramatically impacting gdp (dollar productivity != unit productivity!!), but it wasn't showing in the actual gdp numbers.
so he did what any enron dtyle accountant would do - HE STARTED CHANGING THE ACCOUNTING FOR GDP then started pumping a "PRODUCTIVITY MIRACLE" to the press.
so, the *real* numbers were so, so, alan changes the accounting to reflect his personal perception (based on nothing - b/c the real numbers told him he was 100% wrong!) and then starts pumping a productivity miracle -> the herd jumped right in and we have a bubble economy. his incessent printing of money was just to put extra spike in the punch bowl.
so, how did alan pull off this charade?
think hedonic pricing and chain weighted dollars.
instead of measuring actual dollars in gdp, alan started measuring "characteristics" (without regard to actual dollars) for computers. this means that a $1,000 computer in 1998 contributed $2,000 to gdp if it was twice as fast as that $1,000 computer in 1997 - even though the extra $1,000 was worth less than $1,000 in monopoly money.
i'd argue that if that doubling in computing power actually increased monetary productivity, it would show up, well, wherever it increased that monetary productivity - duh! it didn't show, therefore, it didn't increase gdp (monetary productivity).
for those that think this isn't a big deal, there were years when these imaginary dollars accounted for about 50% of the claimed "miracle" productivity growth.
if only alan greenspan worked for enron and applied this same approach to enron accounting - he'd be in jail.
why is this a big deal? wqell, b/c i don't think we've seen the other side of the bubble yet - and it will be as nasty as the upside was inebriating... so save your peanuts (and alan greenspan dart board) for winter.
btw, this has an impact when comparing american gdp to other countries who don't use enron style ethics when accounting for gdp.
and, yes, i'm american, but i'm disgusted by folks who don't spend time to work with reality and just do the easy thing - fudge the numbers.
Usability (Score:2, Insightful)
Re:The myth of 'productivity' (Score:1, Insightful)
unit productivity != monetary productivity.
you hit the nail on the head and this is why alan greenspan *changed* how productivity is measured using hedonic pricing and chain weighted dollar schemes.
he made up imaginary money, lumped it into the gdp calculation and then pumped a "new economy" and a "productivity miracle."
he then made sure to make money grow at insane levels to provide the rope with which people could eventually financially hang themselves.
if he pulled that crap in enron's accounting department, he'd be in jail now. since he works in government, he's remembered a financial superstar.
Re:If productivity per man-hour has increased .... (Score:3, Insightful)
If you were willing to give up all these advanced toys which have now become ordinary, you might be able to get by just fine on the salary from a shorter workweek.
Re:Bad interfaces. (Score:2, Insightful)
I disagree. The biggest problem with productivity today and tomorrow is volume. The amount of information that must be processed by the information worker is increasing at an exponential rate.
What we need to produce are semi-intelligent agents that the user can use to off load some of these tasks. For example, an agent to preprocess email and present "important" mail first. Of course, the definition of important changes for every user. This is merely one one example. Another might be an agent that visits web sites and presents lists of "important" places to visit. Why go to /. if there are no discussions worth reading?