How Algorithms May Affect You (phys.org) 85
New submitter Muckluck shares an excerpt from a report via Phys.Org that provides "an interesting look at how algorithms may be shaping your life": When you browse online for a new pair of shoes, pick a movie to stream on Netflix or apply for a car loan, an algorithm likely has its word to say on the outcome. The complex mathematical formulas are playing a growing role in all walks of life: from detecting skin cancers to suggesting new Facebook friends, deciding who gets a job, how police resources are deployed, who gets insurance at what cost, or who is on a "no fly" list. Algorithms are being used -- experimentally -- to write news articles from raw data, while Donald Trump's presidential campaign was helped by behavioral marketers who used an algorithm to locate the highest concentrations of "persuadable voters." But while such automated tools can inject a measure of objectivity into erstwhile subjective decisions, fears are rising over the lack of transparency algorithms can entail, with pressure growing to apply standards of ethics or "accountability." Data scientist Cathy O'Neil cautions about "blindly trusting" formulas to determine a fair outcome. "Algorithms are not inherently fair, because the person who builds the model defines success," she said. Phys.Org cites O'Neil's 2016 book, "Weapons of Math Destruction," which provides some "troubling examples in the United States" of "nefarious" algorithms. "Her findings were echoed in a White House report last year warning that algorithmic systems 'are not infallible -- they rely on the imperfect inputs, logic, probability, and people who design them,'" reports Phys.Org. "The report noted that data systems can ideally help weed out human bias but warned against algorithms 'systematically disadvantaging certain groups.'"
The end (Score:3)
It's the end of intelligence as we know it.
Re: (Score:3)
Maybe just the opposite, the beginning of intelligence. The problems are garbage data, conflation, risk analysis with random failures, entropy, and just ignoring facts-- among so many other problems.
How far does an algorithm take bias until it's actually discriminating based on such things as gender, race, etc? We're in the very early stages of "big data" and we're doing a bad job of it. The problem is this: we'll continue doing a bad job until we have more transparency, IMHO.
Credit ratings often self-full-filling prophecies (Score:2)
> You are leaving out the money part. Credit is not rich. Its debt. You
> want to be rich, have money as savings. You want to be wealthy, have
> money as assets/investments. Most don't need credit except for a car
> (still poor), a home (very common), school loans (credit may affect rates).
That's how it used to be. Nowadays, credit ratings are part of the hiring process. If you have a bad credit score, you can't get a promotion, or in some cases even a job. So you have no income and default on loan
Re: (Score:2)
Er, not really. As long as the "intelligence" takes the form of algorithms, that means human beings are devising sets of rules for computers to follow. That is not very intelligent - or, at least, the intelligence involved is indirect, remote and attenuated. The people who specify the software's behaviour must communicate what they want clearly, unambiguously, completely and consistently to the programmers, who then have to do the same thing in their code. Finally, the computer does whatever the original sp
Not so (Score:2)
No, it doesn't mean that at all. You're not considering that machines can write algorithms. And they certainly can. Genetic software (which we can very accurately describe as an implementation of "nature's algorithm") has been doing that for decades now, and the deep learning mechanisms we're just beginning to explore now could be leveraged in similar ways, p
a "certainty" code smell (Score:2)
I've spent my entire life trying not to be this dim. Yes, very clever work there treading on the narrow definition—while engaged in 100% baby flush.
The ridiculousness of this is apparent to any thinking person in less time than it takes to type "Wittgenstein".
Because some human process defined the solution gradient that the "genetic" software optimized over—ad infinite turtle—in an act of algorithmic emancip
some linguistic navel gazing (Score:2)
After pressing "submit", in a split-second second evaluation, I noticed that that sentence I wrote does not quite work.
Problematic:
Less problematic:
Here's what you do (Score:4, Funny)
In my day, we had a simple and effective way to judge algorithms:
O(n log(n)) or faster: good
O(n^2) or slower: bad
Re: (Score:2)
In my day we wrote algorithms down on paper, and labelled the steps with numbers or letters!
1. Take a slice of bread out of the bag.
2. Place it in toaster
3. Set to light brown
4. Push down lever
We judged algorithms by taste.
Way back when... (Score:2)
In my day, we had a simple and effective way to judge algorithms:
Well in MY day we didn't have the luxury of ignoring the 'C' so readily.
Re: (Score:2)
Re: (Score:2)
We still don't.
Transparancy (Score:2)
Re: (Score:1)
First of all, the problem is these systems increasingly take away power from the judicial and other democratic systems, which actually were somewhat transparent. You can see the law that affects you if you want, it's public. With algorithms, you can't. An example are the 'risk of recidivism scores' that increasingly influence judges' sentencing. Those, it turned out, disproportionally called black people risky.
https://www.propublica.org/ser... [propublica.org]
Secondly, those new algorithms are rarely transparant, for a numb
Re: (Score:1)
Demand access to pricing algorithms (Score:3)
We should be demanding access to the data and algorithms used to generate pricing for mandatory services like the ACA, home insurance, and automobile insurance. We should never be required to buy anything without even knowing the basis for the charges.
Re: (Score:2)
We should be demanding access to the data and algorithms used to generate pricing for mandatory services like the ACA, home insurance, and automobile insurance. We should never be required to buy anything without even knowing the basis for the charges.
When greed and corruption pervert capitalism, one doesn't have to look far to understand pricing.
Let's also not pretend insurance is a business concept that has ever struggled to survive. They collect a few billion, and then hire an army of lobbyists to ensure their flavor greed is mandatory.
How this corrupt process works isn't some kind of mystery to solve.
Re: (Score:3)
But even if the algorithms are 100% open and transparent, that means nothing if the data feed into them is poor. If the bank uses an algorithm to determine if it want to lend money to you, how is the data about you collected? Who decided to classify you as a say medium risk person? What cirterias did he/she/they use for that? How thorogh were he/she/they in gathering decition material? What did he/she/they miss/ignore/misunderstood?
Unless there is full and complete transparency and accountability for dat
Re: (Score:2)
... At least the algorithms have the possibility of being looked at. Maybe that should have been the story.
Algorithms that determine what goods and services a person has access to, the quality of said goods and services, and how much he or she pays for them, need more than the "possibility" of being looked at. They need legislation that requires them to be publicly released in their entirety before being put into use, and every modification, bug-fix, etc. needs to be published as soon as, or before, implementation takes place. And there need to be really onerous and consistently enforced penalties for not relea
Re: Who cares about this (Score:2)
Re: (Score:2)
Unless they don't care if you know, in which case that may be even worse. Now that they've taken 1600 Pennsylvania Avenue, who's going to stop them?
Certain groups? (Score:2)
Some federal database might buy a state database and find lots of illegal migrants getting free city or state services?
That a person is a religious covert? Does their faith or cult have issues? A person buying products or searching for topics that get reported and tracked?
A person looking to travel? Most of the US online tracking is looking for any trace of radicalization and mobilization. Is a person of interest looking up interesting things?
Get some US gov/mil work? Need a polygraph?
Re: Certain groups? (Score:2)
Algorithms or what? (Score:2)
The author seems to think that the alternative to "algorithms" is people making good decisions. That's simply not true. The alternative is people attempting, or not, to follow some agreed on some ill-defined process.
Note that algorithms make it harder to ignore that we have to make tradeoffs. To take one of the examples from the article, we may have to choose between "well-respected teachers" and those who actually help students significantly. (Of course, we could reveal performance information and let p
Re:Algorithms or what? (Score:5, Interesting)
But because the program was actually used in judiary decisions in several States, it unnecessarily sent people to prison, while it recommended to set high risk people free on probation, and it did it with a strong racial bias that was contradicted by reality.
Re: (Score:2)
Re: (Score:2)
"persuadable voters" (Score:2)
"undereducated voters"
Re: (Score:2)
Re: (Score:2)
... mistake to assume that anyone who is susceptible to rhetoric is 'uneducated' ...
Think that over a bit:
In the 2016 election, a wide gap in presidential preferences emerged between those with and without a college degree. College graduates backed Clinton by a 9-point margin (52%-43%), while those without a college degree backed Trump 52%-44% [pewresearch.org].
Re: (Score:1)
The origin of the word ... (Score:2)
Al Gore [wikipedia.org] rhythm [wikipedia.org]
Computer says (Score:1)
Re: Thank you (Score:1)
insurance claim adjudication is quite automated. all the BS about "death panels", that stuff is built into the insurance platforms already.
car claim? theres vendors who do the adjudication on different aspects of your claims on behalf of your insurance co. theres companies that have a high rate of denying your claims. so naturally theres people you can call that will help advocate on your behalf to get what you deserve, because they have special knowledge those companies dont want you to know...
Health insur
In other news... Integers: Why so many? (Score:4, Informative)
Al, go rhythm -- you'll like it. (Score:3)
Don't forget the algorithm that determined which "hundreds of movies" out of the zillions that have been made that you got to choose from out of the first 30, and the algorithms that the movies studio used, and the algorithms that the effects companies used, and the algorithms that determine which actors were "hot"...
To say that making a choice on Netflix is "algorithm-free" is to not even remotely understand the world one lives in.
Re: (Score:2)
Agent Smith (Score:4, Insightful)
"Which is why the Matrix was redesigned to this, the peak of your civilization. I say your civilization, because as soon as we started thinking for you, it really became our civilization, which is of course what this is all about."
Re: (Score:2)
Sadly the last years of the 20th century were the peak of our civilization, for the years which followed brought us into the 21st century of the War On Terror, the Great Recession, and President Trump.
I more thought of the last years of the 20th century as peak corruption. You know, back when you could sell vaporware for millions, resulting in the dot bomb crash.
Or maybe peak corruption was in 2008 when the deregulated banking industry started fucking about, resulting in one of the worst financial crashes in history.
And then we come to today, where companies can file for multi-billion-dollar IPOs after demonstrating a unique ability to lose hundreds of millions per year, and may never sustain profitabil
Re: (Score:2)
I generally think of corruption as the state where people in positions of authority make decisions based on what they personally get from involved people rather than according to how their authority should be used. For example, if a police officer writes speeding tickets based on whether the driver hands over $100 or not is corrupt. By this criterion, I'm not sure your examples qualify.
Selling vaporware or stock typically involves unforced agreement on both sides. It's stupid to invest much in vaporwa
Algorithms are just more weapons (Score:2)
"A federal appeals court decisively struck down North Carolina’s voter identification law on Friday, saying its provisions deliberately “target African-Americans with almost surgical precision” in an effort to depress black turnout at the polls."-July 29, 2016
https://www.nytimes.com/2016/0... [nytimes.com]
Re: (Score:2)
One of 14 unequal outcomes that will make you say "fuck having law and order and shit".
Black Box Society (Score:2, Insightful)
A recent report on this in the Netherlands summarised it as "playtime has to be over". Big data and the algorithms that work on top it are getting a serious amount of power over our lives. Any little scrap of data is starting to influence your chances of getting a job, a cheap loan, or even a date.
If you want to know how scary this gets, check out this presentation by Alexander Nix on how he used this type of data to influence the elections.
https://www.youtube.com/watch?... [youtube.com]
Or have a look at the new "Social
models may re-enforce themselves too (Score:2)
Model says people are not the right candidate to get the job. Result: you don't get the job and other people who look good to the model get the job. Again, more data that "proves" those people should not get that job and the other people do.
Re: (Score:2)
Few people count the results of random searches in the crime rate. Crime rate depends on crime reported, not recorded.
Re: (Score:2)
Before models, managers would interview potential new hires, and would generally hire or not hire based on the impression they got of the candidate. That's not necessarily better than a faulty algorithm. The algorithm can be examined and changed.
It's a long winded warning of... (Score:1)
... garbage in, garbage out.
Sjw's (Score:1)
The fact is that SJWs cannot seem to comprehend that inequality in result isn't itself proof of some bias, PARTICULARLY if the bias-factor isn't even part of the algorithm.
Further, the fear is that simple objective analysis will occur without human intervention, and thus lack someone to call racist, sexist etc (in essence, so they're pre-labeling the author of algorithms as racist, sexist etc.).
For example
Your algorithm shows that people below a certain income level fail to repay loans at the normal rate, s
en francais (Score:2, Funny)
The first movie I ever watched on netflix was Inside Out. At the end, Netflix's first recommendation was that if I liked Inside Out, I should watched Inside Out in french.
Great algorithm there. Oh the complexity. What's next? The spanish version?
Probably the worst suggestion any person could have ever made to anyone outside of a french class.
The algorithm must have been so happy. Think about it. It found a movie, where every word spoken is totally different, but there's a 100% match on the title! Woo
Meta Thinkers (Score:2)
Doesn't work on us. We can think at the same level as the creators of said algorithms. It's not a "War of Math Destruction" it's a "War of Meta Thinking". It is akin to playing a game of chess and trying to guess based on previous experience from playing your opponent what you think they will do and make a move to counter it. But if you're opponent is thinking in the same manner and is thinking you might think in this way and anticipates you arriving at that conclusion, he/she can counter your counter.
T