Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security Privacy Science Technology

Cry To Beat Iris Scanners 373

Ant writes "The Register has an article on how crying beats iris scanners. An MP who volunteered to take part in the UK ID card trials says the iris scanner used is uncomfortable and made his eyes water... The water in his eyes actually stopped the scanner from working, and it seems long eyelashes and hard contact lenses could fox it too... So we're going to have a system that is derailed by a few tears and fluttering eyelashes?"
This discussion has been archived. No new comments can be posted.

Cry To Beat Iris Scanners

Comments Filter:
  • by brolewis ( 712511 ) on Tuesday May 11, 2004 @03:03AM (#9114429)
    Hm, so technology meets the sterotypical cop: bat your eyelashes, cry a little and get out of the ticket.
  • by Ckwop ( 707653 ) * on Tuesday May 11, 2004 @03:05AM (#9114441) Homepage
    For the 123rd time. *How* does biometric data prevent terrorism or halt illegal immigration or any of the things it's meant to do?|

    Terrorists: Is any (known) terrorist worth his/her salt going to fly on their own passport. What's stopping them getting a *real* passport with the correct Biometerics on a different name?

    Immigration: Anyone who wants to immigrate enough will get the *real* id in a fake name!

    Stopping Criminals: Yes because criminals are moral enough not to have fakes!

    The trade off isn't worth it. The only person this effects is you: the law abiding honest citizen. Life is no harder for any of the above groups.

    Simon.
    • by Anonymous Coward
      It's designed to make contractors money.
    • by hak1du ( 761835 ) on Tuesday May 11, 2004 @03:17AM (#9114510) Journal
      What's stopping them getting a *real* passport with the correct Biometerics on a different name?

      Well, in the Bush/Ashcroft 1984 utopia, the biometric identifiers are not only stored on your passport, but also in centralized databases. They aren't only used to tie you to your passport, but they are also used to retrieve possibly matching identities from those centralized databases.

      Furthermore, the same centralized databases contain assessments of how much of a threat you likely pose, based on detailed information about where you have traveled, what kinds of political views you have stated in public forums (and maybe in private), the results of surveillance, contacts, purchasing history, insurance history, habits, and interests.

      Immigration: Anyone who wants to immigrate enough will get the *real* id in a fake name!

      That one's even easier. The general idea is that all US citizens would have their biometric identifiers registered in central databases with an indication that they may enter the country. Furthermore, the biometric identifiers of everybody who has ever been denied entry would also be registered. When you appear at the border and your biometric identifiers fall into the first category, you are permitted in. If they fall into the second category, you won't be let in, no matter what your (probably fake) passport says. And if you fall in between--well, prepare for a long wait.

      Furthermore, even if the biometric identifiers are not reliable enough to be able to distinguish between hundreds of millions of people in centralized databases, governments are also assuming that they can make id cards that are sufficiently forgery-proof to make "just getting a *real* id in a fake name" rather difficult.

      I'm not saying that any of this will work. I'm just saying that, if you assume that biometric identifiers actually work reliably and/or that you can produce ids that are difficult to fake, you can concoct scenarios in which they would be useful for the intended purpose.

      I think those are big "ifs", but if you are going to attack these policies, I think you need to dig a little deeper to do so.
      • by Ckwop ( 707653 ) * on Tuesday May 11, 2004 @03:33AM (#9114570) Homepage

        Furthermore, even if the biometric identifiers are not reliable enough to be able to distinguish between hundreds of millions of people in centralized databases, governments are also assuming that they can make id cards that are sufficiently forgery-proof to make "just getting a *real* id in a fake name" rather difficult.

        A UK reporter was able to obtain a *real* fake ID for just over a grand. Through a network of bribes.. It's not as hard as you think..

        Ask yourself this: How much do you recon they pay their staff at the passport issuing office? Now ask yourself how much that passport could be worth to someone! The math does itself.

        ID cards are flawed because you can't secure a system that large. Criminals have cash to 'invest' in perverting your system.

        Simon

        • by Ckwop ( 707653 ) * on Tuesday May 11, 2004 @03:37AM (#9114581) Homepage
          haha.. Lesson 2 in security. Authenticating a person doesn't tell you their motive.

          Simon.
        • Ask yourself this: How much do you recon they pay their staff at the passport issuing office? Now ask yourself how much that passport could be worth to someone! The math does itself.

          In Bush's mindset, any staff person that would do such a thing should probably be considered a terrorist and can just be shipped off to Guantanamo without a trial, where they can be raped and tortured courtesy of the US government. Given that downside, faking ids for a few bucks probably seems a lot less appealing to the staf
        • Here in the US, my brother tried to replace his driver's license (the de facto US identity card) because his old one was damaged. He tried to use cash to pay the fee for this (probably something like $20), but then he discovered the driver's license center would only accept a money order because the employees of the center weren't trusted to handle cash. Seriously! Our government over here doesn't even trust the people who hand out ID cards with twenty dollars of cash!
      • That one's even easier. The general idea is that all US citizens....

        Except (s)he was talking about the UK and not the US approach. If not for getting the country wrong, you'd be correct :-)

    • by Ralph Wiggam ( 22354 ) on Tuesday May 11, 2004 @03:19AM (#9114516) Homepage
      All of the 9/11 hijackers had valid state IDs. I think about that while I'm showing my ID to the sixth person in the airport. Speaking of those guys, there was big report released last month showing that the federal TSA baggage screeners were just as incompetent as the private employees they replaced. It's all window dressing to make you feel safe enough to go out and spend your money. Meanwhile, our ports are wide open to someone slapping a stamp on a bomb.

      -B
      • Yeah, I remember Rush Limbaugh played a funny mock-advertisement for the new "Elite Federal Baggage Screening Corps" which was real funny.

        Just ask our new recruit Johnny here: "I used to just sit at the bus stop, picking my nose and annoying people, but this is much better!"

        "Just pass our rigorous qualification test -" *yup, this one's breathing* "-and you're on your way to an exciting career!"

    • Particularly as visitors here for less than 3 months will be exempt.

      Also, people will rely on the DNA database as evidence, and not do the proper police/intelligence work. Fakers will escape the net. I always remember a maths teacher telling us to apply "sanity tests". Like roughly do the maths in your head and then check against the detailed calculations. The problem with systems over humans is that this is often not done (A bit like "why didn't Saddam fire those WMDs if he had them?")

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Tuesday May 11, 2004 @03:42AM (#9114597)
      Comment removed based on user account deletion
      • Rather than giving someone your timecard, you just make a mould of your finger and they press that against the fingerprint reader.

        Not only can you make a mould of a finger from a willing participant (and why not if you want to commit fraud against an employer). You can create fake fingerprints from residual prints that someone has left behind.

        http://www.totse.com/en/bad_ideas/locks_and_sec u ri ty/164704.html

        So, how easy do you want it to be for someone to steal your luxury car?
      • by ezzzD55J ( 697465 ) <slashdot5@scum.org> on Tuesday May 11, 2004 @05:36AM (#9114931) Homepage
        It's an interesting idea, but it's too dangerous, because the whole point of biometrics is that they are tied to your person. You can't change them (eyes, fingers), you can't get new ones if your old ones are lost (eyes, fingers) or their information stolen (iris pattern, fingerprint), not everybody has them (eyes, fingers), and all scanners can probably be fooled with a little or much effort.

        Another reason I don't like biometrics, however, is that you cannot compartmentalise your authentication information any more. If, say, the tax people, phone company, bank and the police all use your biometric information to authenticate you, then that provides for a massive spillover in (authentication) information that you can't control - for the same reason that it is a bad idea to have the same PIN code on your ATM card and your GSM phone PIN, it's a bad idea for everybody using the same info to authenticate you. Nowadays, if somebody can impersonate you to the phone company, all they can do is run up high bills or get you disconnected or something. But if you're a phone company employee with access to someone's biometric info, you're a small step away from being able to impersonate that person to their bank, passport authority, etc., and take over their life.

        Even worse, as above, you can't change your info if it's compromised. Remember that biometric info is just a fancy password, with all the password weaknesses, with the advantage that you don't have to remember it, and the disadvantage that you can't change it or get a new one. People can intercept and replay your password (biometric info) to scanners, it's just very simple symmetric and unreliable information in the end, relying on the trustworthiness of biometric scanners to be trustworthy. And of course the path from the scanners to the device interested in your identity..

        Biometrics aren't a silver bullet.

    • Re: (Score:2, Insightful)

      Comment removed based on user account deletion
    • Terrorists: Is any (known) terrorist worth his/her salt going to fly on their own passport. What's stopping them getting a *real* passport with the correct Biometerics on a different name?

      Especially if said terrorist has the resources of a nation state behind him or her. In any such ID system there will be mechanisms for issuing bogus identities with valid biometrics. For such things as undercover cops, spies, "witness protection", etc.

      Stopping Criminals: Yes because criminals are moral enough not to ha
    • Also for the 123rd time...

      What possible value is there in making me carry a card around with me that matches my eyes/fingerprints? I ALWAYS carry my eyes and fingeprints with me at all times. Does anyone other than me spot the redundancy here?

      The standard reasoning is that the card carrys more than just eye/print data.

      Well, if you want to store data that relates to terrorists eyes/fingerprints, isn't it obvious that that data should be held somewhere a bit more secure and tamperproof than in the terroris
      • I have - it's one of the things I've noted in the margins of my printout of the consultation document [homeoffice.gov.uk] (120 pages of PDF). Comments are invited, closing date is the 20th of July. I suspect the government's answer will be that it allows banks and other non-government parties to check your biometrics without having access to the National Identity Register.
    • Terrorists: Is any (known) terrorist worth his/her salt going to fly on their own passport

      Why not? I don't think any terrorist worth his/her salt is going to give 'terrorist' as their occupation on the passport...

    • "it is important that we do not pretend that an entitlement card would be an overwhelming factor in combating international terrorism" - David Blunkett 3 July 2002

      (entitlement card was the proposed name for ID card back then)
    • by mumblestheclown ( 569987 ) on Tuesday May 11, 2004 @05:43AM (#9114960)
      Of all the bullshit logic we see on slashdot, this has got to be the most persistent and annoying kind... the sort of logic that supposes that if something doesn't provide absolute security, then the security it provides must be worthless.

      In practice, this is a nonsense argument. For example, most people here know that WinXP copy protection can be broken with the help of a few google searches that lead to a few russian websites. there are trivial ways to defeat masterlocks and the ordinary sort of locks that 'secure' house doors. modern money *can*, with enough patience and technical skill, be counterfeited.

      And yet microsoft continues to have a keycode unlock to winxp, houses continue to have locks, and treasury departments still spend quite a bit per bill to give them 'security features.' why?

      Because as anybody who would rather think about this for two seconds (rather than just whoring up for +5 insightful, as you have) could see, protection in a real and complex world is not about *absolute* protection, it's about decreasing the *rate* of violation/infringement.

      I know several people who have bought XP where they pirated 95/98/whatever because of their fear of the online activation system. People continue to have locks on their houses because it will make their house less likely to be burgled, and the counterfeit protection on money stops all but the most determined counterfeiters.

      Likewise, biometric data will NOT "prevent" or "halt" illegal immgrigration in an absolute sesns and it is unreasonable to claim that's what it's "meant to do." Rather, it will SLOW THE RATE of illegal immigration (if not terrorism--that is obviously less of a statistical process because of the smaller data set). What is stopping them from getting a *real* passport with teh correct biometrics in a different name? have you ever tried getting an illegal passport of the regular kind? it's not easy! now, try finding somebody who provides an illegal passport with an embedded chip in it! not easy at ALL, especially given that for example, you know, when a UK passport is scanned at a US border, the US queries (or can query) the UK systems to vouch for the authenticity of the passport.

      To claim that anybody who wants to "immigrate enough" is bullshit. Sure, there will always be the top n% who are determined, clever, and connected enough to beat any system. But with inceased smart security such as biometrics in concert with other ideas, this n% becomes smaller and smaller.

      MOD PARENT DOWN as he has provided NO INSIGHT

    • It's worse than that. Much worse. It's not even a security trade-off (where some convenience and privacy is sacrificed to gain a little security), it can quite convincingly be argued that systems such as these make security *worse*. Consider the following, insanely optimistic assumptions:
      • Everyone will travel under their real identity.
      • There will exist no fake-but-valid-looking biometric identity-cards.
      • Noone will be able to obtain real cards under an assumed name.
      • The BigBrother database manages, by collec
  • uhh.. (Score:4, Funny)

    by Anonymous Coward on Tuesday May 11, 2004 @03:05AM (#9114444)
    So we're going to have a system that is derailed by a few tears and fluttering eyelashes?

    We already have a system like that. It's called Windows.

    _
    Download AWESOME music here [kicks-ass.net] (lame encoded).
  • by guycouch ( 763243 ) on Tuesday May 11, 2004 @03:07AM (#9114458) Journal
    "So we're going to have a system that is derailed by a few tears and fluttering eyelashes?" Yes. They're called women.
  • by Alsee ( 515537 ) on Tuesday May 11, 2004 @03:08AM (#9114463) Homepage
    I bet sandpaper works too!

    -
  • by scubacuda ( 411898 ) <scubacuda@gmai[ ]om ['l.c' in gap]> on Tuesday May 11, 2004 @03:09AM (#9114464)
    When I hear "beats iris scanners," I think of an iris scanner giving some sort of false positive.

    Sure, there's a problem with it correctly identifying the real people. But is this really "beating" the scanner?

    Just a thought...

    • You're right. The Register had a pretty misleading and incorrect article title.
    • by Vellmont ( 569020 ) on Tuesday May 11, 2004 @03:15AM (#9114499) Homepage

      But is this really "beating" the scanner?


      If 7% of the time the scanner can't ID you, those people will probbably just routinely be let in. If all you have to do is tear up a little, have long eyelashes, or whatever then anyone that'd be caught be this system will do just that. A system where it's easy to become incorrectly identified is a useless one.
      • Very true.

        And for immigration purposes, not showing up on the system IS beating the system. The immigrant can then claim that they have just arrived at port and begin the immigration process again, despite having been in the country for a while and previously had your application rejected.

        The application looping is what these systems are supposed to prevent and is much of the basis for the ID card proposals.

        This system is worthless.
      • If 7% of the time the scanner can't ID you, those people will probbably just routinely be let in.
        Or, every time there's an error you get a free body cavity search.
    • From the point of view of us British people who are facing the threat of ID cards, "beating the iris scanners" means defeating the whole concept and forcing the government to listen when we say we don't want them.

      If we all cry when they come to scan us, we can stop this.
  • by MoThugz ( 560556 ) on Tuesday May 11, 2004 @03:11AM (#9114477) Homepage
    This sort of things happen all the time when you're using a new technology. Nothing just works as expected the first time round, and it's precisely because of such issues that people innovate.

    And, IIRC, the UK is just doing a trial run of this biometric ID card thingy, and the purpose of such trial runs are to catch "gotchas" like this.

    I'm not going to rant on the "privacy issues"... heck, my country uses an ID card system as well, and as far as I'm concerned, it eases a lot of trivial processes (loan applications, etc. etc.) and in case something happens to me, at least people will know who I am.
    • A trial?

      How many government trials with political backing don't get implemented?

      If it goes bad, Blunkett will just say that there were issues to iron out. I can't imagine for 1 minute that he'll cancel it.

      • How many government trials with political backing don't get implemented?

        I would say quite a few, if it was proven massively unpopular, especially when the government is democratically elected.

        If they push on with it despite massive protests and so on... chances are they will not get re-elected, and the winning party is almost surely campaigning primarily for the axing of said unpopular program.

        I too, can't say if the program will get axed... and if it's based on sincere fact finding and R&D, that

        • I would say quite a few, if it was proven massively unpopular, especially when the government is democratically elected.

          If only you were right. The poll tax was unpopular in Scotland and still got implemented.

          Also, Blunkett completely ignored the public feedback on ID cards, where something like 80% of respondents were opposed, complaining that that was because of an orchestrated campaign (like people are sheep or something).

        • Both Labour and the Conservatives support the introduction of biometric ID cards. Labour because they believe it will give them control and the Conservatives because of the amount of money their contributors are going to make while rolling the system out.

          We're lucky in that there is one party who are definitely against ID cards, the Liberal Democrats, but realistically, they don't matter. The UK has an election system which favours the largest minority (35%-40% is enough), handing them a disproportionate
  • by Wasteofspace ( 777087 ) on Tuesday May 11, 2004 @03:12AM (#9114479)
    I recently had a bad fall and ended up in hospital (no need to mention the shopping trolley and the amount of alcohol that caused this situation)

    After some standard tests, the doctor spotted that one of my iris's (sp?) was larger than the other, which had something to do with the head trauma.

    Basically that means that if you need to pass an eye scan, just drink lots, grab a trolley, fall on your head, and nothing will be able recognise you by your eyes any longer as the features of them will have changed.

    (probably talkin s%$t, but i could be right, right??)
  • by 0xC0FFEE ( 763100 ) on Tuesday May 11, 2004 @03:13AM (#9114489)
    There ain't no such thing as a technology that gets worst or doesn't improve. So in due time things will be perversely efficient and operate in a wide range of conditions. Yeah it takes time, but in this particular case, the more the better in my view.

    Anyway, when I go get my eyes examined, there's this machine taking a picture of my retina and blowing air into it so as to remove water. Oh and they ask me to remove my lens first, imagine!

    • by prockcore ( 543967 ) on Tuesday May 11, 2004 @03:29AM (#9114559)
      There ain't no such thing as a technology that gets worst or doesn't improve.

      True, but there is such a think as a technology that has been proven to be inherently flawed.

      Just google for "Bertillonage" for an example of a failed biometrics concept, which no amount of technology could save.

      Is iris scanning inherently flawed? I don't know, but if they're just now finding out crying gives a false negative, I don't think anyone has really done any real tests to prove one way or another.
      • This is going OT, but we're not talking inter-operability or performance here where "flawedness" is caused by the infrastructure in place. We're talking basic stuff like data acquisition and data analysis algorithms.

        Now if the data acquisition is flawed, there's nothing you can do and there's no algorithm to correct the flaws. Now following my suggestions previously it is not really _hard_. If the algorithms are flawed then its no big problem because 1) You've acquired data through a proper acquisition pr

        • I looked at your "example" of 19th century biometrics. Interesting historical value. Your point was?

          My point was here was a system where no matter how accurate your measurements were, it didn't matter since it wasn't unique enough.

          It seems to me that no one has done any real tests of iris scanning to show that it isn't easily circumventable. Does lasik surgery affect the scan? What about opaque contacts?

          No amount of technology is going to help if iris scanning is inherently flawed... and we're not goi
  • by jamesh ( 87723 ) on Tuesday May 11, 2004 @03:15AM (#9114497)
    I have eyelashes long enough that they rub on most sunglasses I wear. They also blur my peripheral vision unless I open my eyes up really wide. How long do they have to be to interfere with such a system?

    I've never been game to trim them though :)

    My daughters have inherited the long eyelashes though and they suit them much better.
  • by geekanarchy ( 769840 ) on Tuesday May 11, 2004 @03:17AM (#9114507)
    I may just start selling signs that say "Secure Area: No Chopped Onions Allowed".
  • by grendel_x86 ( 659437 ) on Tuesday May 11, 2004 @03:18AM (#9114511) Homepage
    from Pondexter (yes the evil big brother guy) where he said "in a lot of ways we have the worst of both worlds: no security and no privacy".

    http://www.wired.com/wired/archive/12.05/poindex te r.html

    (It was in this past wired, good article)
    • quote from the pointdexter article [wired.com]:

      It's a little like the Sims - you create a virtual world that has real addresses, real airports, but is populated with imaginary people. We built them by taking a list of all the last names in the country and then adding first names at random. Then we had them take trips. We had a team of a dozen people who came up with scenarios. You introduce terrorists into your world, and then you start looking for ways to pick them out from the data.

      so they actually pay some people

  • by Tony.Tang ( 164961 ) <slashdot&sleek,hn,org> on Tuesday May 11, 2004 @03:23AM (#9114537) Homepage Journal
    The title of the post is poorly worded. Crying doesn't BEAT iris scanners -- that seems to imply that by crying, the iris scanner goes "okay, you're good." Instead, the iris scanner FAILS if you cry. That means, if your eyes water, the iris scanner may not recognise you.

    Needless to say, this makes a lot more sense, and is actually more acceptable. After all, (and here's my layman's view coming in) iris scanners are essentially cameras with some pretty cool-dude computer vision algorithms in the back. If your eyes are teary, the CV algorithms get messed up -- it's kind of like having a distortion lens (like an oddly shaped magnifying lens) on the front of the camera.
    • well, it may BEAT iris scanners as a tool to use on millions of people in an airport because you have 1000 people sitting and waiting for positive ID => e.g. if the number of people having trouble with these machines (teary eye, red eyes from flight, alcohol influence from intercontinental flight, drug influence, eylashes, or anything else that upsets the precious algorithms) is too large, the system becomes unpractical.
      in that sense, you can say "beat".
  • accuracy (Score:2, Interesting)

    by noelo ( 661375 )
    While people may joke about this technology and the whole id verification process/big brother, the fact is that its here to stay and I'd rather that flaws like this one are discovered in the initial test stages than having to spend hours proving who I am at an airport.
  • Astigmatism (Score:4, Interesting)

    by groupthink ( 568205 ) on Tuesday May 11, 2004 @03:38AM (#9114585)
    Where I work we use these iris scanners [lgiris.com]. I wear glasses for my astigmatism [m-w.com] and the system reads just fine through my glasses, unless I turn them perpendicular to my face. Other people who work here have to remove their glasses regardless.
  • Failure rates. (Score:2, Interesting)

    by rew ( 6140 )
    ... fails to correctly identify people in just 4 percent of cases ...

    If you do a test run with 1000 individuals,and find that 4% of the subjects are identified as someone else, then you really have a problem.

    If you then scale up to 1 million people, you will find that a MUCH larger percentage of people will be misidentified: There is a much larger database of people who might have an iris that to the computer looks almost the same. That's when the shit hits the fan.
    • Re:Failure rates. (Score:2, Insightful)

      by spacefrog ( 313816 )
      How in the hell you got modded up is beyond me.

      What specific evidence or even real reasons do you cite that "If you then scale up to 1 million people, you will find that a MUCH larger percentage of people will be misidentified".

      Do you have anything real to cite?
      • Applications are quoted as "Identification", as in "who is this person?". Not as Identity verification, as in: "His pass says he's Roger Wolff, is that true?". Maybe that's a common misrepresentation by the press.

        If you pose the "who is this" question to the computer, your scan will be matched against 1000 others. If the per-match chances of going wrong are 0.004 percent, then doing 1000 matches will result in about 4% error rate.

        With that error rate, trying 1 million matches will result in a correct iden
    • Re:Failure rates. (Score:3, Insightful)

      by bobbis.u ( 703273 )
      Where did you get your quote? The article states that it

      failed to match people with their details in just four per cent of cases

      That is totally different from saying 4% of the subjects are identified as someone else which your quote does not imply either.

      Anyway, surely the system is only for authentification and not identification? I.e. they have your iris on record, you input your name and give them the iris scan. If the two match, you are who you say you are. I seriously doubt they will just scan you

  • First they are confiscating fingernail clippers next it'll be eye drops. Will using eye drops in an airport mean an automatic strip search? People with contacts beware.
  • by Raindeer ( 104129 ) on Tuesday May 11, 2004 @03:52AM (#9114629) Homepage Journal
    The pain with biometrics is, that it is so sexy and so hyped up, that people aren't willing to look at the numbers behind it. Contrary with what privacy and security people always shout, the biggest problem isn't that it doesn't stop criminals and terrorists. The single biggest problem of biometrics is its failure rate.

    If you want to roll out biometrics on a massive scale, an accuracy of 0.1 percent chance for falsely rejecting a person means that at an average large airport, like JFK, Atlanta, Heathrow means that 1 in a thousand scans fails. Now this might not sound as a big chance, but since you need to go through the biometric scanner twice, when you get on or when you get off. So this reduces the amount of people nescessary for failure to 500. Result is that with the hundreds of millions flying on a yearly basis in Europe and the US over 100.000 people might not get on or off a plane.

    You might be one of them!
  • It stopped the scanner from working != gave a false postive on the scanee.

    This thing isnt going to let anyone by who has watery eyes, its just going to give an error and ask them to scan again. Just like a bank card with a weak magnetic strip. They dont just automatically aprove your purchase, it gives an error and asks you to swipe again.

    Of course, I'm very skeptical on how biometrics helps ANYTHING..but this is outlined well in a +5 post here..read that.
  • by NewtonsLaw ( 409638 ) on Tuesday May 11, 2004 @03:55AM (#9114644)
    Given that the integrity, honesty, competence and trustworthiness of those at the top of the political power-pyramid has been well and truly drawn into question by recent events related to the treatment of prisoners in Iraq, am I the only one worried that these centralized databases of personal ID and info represent a *huge* potential for abuse?

    It really scares me that what was frightening science fiction yesterday, looks like becoming reality tomorrow.

    Looks as if one of our most important rights (the right to privacy and anonmymity) is about to be exponged forever -- with narry a whimper from the general population.

    When *used* only as promised, modern sophisticated ID and tracking systems may pose no threat to the general public -- but what happens when (and that is *when*, not "if") they are abused?

    What protection mechanisms are incorporated to stop some bureaucrat or politician (ab)using such a system to track a foe and use that information for their own means?

    Isn't about time we told our politicians to back off and mind their own business?

    While I'm most certainly not anti-American, I think the simplest and most effective way that the USA could reduce the risk of terrorist attacks is by getting out of Iraq and stop trying to expand its empire and the reach of its military muscle.

    I can imagine how much better life would be for US citizens if the US government spent as much on the health, welfare and education of its own people as it has on war in the past 60 years or so -- and ultimately, what have they got to show for their involvement in Vietnam, Granada, Somalia, Iraq, etc?

    Yeah, we all know that Saddam was a despot -- but I'd wager that there are just about as many people who regard Bush as a despot. Surely that gives them no more right to attack the USA than the USA had to attack Iraq. All sides in this battle are completely and utterly mad.

    Uh-oh, off topic :-(

    • No, you're not the only worried by it. What bugs me is that my lame-ass goverment will jump on this bandwagon too. Your government will say something like anyone flying into the US must be registered in our database - which is shared with airlines and other governments (this already happens with the new microchipped passports). To smooth things out, we'll get scanned when we renew our passports, and hey presto, I'm being automagically monitored by your government next time I'm travelling to Europe. If I unw
    • by Lord Ender ( 156273 ) on Tuesday May 11, 2004 @12:23PM (#9117582) Homepage
      If the US hadn't spent so much on its military in the past 60 years, much of the world would be communist (in the model of the USSR) and would not have this "freedom" we now enjoy. There would still be a cold war. Countries like South Korea would be in the sad state of countries like North Korea. If we had taken that money and put it toward social services, we would curerntly have an unsustainable population because every unproductive bum in the world would come here for free health care, shelter, and food. And these bums would have a disproportinately high number of children, who inherit their freeloading attitude. But enough of thus alternate timeline.
  • by thesp ( 307649 ) on Tuesday May 11, 2004 @03:55AM (#9114645)
    This seems a worrying trend with biometric systems - even innocent fear/nerves cause physiological changes which can cause a scanner to give a 'no match' scenario. If biometric ID were to become compulsory, there is the distinct possibility of this problem becoming a real danger to the population.

    For example, if you have some nerves or phobia about the screening process (big men with guns, what-ifs about false positives), your physiology changes, and your biometrics no longer match your card. You are therefore taken in for further questioning.

    Even if you are cleared, the next time it happens, you are more nervous, and eventually this becomes a common event for you.

    In extreme cases, some people's reinforced phobia would then prevent them claiming benefits, travelling, anything that the ID was required for, sine they fear the accusations and questioning.

    This is similar to effects seen on the now-discredited polygraph, still in use by agencies worldwide.

    For example, I always get tense going through metal detectors. This is partly due to a childhood visit to Washington from the UK, when by accident I triggered the bomb detectors on a visit to the CIA buildings. (I was about 7, and didn't realise my pocket fan would set off the detectors.) I was taken away from my parents, and searched. This is a big thing when you're seven, and now these sorts of checks make me (irrationally, I know) very twitchy.

    If failing these tests due to phobia were to become a pattern with me, even if it meant I was often singled out in any sort of official process, I am sure my phobia's symptoms would increase, just driving up the error rate. Positive feedback, you see.
  • An MP who volunteered to take part in the UK ID card trials says the iris scanner used is uncomfortable and made his eyes water.

    Secondary tests revealed that he doesn't have glaucoma.

    (I hate that damn 100 PSI glaucoma test. You might think your dentist is sadistic. I *know* my opthamologist is a complete psycho.)

  • ...people with security clearance, with one or both their eyes missing. That will mean the iris scanners have been improved!

    (and if you see those people without fingers... well, that will mean electronic fingerprint recognition became popular)
  • by Zog The Undeniable ( 632031 ) on Tuesday May 11, 2004 @04:23AM (#9114743)
    Nationwide Building Society [nationwide.co.uk] in the UK tried iris scanning for ATMs a few years ago, and it was 100% successful. The technology wasn't rolled out further because of (a) cost and (b) it was fairly useless as a fraud prevention measure unless all other banks did it too - you could just use a non-iris ATM if you only had a card and PIN.

    Rather gruesomely, the system checked for a pulse in the iris to ensure that you hadn't got a life-size photograph...or cut off the account owner's head.

    • by TheLink ( 130905 ) on Tuesday May 11, 2004 @05:00AM (#9114846) Journal
      Well they're not all made the same. Just like anything there are different specs. Not sure of the ones used in the ATMs - I heard some of those can work from quite a significant distance- 1 metre? The one I played around with only could do about 10 cm to 20cm maybe 30 cm.

      To register a person you'd want the best pic possible, so you normally want a cooperative subject. But after that the one I tested was pretty OK, even IDs people with scratched eyewear and even some sunglasses.

      As for the danger to epileptics claims thats stupid - the stuff can work with IR light. The one I played around with had 3 red LEDs for illumination and was made by LG.

      Just buy the right iris scanner for the task and it'll work OK, unless the iris is obscured - I suppose really thick/long eyelashes might cause problems.

      Epileptic thing really sounds fishy, perhaps there's a hidden story/agenda somewhere. Now if they had said that fake contact lenses could cause problems I'd believe them - then you need fancy scanners that detect pulses and the usual involuntary iris size changes - I doubt the cheap scanners do that.

      Whatever it is, with biometrics for real security you always need a guard there, otherwise you can bring in equipment to fool the sensors. No self respecting guard is going to let you stick some fancy gizmo into/in front of a biometric sensor...

  • by Moderation abuser ( 184013 ) on Tuesday May 11, 2004 @04:34AM (#9114773)
    Iris scanners have a failure rate of around 4% -> 7%. This is a failure to identify a legitimate person against a *previously stored scan*. I.e. the scan stored in your biometric card or the scan stored in the government database.

    Fingerprint scanners have a failure rate of around 2%.

    Facial scanners have a failure rate of 10+%.

  • by JaJ_D ( 652372 ) on Tuesday May 11, 2004 @04:52AM (#9114821)
    So we're going to have a system that is derailed by a few tears and fluttering eyelashes?"

    Yeap its called my love life :-]

    Jaj
  • by rpjs ( 126615 ) on Tuesday May 11, 2004 @05:18AM (#9114882)
    I had ops for cataracts when I was a child. As a result my pupils aren't the nice round sort the rest of you have but are sort of ragged. I wonder how Mr Blunkett's rinky-dinky little fascist scanner equipment will cope with my eyes?

    Well no matter, hopefully me and the soon-to-be-missus will have emigrated to somewhere saner by the time the "voluntary" ID cards will have stopped being voluntary.
  • Since when (Score:2, Funny)

    by SlashDread ( 38969 )
    is iris scanning a "good idea"?

    Or does halley-burton own some iris-scanning patents?

    In that case, I, for one, welcome our new techno info patent overlords.

    "Dread"
  • by darthdrinker ( 150713 ) on Tuesday May 11, 2004 @07:05AM (#9115174)
    I work at a high security department of a large company. I have to pass the iris scan on a daily basis and have never had any trouble with the machine not accepting my eye. And you don't want to know how my eyes look after a weekend of drinking and barely no sleep. You don't have to open your eyes very wide or anything that would make your eyes water. You just look into the machine the same way as you normaly look at something.... Vere rarely the systems doesn't accept you the first time but when you try for a second time the system gets it. We are talking about a 10-15 second procedure so You can't copmplain about that. I don't see the problem.
  • by mwood ( 25379 ) on Tuesday May 11, 2004 @09:20AM (#9115745)
    Nah, this is just what happens when starry-eyed techies meet the real world. The gadget works under perfect conditions, and now the field trials will shake out all of the practical problems that were not thought of in the lab.

    I think the real impediment is going to be the natural trepidation of one who finds himself expected to submit his *eyes* to a machine which will decide whether he's good or evil.
  • by msheppard ( 150231 ) on Tuesday May 11, 2004 @09:40AM (#9115918) Homepage Journal
    Beating the device would imply somehow fooling it to granting you access. The crying effect makes it so the device will not work. So it might be a useless technology if some people can't use it.

    M@
  • by failedlogic ( 627314 ) on Tuesday May 11, 2004 @09:41AM (#9115932)
    "crying beats iris scanners"

    This report is patently false. Why? This news comes from a politician. We all know that they void of human emotion therefore they cannot cry.

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...