Chipping is for Pets, not People


Earlier this year, Medical Daily published an article titled, “Adventures In Biohacking: Hi-Tech Swedish Building Complex Implants RFID Microchips In Employees”.   In a nutshell:  the Epicenter building in Stockholm allows employees to opt in to having an RFID chip implanted in each hand, in the skin between the thumb and index finger.  The chips are provided by BioNyfiken, a Swedish biohacking group.  A tattoo artist inserts the chips.

Such articles are near and dear to me:  my biometrics research has several times been cited on end-time websites.  These sites suggest that biometrics is one possible “Mark of the Beast” – a prophecy from Revelation 13 in which it becomes difficult for people to do much of anything without having a specific mark affixed to them.  Implanting RFID in people is another possibility.

Regardless, the implanted RFID chip does have benefits.  The chip contains a unique digital code associated with that person and facility access can be achieved simply by holding one’s hand in front of the RFID reader.  Likewise, the chip can be used to access specific offices inside the building, and will soon be used to pay for lunch in the Epicenter cafeteria.  And it’s really hard to lose.

None of this is news in the biometrics world.  Fingerprint, vein imaging, iris imaging, and other modalities already cite success stories such as payment for lunches in school cafeterias, entry to buildings, rooms, vaults, and even entry into IT systems.

None of this is news to pet owners, either.  My dog is chipped, so that he can be identified when he escapes down the trail and into the creek.  But there is a huge difference between pet-chipping and people-chipping:  there is little reason for anyone to steal my dog’s identity.  Put a chip in a person, though, and it’s different:  there are lots of reasons that someone might want to steal a person’s identity.  The use cases are far from identical.

And that’s my real beef.  What is the need to implant RFID chips to identify people when biometrics can already do that?  Properly used DNA identification can reach a false acceptance rate (FAR) of 1 in 250 billion.  Given a current world population approaching 7.5 billion, that’s pretty much unique.  Even a less expensive technology, Fujitsu’s palm vein identification, claims a FAR of 0.0008% in day-to-day usage.

And no matter what, I think somebody is missing the point.  Whether the RFID chip is on a card or buried in your hand, it’s still an RFID chip.  It’s harder to lose, that’s true.  But if a person’s RFID is compromised, it may be a pain – literally – to issue a new identity.

RFID leader HID Global acquired fingerprint vendor Lumidigm in 2014, with a view to adding biometrics building access to its existing RFID card access.  If that sounds the same as implanting a chip in someone’s hand, think for a moment.  A chip is a chip, wherever it is.  But a chip and a fingerprint together… now we’re talking multi-factor authentication – far stronger than an RFID chip on its own.   Banks get this – fingerprint readers are starting to appear on cash dispensers to reduce fraud.

As one who researches new technology markets for a living, I always applaud new approaches – this case included.  But I do have to wonder:  Why bother implanting chips in people instead of just putting a biometrics reader on the door or at the lunch counter?  At least there’s no risk of infection when installing the reader.

Comments are closed.