Biometric
Biometric Technology: a brief history
Modern biometric technology began in the 1960s, evolving into
high-tech scanners that read bio-markers with an accuracy touching
100%. In 2020, biology-based science is disrupting the authentication
industry—at speed. The future is now passwordless.
A brief history of biometric technology
1960s—exploration
In 1960, scientists began identifying the physiological components of
acoustic speech and phonic sounds. This was the precursor to modern
voice recognition technology. In 1969, the Federal Bureau of
Investigation (FBI) pushed for automated fingerprint identification
which led to the study of minutiae points to map unique patterns and
ridges.
1970s and 1980s—FBI funding
By 1975, the first scanners to extract fingerprint points were
prototyped, funded by the FBI. Digital storage costs were prohibitive,
so the National Institute of Science and Technology (NIST) worked on
compression and algorithms.
The work at NIST led to the development of the M40 algorithm, the
first operational matching algorithm used at the FBI. Used to narrow
the human search, this algorithm produced a significantly smaller
set of images that were then provided to trained and specialized
human technicians for evaluation. Developments continued to improve
the available fingerprint technology.
—biometricupdate
The NIST advanced speech, ocular, and face recognition by filing
patents for iris identification and subcutaneous blood vessel
patterns. Mugshots were digitized to databases.
The 1990s—biometric science takes off
The 1990s was a boom time for biometrics. Algebraic equations revealed
that less than one hundred values could differentiate normalized face
images. The National Security Agency (NSA) formed the Biometric
Consortium. The Department of Defence (DoD) partnered with the Defense
Advanced Research Products Agency (DARPA) to fund face recognition
algorithms for commercial markets. Commercial products soon followed.
Lockheed Martin contracted to build an automated fingerprint
identification system for the FBI. CODIS (the FBI’s forensic DNA
database) was born to digitally store, search, and retrieve DNA
markers. Retrieval was bottlenecked because the data existed in silos
and biomarker lab sequencing took hours. A network would soon
facilitate the electronic mass exchange of stored bio-indicators.
The 2000s—rollout of biometric tech
West Virginia University (WVU) established the first bachelor’s
program in Biometric Systems and Computer Engineering. The
International Organization for Standardization (ISO) standardized
generic biometric technologies, promoting the collaborative exchange
of international biometric research and development. Palm print
biomarker tech made an appearance on the biometric stage. The European
Biometric Forum was established to address market adoption and
fragmentation barriers. Face recognition became the accepted global
biometric authenticator for passports and other Machine Readable
Travel Documents (MRTDs).
The United States immigration department used biometrics for visa
applicants at ports of entry and exit to tighten security on offenders
and facilitate travel for millions of legitimate travellers. The DoD
gathered biometric data (fingerprints, face, voice samples, iris
images, and DNA swabs) to track and identify national security
threats. President Bush made personal identification cards mandatory
for all government personnel and automated palm print databases were
deployed statewide. In 2008, the Department of Homeland Security
stopped a suspected terrorist at the border by cross-matching
biometric data.
2010s—the shift from state to smartphones
Biometric technology continued to power terrorist identification, but
a shift happened in 2013 when Apple introduced Touch ID on the iPhone
5S. Biometrics was no longer strictly a government tool. It was now
consumed by the public.
Touch ID is heavily integrated into iOS devices, allowing users to
unlock their device, as well as make purchases in the various Apple
digital media stores (iTunes Store, the App Store, iBookstore), and to
authenticate Apple Pay online or in apps. On announcing the feature,
Apple made it clear that the fingerprint information is stored locally
in a secure location on the Apple [chip], rather than being stored
remotely on Apple servers or in iCloud, making it very difficult for
external access.—Source: National Science & Technology Council
2020—the explosion
Over the last 60 years, the barriers that made biometric
authentication inaccessible for consumers, enterprises, and even code
developers—the digital artisans of tech—were removed. Then, something
happened in 2020:
Momentum.
The moving parts came together, catapulting biometric authentication
into the mainstream.
Two technologies collided at the right time to facilitate
the adoption of biometric authentication:
The science behind scanning sensors improved to almost perfect
The use of smartphones went through the roof
Scanning sensors integrated into smartphones
The science behind biology-based scanners improved to near-zero error
rates and the price of sensors went down. That made integrating
scanning sensors into smartphones easy. Around 2010, sSmartphone use
exploded when 4G (LTE) wireless networks made streaming possible for
millions of mobile consumers. Samsung and Apple built biometric
fingerprint scanners into smartphones to authenticate the millions of
smartphone users—and the public accepted this integration with open
arms. Apple transitioned to face recognition with iPhone X.
Now, 5G is poised to put the Internet of Things (IoT) and big data in
back pockets, and standards bodies and groups like
FIDO
and W3C not only recognize biometrics but regulate it. The roadblocks
that hindered biometrics have fallen away. Biology-based security as a
futuristic concept has been replaced by widespread acceptance and a
race to implement biometric tech—today.
The future of biometric authentication
Technology constantly improves to combat threat actors’ efforts to
find weaknesses in a system. But biometric technology is already at a
secure state that supports its full-scale integration. Biometric
authentication technology can charge forward because biometric traits
are next to impossible to forge, algorithms continue to revolutionize,
and hardware scanners are already at next-to-zero error rates. Facial
recognition has already been deployed outside of phones, making itself
known in airports and stadiums.
Experts who study biometrics predict there will only be more options
in the coming years, such as voice or heart-rate detection,
signature authentication, and even devices that can tell who you are
by the way you walk. The Pentagon is already working on tools for
gait and heartbeat identification.
—Washington Post
As passwordless authentication gains support and stimulus by bodies
like
FIDO
open standards, integrating biometric technology into browsers,
smartphones, and outside world applications will only continue to grow
and improve. It didn’t take us all that long to get here; sixty years
brought us from science fiction prediction to real-world mass
adoption. Where will we be in 2030? Predictions point to passwordless.
Which biometric authentication technology will take off more than
others is up in the air. In March of 2020, Google made Android devices
(v7 and higher)
FIDO2
certified. That means those users now log in without passwords. That
same month, Microsoft, Dropbox, Chrome, Edge, and Firefox began
integrating FIDO2 WebAuthn, an open web protocol approved by W3C.
(source:
The Verge). Soon, risky password habits will fall into obscurity, and devices
will recognize who we are automatically, open up, and let us get on
with our lives.