Sergey Nivens - Fotolia
Defense Department eyes behavioral biometrics with new contract
The Department of Defense awards a $2.4 million contract to Twosense.AI in order to create a behavioral biometrics system that can replace the current ID card system.
The Department of Defense awarded a contract to a behavioral biometrics startup that could eventually replace current ID systems.
The $2.42 million contract was awarded to Twosense.AI, a startup based in Brooklyn, N.Y., that was founded in 2013 and specializes in "invisible, continuous authentication" using behavioral biometrics. Twosense.AI will be working with the DoD's Defense Information Systems Agency (DISA) to create "government product to further secure employee identity while improving the usability of secure systems," according to the announcement.
DISA has been exploring various potential identity initiatives to replace the current common access card system as part of its Assured Identity initiative, which was announced in March 2018.
Mitchell Jukanovich, vice president of federal sales at Tripwire, based in Portland, Ore., said the DoD has had a plan in place for years to replace the chip-based ID cards.
"During that time, the agency communicated its intention to deploy multiple identity management technologies -- behavioral biometrics being one of them. This approach of utilizing multiple identity management technologies is a good one that has been proven in other industries," Jukanovich said. "Similar to the government's many other attempts to modernize, it's not a matter of whether the technology is there to replace legacy systems, but whether the new technology is interoperable across disparate legacy systems throughout departments, agencies or government-wide. That will be the greatest challenge in fully utilizing this and other capabilities."
Twosense.AI behavioral biometrics
The current product offered by Twosense.AI authenticates users by modeling their behavior, "such as the way they walk, interact with their phone, commute to work, and how and where they spend their time."
"Through the power of deep learning, algorithms are highly personalized, learning the personal characteristics that make each user unique on an individual level. The product leverages mobile and workstation behavioral biometrics, as well as proximity, to create invisible continuous multi-factor authentication for the workplace," Twosense.AI wrote in the announcement. "Continuous authentication drastically reduces the risk of a breach while improving the user experience by removing authentication challenges."
Dawud Gordon, co-founder and CEO of Twosense.AI, said via email that user data would be protected in this type of behavioral biometrics system.
"We protect the employee by not tracking or recording any PII. Their name, phone number, email, address, etc. is all contained within existing employer IAM systems that we connect to through API integrations that share only an ID hash with us. Our APIs support only authentication and risk-related uses," Gordon said. "Our service is software only and is deployable to existing mobile, laptop, desktop and virtual devices, and only the devices they are already using. Nothing is needed to change on how they use their devices, or which devices they use."
Michael Cobb, SearchSecurity contributor and cybersecurity professional, said imitating someone well enough to bypass behavioral biometrics is "nearly impossible," but accuracy, false positives and potential login delays have been issues with other attempts in this space.
Sam Bakken, senior product marketing manager at OneSpan in Chicago, said behavioral biometrics are aimed at improving the user experience "by providing a base layer of authentication without having to interrupt the user experience unless necessary."
"This can benefit users with a less disruptive experience and benefit organizations, which can increase the fidelity of their risk data about a user and only interrupt users for transactions when absolutely necessary when a transaction exceeds a certain risk threshold," Bakken said. "It's a great base layer for certain levels of authentication or as additional contextual data to be ingested by a fraud/risk management solution, but it needs to be part of a layered, risk-based approach to authentication."
Privacy implications
Rebecca Herold, CEO of The Privacy Professor, said it seems "naïve to state that no personal data is being collected if such intimate characteristics are being collected, analyzed, monitored and used to know exactly who a person is by those characteristics."
"Generally, if data can be associated with a specific individual, it is a type of personal data," Herold said. "Just because traditionally considered information items such as name, address, and other HR type of information will not be stored within the IAM systems, all that other data that is unique to individuals, and can be associated with a specific individual, is considered to be personal data under a wide variety of laws and regulations throughout the world."
Bakken noted that behavioral biometrics systems will claim no PII is collected because they "are only calculating variances between the current user's keystrokes, swipes, etc. and the enrolled profile."
"Because the data is essentially just a collection of these variances, in the end the data is rather useless to an attacker, for example. Many jurisdictions don't consider behavioral biometrics information to be personally identifiable information. That could change as we see more and more behavioral biometric deployments," Bakken said. "It boils down to what data exactly is being collected/analyzed. But potentially tracking people's daily routes, especially when it comes to military personnel (see the recent example of the fitness app Strava) is essentially giving away the locations of military bases."
Herold added that the language in the announcement felt like "an attempt to sugarcoat the term 'surveillance.'"
"Monitoring individuals continuously without their awareness of such monitoring is also the general definition of surveillance. They are basically using continuous surveillance and monitoring instead of getting point-in-time authentication from the individuals. This seems to be a semantic distinction without an actual difference in described goals and results," Herold said. "I really don't know enough from the information to give a clear critique. Just enough to be highly skeptical of the claimed benefits, and highly concerned of the possible privacy issues. Perhaps their PR is giving just enough information about the system to create more concerns than is justified by how the system actually works."
The DoD did not respond to requests for comment at the time of this post.