NEWS/COMMENT ...Contined from page 11 Jin explained: “It doesn’t matter what the sound is, everyone’s ears are different and we can show that in the audio recording. This uniqueness can lead to a new way of confirming the identity of the user, equivalent to fingerprinting.” The researchers believe EarEcho could be used to unlock smartphones, but Jin sees its greatest potential use in continuously monitoring smartphone users. EarEcho works when users are listening to their earbuds, so it is a passive system. This means users need not take any action, such as submitting a fingerprint or voice command, for it to work. Such a system, Jin argues, is ideal for situations where users are required to verify their identity such as making mobile payments. It could also eliminate the need to re-enter passcodes or fingerprints when a phone locks up after not being used. A prototype of the system was described in last month’s Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies journal. UB’s Technology Transfer office has filed a provisional patent application for the technology. Additional co-authors of the study include Yang Gao and Wei Wang, both graduate students in Jin’s lab; Wei Sun, associate professor in the Department of Communicative Disorders and Sciences in the College of Arts and Sciences; and Vir V Phoha, professor of electrical engineering and computer science at Syracuse University. Zhanpeng Jin: what else could you do with speakers? Credit: University at Buffalo.
surveillance technology
US seeks long-range biometrics
A
merica’s IARPA intelligence agency is looking to develop new longrange biometric surveillance systems. In an RFI (request for information) issued on 13 September, IARPA invited responses from biometric tech suppliers for its BRIAR – Biometric Recognition and Identification at Altitude and Range – project, with a response date of 21 October. The aim is to research and develop facial and related recognition systems that can be deployed from rooftops or unmanned aerial vehicles like drones. Initially, IARPA (Intelligence Advanced Research Projects Activity) says it is “seeking 12
Biometric Technology Today
information on research efforts and datasets that may be useful in planning a programme focused on advancing the state-of-the-art of biometric recognition and identification at altitude and range”. The agency explained: “There have been notable advances in computer vision and biometric approaches to facilitate unconstrained face recognition. However, there remain challenges in diverse face identification when dealing with low-resolution or noisy imagery (eg, motion blur, atmospheric turbulence). In addition, limited research has been performed on face recognition using imagery captured at high camera pitch angles, such as those collected from security cameras on building tops or from airborne platforms, such as unmanned aerial vehicles (UAVs).” IARPA says its main aim is to research systems that could protect critical infrastructure and military forces, and support border security. “Examples may include (but not limited
to) whole-body identification, gait recognition and/or anthropomorphic classification (eg, height, gender). The fusion of multiple biometric signatures to address these limitations remains under-served by the research community,” IARPA said. The information it wants to source is biometric research datasets that: • Include imagery captured at long-range (over 300 metres) or severe pitch angles (more than 20 degrees). • Capture images using UAVs like quadcopters or fix-wing platforms. • Include whole-body video imagery. IARPA also wants to hear from organisations that have conducted research on multi-modal fused biometric identification (such as face and gait recognition) in standoff scenarios – meaning ranges greater than 20 metres from sensor to subject.
COMMENT There’s talk these days of politicians in the West retreating into their own ‘tribes’: Democrats and Republicans in the US becoming increasingly hard-line; members of Parliament in the UK becoming increasingly insulting and unyielding to each other’s point of view. Whether that’s true or not, a phase or not, the same impression holds in the area of biometric technology and especially facial recognition. Privacy and civil rights campaigners on the one hand condemn the technology and call for an outright ban on video mass surveillance. On the other side of the fence, biometric tech and law enforcement advocates emphasise its ability to cut crime. Last month it was the turn of the biometrics industry to put its case, when a coalition of some 40 technology companies, law enforcement groups and individuals sent an open letter to the US Congress opposing an outright ban on the use of facial recognition by law enforcement. The industry voices include Acuant, Cognitec, HID Global, the IJIS Institute, the IBIA (International Biometrics + Identity Association), JENETRIC, NEC, Thales and Vision-Box. Their letter points up the recent rapid improvements in FR systems, and how the technology is being used “to help identify individuals involved in crimes, find missing children and combat sex trafficking”. They ask Congress to consider
the “viable alternatives” to a ban, including setting performance standards, and offering guidance and additional training for law enforcement officers. They also point out that polls consistently show Americans trust law enforcement to use facial recognition technology responsibly. That claim is supported, and extended, by an Ipsos survey we cover this issue (page 2), which found that citizens globally support government use of facial recognition – with limits. The survey of some 20,000 people across 26 countries found that supporters of a total ban on FR were a minority in all the countries involved. A majority of citizens in all 26 supported the limited government use of AI and facial recognition. In that same spirit, we report on page 1 how Detroit has bucked the trend of other US cities like San Francisco, Oakland and Berkeley: instead of an outright ban on FR, Detroit has allowed its police force to keep using the technology. But there are strict safeguards. Detroit police can only access still images, not the city’s CCTV camera network. They can use FR technology to investigate only serious crimes like sexual assault, murder and home invasions. They are not allowed to search based on an individual’s race or gender, nor use FR to assess a person’s immigration status. This (middle) ground-breaking move offers a practical way forward for cities to get past partisanship, and securely use FR technology to combat crime. Tim Ring
October 2019