There was just an article about how FR systems have a long way to go with respect to accuracy.
https://www.washingtonpost.com/techn...use/?tid=sm_fb
"Asian and African American people were up to 100 times more likely to be misidentified than white men, depending on the particular algorithm and type of search. Native Americans had the highest false-positive rate of all ethnicities, according to the study, which found that systems varied widely in their accuracy.
Algorithms developed in the United States also showed high error rates for “one-to-one” searches of Asians, African Americans, Native Americans and Pacific Islanders. Such searches are critical to functions including cellphone sign-ons and airport boarding schemes, and errors could make it easier for impostors to gain access to those systems. [...]
Women were more likely to be falsely identified than men, and the elderly and children were more likely to be misidentified than those in other age groups, the study found. Middle-aged white men generally benefited from the highest accuracy rates. [...]
The study could fundamentally shake one of American law enforcement’s fastest-growing tools for identifying criminal suspects and witnesses, which privacy advocates have argued is ushering in a dangerous new wave of government surveillance tools.
The FBI alone has logged more than 390,000 facial-recognition searches of state driver’s license records and other federal and local databases since 2011, federal records show. Members of Congress this year have voiced anger over the technology’s lack of regulation and its potential for discrimination and abuse."
From my own experience: a couple of weeks ago I flew to Europe out of a major US airport. As we presented our boarding passes to get onto the plane, each passenger was asked to look into a camera. I found out later that this was supposed to be "voluntary", but at no point did I hear anyone say so, nor was there a sign indicating this (except further away - not at the gate but when one first entered the secure area). I watched as the two gentlemen (not traveling together) ahead of me waited patiently while the camera tried and failed repeatedly to capture their image - they were both quite dark-skinned. However, the woman ahead of them, who was of a fair complexion, had no such issues - her picture was captured in seconds and away she went.
All this to say - there are significant technological and legal issues still with FR/AI.
Bonus item. Now we have to weigh everyone going onto a plane and the privacy concerns. Best answer, maybe. With the precision of facial recognition from AI, I suspect algorithms exist to calculate body mass and weight from a video camera in any public space so an airport would be no different except it would have a stated objective. One step further, the TSA body scanner probably collect this visual data without physically weighing a person. In theory if you walked in public with a camera, AI may exist that not only your face is recognized, but also height, build, and bodyweight.