Probably just reading the room, with States like texas making abortions illegal and allowing random citizens from enforcing that.

Famously, abortions are a woman thing.

Anyway, looking through the facts, it's just some random woman. There's better evidence that these facial recognition systems are much worse at minorities rather than genders.

Interesting biases are own-gendeR: https://pmc.ncbi.nlm.nih.gov/articles/PMC11841357/

Racial bias:

https://mitsloan.mit.edu/ideas-made-to-matter/unmasking-bias...

Miss rates:

https://par.nsf.gov/servlets/purl/10358566

Although you can probably interpret the facts differently, we've seen how any search function gets enshittified: Once people get used to searching for things, they tend to select something that returns results vs something that fails to return results.

Rather than the user blaming themselves, they blame the searcher. As such, any search system overtime will bias towards returning search (eg, Outlook), rather than accuracy.

So if these systems easily miss certain classes of people, women, minorities, they'll more likely be surfaced as inaccurate matches rather than men who'll have a higher confidence of being screened out.

That's how I interpret this 2 second commment.