icowden
Squire
Do you work in data? I work in software design and database management. The programming would not have been biased. That would be utterly pointless. There will be a mapping and there will likely be something which ensures that no null data can be recorded. That something is probably the reason that if "no title" was selected then line 2 incorrectly got a title. The logic for women is a little more complex than that for men for obvious reasons.What you describe as a bug doesn't make sense. What is more likely is inadvertent bias in the programming. This is the more likely cause of the problem.
The reason that gender is captured and coded on the driving license isn't for fun. It will be used for statistical analysis. It may also be used as a way of validating that a driving license is genuine.What we can take away from this? That gender markers and titles are unnecessary, since the prevalence of titles is variable with their being an option not to have one anyway, and gender markers are coded so that only those people who read blogs know that they are there.
That article is entirely about highly complex AI training, not logic programming. These are chalk and cheese. Programming is "if this, then that".Al Gore Rhythms (sorry) can make processing and decisions faster, but they can create unintended bias that has the appearance at least of sexism or racism. See here ...
I can assure you that the DVLA is not using AI. Government agencies have not progressed beyond bog standard.