In collaboration with colleagues, Dr Jie Zhang from the Department of Informatics at King’s College London, assessed eight artificial intelligence (AI) powered pedestrian detection systems used in autonomous vehicle research.
They found through testing over 8,000 images through these pieces of software that detection accuracy for adults was almost 20 per cent higher than it was for children, and just over 7.5 per cent more accurate for light-skinned pedestrians compared to their darker-skinned pedestrians.
According to Kings, a major cause of this discrepancy is that the main collections of pedestrian images used to train the AI systems used in pedestrian detection feature more people with light skin than dark skin. The result of this uneven data source is a lack of fairness in the AI system it is used to train.
In a statement, Dr Jie Zhang said: “Fairness when it comes to AI is when an AI system treats privileged and under-privileged groups the same, which is not what is happening when it comes to autonomous vehicles. Car manufacturers don’t release the details of the software they use for pedestrian detection, but as they are usually built upon the same open-source systems we used in our research, we can be quite sure that they are running into the same issues of bias.
“While the impact of unfair AI systems has already been well documented, from AI recruitment software favouring male applicants, to facial recognition software being less accurate for black women than white men, the danger that self-driving cars can pose is acute. Before, minority individuals may have been denied vital services, now they might face severe injury”.
The researchers also found that the bias towards dark-skin pedestrians increases significantly under scenarios of low contrast and low brightness.
The researchers hope that manufacturers will be more transparent when it comes to how their commercial pedestrian detection AI models are trained, as well as how they perform, before they drive on roads.
“Automotive manufacturers and the government need to come together to build regulation that ensures that the safety of these systems can be measured objectively, especially when it comes to fairness,” said Dr Zhang. “Current provision for fairness in these systems is limited, which can have a major impact not only on future systems, but directly on pedestrian safety. As AI becomes more and more integrated into our daily lives, from the types of cars we ride, to the way we interact with law enforcement, this issue of fairness will only grow in importance.”
Promoted content: Does social media work for engineers – and how can you make it work for you?
So in addition to doing their own job, engineers are expected to do the marketing department´s work for them as well? Sorry, wait a minute, I know the...