Begin typing your search above and press return to search.
proflie-avatar
Login
exit_to_app
Adani and his group buying governance
access_time 23 Nov 2024 4:45 AM GMT
Trump
access_time 22 Nov 2024 2:47 PM GMT
election commmission
access_time 22 Nov 2024 4:02 AM GMT
Champions Trophy tournament
access_time 21 Nov 2024 5:00 AM GMT
The illness in health care
access_time 20 Nov 2024 5:00 AM GMT
The fire in Manipur should be put out
access_time 21 Nov 2024 9:19 AM GMT
DEEP READ
Munambam Waqf issue decoded
access_time 16 Nov 2024 5:18 PM GMT
Ukraine
access_time 16 Aug 2023 5:46 AM GMT
Foreign espionage in the UK
access_time 22 Oct 2024 8:38 AM GMT
exit_to_app
Homechevron_rightTechnologychevron_rightDriverless cars fail...

Driverless cars fail to detect dark-skinned people, kids: Study

text_fields
bookmark_border
Driverless cars fail to detect dark-skinned people, kids: Study
cancel

London: A new study has uncovered significant issues related to the detection systems of autonomous driverless vehicles, revealing that kids and dark-skinned pedestrians are at higher risk on the street, as these systems are worse at detecting them than lighter-skinned people.

The researchers at King's College in London, in their study, conducted a fairness analysis of eight different AI-powered pedestrian detectors trained on "widely used, real-world" datasets.

"Autonomous driving systems are on track to become the predominant mode of transportation in the future. However, these systems are susceptible to software bugs, which can potentially result in severe injuries or even fatalities for both pedestrians and passengers," said Jie Zhang, a Kings College lecturer in computer science and a co-author of the study.

The researchers found through testing over 8,000 images through these pieces of software that detection accuracy for adults was 19.67 percent higher compared to children, and there was a 7.52 percent accuracy disparity between light-skinned and dark-skinned individuals.

However, gender showed only a difference of 1.1 percent in detection accuracy.

They also found that the detection performance for the dark-skin group decreases under low-brightness and low-contrast conditions compared to the light-skin group, in other words, night. For instance, the difference in undetected proportions increases from 7.14 percent to 9.86 percent between daytime and nighttime scenarios.

"Fairness issues in autonomous driving systems, such as a higher accuracy in detecting pedestrians of white ethnicity compared to black ethnicity, can perpetuate discriminatory outcomes and unequal treatment based on race," the researchers said.

"This can result in harm to individuals belonging to marginalised groups, further exacerbating existing social inequalities. Therefore, it is crucial to prioritise fairness testing in autonomous driving systems," they added.

With inputs from IANS

Show Full Article
TAGS:Driverless carsAutonomous vehiclefairness issueDetection systems
Next Story