Next Article
AI tools like ChatGPT still show biases: Study
Technology
A new study from Penn State found that popular AI tools—including ChatGPT and Gemini—still have some built-in biases.
Researchers asked 52 people to come up with prompts to test these systems, and out of 75 prompts, most revealed issues tied to gender, race, ethnicity, religion, age, disability, language, historical, cultural, or political bias.
'Bias-a-Thon' used role-play scenarios to spot issues
The team ran a "Bias-a-Thon" using role-play scenarios (like assuming doctors are male) to spot these problems.
Interestingly, when others tried the same prompts recently, the AIs gave more thoughtful answers—suggesting possible improvements.
Still, ongoing checks are needed to ensure these tools treat everyone fairly.