Otko have selective social amnesia
If, for example, models for making these decisions are trained blindly on historical data without controlling for biases, they would learn to unfairly treat individuals that belong to historically disadvantaged sections of the population, such as women and people of color.” “AI is being used to make several real-life decisions that affect all of us, determining credit limits, approving loans, scoring job applications, etc.
“AI and, more specifically, machine learning models inherit biases present in the data they’re trained on and are prone to even amplify those biases,” he explained. candidate at the USC Viterbi School of Engineering. The importance of addressing and removing biases in AI is becoming more important as AI becomes increasingly prevalent in our daily lives, noted Ayush Jaiswal, Ph.D. And thanks to AI researchers at USC’s Information Sciences Institute (ISI), this concept, called adversarial forgetting, is now a real mechanism. If you’re thinking that this sounds like the computer scientist’s version of “The Eternal Sunshine of the Spotless Mind,” you’d be pretty spot on. Now, imagine it was possible to train an AI deep learning model to analyze that underlying data by inducing amnesia: it forgets certain data and only focuses on others.
#Otko have selective social amnesia zip#
Imagine if the next time you apply for a loan, a computer algorithm determines you need to pay a higher rate based primarily on your race, gender or zip code. The adversarial forgetting mechanism induces amnesia in deep learning models to remove biases