Forums

View all topics
Back to CompTIA
0

The Movement to Hold AI Accountable Gains More Steam

  • 7 mths ago

ALGORITHMS PLAY A growing role in our lives, even as their flaws are becoming more apparent: A Michigan man wrongly accused of fraud had to file for bankruptcy; automated screening tools disproportionately harm people of color who want to buy a home or rent an apartment; Black Facebook users were subjected to more abuse than white users. Other automated systems have improperly rated teachers, graded students, and flagged people with dark skin more often for cheating on tests.

Now, efforts are underway to better understand how AI works and hold users accountable. New York’s City Council last month adopted a law requiring audits of algorithms used by employers in hiring or promotion. The law, the first of its kind in the nation, requires employers to bring in outsiders to assess whether an algorithm exhibits bias based on sex, race, or ethnicity. Employers also must tell job applicants who live in New York when artificial intelligence plays a role in deciding who gets hired or promoted.

In Washington, DC, members of Congress are drafting a bill that would require businesses to evaluate automated decisionmaking systems used in areas such as health care, housing, employment, or education, and report the findings to the Federal Trade Commission; three of the FTC’s five members support stronger regulation of algorithms. An AI Bill of Rights proposed last month by the White House calls for disclosing when AI makes decisions that impact a person’s civil rights, and it says AI systems should be “carefully audited” for accuracy and bias, among other things.

Elsewhere, European Union lawmakers are considering legislation requiring inspection of AI deemed high-risk and creating a public registry of high-risk systems. Countries including China, Canada, Germany, and the UK have also taken steps to regulate AI in recent years.

Julia Stoyanovich, an associate professor at New York University who served on the New York City Automated Decision Systems Task Force, says she and students recently examined a hiring tool and found it assigned people different personality scores based on the software program with which they created their résumé. Other studies have found that hiring algorithms favor applicants based on where they went to school, their accent, whether they wear glasses, or whether there’s a bookshelf in the background.

Continue reading: https://www.wired.com/story/movement-hold-ai-accountable-gains-steam/

Reply Oldest first
  • Oldest first
  • Newest first
  • Active threads
  • Popular
Like Follow
  • 7 mths agoLast active
  • 4Views
  • 1 Following
Powered by Forumbee

Forums

View all topics