Share this article!

Consumer Watchdog released a report spotlighting the profiling flaws of algorithms and submitted a letter this week to the state privacy agency outlining how new state regulations can be drawn to stop them.

 

The report, “Unseen Hand: How Automatic Decision-making Breeds Discrimination and What Can Be Done About It,” details the ways in which automated decision-making disproportionally denies people of color, females, religious groups, the disabled, and low-income people access to essential services such as mortgages, jobs, and education.

 

The California Privacy Protection Agency (CPPA) is beginning to draft regulations for the new California Consumer Privacy Act (CCPA) concerning algorithmic logic and profiling based on personal information. Personal information includes data that identifies or links to a person, including their race, religion, geolocation, sexual orientation, biometric data or financial information.

 

The first comment period for regulations concerning automated decision-making, cyber security audits, and risk assessments ended Monday. Read Consumer Watchdog’s letter here.

 

“Automated decisions are the unseen hand of discrimination, using biased filters to prevent people from achieving important goals such as acquiring a home. As it begins drawing regulations governing algorithms, the CPPA has the authority to let Californians know when and how they are being profiled, and their right to opt out of automated decision-making,” said Justin Kloczko, Consumer Watchdog’s tech and privacy advocate.

 

Booking.com

In 2019, home mortgage lenders gave out loans 40%-80% more times to white people than people of color in scenarios where both groups had similar financial characteristics, according to The Markup. In addition, high-earning Black applicants with less debt were denied loans more than high-earning White applicants with more debt. Under CCPA disclosure regulations, people deserve to know the personal data that was processed, the automated decision’s consequences for the subject, and the most important factors used to formulate a decision, said Consumer Watchdog. 

 

Black taxpayers are at least three times as likely to be audited by the Internal Revenue Service thanks to the algorithm used to determine who is chosen for an audit, according to a New York Times report this year. However, it’s not completely clear why. The program skews toward auditing those who claim certain earned-income tax credit, but Black Americans are still selected for audit more, even in comparison to other groups who also claim the tax credit. The algorithm also appears to target less complex returns instead of ones that include business income. Under the report’s recommendations, consumers would know about how these decisions are made and have the choice to opt out of them.

 

Click here to read the full report. 



Reset password

Enter your email address and we will send you a link to change your password.

Get started with your account

to save your favorite homes and more

Sign up with email

Get started with your account

to save your favorite homes and more

By clicking the «SIGN UP» button you agree to the Terms of Use and Privacy Policy

Create an agent account

Manage your listings, profile and more

By clicking the «SIGN UP» button you agree to the Terms of Use and Privacy Policy

Create an agent account

Manage your listings, profile and more

Sign up with email