Random Stuff about Artificial Intelligence, Data Science and Visualisation

Algorithms Allegedly Penalized Black Renters. The US Government Is Watching

In a recent statement, the United States Department of Justice (DOJ) has warned a provider of tenant-screening software that its technology must comply with the Fair Housing Act (FHA) or face enforcement action. The provider in question is RentGrow, and their software collects and analyzes tenant data from background checks, credit histories, and rental applications in order to determine a person’s creditworthiness for residency.

The DOJ’s move is part of a larger effort to ensure that companies are not using software tools that may prove discriminatory. Specifically, RentGrow’s software was found to incorporate metrics prohibited by the FHA, such as race, religion, national origin, and even age. Under the FHA, it’s illegal to discriminate against any potential tenants or homeowners based on any of these grounds or any others.

The DOJ is concerned that landlords will rely on RentGrow’s analysis to automatically reject applicants without giving each person due consideration. RentGrow maintains that its software evaluates potential tenants objectively, but the DOJ is taking steps to ensure the company follows the law. The warning was the first step in that process and the DOJ stated it will make sure RentGrow complies with the FHA and that its software neither results in discrimination nor encourages it.

This warning is an important reminder for both landlords and software companies that the FHA must be taken seriously. Software companies have a responsibility to make sure their products do not facilitate discrimination, and landlords have to be aware that their use of such software involves risks. It is important that both of these parties are aware their actions could have serious legal consequences if they are not compliant with the FHA.


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *