Data brokers’ conduct can be “particularly worrisome” because the sensitive data driving the use of artificial intelligence can be collected from military personnel, people experiencing dementia, and others, according to Rohit Chopra, director of the US Consumer Financial Protection Bureau.
“The CFPB will be taking steps to ensure that modern-day data brokers in the surveillance industry know that they cannot engage in illegal collection and sharing of our data,” he said in a statement.
President Joe Biden last year called on the US Federal Trade Commission (FTC) to help protect the data privacy of women seeking reproductive healthcare who may face law enforcement action in some states. The FTC has also sued an Idaho company for selling mobile phone geolocation data, saying it could be traced to places like abortion clinics, churches and addiction treatment centers.
In advance of Tuesday’s announcement, CFPB officials said the proposal would expand the number of companies currently subject to the Fair Credit Reporting Act – a 1970 law governing the privacy of consumer data provided to lenders – to cover the use of data derived from payment histories, personal income and criminal records.
Of particular concern, officials said, was the disclosure of “credit header data,” or personal data such as names, addresses and social security numbers that the top three credit bureaus, Experian, TransUnion and Equifax, share for people, some of whom may be seeking to avoid contact, such as domestic violence victims.
Discover the stories of your interest
The CFPB in March opened a public inquiry into the conduct of companies like credit bureaus and background screening firms. According to agency officials, the results indicated data brokers reported consumers in financial distress to companies who then targeted them with predatory debt products, among other findings that pointed to the need for greater privacy protections.