Regulator Wades Into Big Data Credit Swamp

By Gregory Roberts

Federal regulators are looking into how lenders are using “alternative data,” such as utility payments, Facebook posts or whether someone capitalizes text messages, to make loans to consumers locked out of the traditional credit system.

Such data is gathered on a fintech frontier where advanced technology meets lending and credit, with implications for discrimination, privacy and the role of the consumer in an always-connected society. They offer the promise of greater access to the financial system for those often left out – but also brings complex regulatory challenges.

“There’s a tremendous amount of interest in this area,” Kevin Petrasic, head of the Global Financial Institutions Advisory practice at White & Case LLP, in Washington, told Bloomberg BNA. “We’re sort of a at a unique period in time.

“There is a tremendous amount of expectation, tempered by, what are the risks? Are there risks being created that we’re not fully aware of?”

The Consumer Financial Protection Bureau in February put out a request for information about “ways to expand access to credit for consumers who are credit invisible” – a population the bureau put at 26 million -- “or who lack enough credit history to obtain a credit score,” a category that numbers another 19 million. The focus of the inquiry is “the benefits and risks of tapping alternative data sources” to make decisions on lending and credit.

“This is a new industry, and it’s a very diverse industry, so they’re trying to decide how they approach it from a regulatory and supervision standpoint,” said David Skanderson, vice president of the Charles River Associates consulting firm.

‘All Data Is Credit Data’

Traditional data for credit decisions include such items as the applicant’s income, employment history and payments records for credit cards and other conventional debts, as well as public information such as bankruptcy filings and foreclosures. Those are the type of factors that go into a FICO score or a rating from one the major credit bureaus, and their use generally has passed muster over time with government regulators.

As for alternative data, Skanderson said, “I would pretty much define it in the negative, as ‘everything else.’ ”

“Everything else” could include much information collected by so-called alternative credit bureaus, such as payment history for rent, utilities or cell-phone bills, payday loans or check-cashing occurrences or records from bank accounts.

Beyond that lies “big data” and the cutting edge of credit scoring: a consumer’s email addresses, brand of car, Facebook friends, educational background and college major, even whether he or she sends text messages in all capital letters or in lower case.

Some of the more esoteric of those items may be more talked-about than actually applied by online platform lenders or other companies making loans or extending credit. But the nearly infinite possibilities, in a world awash in smart phones, mobile devices, laptops and data points, troubles Persis Yu, of the National Consumer Law Center advocacy organization, who cites privacy concerns and potential discrimination in decision-making.

Yu points to the motto that formerly appeared on the web site of the fintech company ZestFinance: “All data is credit data.”

Easier Access to Credit

The declared intent of ZestFinance is to make the extension of credit to consumers fairer and more inclusive, by expanding the measures used beyond the traditional ones that may work to the disadvantage of certain kinds of borrowers. Those people might include young adults, immigrants or men or women recently divorced.

Thanks in part to alternative credit scoring, “It’s pretty clear there’s more credit available, more diverse sources of credit and also easier credit for people who have limited or no credit experience,” Skanderson said.

“The CFPB has recognized the utility of some of these models, but at the same time sounds a cautionary note, which is that fair-lending laws still apply,” Donald Lampe, a partner in the Washington office of Morrison & Foerster LLP, said. “Don’t throw out the fair-lending rulebook just because you think you’ve built a better decision engine that can help people get credit who can’t get it to through traditional means. So it’s very much a balancing test.”

Credit-scoring works by comparing an applicant to others with similar characteristics and basing a decision on whether to grant credit and at what rate on the record of the similar applicants in making payments on time, defaulting or declaring bankruptcy.

The principal federal laws that apply to credit underwriting are the 1974 Equal Credit Opportunity Act (ECOA) and the 1970 Fair Credit Reporting Act (FCRA).

The best-known provisions of ECOA bar lenders from discriminating on the basis of race, religion, national origin, sex, marital status or age, regardless of how accurately those characteristics may predict creditworthiness.

“The point of it is that you cannot take the color of somebody’s skin or their marital status as a proxy for their credit-worthiness,” Petrasic said. “There are clearly better means by which you can make a credit decision than by the color of their skin.”

Disparate Treatment and Impact

It’s relatively easy for lenders to avoid what’s known as disparate treatment of the protected categories of people, which refers to how lenders intentionally approach and evaluate potential borrowers, in marketing, granting and servicing loans. A trickier standard is disparate impact, which can occur unintentionally, based on what may appear to be neutral criteria, and shows up after the fact.

For example, if research showed that graduates of certain selective universities made for better credit risks, and a lender based its decisions on that criterion, the process might result in a disparate impact if the admissions procedures for the universities discriminated against a protected group. A lender could run into disparate-impact problems if it evaluated applicants on whether they shopped online for wedding anniversary gifts, even if that correlated with creditworthiness, because that behavior could be a proxy for marital status.

Not all disparities are illegal. A straightforward execution of a FICO-score cutoff, for example, could well deny credit to a disproportionate share of minority applicants. Any system a creditor wishes to use must meet certain tests of statistical validity and must satisfy a legitimate business interest.

There’s another provision of ECOA that could prove especially thorny for companies wishing to mine big data to refine their underwriting: It says that when a creditor denies credit to an applicant, the creditor must explain to the applicant why the application was rejected.

That requirement is relatively easy to comply with if traditional credit measures are used. But it could be increasingly difficult to satisfy as the number of data points incorporated in a computerized underwriting algorithm multiplies exponentially. “It can become more challenging in some cases to isolate what the reason, or the top three reasons, were that resulted in the decision to decline the applicant,” Skanderson said.

Credit Rating Concerns

The escalating volume and complexity of the data also carries implications for FCRA, which requires credit-rating services to disclose the contents of a consumer’s file to the consumer and to provide for the consumer to challenge entries in the file.

“With big data, the whole thing becomes much messier,” Yu said.

A threat of further convolution is highlighted by the current motto on the ZestFinance web site: “Machine learning is the future of underwriting.”

Machine learning is a form of artificial intelligence by which computers, acting without specific programming instructions, revise their calculations in response to incoming data and the patterns identified in that data. If applied to credit underwriting, that could make complying with the transparency and disclosure requirements of ECOA and FCRA even more challenging.

“That’s a very real concern that institutions have,” Petrasic said. “As they become more and more focused on trying to add in artificial intelligence and machine learning aspects of credit decisioning, and more and more information sort of gets assimilated into the algorithm and it becomes nuanced, the concern is very real about whether it will be clear for the basis of denial of credit.

“It really puts a tremendous degree of emphasis on institutions to understand exactly how their model is working,” Petrasic said.

There’s a learning curve for regulators, too, he said.

“The challenge for both the industry and the government will be trying to understand, first off, where technology is taking us, to make sure we understand exactly where the opportunities are, as well as the risks that accompany those opportunities,” Petrasic said.

It’s a challenge 21st-century lenders and regulators will have to meet, Skanderson said.

“The kind of innovation that’s happening in the online segment of the credit market,” he said, “is probably something that’s here to stay.”

To contact the reporter on this story: Gregory Roberts in Washington at gRoberts@bna.com

To contact the editor responsible for this story: Michael Ferullo at MFerullo@bna.com

Copyright © 2017 The Bureau of National Affairs, Inc. All Rights Reserved.