Racial and Socioeconomic Equity—Fintech Can Do Better

Technology is supposed to be a great equalizer. However, inequities persist in both physical and digital spaces. As our organization stands with our communities against racism, the SF Fed Fintech team knows we need to meet this moment wherever we can. So, today, I’d like to focus on a section of our latest Fintech Edge report that discusses racial and socioeconomic inequities built into technology. Keep reading for questions we can all ask to move toward positive change.

The promise of fintech

New technology and data use, such as alternative data in underwriting, gives hope that more people can gain access to banks and credit unions, and non-bank financial services. At the same time, however, some innovations do not adequately incorporate or address the challenges faced by people of color and marginalized groups. As data governance systems and new digital infrastructure solutions are built, their designers need to understand the challenges and needs of diverse communities. New products and systems will only be successful when they reflect the preferences and needs of all individuals.

How do old problems get into new technology?

Technology can reinforce existing racial, gender, and socioeconomic biases. Here are a couple examples.

Research tells us that physical sensors and photo recognition programs don’t recognize darker skin tones as well as they recognize light skin tones. Why? The underlying training data lacked diversity.

The diversity gap extends to software used to make judgments every day, from approving bail requests to hiring, because artificial intelligence predicts outcomes based on existing data sets. Just because, historically, more men than women are hired in STEM fields and African Americans are disproportionately held before trial, does not mean the future should be the same.

Programming has to do more than repeat a broken pattern.

Finally, even availability is an ongoing challenge. In the United States, not everyone can access high-quality computer technology or get online. COVID-19 has thrown this into stark relief as many are forced to learn and work remotely.

Diverse impacts of data collection

The impacts of increasing data collection, which powers much of this technology, can also vary by socioeconomic status, race, and ethnicity. For example, while more diversity in the training data used for artificial intelligence can reduce some inequities, collecting more information always comes with the risk of data breaches or data misuse. Unfortunately, these risks (such as identity theft) happen disproportionately to already disadvantaged groups. Furthermore, resolving issues like data breaches or fraud takes time, which can compound their impact.

Beyond baseline concerns about data protection, there are new systems starting around the world to enable individuals to direct information sharing between companies. These new opportunities are exciting, but certain groups, such as immigrants, may be particularly sensitive to sharing information about themselves if it could end up being used for other purposes.

How can fintech become more equitable?

Those working in technology can acknowledge these inequities and work towards inclusiveness.

In our latest fintech research paper, we consider the potential for widespread data protection and a framework of data rights to enable more individual control over information. A fundamental challenge to creating these kinds of systems is a tension between standardization and diversity, though. Having a common set of rules and tools is useful so businesses, regulators, and individuals know how to act and who is accountable. But every individual has different needs when it comes to data and technology. Therefore, a uniform system could leave some better or worse off than others.

How can industry and policymakers design standardized data governance and technical systems that incorporate the unique risks and benefits different communities face? Questions to consider:

  • Should standardized data protection be based on those that face the most risks, or would that limit potential benefits for others?
  • When establishing data rights, how can control systems be built inclusively so they don’t rely on technology and information that remains inaccessible?
  • How can new technology concepts and systems, such as digital identities and real-time payments, best serve marginalized communities who may have the most to gain and lose from innovation?

These are exceptionally challenging questions and I will not find the answers based on my own experiences, nor will you. Answers are in communities around the country. Their input is essential for designing inclusive systems, more innovative technology, and an effective and resilient digital infrastructure. I have sat with founders and designers as they innovate, and they have amazing techniques for generating ideas and collaboration. Unfortunately, those techniques don’t always scale to engage with the diversity of communities in the U.S. It is time to start thinking about technology development and innovation from a community perspective, and to find new techniques to bring in those who have been left out. There are inherent tensions between standardizing practices and customizing to different needs, between quick iteration and intensive community engagement, and between serving those with the greatest needs and reaching scale. But the existence of these tensions underscores how important these issues are, and the responsibility to think deeply about digital equity.

The SF Fed’s Fintech Team is grateful for the opportunity to contribute to this dialogue, and we look forward to continuing to serve the public. For deeper discussion of these issues, visit Fintech Edge.

Kaitlin Asrow is a fintech policy advisor at the San Francisco Fed.

Image credit: metamorworks via iStock.

You may also like to read:

The views expressed here do not necessarily reflect the views of the management of the Federal Reserve Bank of San Francisco or of the Board of Governors of the Federal Reserve System.