The views expressed by contributors are their own and not the view of The Hill

Tech companies need a history lesson and civil rights groups can provide it


From calling out the biases in Amazon’s facial recognition database to warning against roll backs of anti-discrimination rules in an era with Artificial Intelligence, civil rights groups like the ACLU, NAACP and others have proven themselves to be effective watchdogs for discriminatory policies in the digital age. Their work has been effective in raising awareness, but sounding the alarm may not be enough anymore.

{mosads}By the time algorithmic biases are known to the public, detrimental practices have often been in place for years. Evenly seemingly minor algorithmic tweaks, like those that allow companies to exclude groups of people from viewing certain ads, quickly accrue discriminatory damages.

In order to take a proactive stance for equity, rather than a reactive stance against discrimination, it is be time for civil rights advocates to team up with the tech companies they’ve been scrutinizing.

Housing is one area ripe for that collaboration. Housing costs continue to outpace the overall cost of living, creating an incredibly competitive market in many areas. In the middle of this challenge lies a separate, but equally impactful element: the housing market has evolved from yard signs and community boards to online sites including Zillow, HotPads, Trulia and Facebook, but regulation has not yet caught up.

When searching for housing online, users are directed to certain postings by search algorithms — automated computer rules which use the information that companies have about users, such as age, gender and location to help advertisers find ideal candidates. These algorithms also can and do violate the Fair Housing Act (FHA) by allowing advertisers to weed out potential renters by race, disability and religion.

One notable example of this practice comes from Facebook, who just yesterday was accused of participating in housing discrimination by the Department of Housing and Urban Development. Their ad practices, which allow for traditional market segmentation, easily lend themselves to discriminatory practices, as advertisers can restrict ads from being shown to groups who fall under the protections of the FHA.

Whereas housing posts that explicitly state that certain groups are unwelcome are quickly removed from most sites, discriminatory ad policies essentially allow for the same practice, but regulated by algorithms.

In order to be more effective in the advocacy space, civil rights organizations require a better understanding of algorithmic design; in order to avoid discriminatory results, technologists need a better understanding of the law and its historical underpinnings.

The only way to accomplish these mirror goals is through serious, intentional and widespread collaboration between technologists and civil rights organizations. Three potential strategies to accomplish this include starting with a conversation, defining common language and scaling efforts.

Starting with a conversation. Civil rights leaders and technologists can organize conversations with each other to explain the history of specific civil rights legislation. In the case of housing, civil rights organizations can explain why and how the FHA has historically protected people who belong to minority groupings of race, religion, national origin, sex, handicap and family status.

By gaining an understanding of how those groups explicitly or implicitly fall into market segmentations for ads, technologists can understand how certain manipulations of data could, in effect, be a form of digital redlining. These conversations can strengthen how civil rights organizations understand the modern advocacy landscape and how technologists support civil rights.

Defining common language. These groups often speak using the language of their space, so defining terms that are familiar to both groups could be hugely beneficial to collaboration. For example, technologists often use the phrase “agile methodology.” Though the term, which refers to a project management style where metrics are collaboratively revisited every few weeks, may be unfamiliar to civil rights organizations, it encapsulates the spirit of the two groups working together to develop policy.

Similarly, civil rights leaders are used to jargon common in Washington. In the course of a conversation about housing, referring to the Fair Housing Act by its full name, instead of “FHA,” could invite technologists into the conversation on a more even footing.

Scaling efforts. These groups must create mechanisms to have interdisciplinary conversations on a broad scale. A good first step would include inviting technologists to civil rights conferences and civil rights leaders to technology conferences; the logical next step would be the creation of more conferences like Data for Black Lives, which focus on the intersection of civil rights and technology. As it stands, the national conversation on tech and civil rights is far too siloed — an overview of an upcoming AI conference with over 150 speakers has only one session covering law and ethics.

In this example, the addition of civil rights leaders to inform on existing and potential bias in AI development could lead to richer conversation about the ability of technologists to mitigate potential disasters.

These strategies lay out the potential for greater collaboration between the two groups that need not be adversaries. As algorithms that influence what people see and how they are treated continue to spread into new areas of life, the tech and civil rights worlds must overlap, in order to provide services and legal protections that match the needs of vulnerable populations.

By working together, civil rights organizations and technologists can ensure that as our understanding of this space evolves, so too do our protections and ability to hold designers accountable.

Afua Bruce is the director of Engineering for the Public Interest Technology program at New America and oversees projects where technologists work alongside nonprofits and governments to develop solutions in a new, human-centered way. Maria Filippelli is the Public Interest Technology 2020 Census fellow coordinating census tech awareness campaigns for community organizations and creating intro-to-tech resources for civil rights organizations.