Technology and Social Justice Movements

One would have to be living under a rock to not be aware of the social justice movements sweeping across the United States in the past few weeks. As a technologist, I am often asked what part does technology play in these social justice movements and what can we do to make sure we are part of the solution and not adding to the problem. In the middle of the current crises with law enforcement, it was interesting to note how some technology companies responded. Most notably by suspending law enforcement access to facial recognition technology by Amazon and Microsoft. (See Story) This decision came on the heels of IBM suspending any development of new facial recognition products and Google asking for a temporary ban. Law enforcement has often found its motives questioned by technology companies. This is especially true when law enforcement has requested (or, in some cases, demanded) access to specific technology or access to a product developed by a company. I believe we in the technology industry do have a responsibility to guide the development of these new technologies but also push for laws and guidelines for their proper use. The ACLU has recognized how these technologies, used improperly or too broadly, could have a detrimental impact on certain communities.

Case in Point, that of Robert Julian-Borchak Williams. When I read this story the first time, I was drawn to the fact that this incident occurred in Farmington Hills, MI – a community I am aware of from my youth. This man was arrested on the front lawn of his home and accused of stealing $3800 worth of jewelry from a nearby store. This arrest was based on a facial recognition technology being used by the state of Michigan, which in this case, was wrong. Was this the first time facial recognition technology was wrong, or as stated by Clare Garvie, a lawyer at Georgetown University’s Center on Privacy and Technology, is this just the first time we know of a mistake being made. This is dangerous and scary ground especially because of the recent social issues in the United States. Law Enforcement is using technology such as this to cull through the massive amount of social media data to try and determine who is actually committing crimes. This is valuable and necessary work but the technology may not be where we need it to be. In fact, a federal study from 2019, found that algorithms of over 100 facial recognition systems were biased, falsely identifying African-American and Asian faces 10 times to 100 times more than Caucasian faces. Not inconsequentially, Robert Williams is a black male.

Technology can be a game changer and it can help us solve crimes but there has to be a human element in its application. We also need to remember that innocence is presumed even when a picture seems to paint a different story. Let’s make sure that the technology works 100% of the time (ok – %99.99%) before we use it to alter someone’s life irrevocably.