The First Step in Fighting Racial Bias is Recognizing it in Ourselves | HuffPost - Action News
Home WebMail Tuesday, November 5, 2024, 05:39 AM | Calgary | 0.6°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
  • No news available at this time.
Posted: 2016-09-12T16:54:00Z | Updated: 2016-09-12T16:54:00Z The First Step in Fighting Racial Bias is Recognizing it in Ourselves | HuffPost

The First Step in Fighting Racial Bias is Recognizing it in Ourselves

The First Step in Fighting Racial Bias is Recognizing it in Ourselves
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.
Open Image Modal

Sports is not a topic I write about often, but a report out of Louisiana State University caught my attention and is relevant to the implicit biases that exist in the boardrooms, living rooms, locker rooms and courtrooms of American culture.

A pair of researchers at Louisiana State University issued a paper this month through the National Bureau of Economic Research that examines the effects of emotional shocks associated with unexpected outcomes of [LSU] football games. They concluded that judges hand down harsher punishments in the wake of an unexpected defeat. Unexpected wins, or losses that were expected to be close contests have no impact, the researchers continued. The effects of these emotional shocks are asymmetrically borne by black defendants.

Heres the kicker: Those revealing results were driven by judges who received their bachelors degree from LSU.

Ill give you another example, this one from the glamorous world of beauty. A recent beauty contest was judged by artificial intelligence, in which robots were built with the intent of demonstrating the utmost levels of objectivity in matters of symmetry, skin color, wrinkles and many other parameters affecting our perception of your beauty.

But the project backfired badly.

Beauty.ai requested images from all over the world, promising to judge them fairly. But when the results were revealed 44 winners out of 6,000 entrants most were white. Only six were Asian, and just one was dark-skinned. Beauty.ai claimed that few African Americans, Asians and Native Americans submitted photos, but the organization also admitted that biased algorithms might have been involved.

As The Guardian reports: The simplest explanation for biased algorithms is that the humans who create them have their own deeply entrenched biases.

Such ingrained behavior lies at the root of institutional and all racism in our world.

Ill give you one more example: Members of the Oakland Police Department and the Department of Psychology at Stanford University worked with the Oakland-based Neighbors for Racial Justice organization to present a report outlining racial profiling in listserv alerts that target not behavior against persons or property but rather black and brown skin color alone. As an example, they included this observation: Black male driving by slowly.

That presentation concludes that individuals posting listserv alerts should examine their implicit biases before and often unwittingly engaging in racial profiling and ask themselves: Would this behavior be suspicious if the perceived race was the same as our own?

How do we know our biases?

Its important for all of us to acknowledge our racial biases, so we can become more aware of our judgment and decision-making processes. Along the way, we might be able to decrease incidents of racism in our daily lives at work and around the world.

Its also critical to allow ourselves to recognize that, even though we may not believe our behavior is racially motivated , it often is. And in my line of work, I witness people becoming aware of their biases every time Professional Passport conducts our Global Awareness Program, a two-day workshop that offers presentations, demonstrations, activities and exercises designed to help reveal the cultural DNA of each participant, which in turn can reveal incidents of institutional racism. Attendees are guided through seven dimensions that analyze how their original culture shaped the lenses and filters through which we examine situations, problems, solutions, etc. This exercise allows us to better understand why we so quickly judge people who do not meet or share our cultural expectations. This revealing work is done through an online assessment that provides class attendees with their cultural DNA.

Identifying ones cultural DNA is only the first step in acknowledging that we are all biased. Additional steps we recommend include taking the free Project Implicit: Social Attitudes test developed by researchers from Harvard University, the University of Virginia and the University of Washington, which identifies implicit associations about race, gender, sexual orientation and other topics; watching Democracy Now! an independent, global news hour anchored by award-winning journalists Amy Goodman and Juan Gonzlez that airs on more than 1,300 TV and radio stations around the world and all the time on YouTube ; and subscribing to Fortune.coms raceAhead column, which explores culture and diversity in corporate America.

Taking control of our biases

As mentioned at the beginning of this piece, biases exist everywhere and not only in American culture but in cultures all over the world.

Once we have a broader and more complete understanding of our own implicit biases, we can better recognize them and make focused efforts to change. For inspiration, look no further than two technology companies.

Nextdoor, a private social network for neighborhoods, tapped the insight behind Neighbors for Racial Justices work with Oakland police and Stanford researchers. According to the news website Fusion, Nextdoor users who make posts to their neighborhoods Crime and Safety forum are now asked for additional information if their post mentions race. The result? A reported 75 percent reduction in racial profiling posts.

Even more recently, an email sent Sept. 8 to the Airbnb community from co-founder and chief executive officer Brian Chesky admits that his company has been slow in addressing bias and discrimination and sought to rectify that by asking Laura Murphy, the former head of the American Civil Liberties Unions Washington, D.C., Legislative Office, to review every aspect of the Airbnb platform. The resulting 32-page report, titled Airbnbs Work to Fight Discrimination and Build Inclusion, reveals that the company now requires everyone who uses the travel accommodations network to agree to a detailed nondiscrimination policy, provides VIP treatment for any guest who feels discriminated against and offers anti-bias training for (and public recognition of) members of the Airbnb community.

As Chesky concludes, Every time you make someone else feel like they belong, that person feels accepted and safe to be themselves.

Your Support Has Never Been More Critical

Other news outlets have retreated behind paywalls. At HuffPost, we believe journalism should be free for everyone.

Would you help us provide essential information to our readers during this critical time? We can't do it without you.

Support HuffPost