Mind & Body

There are 2 Types of Injustice and You Respond Differently to Each of Them

Scroll through your social media feed and it can seem like every other post is designed either to pull your heartstrings or fuel your outrage. A teen is holding a bake sale to fund her baby sister's cancer treatment. An arsonist burned down three churches. Maybe your favorite actor got in a car crash, and a politician you dislike just had a spike in their approval ratings. Bad things are happening to good people while good things are happening to bad people, but how you respond depends on which of those we're talking about.

You Can't Always Get What You Want

There's been a lot of research looking into what people do when faced with injustice. People are generally willing to give money to help innocent victims of crimes, and they're also in favor of punishment for people who do wrong. But those two desires aren't equal.

For example, one study that had people imagine a pickpocket stealing from someone found that people recommended the two exchange a higher sum of money if the sum was framed as punishment for the pickpocket than if it was framed as compensation for the victim. Punishment for wrongdoing is more important to us than reward for doing the right thing, perhaps because doing the right thing is what you're supposed to do in the first place.

But in a recent study published in the journal PLOS One, researchers Jeff Galak and Rosalind M. Chow from Carnegie Mellon University took issue with the current research. "In ... these paradigms, which are quite typical, the victim has not caused his/her negative outcome, but the transgressor caused his/her positive outcome," they write. That could give undue weight to a desire to punish — the pickpocket did wrong, but the victim didn't actually do anything. The situations aren't equal.

To bring more balance to the question, Galak and Chow put a twist on a familiar game in psychology research known as the dictator game. In their version of the game, there were three players: a decider, a receiver, and an observer. The rules were simple: The decider would be given a certain sum of money and had to decide how much of it to share with the receiver. The players were also informed that one player would be chosen at random to win or lose $25. After watching for several rounds, the observer got the chance to use their own money to either reward the decider's good behavior or punish their bad behavior.

Here's an example: Say you log into the game and you see that two other players have logged in as well. You play a few practice rounds where each player gets a chance in a different role, then the real game begins and your role is randomized. You get to be the observer, who gets $10 per round. You watch as the decider is given $20 at the beginning of each round and gets the chance to give some amount to the receiver. The player keeps all $20, every single time. At the end of five rounds, the decider is randomly awarded $25. Then you're given a choice: Assuming that every dollar you pay takes $4 away from the decider, how much would you pay from your $50 to punish them for playing the game unfairly?

Of course, there was a twist. Participants only thought they were playing a game with two other people over the internet, and they also thought that their role was randomly chosen. In fact, the participants were playing this game with a computer (a fact the researchers took such great pains to hide that fewer than 5 percent of participants suspected it), and they could only play as the observer. In some conditions, the decider kept all the money and then "randomly" won $25; in others, the decider gave half to the receiver every time and then "randomly" lost $25. In the latter scenario, the participant got the option to reward the decider, rather than punish them.

This leveled the injustice playing field: Players had to decide how much to compensate a good person who had a bad thing happen to them, or how much to punish a bad person who had a good thing happen to them. What did the players do?

Give a Little Bit

When there was no injustice (that is, nobody randomly won or lost that $25), participants were equally likely to compensate good people as to punish bad people. But when injustice was in the mix, the scales shifted toward compensation: Participants compensated good people 76 percent of the time but punished bad people only 36 percent of the time.

But hold up: That's not the whole story. Even though they compensated good people more often, when they did punish bad people, they brought the hammer down. People were willing to spend a median of $10 to compensate a good person who had faced misfortune. But to punish a bad person who had something good happen, they spent a median of $25 — half of their earnings. That was enough to take nearly all of the bad person's money.

So why the mismatch? What is it that made people more likely to compensate good people, but give small amounts of money when they did, and less likely to punish bad people, but pay large amounts of money for the pleasure? The researchers think it's because people consider the only fair punishment to be thorough enough to make sure the bad actors never do wrong again, which requires a lot of effort or resources to pull off. Anything shy of that isn't worth it. But when it comes to bad things happening to good people, the researchers write, "It appears that even small acts of compensation are seen as sufficient to convince individuals that they have fulfilled their moral obligations and have restored justice."

That's why you're willing to throw a few bucks toward a GoFundMe for a stranger who's down on their luck, but in the face of a bad company or celebrity catching a lucky break, you just scream into the void and don't take any action. According to this research, it's because you want more punishment for the wrongdoers than you're able to provide, so you just throw up your hands. If that bothers you, then try to keep in mind that righting injustice is like giving to charity: You don't need to solve the whole problem; you just need to chip in. And every little bit helps.

Get stories like this one in your inbox or your headphones: Sign up for our daily email and subscribe to the Curiosity Daily podcast.

Learn about more imbalances in your judgment with "You Are Not So Smart: Why You Have Too Many Friends on Facebook, Why Your Memory Is Mostly Fiction, and 46 Other Ways You're Deluding Yourself" by David McRaney. We handpick reading recommendations we think you may like. If you choose to make a purchase, Curiosity will get a share of the sale.

Written by Ashley Hamer May 13, 2019

Curiosity uses cookies to improve site performance, for analytics and for advertising. By continuing to use our site, you accept our use of cookies, our Privacy Policy and Terms of Use.