It’s hard to be a moral person. Technology is making it harder.

It’s hard to be a moral person. Technology is making it harder.
It’s hard to be a moral person. Technology is making it harder.

It was on the day I read a Facebook post by my sick friend that I started to really question my relationship with technology. An old friend had posted a status update saying he needed to rush to the hospital because he was having a health crisis. I recognized the post as a plea for support, but did nothing about it.

Digital technology often seems to make it harder for us to respond in the right way when someone is suffering and needs our help

What if digital technology is degrading our capacity for moral attention – the capacity to notice the morally salient features of a given situation so that we can respond appropriately?

  • There is a lot of evidence to indicate that our devices really are having this negative effect
  • Tech companies continue to bake in design elements that amplify the effect

It’s a curiosity gap.

When you see the number [go up], it’s tapping into novelty seeking – same as a slot machine. It’s making you aware of a gap in your knowledge and now you want to close it. “We’ve all been there,” he assures me.

Human Downgrading

Evidence suggests that digital tech erodes our attention, eroding our moral attention, and eroding empathy

  • Our devices disconnect us even when we’re not using them
  • In a 2015 Pew Research Center survey, 89 percent of American respondents admitted that they whipped out their phone during their last social interaction
  • 82 percent said it deteriorated the conversation and decreased the empathic connection they felt toward the other people they were with

Democracy itself is at stake

The past few years have seen mounting concern over the way social media gives authoritarian politicians a leg up

  • By offering them a vast platform where they can demonize a minority group or other “threat,” social media enables them to fuel a population’s negative emotions
  • Negative emotions last longer and are stickier
  • Five people died in the Capitol riot

The idea of moral attention goes back to ancient Greece

Simone Weil, an early 20th-century French philosopher and Christian mystic, wrote that to be able to properly pay attention to someone else – to become fully receptive to their situation in all its complexity – you need to first get your own self out of the way

  • Attention consists of suspending our thought, leaving it detached, empty, ready to receive in its naked truth the object that is to penetrate it
  • Buddhist traditions emphasize the importance of relinquishing our ego and training our attention so we can perceive and respond to others’ needs

Why don’t we build tech that enhances moral attention?

Companies such as Facebook have found a winning strategy for capturing our attention

  • They’ve got supercomputers testing precisely which colors, sounds, and other design elements are best at exploiting our psychological weaknesses
  • Technology may have caused some problems – but it can also fix them
  • Thus far, much of the intervention in the digital sphere to enhance that has not worked out so well
  • Tenzin Priyadarshi, the director of the Dalai Lama Center for Ethics and Transformative Values at MIT, wants to design technology that enhances people’s moral attention on the platforms where they already spend time
  • He wants users to be able to integrate a plug-in that periodically peppers their feeds with good behavioral nudges, like, “Have you said a kind word to a colleague today?” or, “Did you call someone who’s elderly or sick?”

What was really happening the day I got distracted from my sick friend’s Facebook post and went to look at my Gmail instead?

Tristan Harris, a former design ethicist at Google, leads the Center for Humane Technology, which aims to realign tech with humanity’s best interests.

The consequences can be catastrophic

In Myanmar, Facebook users used the platform to incite violence against the Rohingya, a mostly Muslim minority group in the Buddhist-majority country.

  • Thanks to the Facebook algorithm, these emotion-arousing posts were shared countless times, directing users’ attention to an ever narrower and darker view of the Rohingya.

So, what can we do?

We have two main options: regulation and self-regulation.

  • On a societal level, we have to start by recognizing that Big Tech is probably not going to change unless the law forces it to, or it becomes too costly (financially or reputationally) not to change.
  • One thing we can do as citizens is demand tech reform, putting public pressure on tech leaders and calling them out if they fail to respond.
  • Meanwhile, tech policy experts can push for new regulations.
  • These regulations will have to change Big Tech’s incentives by punishing unwanted behavior and rewarding humane behavior.

What is needed, then, is ongoing training: the ability not just to withdraw attention, but to invest it somewhere else, to enlarge and proliferate it, to improve its acuity.

Odell describes how she’s trained her attention by studying nature, especially birds and plants

  • There are many other ways to do it, from meditating to reading literature
  • In the year since my sick friend’s Facebook post, I’ve become more intentional about birding, meditating, and reading fiction in order to train my attention

With notifications coming at us from all sides, it’s never been easier to have an excuse to attenuate or leave an uncomfortable stimulus

By fragmenting my attention and dangling before it the possibility of something newer and happier, Gmail’s design had exploited my innate psychological vulnerabilities and had made me more likely to turn away from my sick friend’s post, degrading my moral attention.

  • The problem isn’t just Gmail. Silicon Valley designers have studied a whole suite of “persuasive technology” tricks and used them in everything from Amazon’s one-click shopping to Facebook’s News Feed to YouTube’s video recommender algorithm
  • Sometimes the goal of persuasive technology is to get us to spend money, but often it’s just to keep us looking and scrolling and clicking on a platform for as long as possible

Recent US election

As former President Donald Trump racked up millions of votes, many liberals wondered incredulously how nearly half of the electorate could possibly vote for a man who had put kids in cages, enabled a pandemic that had killed many thousands of Americans, and so much more. How was all this not a dealbreaker?

The business model shifts our collective attention to certain stories to the exclusion of others

We become increasingly convinced that we are good and the other side is evil

  • By narrowing our attention, the business model also narrows our moral attention – our ability to see that there may be other perspectives that matter morally

Source