19/01/24

Algorithms on social media pick what people read. There’s worry that social media algorithms are creating filter bubbles, so that they never have to read something they don’t agree with and thus cause tribal thinking and confirmation bias. A filter bubble is your own personal universe of information that’s been generated by algorithms that are trying to guess what you’re interested in. And increasingly online we live in these bubbles. They follow us around. They form part of the fabric of most websites that we visit and I think we’re starting to see how they’re creating some challenges for democracy.

 

 

 

.W

Watch

Glossary

  • filter bubble – a situation in which an Internet user encounters only information and opinions that conform to and reinforce their own beliefs, caused by algorithms that personalize an individual’s online experience
  • the fabric of sth the basic structure of a society, an organization, etc. that enables it to function successfully
  • to conform to sth to obey a rule or reach the necessary stated standard, or to do things in a traditional way
  • to mediate sth – form a link between
  • to miss a piece of the picture – to fail to notice the entire situation
  • landscape – the distinctive features of a sphere of activity
  • confirmation bias – the tendency to interpret new evidence as confirmation of one’s existing beliefs or theories
  • fraction – a small or tiny part, amount, or proportion of something
  • compulsion – an irresistible urge to behave in a certain way
  • to amp sth upto increase the power or force of (something)
  • to impose sth on smb to establish as something to be obeyed or complied with
  • to dodge sth – avoid (someone or something) by a sudden quick movement
  • to grapple with sthto try to deal with or understand a difficult problem or subject

 

Watch and listen

Answer the questions below.

  • How does a filter bubble form? (0:10)
  • How do we actually choose our media? (0:35)
  • What is the deeper problem with filter bubbles? (1:50)
  • Are algorithms really neutral? (3:15)
  • Why is all social media activity essentially creating a “list” and what are the associated inherent dangers? (3:45)

Practice Makes Perfect

The following is a transcript of the video. Fill in the blanks with the correct form of the word in brackets. Go to Big Think to find out more.

A filter bubble is your own personal 1)________ (universal) of information that’s been generated by algorithms that are trying to guess what you’re interested in. And 2)________ (increase) online we live in these bubbles. They follow us around. They form part of the fabric of most websites that we visit and I think we’re starting to see how they’re creating some challenges for democracy. We’ve always chosen media that conforms to our address and read newspapers or magazines that in some way 3)________ (reflection) what we’re interested in and who we want to be. But the age of kind of the algorithmically mediated media is 4)________ (real) different in a couple of ways. One way is it’s not something that we know that we’re choosing. So we don’t know on what 5)________ (base), who an algorithm thinks we are and therefore we don’t know how it’s deciding what to show us or not show us. And it’s often that not showing us part that’s the most important – we don’t know what piece of the picture we’re missing because by 6)________ (define) it’s out of view. And so that’s increasingly I think part of what we’re seeing online is that it’s getting harder and harder even to imagine how someone else might come to the views that they have might see the world the way they do. Because that information is literally not part of what we’re seeing or 7)________ (consumption). Another feature of kind of the filter bubble 8)________ (land) is that it’s automatic and it’s not something that we’re choosing. When you pick up a left wing magazine or a right wing magazine we know what the bias is, what to expect. A deeper problem with algorithms choosing what we see and what we don’t see is that the data that they have to base those decisions on is really not 9)________ (represent) of the whole of who we are as human 10)________ (be). So Facebook is basically trying to take a handful of a sort of decisions about what to click on and what not to click on, maybe how much time we spend with different things and trying to extract from that some general 11)________ (true) about what we’re interested in or what we care about. And that clicking self who in fractions of a second is trying to decide am I interested in this article or am I not it just isn’t a very full 12)________ (represent) of the whole of our human self. You can do this experiment where you can look back at your web history for the last month and obviously there are going to be some things there that really gave you a lot of value, that represent your true self or your 13)________ (in) self. But there’s a lot of stuff, you know, I click on cell phone reviews even though I’ll always have an iPhone. I never am not going to have an iPhone. But it’s just some kind of 14)________ (compulsion) that I have. And I don’t particularly need or want algorithms amping up my desire to read 15)________ (use) technology reviews.

Answers:

1. universe   2. increasingly   3. reflect   4. really   5. basic   6. definition   7. consuming   8. landscape   9. representative

10. beings   11. true   12. representation   13. innermost   14. compulsion   15. useless

.

Explore it more on TED

(588)