Tweaks to social media algorithm have minimal impact on opinion, studies find

Researchers had special access to data from Meta to test the changes’ effects.

By Michael MarksAugust 8, 2023 1:17 pm,

Much has been made of the role of social media in our current political climate.

Sites like Facebook, Instagram and the site formerly known as Twitter, now X, have all been blamed for harboring echo chambers and pushing voters further and further to each political pole.

Tweaking the algorithms themselves, some have argued, could affect how people form these opinions. A group of researchers recently got the chance to check that idea, in a series of experiments using data from Meta itself.

Talia Stroud, a professor of communication studies and journalism at the University of Texas at Austin, was one of the lead researchers on the project. She spoke to Texas Standard about their findings. Listen to the interview above or read the transcript below.

This transcript has been edited lightly for clarity:

Texas Standard: I want to get to the core of why you thought this was important to research. At the heart of a lot of complaints that we’ve been hearing in the political conversation for the past several years has been this idea that social media is pushing individuals sort of into certain patterns of political thinking by using algorithms – trying to sort of increase clicks – and you wanted to know whether or not a tweak of those algorithms could change the status quo. Is that what you were thinking or was there something different?

Talia Stroud: You’re absolutely correct. We wanted to understand what happened when we made changes to the algorithm, to people’s on platform experience and to their attitudes. And then we also had the opportunity to look at data from the platform itself in aggregated form to find out where people were going.

So when you were testing this, what were the variables? What were some of the changes that you were experimenting with?

So in one study, what we did is we switched people from the normal algorithm that they would see on Facebook and Instagram to a chronological feed where they saw the most recent content first in their feeds.

The most recent content.

Yeah, exactly right. And in a second study, what we did is we held back reshared content from the feed. And then the third one, what we did is we demoted content from like-minded sources.

So people who share your political identity pages in groups where the audiences are often people who share your political identity, we demoted those in your algorithmic feed.

Wow. I love the setup, the design. Okay, so how much of a difference did these tweaks to the algorithm really make when it came to people’s attitudes?

So for the consenting participants – these are people who agreed to have their feeds altered. We didn’t just do this to people that were unaware that this was a possibility. And what we found is that these changes to the algorithm really did change the type of content that they saw.

But surprisingly, even though it changed the composition of their newsfeed, it didn’t have a large impact on people’s attitudes.

Wow. Why do you think that is? Do you have any any theories?

You know, it could be a lot of different things. And we can’t know for sure from these studies, but it could be the length of time.

So we changed people’s ad feeds for a period of three months. And although by social science standards that’s a long time, maybe that wasn’t long enough.

It’s also possible that, you know, these platforms have been around for decades. And it’s also the case that Facebook and Instagram are just one source of many, many sources that people have at their disposal. And so it’s possible that just changing one aspect of one of them isn’t enough to change people’s attitudes.

Does this upend the conventional wisdom around algorithms and the power thereof?

I think it raises some substantial questions that we need to think through.

So the first is, you know, I think that there have been some popular proposals of, “oh, if we just change the algorithm, that will result in some curing of societal ills.” And I think that this shows we really need to be very thoughtful about any prescriptive policy ideas and test them to find out whether or not they actually have those effects.

But I think it also confirms the idea that these algorithms are very powerful when we change them. The composition of what people saw on their feeds really did change.

But I guess some would say, well, if you change the composition and attitudes don’t seem to change, as a practical matter, what difference does it make? Although I suppose some would say maybe when it comes to participants in the study, the damage was already done, so to speak. I mean, that social media has already done its work of locking us into our respective echo chambers, and it’s a little hard for folks to get out.

You know, I think that’s one fair assessment from the study.

We had one aspect of the study where we were able to look at political news URLs posted at least 100 times on Facebook. And when we just looked at who saw these URLs, they’re primarily engaged with by conservatives or by liberals, but not both.

And I’ll give you one other anecdote that I think is worth sharing. In the study where we held back reshared content from people’s feed, it resulted in changes to their feed. So they saw less political news, for example. And among the study participants there, when they saw less political news, they actually were less knowledgeable about what was happening in the news.

So it’s not to say that there aren’t effects that we should think about it. It is to say, though, that at least in these series of studies, we did not find that these tweaks to the algorithm affected people’s political polarization.

If you found the reporting above valuable, please consider making a donation to support it here. Your gift helps pay for everything you find on texasstandard.org and KUT.org. Thanks for donating today.