Journal article 1194 views 169 downloads
Recommender systems and the amplification of extremist content
Internet Policy Review, Volume: 10, Issue: 2
Swansea University Authors: Joe Whittaker , Sean Looney, Alastair Reed
-
PDF | Version of Record
Released under the terms of a Creative Commons Attribution 3.0 Germany
Download (483.51KB)
DOI (Published version): 10.14763/2021.2.1565
Abstract
Policymakers have recently expressed concerns over the role of recommendation algorithms and their role in forming “filter bubbles.” This is a particularly prescient concern in the context of extremist content online; these algorithms may promote extremist content at the expense of more moderate voi...
Published in: | Internet Policy Review |
---|---|
ISSN: | 2197-6775 |
Published: |
Internet Policy Review, Alexander von Humboldt Institute for Internet and Society
2021
|
Online Access: |
Check full text
|
URI: | https://cronfa.swan.ac.uk/Record/cronfa57054 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
first_indexed |
2021-07-05T09:03:08Z |
---|---|
last_indexed |
2023-01-11T14:36:42Z |
id |
cronfa57054 |
recordtype |
SURis |
fullrecord |
<?xml version="1.0"?><rfc1807><datestamp>2022-11-02T13:38:14.7793384</datestamp><bib-version>v2</bib-version><id>57054</id><entry>2021-06-08</entry><title>Recommender systems and the amplification of extremist content</title><swanseaauthors><author><sid>112ed59957393e783f913443ec80faab</sid><ORCID>0000-0001-7342-6369</ORCID><firstname>Joe</firstname><surname>Whittaker</surname><name>Joe Whittaker</name><active>true</active><ethesisStudent>false</ethesisStudent></author><author><sid>98bae6ad8004bb8fa78d382f2630dbe3</sid><firstname>Sean</firstname><surname>Looney</surname><name>Sean Looney</name><active>true</active><ethesisStudent>false</ethesisStudent></author><author><sid>115297b63e005e2b75991efe269cd4a2</sid><ORCID>0000-0002-9060-5518</ORCID><firstname>Alastair</firstname><surname>Reed</surname><name>Alastair Reed</name><active>true</active><ethesisStudent>false</ethesisStudent></author></swanseaauthors><date>2021-06-08</date><deptcode>CSSP</deptcode><abstract>Policymakers have recently expressed concerns over the role of recommendation algorithms and their role in forming “filter bubbles.” This is a particularly prescient concern in the context of extremist content online; these algorithms may promote extremist content at the expense of more moderate voices. In this article, we make two contributions to this debate. Firstly, we provide a novel empirical analysis of three platforms’ recommendation systems when interacting with far-right content. We find that one platform – YouTube – does amplify extreme and fringe content, while two – Reddit and Gab – do not. Secondly, we contextualise these findings into the regulatory debate. There are currently few policy instruments for dealing with algorithmic amplification, and those that do exist largely focus on transparency. We argue that policymakers have yet to fully understand the problems inherent in “de-amplifying” legal, borderline content and argue that a co-regulatory approach may offer a route towards tackling many of these challenges.</abstract><type>Journal Article</type><journal>Internet Policy Review</journal><volume>10</volume><journalNumber>2</journalNumber><paginationStart/><paginationEnd/><publisher>Internet Policy Review, Alexander von Humboldt Institute for Internet and Society</publisher><placeOfPublication/><isbnPrint/><isbnElectronic/><issnPrint>2197-6775</issnPrint><issnElectronic/><keywords>Filter bubble, Online radicalisation, Algorithms, Extremism, Regulation</keywords><publishedDay>30</publishedDay><publishedMonth>6</publishedMonth><publishedYear>2021</publishedYear><publishedDate>2021-06-30</publishedDate><doi>10.14763/2021.2.1565</doi><url/><notes/><college>COLLEGE NANME</college><department>Criminology, Sociology and Social Policy</department><CollegeCode>COLLEGE CODE</CollegeCode><DepartmentCode>CSSP</DepartmentCode><institution>Swansea University</institution><apcterm/><funders>Global Research Network on Terrorism & Technology</funders><projectreference/><lastEdited>2022-11-02T13:38:14.7793384</lastEdited><Created>2021-06-08T10:01:54.8473674</Created><path><level id="1">Faculty of Humanities and Social Sciences</level><level id="2">School of Social Sciences - Criminology, Sociology and Social Policy</level></path><authors><author><firstname>Joe</firstname><surname>Whittaker</surname><orcid>0000-0001-7342-6369</orcid><order>1</order></author><author><firstname>Sean</firstname><surname>Looney</surname><order>2</order></author><author><firstname>Alastair</firstname><surname>Reed</surname><orcid>0000-0002-9060-5518</orcid><order>3</order></author><author><firstname>Fabio</firstname><surname>Votta</surname><order>4</order></author></authors><documents><document><filename>57054__20330__2741bca41e484a13adfb2a5cec60f23f.pdf</filename><originalFilename>policyreview-2021-2-1565.pdf</originalFilename><uploaded>2021-07-05T09:48:24.3804485</uploaded><type>Output</type><contentLength>495114</contentLength><contentType>application/pdf</contentType><version>Version of Record</version><cronfaStatus>true</cronfaStatus><documentNotes>Released under the terms of a Creative Commons Attribution 3.0 Germany</documentNotes><copyrightCorrect>true</copyrightCorrect><language>eng</language><licence>https://creativecommons.org/licenses/by/3.0/de/deed.en</licence></document></documents><OutputDurs/></rfc1807> |
spelling |
2022-11-02T13:38:14.7793384 v2 57054 2021-06-08 Recommender systems and the amplification of extremist content 112ed59957393e783f913443ec80faab 0000-0001-7342-6369 Joe Whittaker Joe Whittaker true false 98bae6ad8004bb8fa78d382f2630dbe3 Sean Looney Sean Looney true false 115297b63e005e2b75991efe269cd4a2 0000-0002-9060-5518 Alastair Reed Alastair Reed true false 2021-06-08 CSSP Policymakers have recently expressed concerns over the role of recommendation algorithms and their role in forming “filter bubbles.” This is a particularly prescient concern in the context of extremist content online; these algorithms may promote extremist content at the expense of more moderate voices. In this article, we make two contributions to this debate. Firstly, we provide a novel empirical analysis of three platforms’ recommendation systems when interacting with far-right content. We find that one platform – YouTube – does amplify extreme and fringe content, while two – Reddit and Gab – do not. Secondly, we contextualise these findings into the regulatory debate. There are currently few policy instruments for dealing with algorithmic amplification, and those that do exist largely focus on transparency. We argue that policymakers have yet to fully understand the problems inherent in “de-amplifying” legal, borderline content and argue that a co-regulatory approach may offer a route towards tackling many of these challenges. Journal Article Internet Policy Review 10 2 Internet Policy Review, Alexander von Humboldt Institute for Internet and Society 2197-6775 Filter bubble, Online radicalisation, Algorithms, Extremism, Regulation 30 6 2021 2021-06-30 10.14763/2021.2.1565 COLLEGE NANME Criminology, Sociology and Social Policy COLLEGE CODE CSSP Swansea University Global Research Network on Terrorism & Technology 2022-11-02T13:38:14.7793384 2021-06-08T10:01:54.8473674 Faculty of Humanities and Social Sciences School of Social Sciences - Criminology, Sociology and Social Policy Joe Whittaker 0000-0001-7342-6369 1 Sean Looney 2 Alastair Reed 0000-0002-9060-5518 3 Fabio Votta 4 57054__20330__2741bca41e484a13adfb2a5cec60f23f.pdf policyreview-2021-2-1565.pdf 2021-07-05T09:48:24.3804485 Output 495114 application/pdf Version of Record true Released under the terms of a Creative Commons Attribution 3.0 Germany true eng https://creativecommons.org/licenses/by/3.0/de/deed.en |
title |
Recommender systems and the amplification of extremist content |
spellingShingle |
Recommender systems and the amplification of extremist content Joe Whittaker Sean Looney Alastair Reed |
title_short |
Recommender systems and the amplification of extremist content |
title_full |
Recommender systems and the amplification of extremist content |
title_fullStr |
Recommender systems and the amplification of extremist content |
title_full_unstemmed |
Recommender systems and the amplification of extremist content |
title_sort |
Recommender systems and the amplification of extremist content |
author_id_str_mv |
112ed59957393e783f913443ec80faab 98bae6ad8004bb8fa78d382f2630dbe3 115297b63e005e2b75991efe269cd4a2 |
author_id_fullname_str_mv |
112ed59957393e783f913443ec80faab_***_Joe Whittaker 98bae6ad8004bb8fa78d382f2630dbe3_***_Sean Looney 115297b63e005e2b75991efe269cd4a2_***_Alastair Reed |
author |
Joe Whittaker Sean Looney Alastair Reed |
author2 |
Joe Whittaker Sean Looney Alastair Reed Fabio Votta |
format |
Journal article |
container_title |
Internet Policy Review |
container_volume |
10 |
container_issue |
2 |
publishDate |
2021 |
institution |
Swansea University |
issn |
2197-6775 |
doi_str_mv |
10.14763/2021.2.1565 |
publisher |
Internet Policy Review, Alexander von Humboldt Institute for Internet and Society |
college_str |
Faculty of Humanities and Social Sciences |
hierarchytype |
|
hierarchy_top_id |
facultyofhumanitiesandsocialsciences |
hierarchy_top_title |
Faculty of Humanities and Social Sciences |
hierarchy_parent_id |
facultyofhumanitiesandsocialsciences |
hierarchy_parent_title |
Faculty of Humanities and Social Sciences |
department_str |
School of Social Sciences - Criminology, Sociology and Social Policy{{{_:::_}}}Faculty of Humanities and Social Sciences{{{_:::_}}}School of Social Sciences - Criminology, Sociology and Social Policy |
document_store_str |
1 |
active_str |
0 |
description |
Policymakers have recently expressed concerns over the role of recommendation algorithms and their role in forming “filter bubbles.” This is a particularly prescient concern in the context of extremist content online; these algorithms may promote extremist content at the expense of more moderate voices. In this article, we make two contributions to this debate. Firstly, we provide a novel empirical analysis of three platforms’ recommendation systems when interacting with far-right content. We find that one platform – YouTube – does amplify extreme and fringe content, while two – Reddit and Gab – do not. Secondly, we contextualise these findings into the regulatory debate. There are currently few policy instruments for dealing with algorithmic amplification, and those that do exist largely focus on transparency. We argue that policymakers have yet to fully understand the problems inherent in “de-amplifying” legal, borderline content and argue that a co-regulatory approach may offer a route towards tackling many of these challenges. |
published_date |
2021-06-30T04:12:30Z |
_version_ |
1763753853003497472 |
score |
11.035634 |