Our website is made possible by displaying online advertisements to our visitors.
Please consider supporting us by disabling your ad blocker.

Responsive image


Alt-right pipeline

The alt-right pipeline (also called the alt-right rabbit hole) is a proposed conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming provocative right-wing political content, such as antifeminist or anti-SJW ideas, gradually increases exposure to the alt-right or similar far-right politics. It posits that this interaction takes place due to the interconnected nature of political commentators and online communities, allowing members of one audience or community to discover more extreme groups.[1][2] This process is most commonly associated with and has been documented on the video platform YouTube, and is largely faceted by the method in which algorithms on various social media platforms function through the process recommending content that is similar to what users engage with, but can quickly lead users down rabbit-holes.[2][3][4] The effects of YouTube's algorithmic bias in radicalizing users has been replicated by one study,[2][5][6][7] although two other studies found little or no evidence of a radicalization process.[3][8][9]

Many political movements have been associated with the pipeline concept. The intellectual dark web,[2] libertarianism,[10] the men's rights movement,[11] and the alt-lite movement[2] have all been identified as possibly introducing audiences to alt-right ideas. Audiences that seek out and are willing to accept extreme content in this fashion typically consist of young men, commonly those that experience significant loneliness and seek belonging or meaning.[12]

The alt-right pipeline may be a contributing factor to domestic terrorism.[13][14] Many social media platforms have acknowledged this path of radicalization and have taken measures to prevent it, including the removal of extremist figures and rules against hate speech and misinformation.[3][12] Left-wing movements, such as BreadTube, also oppose the alt-right pipeline and "seek to create a 'leftist pipeline' as a counterforce to the alt-right pipeline."[15]

  1. ^ Cite error: The named reference Lewis 2018 was invoked but never defined (see the help page).
  2. ^ a b c d e Horta Ribeiro, Manoel; Ottoni, Raphael; West, Robert; Almeida, Virgílio A. F.; Meira, Wagner (27 January 2020). "Auditing radicalization pathways on YouTube". Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency. pp. 131–141. doi:10.1145/3351095.3372879. ISBN 9781450369367. S2CID 201316434.
  3. ^ a b c Ledwich, Mark; Zaitsev, Anna (26 February 2020). "Algorithmic extremism: Examining YouTube's rabbit hole of radicalization". First Monday. arXiv:1912.11211. doi:10.5210/fm.v25i3.10419. ISSN 1396-0466. S2CID 209460683. Archived from the original on 28 October 2022. Retrieved 28 October 2022.
  4. ^ "Mozilla Investigation: YouTube Algorithm Recommends Videos that Violate the Platform's Very Own Policies". Mozilla Foundation. 7 July 2021. Archived from the original on 25 March 2023. Retrieved 25 March 2023.
  5. ^ Lomas, Natasha (28 January 2020). "Study of YouTube comments finds evidence of radicalization effect". TechCrunch. Retrieved 17 July 2021.
  6. ^ Newton, Casey (28 August 2019). "YouTube may push users to more radical views over time, a new paper argues". The Verge. Archived from the original on 27 July 2023. Retrieved 17 July 2021.
  7. ^ Horta Ribeiro, Manoel; Ottoni, Raphael; West, Robert; Almeida, Virgílio A. F.; Meira, Wagner (22 August 2019). "Auditing Radicalization Pathways on YouTube". arXiv:1908.08313 [cs.CY].
  8. ^ Hosseinmardi, Homa; Ghasemian, Amir; Clauset, Aaron; Mobius, Markus; Rothschild, David M.; Watts, Duncan J. (2 August 2021). "Examining the consumption of radical content on You Tube". Proceedings of the National Academy of Sciences. 118 (32). arXiv:2011.12843. Bibcode:2021PNAS..11801967H. doi:10.1073/pnas.2101967118. PMC 8364190. PMID 34341121.
  9. ^ * Chen, Annie Y.; Nyhan, Brendan; Reifler, Jason; Robertson, Ronald E.; Wilson, Christo (22 April 2022). "Subscriptions and external links help drive resentful users to alternative and extremist YouTube videos". arXiv:2204.10921 [cs.SI].
  10. ^ Cite error: The named reference Hermansson 2020 was invoked but never defined (see the help page).
  11. ^ Cite error: The named reference Mamié 2021 was invoked but never defined (see the help page).
  12. ^ a b Cite error: The named reference Roose 2019 was invoked but never defined (see the help page).
  13. ^ Piazza, James A. (2 January 2022). "Fake news: the effects of social media disinformation on domestic terrorism". Dynamics of Asymmetric Conflict. 15 (1): 55–77. doi:10.1080/17467586.2021.1895263. ISSN 1746-7586. S2CID 233679934. Archived from the original on 25 July 2023. Retrieved 4 November 2022.
  14. ^ Cite error: The named reference Munn 2019 was invoked but never defined (see the help page).
  15. ^ Cite error: The named reference Cotter 2022 was invoked but never defined (see the help page).

Previous Page Next Page






Alternatif sağ boru hattı Turkish

Responsive image

Responsive image