Charlie Gstalder is a first-year English major. He is a staff writer for the Newswire from Westchester, N.Y.

Charlie Gstalder is a first-year English major. He is a staff writer for the Newswire from Westchester, N.Y.
I fear the intent of my previous opinion piece may have been unclear. As a writer, a journalist and an American, I have no qualms with free speech. The target of this series is extremism, and the criminalization of hate speech was merely one proposed solution. This week I will provide another: The removal of internet algorithms.
If you have ever shopped online, browsed a social media feed or conducted a Google search, algorithms have dictated your actions. In the context of internet use, algorithms are finite sets of calculations meant to control your experience. While certain algorithms can be beneficial — there’s nothing nefarious about Amazon suggesting some headphones you may be interested in — others are contributing to the rise of extremism in American society.
For most of my generation, the primary association with algorithms is the Instagram feed. In 2016, Instagram announced that it would be abandoning its chronologically-based feed in favor of an algorithmically-controlled one.
Users were furious and worried that they would no longer be able to see their friends’ posts. Instagram went ahead with the change regardless, and honestly, the results did not ruin the app. However, this shift from time to algorithm becomes more troublesome when you remember Instagram is owned by Mark Zuckerberg’s Facebook.
Facebook played a critical role in Russia’s 2016 election interference. It was primarily on Facebook that disinformation and actual fake news were spread. The targeting of Facebook makes sense for two reasons: The prevalence of social media news and the use of algorithms.
A 2018 Pew Research Center poll found that 68% of Americans get their news in some part on social media. Facebook’s news feed is dictated by algorithms. The algorithms themselves are dangerous for two seemingly contradictory reasons: The ease of manipulation and their non-human nature.
The ease with which one can manipulate algorithms cannot be disputed. Simply search the name of any form of social media followed by the word “algorithm” and revel in the multitude of websites discussing how to use the algorithm to your advantage. This is what made the work of Russian troll farms so simple — all they had to do was make content that fit the criteria for the algorithms and disperse it. Then, they could simply sit back and watch as their false and dishonest information proliferated itself among users’ news feeds.
It’s that simple because algorithms are a series of choices or calculations. Algorithms are non-human. Their artificial nature makes them free of the unpredictability of the human species. Algorithms make sense in a way that humans do not; they can be understood in a way that living beings never could. By mastering a website’s algorithm, one could dictate the content displayed on said website. When viewed this way, it is easy to see how, in the hands of extremists, algorithms can be weaponized.
Potential for weaponization aside, internet algorithms are dangerous because we are forced to use them. One cannot simply press a button to return Instagram to a chronological order, nor can one use Facebook, YouTube or Twitter without tacitly agreeing to participate in their algorithms.
Honestly, I would not have nearly as much of a problem with this if a choice existed. If each of us could choose whether or not to use algorithms, there would be no conflict. But this is not the case. Using the internet should not equate to consenting to algorithms, not in a society in which internet use is necessary for daily life.
Algorithms are too easily manipulated, they are nonhuman, they hinder our free will and they are used for nefarious purposes. This cannot continue unchecked. We must allow for an algorithm opt-out provision. That way, if you are willing to have Zuckerberg’s computer scientists determine what you want to see, you are free to do so. But it cannot be necessary.
You must be logged in to post a comment.