At the annual Internet Governance Forum (IGF) in Geneva, researchers recommended third-party, middleware software to help block inappropriate content on the internet. They recommend this software be used to protect users against the social media algorithms feeding them content they don’t want to see without encroaching on free speech rights.
The IGF is an annual meeting of government representatives, academics, and other stakeholders to discuss public policy issues relating to the internet. Held in Geneva, Switzerland, it’s an important forum for discussing the role of governments in the governance of the Internet and provides a unique opportunity for government officials from around the world to meet with representatives from non-governmental organizations and technical experts in order to discuss ways to resolve risks and challenges related to the internet.
Barak Richman, professor of law and business at Duke University, argued that “third party software which moderates what users see bypassing images and videos that the user selects as objectionable — could act in place of other solutions to tackle disinformation and hate speech”. He argues that it would help create a buffer between the harmful social media algorithms and the user.
The spread of unwholesome, harmful, and inappropriate content on the internet is becoming a more serious problem every day. People are being fed online content they don’t want to see because of social media algorithms, unaware that they are designed to manipulate their emotions in order to maximize user engagement. Middleware technology such as web filtering software provides a safer online environment for everyone.
Further Resources
How Social Media is Compromising Our Youth Today: The Fall of Facebook