The Films and Publications Amendment Act 2018 was promulgated on 1 March 2019. This Amendment Act has made online content distributors liable for the content they host. This means that the FPAA will have a significant impact on ISPs, as well as foreign-based ISPs that provide services in South Africa.
The FPAA aims to regulate the distribution and access to games depicting cruelty to animals, adults performing sexual acts with children, child pornography and violence against women. It also prohibits any person from distributing or exhibiting such material in any form.
If you are an ISP and you fail to comply with the scheme, penalties may include:
- A fine not exceeding R10 million or imprisonment for a period not exceeding 10 years in the case of an individual; or
- A fine not exceeding R100 million for companies; or
- Any combination thereof
This Act has made online content distributors liable for the content they host. The FPAA applies to all online content distributors, including social media platforms. Under the law, ISPs have an obligation to block access to pornographic sites on their networks. This means that as ISPs seek to comply with this requirement, they also need to consider how they can support their subscribers in preventing children from accessing unsuitable content or material.
In addition, if an internet service provider knows that its services are being used for child pornography, incitement of violence, advocating hatred or propaganda for war, it is required to report the offending content and user to the police.
How Web Filtering Can Help ISPs Meet Their Regulatory Requirements
The changes to this legislation are a clear indication that governments around the world are looking to exercise their Digital Sovereignty. Governments are looking to enforce the same laws they have offline but in the online space. ISPs will need to implement tools to support government goals or they will be mandated to do so as is already the case in many jurisdictions.
Web filters are designed to block access to websites that contain content that is harmful to children, adults, or society in general. ISP filtering of content required by law is a complex issue. The legislation specifies certain categories of material which ISPs must block their users from accessing, while at the same time leaving open several other options for what can be blocked and what cannot be. Some examples are given below:
- Blocked content
- Child pornography
- Terrorist training manuals
- Specific material that may incite racial hatred or violence against ethnic minorities, or religious minorities
Web filters like Netsweeper work with the CTIRU and the IWF to remove harmful content from the internet, categorize content, and provide reports on the type of content available.
In conclusion, the South African government needs to work closely with Internet Service Providers (ISPs) in order for them to comply with the legislation. Web filtering can help in these situations by blocking access to content that is harmful for users and helping ISPs provide a report on the presence of harmful content so it can be investigated by the relevant authorities.