Red flags as Film and Publication Board becomes regulator of all online content
Technically, bloggers and all other online content producers could land in trouble with the content classification and regulation authority.
The Film and Publications Amendment Act has granted the Film and Publication Board (FPB) wide-ranging powers, which could threaten free speech.
With the commencement of the act on 1 March 2022, the FPB became the regulator of all online content in South Africa.
The act allows the board (which is accountable to the minister of communications) to censor content. The update of the Film and Publications Act of 1996 was intended to balance constitutional freedoms with the virtuous need to protect children, in particular, from online harm.
Until the final regulations are gazetted, online distributors of games, films and publications do not have clarity on how the act will affect them. As it stands, it does not apply to traditional broadcasters, because broadcasting is self-regulated and protected in terms of the Constitution.
However, all creators of online content could fall foul of the act if a complaint is lodged to the board about content posted to Instagram, YouTube, Netflix, Facebook or another online platform.
And those who make a living from their films, games or other user-generated content are technically deemed to be commercial operators. If they’re not registered, fee-paying entities, they could be liable for hefty fines, should their content classification be out of sync with the state.
Deputy Communications Minister Philemon Mapulane told media on 4 March that South Africa has moved away from censorship post-apartheid and adopted the content classification system that provides clearly labelled ratings and consumer advisories.
“The only content that is not allowed relates to those aspects that we as South Africans find offensive or unconstitutional, such as hate speech, incitement to violence, child sexual abuse … bestiality, etc.
“This puts the power in the hands of the consumer to make informed choices of what they want to view or read, and to learn to likewise make these choices to protect their children from exposure to content that can cause developmental and psychological harm,” Mapulane said. We’ve come a long way, he suggested.
Not everyone agrees. And though aspects of the act are to be celebrated, there are further concerns about it being out of step with international approaches to the classification of content on global platforms.
Media lawyer Janet MacKenzie says: “It has an exceedingly wide application and it does include user-generated content and any content published online by persons who are not members of the Press Council. The difficulty is the requirement in the act that all online distributors must register as a distributor with the FPB.
“Online distributors then have the option to either submit all content to the FPB for classification prior to distribution, or to apply to the FPB for the right to self-classify content or the right to use the classification system of an accredited foreign institution. For a person who generates their own content and puts it on platforms, that’s not going to be possible.”
It also means platforms would have to register with the board and submit all their content to it for classification. “This is impossible, by the very nature of the vast amounts of content on such platforms, if one just considers the extent of user-generated content available online.”
Platforms will have to ensure their classification of content is in line with the FPB’s classification guidelines. “This is where it becomes really difficult for platforms that are global in nature who in terms of international approaches to the regulation of online harms are subject to self-regulation systems which allow such platforms to implement their own classification systems, complaint processes and take-down procedures. Global content platforms already have extensive content moderation systems in place, and they do their own self-classification in accordance with their own platform rules, guidelines and standards,” she explains. “They also employ content moderators worldwide, to perform that function.”
MacKenzie says, though the positive aspects of the act – to prevent hate speech and child abuse – are laudable, those aims could have been achieved with less interference.
Technically, the act applies to all films – from drone footage to stills, GIFs and live-streaming. This means vloggers, bloggers, streamers, webinar hosts, influencers and other content creators are affected.
Anyone who has not classified a film, game or publication is guilty of an offence, and liable to pay a R150,000 fine or be sentenced to eight months in jail, or both. And if it’s distributed, that could mean a fine of up to R500,000, five years in jail, or both.
It also makes livestreaming legally impossible, as all content first needs to be classified and approved.
William Bird of Media Monitoring Africa says that, although major platforms include specifications on their programming, bloggers and other content producers would have to submit their work for preclassification, even if it is not for commercial purposes. “For bloggers, who are producing or distributing content, it’s a big problem. If you’re producing video content, and you fail to submit for classification there are quite serious penalties. That’s a big stick to beat content producers with.
The duplication, fees and administrative hurdles could act as a disincentive to invest in South Africa’s content and creative industries, MacKenzie says, adding that the state would have a “wonderful excuse” to block access to social networking sites, should it feel so inclined. DM168
This story first appeared in our weekly Daily Maverick 168 newspaper, which is available countrywide for R25.