FPB on harmful social media and online content; SA Post Office financial sustainability (postponed); with Deputy Minister
Meeting Summary
The Film and Publication Board (FPB) briefed the Select Committee on the challenges posed by social media and online platforms, and the measures in place to address harmful content.
Mandated by the Film and Publications Act, the FPB regulates and acts against content such as child sexual abuse material, hate speech, cyberbullying, and non-consensual intimate images. The FPB said its key challenges included limited access to illegal online spaces, misaligned platform standards, legislative gaps, and resource constraints. In response, it monitors digital activity, issues take-downs, conducts awareness campaigns, and collaborates with law enforcement, civil society and global networks, while working to improve regulatory frameworks.
The Committee discussed the need to address the sexualisation of children and regulate harmful online content, including hate speech and disinformation. There was debate over whether content such as Julius Malema’s song “Kill the Boer” fell under the FPB’s jurisdiction.
The Committee raised concerns about who determines what constitutes disinformation and harmful content, with suggestions that such definitions could be misused or politicised. The issue of deepfakes was highlighted, with questions on whether the FPB monitors them and how they are categorised. Members questioned the FPB on its ability to regulate online platforms, including WhatsApp, and how it planned to align international companies with South African standards without overstepping into censorship. There were concerns about learners’ use of artificial intelligence, the FPB’s school outreach efforts, and how it deals with shops selling prohibited material.
The Chairperson pressed the Department on legislative delays that hindered the FPB’s full regulatory authority, and criticised the expansion of the FPB’s mandate without a corresponding increase in budget. She also raised concerns about the volume of child sexual abuse material online, and asked what strategies and impact metrics were being used to combat it.
The Deputy Minister and Department officials discussed the need for flexible, responsive legislation that balances freedom of expression with public protection. They emphasised education, collaboration with other regulators, and building local regulatory capacity over outsourcing. The FPB’s mandate was clarified as being protective rather than censorious, with an emphasis on encouraging digital literacy and safety, particularly among children and parents.
The Committee criticised the South African Post Office's business rescue practitioners for failing to attend the meeting, and considered legal steps to compel their attendance, noting their ongoing request for significant additional funding.
Meeting report
FPB on measures to address harmful social media and online content
Mr Ephraim Tlhako, Chief Executive Officer (CEO), Film and Publication Board (FPB), Ms Zama Mkosi, Chairperson, FPB, and Ms Nonkqubela Jordan-Dyani, Director-General, Department of Communications and Digital Technologies (DCDT), briefed the Committee on the challenges and measures taken by the FPB to address harmful content in social media and other online platforms.
Background and mandate
The FPB was a statutory content regulator established under the Film and Publications Act (Act 65 of 1996, amended in 2019). Its purpose was to protect the public from harmful and prohibited content. Harmful content included emotional, psychological, or moral distress, while prohibited content included propaganda for war, incitement to violence, and hate speech.
Legislative prohibitions
The FPB enforces laws prohibiting the unclassified distribution of films, games, and publications and criminalises content such as child sexual abuse material, revenge porn, deepfakes, and depictions of violence against children. It also addresses the online distribution of content without consent and the incitement of harm through digital means.
Enforcement mechanisms
Enforcement tools include an Enforcement Committee led by a retired judge, an Appeal Tribunal, criminal prosecution, take-down notices, compliance notices, revocation of registration, and fines.
Online platforms
Popular social media platforms in South Africa include WhatsApp, Facebook, Instagram, TikTok, Twitter/X, YouTube and LinkedIn. Streaming services include Netflix, Showmax, DSTV Stream, Amazon Prime Video, Disney+, Apple TV+, VIU and MTN Cinemagic.
Online harms
The FPB monitors online platforms for harmful content such as child sexual abuse material (CSAM), child exploitation and violence, cyberbullying, revenge porn, deepfakes, bestiality, hate speech, and incitement to violence. Between April and December 2024, the FPB handled 18 CSAM cases involving 178 411 files, with 7 461 confirmed as CSAM. In 2025, two additional cases with over 40 000 images were ongoing, along with one large 2024 case of 137 415 images.
Public complaints
Since April 2024, the FPB has received 20 public complaints involving CSAM, impersonation, bestiality, and non-consensual sharing of intimate images.
Compliance & enforcement
Between April and December 2024, 574 inspections were conducted, resulting in 108 compliance notices, 14 take-down notices, 19 raids, and the confiscation of material valued at R1 million.
Online safety education & awareness
Between April and June 2024, 50 workshops were held, reaching 4 625 participants. From July to September, 88 workshops were conducted with 10 874 attendees. From August to December, 70 workshops reached 6 214 people. Audiences included parents, educators, learners, law enforcement, vulnerable groups, and community members.
Challenges
Challenges include distributing inappropriate content on social platforms, misaligning platform community standards with FPB guidelines, lack of access to dark web content, limited legislative powers, the emotionally taxing nature of CSAM analysis, and insufficient resources and funding for broader outreach and enforcement.
Counter measures
Measures include dedicated monitoring units, issuing take-down notices in cooperation with platforms, and running online safety education programmes. The FPB also works closely with law enforcement, civil society, and government departments, including the DCDT, the Department of Basic Education (DBE) and the Department of Social Development (DSD). It was also part of international safety networks such as the Association of Internet Hotline Providers (INHOPE) and the Global Online Safety Regulators Network. Legislative collaboration was ongoing to strengthen regulatory capabilities.
see attached for full presentation
Discussion
Mr N Pienaar (DA, Limpopo) opened by noting the importance of addressing the sexualisation of children and the harm to children.
He highlighted hate-based groups and asked if, for example, Julius Malema singing ‘kill the Boer’ fell under their purview. Would they investigate cases like that?
He asked who decided what was disinformation, as what was considered disinformation could change, and also who considered what was harmful. He commented that these issues could be abused.
He said there should have been more attention to deep fakes. He asked if the FPB investigated deep fakes, or if it fell under cartoons.
How did the FPB plan to regulate these platforms? How would it be able to regulate companies and align their standards with South Africa?
He wanted to know if WhatsApp content should be controlled or not. He said the FPB should not try to control the companies but rather the people who were participating in hate speech or harmful content.
Mr M Modise (ANC, Gauteng) said that the pace at which technology was progressing, and the exposure to harmful content, was scary. Were there any specific issues the Deputy Minister wanted the Committee to address?
Ms M Kennedy (EFF, Limpopo) noted her concern about learners using artificial intelligence (AI). She asked how many schools the FPB had reached to address this issue.
How did the FPB deal with shops that sold prohibited material? How were they regulating and preventing the sale of those materials?
Mr H van den Berg (FF+, Northern Cape) noted that the Department had not reported on the budget and how it was using its funding. He asked if it would be better to outsource digital media regulation to more advanced countries to monitor content. How did the FPB balance monitoring content and infringing on the privacy of citizens?
The Chairperson asked the Deputy Minister to justify the Department’s delay in legislation that would have given the FPB the full mandate to monitor online harms. She said that global platforms did not align with FPB standards, and asked why the Department was not creating binding legislation or memorandums of understanding (MoUs) to ensure compliance.
She questioned why the FPB’s mandate had been increased without the budget being adjusted to account for these added responsibilities. She referred to the large volumes of content of child sexual abuse, and asked what strategies were in place to prevent the distribution of this content. How was the efficacy of countermeasures evaluated?
She asked if the enforcement committee was appointed for a term, how long it took them to rule on offensive content and for that content to be taken down.
Responses
Mr Mondli Gungubele, Deputy Minister of Communications and Technologies, said he had been interested in seeing how the FPB would answer the question on Julius Malema. He noted that the courts had ruled on the matter.
He said that many resources needed to go towards creating foundational structures that would ensure more people had access to education on technology.
Ms Jordan-Dyani said she would invite the Committee to meetings with the Minister and the Department so that the Members could hear the challenges that other countries were facing.
She explained that the rights a person had offline were the same rights that a person should have online. This was balanced with allowing people to express their views. She said the FPB was working with the Information Regulator and the Domain Name Regulator to take down harmful content.
When they introduced the information communication technology (ICT) White Paper in 2016, they had hoped that there would be one regulator. An Mou was created between the regulators to try to ensure that they were working together where their mandates overlapped.
She said that national laws took precedence when working with multinational companies.
She said the audio and audiovisual policy, which would regulate platform providers, would be finalised this year.
She said that citizens had a responsibility, and the FPB was trying to encourage citizens to be aware of their rights and responsibilities online and the content they put out. It was trying to educate children and parents not to share personal information online so that they were not exploited. She said that the Protection of Personal Information Act (POPIA) made it difficult to share information on people who were exploiting others.
She said there was a new branch in the Department that dealt with media and content, specifically content classification and online safety.
She responded to the question on outsourcing, and said that the country should capacitate people in this area so that they could adequately respond to the issues discussed.
Ms Mkosi said that legislative changes took time, and that technology changed faster than legislation could. This required legislation to be agile. She added that legislation should not hinder freedoms of consumption and speech. Content needed to be classified so that proper warnings could be placed on the content.
Mr Tlhako explained that the courts and the Human Rights Commission (HRC) had more to do with the Julius Malema issue. Any issues regarding freedom of speech were governed by the Bill of Rights. He added that freedom of expression had certain limitations.
On the question of who decided what harmful content was, he said that the Act defined what was considered harmful. He agreed that it was difficult to determine what was real or not. Users needed to be vigilant, and the FPB was trying to educate people to be aware of deepfakes.
He said that the FPB did not dictate to platforms what guidelines they must use, but tried to have the guidelines aligned across different platforms.
He said the FPB worked with the South African Police Service (SAPS) to confiscate illegal material sold by hawkers. If child sexual content was found, criminal charges were laid. He said their licence to sell was taken away, and they had to reapply for it if illegal material was found.
There was a target number of schools that the FPB tried to reach so that children could be educated about online content. They spent most of their time at rural schools.
The FPB tried to find a balance between freedom of expression and monitoring online content.
Mr Tlhako referred to the budget, and said the FPB was looking at having self-generated revenue. They were trying to increase this percentage, which was currently about 50%. He emphasised that the FPB was working to increase its cybersecurity skills so that the FPB could be proactive and prevent the distribution of harmful content.
Adv Makhosazana Lindhorst, Acting Executive: Technology and Platform Monitoring and Regulatory Development and Enforcement, FPB, said that members of the enforcement committee were appointed for five years. The committee needed to give a judgment within 45 days.
She said that the regulators were trying to ensure that South Africans were protected and that they provided guidelines to educate people on how to identify harmful content and how to conduct themselves online. The FPB was working on a guideline to explain what was considered harmful content.
She said that Julius Malema singing ‘kill the Boer’ did not fall under the FPB’s purview. They could investigate content online only, and after there had been a complaint.
Deputy Minister Gungubele raised the need for flexible policies and legislation, as technology was always evolving.
Further discussion
The Chairperson asked what safeguards were in place to stop the FPB from being used as a political tool. Could they provide evidence that they applied regulations equally across political, religious and cultural views?
Mr Pienaar questioned what was considered disinformation or misinformation.
Mr Tlhako explained that the regulation committee was chaired by a judge, and the appeals committee was an independent board that adjudicated decisions to ensure that the FPB was not used as a political tool.
With regard to misinformation, the FPB intervened if the content was seen to be harmful, as defined by the Act.
Adv Lindhorst emphasised that the FPB was there to balance the right to freedom of expression. It was not there to censor people, but to protect them.
The Chairperson indicated that she would send follow-up questions to the FPB.
SAPO briefing
The Chairperson said that the business rescue practitioners (BRPs) of the South African Post Office (SAPO) had sent a letter to her stating that they would not be available, and would be sending other representatives in their stead. They had asked the Minister for a deferment, but this was denied, as it would have been the second time they had requested a deferment. She said they had had four months to prepare, and felt that the BRPs were shirking their responsibilities.
Mr Pienaar asked what the reason was for the BRPs not attending.
The Chairperson shared the letter, which gave no reason for their non-attendance, with the Committee.
Mr P Mabilo (ANC, Northern Cape) was concerned that the BRPs were not present, and commented that the Committee had been lenient with them. He said the Committee should subpoena SAPO so that they appeared before them. He said that it would not be fair to engage with their representatives.
Mr Van den Berg noted that SAPO was asking for R2.5 billion in additional funds. He did not think engaging with the representatives would be valuable.
Mr B Farmer (PA, Western Cape) agreed with the other Members' views on the matter.
The Chairperson said she would engage with Parliament's lega team to discuss the legal route the Committee could take to have SAPO appear before them. She expressed disappointment with the BRPs for not being in attendance.
Deputy Minister Gungubele said he respected the Committee’s decision on this matter, and would support them.
Committee minutes
The Committee reviewed the minutes from its meeting on 26 March. Mr Pienaar moved their adoption, and Mr Farmer seconded.
The minutes were duly adopted.
The meeting was adjourned
Audio
No related
Documents
Present
-
Boshoff, Ms SH Chairperson
DA -
Farmer, Mr B
PA -
Gungubele, Mr M
ANC -
Kennedy, Ms M
EFF -
Mabilo, Mr SP
ANC -
Modise, Mr MG
ANC -
Mokoena, Ms SM
MKP -
Ndlangisa-Makaula, Ms MB
ANC -
Pienaar, Mr N H
DA -
Sithole, Ms SL
ANC -
Van den Berg, Mr H
FF+
Download as PDF
You can download this page as a PDF using your browser's print functionality. Click on the "Print" button below and select the "PDF" option under destinations/printers.
See detailed instructions for your browser here.