The general public has expressed its distrust for the NITDA’s Code of Practice (Code) for interactive computer service platforms/ internet intermediaries. Specifically, the public has expressed concerns that the Code is yet another attempt by the Nigerian Government to regulate Social Media and curtail free speech.
In this article, our Davidson Oturu and Agboola Mubarak Dosunmu examine the Code and highlight provisions that should be addressed while making a comparison with laws in other jurisdictions.
On 13th June 2022, the National Information Technology Development Agency (NITDA) issued the draft Code of Practice for Interactive Computer Service Platforms/Internet Intermediaries (“the Code”).
The objectives of the Code include setting out best practices for Platforms and making the digital ecosystem safer for Nigerians and non-Nigerians in Nigeria. The Code is also expected to set out measures to combat harmful online information and adopt a co-regulatory approach toward implementation and compliance. The Code thereafter sets out provisions across six parts to achieve these objectives.
According to NITDA, the Code was developed in collaboration with the Nigerian Communications Commission (NCC) and the Nigerian Broadcasting Commission (NBC), with input from “interactive computer platforms” such as Twitter, Facebook, WhatsApp, Instagram, Google, and TikTok. NITDA further stated that the Code is aimed at “protecting the fundamental human rights of Nigerians and non-Nigerians living in the country, as well as defining guidelines for interacting in the digital ecosystem”. 
As expected, Nigerians have been distrustful of the Code, with many concluding that it is an attempt by the Nigerian Government to regulate social media and quash freedom of expression. This is understandable considering the antecedents of the Nigerian Government when it comes to its posturing regarding social media Platforms.
For instance, in 2019, there was the Social Media Bill that was before the National Assembly, by which the Government was exploring ways of curbing the perceived excesses of social media users. That ill-fated Bill was closely followed by the Prohibition of Hate Speeches Bill (“Hate Speech Bill”). When the public outcry regarding both Bills became resounding, they were stepped down.
However, in 2021, following Twitter’s deletion of a Tweet posted by the President of the Federal Republic of Nigeria, Twitter was banned for several months, with users in Nigeria unable to directly access the Platform and many users resorting to using Virtual Private Networks (VPNs) to access the microblogging site.
In that same year, there was an attempt by the Nigerian Government to amend the National Broadcasting Commission Act to empower the NBC regulate social media Platforms.
It is against this background that we examine the provisions of the Code and determine if it is indeed a tool designed to restrict free speech in Nigeria.
WHAT ENTITIES ARE AFFECTED BY THE CODE?
It is pertinent to determine early enough who would be affected by the Code.
The following entities are expected to comply with the Code:
- Interactive Computer Service Platforms – the Code defines these as “any electronic medium or site where services are provided by means of a computer resource and on-demand and where users create, upload, share, disseminate, modify, or access information, including websites that provide reviews, gaming Platforms, and online sites for conducting commercial transactions".
The inference drawn from this definition of interactive computer service Platforms is that it would cover Platforms such as companies’ websites, fintechs, gaming companies, edtechs, healthtechs, e-commerce Platforms, social media Platforms and other service providers that offer goods and services through their Platforms.
- Internet Intermediary defined in the Code as including, “but not limited to, social media operators, websites, blogs, media sharing websites, online discussion forums, streaming Platforms, and other similar oriented intermediaries where services are either enabled or provided and transactions are conducted and where users can create, read, engage, upload, share, disseminate, modify, or access information“.
This definition captures a number of companies already covered under interactive computer service Platforms. It includes streaming Platforms (like Netflix, YouTube, etc.), social media Platforms, internet service providers, e-commerce intermediaries, fintechs, etc. Indeed, both Interactive Computer Service Platforms and Internet Intermediaries are referred to as a “Platform” under the Code.
- Large Service Platforms (Large Platforms) – defined as “an Interactive Computer Service Platform/Internet Intermediary whose users are more than one hundred thousand (100, 000)“.
This simple definition indicates that Platforms and Intermediaries (collectively referred to in this article as “Platforms”) with more than one hundred thousand users are classified as Large Platforms.
The Code contains some commendable provisions such as the provision mandating the removal of non-consensual sensual contents, provisions addressing contents harmful to a child, provisions introducing a notice-and-take-down regime and provisions concerning Platform rules.
Item 1 of Part II also promotes equal distribution of information for Nigerian users.
CONCERNS WITH THE CODE
The Code contains certain provisions that may be used by an abusive government to curtail/infringe free speech. Indeed, the three major areas of concern under the Code with respect to restricting free speech are the provisions allowing the Government to order the removal of a content, the provisions mandating Platforms to proactively remove false information likely to cause public disorder and the provisions requiring local incorporation of Platforms. The major areas of concern and other notable areas are examined below.
Mandatory incorporation of foreign Platforms
The Code imposes additional obligations on Large Platforms which include an obligation to be incorporated in Nigeria, have a physical contact address in Nigeria and appoint a Liaison officer for communications with the Government.
It is likely that Large Platforms that do not carry out the obligations set out above would be prevented from operating in Nigeria.
The first problem with this position is the sheer number of Platforms that will be classified as Large Platforms. This is because the 100,000 user threshold is extremely low and can be contrasted with the 45 million active monthly users within jurisdiction threshold for Very Large Online Platform under the proposed EU Framework.
Similarly, it is not compulsory for Platforms to be locally incorporated or have local addresses under the DSA as an appointment of a Legal Representative typically suffices. It should also be noted that under certain situations, NITDA may also require a Platform with less than one hundred thousand users to comply with the obligations of a Large Platform.
Takedown of content
A Platform is required to take-down a content within 24 hours of receiving notice from an Authorised Government Agency (“Agency”) of the presence of an unlawful content. Unlawful content is defined under the code to mean any content that violates an existing law in Nigeria.
However, the problem is that the Agency is not required to specify how or why a content is unlawful, and the Platform is not given the time/avenue to verify the unlawfulness of the content -particularly where it is unclear whether the content is in fact unlawful.
The position under the Code can be contrasted with the position under the German Network Enforcement Act where content must be manifestly unlawful and the position under the French “Lutte contre la haine sur internet” (“Fighting hate on the internet”) law where the content must be patently illegal before a takedown within 24 hours is required.
In addition, under the DSA, the relevant Agency/Court giving an order to take down a content is required to, among others, provide a statement of the reasons explaining why the information is illegal by reference to the specific provision of the law infringed.
Removal of false information likely to cause public disorder
In addition to creating a general obligation to monitor, this provision in the Code creates a multitude of vagueness which can be easily exploited by an abusive Government as the Code neither defines false information nor public disorder. Consequently, for example, Platforms like Twitter, Instagram etc. may be sanctioned if they fail to proactively take down posts related to the shootings at Lekki and other similar incidents.
This position can be contrasted with other frameworks like the EU E-Commerce Directives and the DSA where States are prevented from imposing a general obligation to monitor as well as countless case laws requiring that any legislation attempting to restrict free speech must be sufficiently precise to enable the citizen to regulate his conduct: he must be able – if need be with appropriate advice – to foresee, to a degree that is reasonable in the circumstances, the consequences which a given action may entail.
Indeed, there are other areas of concern under the Code. For example, the Code does not contain provisions for reviewing contents taken down. In addition to taking down an unlawful content, the Code also requires all Platforms to take all reasonable steps to ensure that such content stays down. Consequently, it is conceivable that an Agency can claim that a content is unlawful and the Platform will be forced to not only take down the content, but to also take steps to prevent the content from resurfacing. The only option available to affected Nigerians will be to approach the Court for a declaratory judgement that the content was not in fact unlawful.
Similarly, the definition of prohibited material to mean content or information objectionable on the grounds of public interest, morality, order, security, peace, or is otherwise prohibited by applicable Nigerian laws is another potential window of abuse. For purposes of clarity, the mere fact that a content is objectionable (without being unlawful) should not be enough ground to remove such content especially in light of how loose the term “objectionable” can be interpreted.
In addition, the absence of an internal complaint-handling system under the Code can be contrasted with the framework under the DSA where Users can lodge a complaint against the decision to remove a content or suspend a User.
Finally, it appears the Government is trying to regulate five different categories of information namely misinformation, disinformation, harmful content, unlawful content and prohibited material. It is therefore conceivable that an abusive Government will be able to fit in an unfavourable content into at least one of these five categories and consequently take steps to remove the content.
Amongst many other recommendations, we suggest that the negative provisions identified above be amended to introduce safeguards or outrightly removed to obviate the possibility of abuse by an excessive government and introduce a measure of protection for the general public and the affected Platforms.
We further recommend the following:
- all provisions imposing general monitoring obligations be removed;
- the introduction of a notice-and-action framework; and
- the introduction of a suspension mechanism for persons/Agencies who are shown to frequently submit notices or complaints that are manifestly unfounded.
We also recommend that time be taken to study how other jurisdictions are able to effectively manage social media and hate speech so that we do not have a Code that does more harm than good to the digital ecosystem.
 NITDA, “Code of practice for interactive computer service Platforms/ internet intermediaries” available at https://nitda.gov.ng/wp-content/uploads/2022/06/Code-of-Practice.pdf accessed 21 June 2022
 NITDA, “Press Release” available at https://twitter.com/NITDANigeria/status/1536392359977664512?s=20&t=O76Ofq9GPDtcFgkpJ5Y0gg accessed 21 June 2022
 Timi Odueso, “Nigeria seeks to regulate social media with new NITDA Code of Practice” available at https://techcabal.com/2022/06/14/nigeria-seeks-to-regulate-social-media-with-new-nitda-code-of-practice/ accessed 21 June 2022
 Social Media Bill available at https://guardian.ng/wp-content/uploads/2019/11/Protection-from-Internet-Falsehood-and-Manipulation-Bill-2019.pdf accessed 21 June 2022
 Hate Speech Bill available at https://www.movedemocracy.org/wp-content/uploads/2020/10/Hate-Speech-Bill.pdf accessed 21 June 2022
 AlJazeera, “Nigeria ends its Twitter ban after seven months” available at https://www.aljazeera.com/economy/2022/1/12/nigeria-ends-its-twitter-ban-after-seven-months accessed 21 June 2022
 PLAC, “The National Broadcasting Cooperation (NBC) Act (Amendment) Bill” available at https://placng.org/i/tag/the-national-broadcasting-cooperation-nbc-act-amendment-bill/ accessed 212 June 2022
 Part I(4) of the Code
 Part I(5) of the Code, Part II(2 & 3) of the Code and Part V(7) of the Code
 Part II (1, 3 & 8) of the Code
 Part III of the Code
 Article 25(1) of the Proposal for A Regulation Of The European Parliament And Of The Council On A Single Market for Digital Services (Digital Services Act) And Amending Directive 2000/31/Ec (“DSA”)
 Article 11 DSA
 Classified under the Code as NITDA, NBC, NCC or any agency authorised by its enabling law.
 Part I(3) of the Code provides that: All Interactive Computer Service Platforms/Internet Intermediaries (Platform) shall Act expeditiously upon receiving a notice from a user, or an authorised government agency of the presence of an unlawful content on its Platform. A Platform must acknowledge the receipt of the complaint and take down the content within 24 hours
 Article 8(2) DSA
 Part V(7) of the Code provides that: All Platforms shall Where a false information is likely to cause violence, public disorder, or exploitation of a child, the Platform shall caution the publisher and remove the content as soon as reasonably practicable
 Recital 47 and Article 15 Directive 2000/31/EC (“E-Commerce Directive”)
 Article 7 DSA
 See generally The Sunday Times v. United Kingdom 6538/74; CASE OF WINGROVE v. THE UNITED KINGDOM (Application no. 17419/90)
 Part I(6) of the Code provides that: All Interactive Computer Service Platforms/Internet Intermediaries (Platform) shall: Exercise due diligence to ensure that no unlawful content is uploaded to their Platform. Where a Platform receives a notice from a user or any authorised government agency that an unlawful content has been uploaded, such Platform is required to take it down and ensure it stays down. No liability shall be incurred by a Platform where such Platform has taken all reasonable steps to ensure that an unlawful content is taken or stays down.
 Part IV of the Code provides that: A Platform shall not continue to keep prohibited materials or make them available for access when they are informed of such materials. Prohibited material is that which is objectionable on the grounds of public interest, morality, order, security, peace, or is otherwise prohibited by applicable Nigerian laws.
 Unintentional dissemination of false information
 Verifiably false or misleading information that, cumulatively, is created, presented, and disseminated for economic gain or to deceive the public intentionally and that may cause public harm
 Content which is not unlawful but harmful
 Any content that violates an existing law in Nigeria
 Content or information objectionable on the grounds of public interest, morality, order, security, peace, or is otherwise prohibited by applicable Nigerian laws
Read the full publication at ǼLEX.