Approved by the Federal Senate on June 30 and still pending consideration by the House of Representatives, Bill of Law No. 2,630/2020 establishes the Brazilian Law on Freedom, Responsibility, and Transparency on the Internet, nicknamed the "Fake News Law". The text proposes to define "standards, guidelines, and transparency mechanisms" for social networks and private messaging platforms, but ends up establishing several obligations for providers, who will have new roles and responsibilities in their operations and in the moderation of content.
Placed in a specific political and social context, the Fake News Bill of Law emerges as an attempt to stop the dissemination of fake news on the internet and to mitigate its impacts in the social, electoral, and public health spheres. However, the approved draft does not contain a definition of the term "fake news", directing its focus on inauthentic behavior of users accounts on social networks and on transparency regarding paid content, which will be moderated by application providers.
Even though the purpose of the law is legitimate, these moderation practices raise a number of concerns, among them: the extension of liability of application providers and the risk of censorship and violation of users' rights to information, freedom of expression, and privacy.
In this analysis, we highlight some of the obligations imposed by the Bill of Law on application providers and possible effects thereof.
Measures regarding accountability and transparency in the use of social networks
The Bill of Law establishes several measures that providers should adopt to protect freedom of expression and access to information, among which we highlight those related to (i) prohibition of inauthentic accounts; (ii) prohibition of automated accounts not identified as such; and (iii) obligation to identify promotional and advertising content.
- Inauthentic accounts are defined by the Bill of Law as profiles that assume or simulate the identity of third parties in order to deceive the public, with the exception of accounts that are explicitly humorous or parodies, as well as those that identify corporate names or pseudonyms. According to the Bill of Law, providers should adopt measures, within the scope and limits of their service, to prohibit the operation of such accounts with a view to protecting freedom of expression and the right of access to information, as well as promoting the free flow of ideas on the internet.
- The Bill of Law also provides that social networks should require the identification of automated accounts, which is to say, accounts managed by technology that simulates or replaces human activities in the distribution of content on social networks, popularly known as "robot accounts". If the automated account has not been identified as such to the providers or the public, the providers may request confirmation of identification by presenting a valid identity document, under penalty of deletion of the account..
In addition, the Bill of Law imposes on providers the responsibility to develop mechanisms to "detect fraud in the registration and use of accounts in breach of the legislation", as well as to track and control the behavior of automated accounts in order to confirm their authenticity.
- Finally, the Bill of Law provides that providers must identify promotional and advertising content in a prominent manner for the users, including when later shared, forwarded, or passed on. The identification of the promotional and advertising content must contain information regarding the account responsible for the action or the advertiser, including contact information.
The obligation to maintain this identification can be quite complex, since such content, even if initially identified, can be modified by users when the content is shared, making it difficult to control the changes and dissemination of the modified content.
In addition, the obligation to provide contact information of the responsible person or the advertiser to the users who request it, in a broad and undefined manner, may give rise to abusive, whith no legitimate interest, as well as cause damage to the privacy and freedom of expression of its owners.
Moderation and transparency proceedings
Also under the justification of guaranteeing the right of access to information and freedom of expression for users of social networks, the Bill of Law establishes "moderation proceedings" that must be complied by the providers.
There are situations, however, that would dismiss the notification of the user, such as in the case of violation of children's and adolescents' rights or if there is a risk of "immediate damage that would be difficult to repair". In such cases, it would be incumbent on the provider to make the account or content unavailable, without the need to notify users.
The Bill of Law further establishes that the provider shall be responsible for repairing, within the scope and technical limits of the services, any damage arising from erroneous moderation. However, it is not clear what form this remedy would take (whether it would only correspond to make the content mistakenly removed available again or whether it would include, for example, indemnification for moral and material damages), which leaves room for expansion of the providers’ liability.
The operations of the providers, as established in the Bill of Law, must be demonstrated based on the preparation of a report concerning the measures for moderation of content and accounts applied by the provider to be published quarterly. This report must contain information such as the grounds and methodology used to detect irregularities and the type of measures taken.
In addition to being released to the public, the report may be evaluated by the Internet Transparency and Accountability Board, a body that will be responsible for monitoring the measures established by the Bill of Law and suggesting guidelines on the subject.
Extension of providers’ civil liability and risks to users’ freedom of expression and right to information
Currently, the civil liability of application providers for content generated by third parties is regulated by Law No. 12,965/2014, popularly known as the Brazilian Civil Rights Framework for the Internet. According to this law, application providers may only be held civilly liable for damages arising from content generated by third parties if, after a specific court order, they have not taken the applicable measures to make unavailable the content indicated as infringing.
That is, under current legislation, application providers do not have the obligation to previously inspect content generated by third parties. They are only obliged to make such content unavailable if they receive a specific court order determining such removal.
- The first is the expansion of the platforms' civil liability, as they will now be held liable, irrespective of a court order, both for infringing content that they fail to remove and for content that has been mistakenly made unavailable. This expansion is quite problematic because it imposes a disproportionate burden on application providers, which may end up making their activities unfeasible.
- The second is the risk of censorship and violation of users' rights to information and freedom of expression, insofar as it gives application providers the power to determine what may or may not be published on their social networks, which today are one of the main means of communication.
Although the purposes of Bill of Law No. 2,630/2020 are laudable, the way it is proposed to combat the dissemination of fake news is quite problematic, as it ends up legitimizing situations of violation of the users’ rights to information, freedom of expression, and privacy perpetrated by private agents and, at the same time, it also increases the liability of these agents, which may render their activities unfeasible.
The fight against the misinformation aimed at by the Fake News Law is urgent, but hasty regulations can have serious consequences for internet freedom. As seen, Bill of Law No. 2,630/2020 is complex and encompasses sensitive topics that need to be discussed from different perspectives with civil society, in a manner similar to the one that occurred during the discussions regarding the Brazilian Civil Rights Framework for the Internet.
Society, as the most interested party, should be invited to participate in the discussions regarding the Bill of Law, helping the Legislature find alternatives to the management and moderation of content by providers, with the aim of guaranteeing fundamental rights in the virtual world, especially freedom of expression and right to information, the prohibition of censorship, and the protection of privacy.
 The term "social network" is defined in article 5, VIII, of Bill of Law No. 2,630/2020 as "an internet application that is designed to connect users to each other, allowing, and having as the center of activity, communication, sharing, and dissemination of content in the same information system, through accounts connected or accessible to each other in a connected manner."
 The term "private messaging service" is defined in article 5, IX, of Bill of Law No. 2,630/2020 as an “internet application that makes it possible to send messages to certain and determined recipients, including those protected by end-to-end encryption, so that only the sender and recipient of the message have access to its exclusive content, excluding those primarily intended for corporate use and electronic mail services.
 Bill of Law No. 2,630/2020 does not define the term "social network and private messaging service providers" (only the terms "social network" and "private messaging service" in article 5, subsections VIII and IX). From the wording proposed in the Bill of Law, we believe that the term “providers of social networks and private messaging services" may be treated as a synonym of application providers, defined by the Brazilian Civil Rights Framework for the Internet (Law No. 12,965/2014). However, the law is only applicable to application providers whose platforms have at least two million registered users (article 1, paragraph 1, of Bill of Law No. 2,630/2020).