In a joint judgment of Theme 533 and Theme 987 of general repercussion, the Federal Supreme Court (STF) decided, on June 26, that article 19 of the Civil Rights Framework for the Internet (Law 12,965/14) is partially unconstitutional. The article conditions the civil liability of providers to the existence of a prior court order for the removal of contents generated by third parties.

As discussed in a previous article, the Plenary had already formed a majority for unconstitutionality on June 12, by a score of 7 x 1, with disagreements on the scope of unconstitutionality. 

At the resumption of the trial, on the 25th of the same month, Minister Cármen Lúcia, in favor of the unconstitutionality, and Minister Edson Fachin, in the opposite direction, voted. The next day, Minister Nunes Marques delivered his vote, which followed the understanding of Minister André Mendonça.

This last vote, although favorable to the constitutionality of the decree, consolidated the decision: article 19 of the Civil Rights Framework for the Internet was considered partially unconstitutional by the STF.

With a score of 8x3, the judgment, therefore, established a new paradigm: the platforms will have to act with higher agility and responsibility to prevent the circulation of illicit contents, under penalty of being civilly liable, even without judicial provocation. At the end of the trial, the Federal Supreme Court established a new thesis on the liability of platforms for third-party content, which will guide the issue throughout the country.

The liability regime for digital platforms in Brazil, redefined by the Supreme Court, institutes new parameters for the protection of fundamental rights in the virtual environment and imposes stricter duties on technology companies.

New thesis established

Get to know the main points of the thesis established by the STF in the judgment.

  • Article 19 of the Brazilian Civil Rights Framework for the Internet (Law 12.965/14), which required a specific court order for the civil liability of internet application providers for damages arising from contents generated by third parties, was declared partially unconstitutional. The STF recognized that the general rule of article 19 does not provide sufficient protection to constitutional legal assets of high relevance, such as the protection of fundamental rights and democracy.
  • Until a new legislation is enacted, internet application providers are subject to civil liability, except for the application of the specific provisions of the electoral laws and regulations issued by the TSE.
  • The internet application provider will be held civilly liable, in agreement with article 21 of the Brazilian Civil Rights Framework for the Internet for damages arising from content generated by third parties in cases of crime or illicit acts without exempting them from the duty to remove the content. The same rule applies to accounts reported as unauthentic.
  • In crimes against honor, article 19 of the Brazilian Civil Rights Framework for the Internet applies, without prejudice to the possibility of removal by extrajudicial notification.
  • In the event of successive replications of an offensive act already recognized by court ruling, all social media providers must remove publications with identical content, notwithstanding of new court decisions, from judicial or extrajudicial notification.
  • Providers are presumed liable for illegal content in the case of paid ads and boosts or artificial distribution networks (chatbots or robots), and may be held liable regardless of notification. To avoid liability, providers must prove that they acted diligently and in a reasonable time to remove the contents from the air.
  • The internet application provider is liable if it does not immediately make unavailable content that reproduces the commission of serious crimes, such as anti-democratic conduct and acts, terrorism, inducing or instigating suicide, incitement to discrimination, crimes against women, sexual crimes against vulnerable people and human trafficking.
  • Providers liability in these cases is related to the configuration of systemic failure, characterized by the absence of adequate measures to prevent or remove illegal content. The existence of an unlawful contents of isolated format is not sufficient to lead to civil liability, and in this case, article 21 of the Brazilian Civil Rights Framework for the Internet applies.
  • The responsible party may require for the return of the post in court, as long as the absence of the unlawful act is demonstrated, without the right to compensation by the provider.
  • Article 19 of the Brazilian Civil Rights Framework for the Internet remains applicable to providers of e-mail services, applications for closed meetings by video or voice, and private messaging, exclusively with regard to personal communications, safeguarding the confidentiality of communication.
  • Internet providers that operate as a marketplace are civilly liable under the Consumer Defense Code.
  • Providers must issue self-regulation covering a notification system, due process and annual rectitude reports, in addition to providing accessible and public service channels.
  • The rules should be routinely reviewed in a transparent fashion.
  • Providers operating in Brazil must keep a headquarter and representative in the country, with authority to respond administratively and judicially, provide information to the authorities and comply with judicial determinations.
  • There will be no strict liability in the application of the announced thesis.
  • The Supreme Court appealed to the National Congress to draft legislation capable of eliminating the deficiencies of the Civil Rights Framework for the Internet in relation to the protection of fundamental rights.
  • The modulation of the effects of the decision provides for prospective application, except for decisions that have already become final.

How the Ministries voted

The minister Luís Roberto Barroso argued that article 19, by requiring a court order for the removal of content, does not adequately protect interests of great relevance, such as the protection of fundamental rights and democracy.

For him, flexibilization is necessary: crimes against honor and civil offenses would continue to depend on a judicial decision, while other crimes could be removed after extrajudicial notification. Paid ads and boosts, on the other hand, would generate a presumption of knowledge of the illicit by the platforms.

Minister Barroso proposed the adoption of a dual model, subjective liability of platforms (i.e., depends on fault or willful misconduct) combined with a duty of care to prevent and mitigate systemic risks, especially in relation to extraordinarily harmful content. In addition, he suggested measures such as reporting channels, due process and rectitude reports.

The minister Gilmar Mendes went further, stating that the platforms have already ceased to be "mere conductors" to become regulators of the public speech. He considered article 19 structurally insufficient and proposed an upgraded regime of joint and several liability for platforms, for cases of non-immediate removal of serious content (e.g., child pornography, inducement or accessory to suicide or self-mutilation, human trafficking, terrorist acts, violent abolition of the Democratic Rule of Law, as well as hate speech and extremist ideologies). The minister also defended the creation of a specialized regulatory body and a robust set of procedural obligations.

On the other hand, Minister André Mendonça upheld the full constitutionality of article 19. He argued that the removal or suspension of user profiles is unconstitutional, except when provenly false, automated or aimed at committing serious crimes.

For him, freedom of expression must have position, rank, status, standing prime and it is up to the Judiciary, not the platforms, to arbitrate conflicts of rights. In addition, in compliance with due process of law, the judicial decision ordering the removal of content should provide specific grounds and be accessible to the platform responsible for compliance, giving it the opportunity to challenge it. Minister Mendonça rejected the idea of preliminary monitoring, but admitted the need for transparency, reporting channels and defended self-regulation under judicial supervision.

Minister Edson Fachin also defended the constitutionality of article 19 of the Brazilian Civil Rights Framework for the Internet, but partially diverged from Minister André Mendonça by not adhering to the additional obligations suggested by him. The minister stressed the importance that the removal of contents continues to depend on a court order, as provided for in the Civil Framework, and proposed the thesis that providers that only offer data access, search and storage, warehousing services without interfering with the contents They can only be held liable if, after a specific court order, they do not take the necessary measures to make the contents pointed out as infringing unavailable, within the technical limits and the deadline indicated.

Minister Flávio Dino recognized a partial omission in article 19 and, in general terms, followed Minister Barroso, but with additions: the provider would respond in case of systemic failure, including involving artificial intelligence. Minister Dino suggested that, if a content is removed due to the platform's duty of care, the plaintiff can request reinstatement in court without the right for compensation. He also proposed that mandatory self-regulation rules be monitored by the Attorney General's Office until a specific law is approved.

Minister Cristiano Zanin also adhere to partial unconstitutionality but makes a distinction between "neutral" providers – which only host contents – and "active" providers. The latter, which promote or boost contents, would be subject to stricter rules, such as liability after extrajudicial notification and the requirement to have artificial intelligence mechanisms combined with human review.

Minister Zanin also defended the digital education of users and the application of the new rules only for the future, that is, the new rules should be applied only to facts that occurred after the final and unappealable decision, ensuring legal certainty.

Minister Dias Toffoli adopted the most radical position: for him, article 19 should be considered totally unconstitutional. For the minister, the rule confers an excess of immunity to digital platforms, which perpetuates the dissemination of harmful content in the virtual environment. Until a new law is approved, extrajudicial notification should be enough to characterize unequivocal knowledge and generate liability of the platforms, for violations of honor, image and intimacy.

The minister Toffoli expanded the range of situations in which liability is strict, including advice activities, boosting and even marketplace that offer prohibited products. He also proposed a list of content that requires immediate removal, such as terrorism, violence against children and hate speech, in addition to suggesting a "Decalogue against digital violence" and demanded from the Legislative a new public policy in 18 months.

Minister Luiz Fux, in turn, acknowledged that there is a deficit of protection. For him, the restrictive conditions of liability to a specific court order is undue, however, he argued that the liability of platforms for damage to honor, image and privacy should occur after a reasoned notification.

In addition, in relation to evidently illegal content, the minister argued that there is a duty of assets monitoring on the parties of companies and that, in cases of paid boost, there is a presumption of knowledge of the illicit. Minister Fux also thinks it is necessary to have confidential and functional reporting channels.

The minister Alexandre de Moraes defended that digital platforms, social networks and private messaging services be equated with other communication ways until there is a new regulation by the National Congress, considering that they are no longer simple repositories and have become active agents in the dissemination of information never plural.

For Minister Moraes, in the face of technological innovations, the massive use of artificial intelligence, and the social striking, impact of these platforms, there is no longer room for neutrality: digital platforms, social networks, and private messaging services must be held civilly and administratively responsible for content driven by algorithms, paid publicity, and omission in the face of unlawful acts.

The minister also highlighted the need for all networks and messaging services operating in Brazil to have headquarters or legal representative in the country. He stated that the self-regulation of the platforms proved to be insufficient and cited as an example the call for the anti-democratic acts of January 8, which occurred without effective intervention by the companies.

Minister Cármen Lúcia took a stand for the expansion of the liability of social networks for content published by users, thus defending the unconstitutionality of article 19 of the Civil Rights Framework for the Internet.

She pointed out that the digital landscape has changed a lot since the sanction of the law and that social networks concentrate power over the circulation of information never plural, operating with algorithms whose logic is not accessible to the public. For the minister, the accountability of platforms needs to reflect these transformations and be applied format equivalent to other contexts of damage and risk.

Last to vote, Minister Nunes Marques followed the minority and voted for the constitutionality of article 19, maintaining the principle that it is necessary to have a judicial decision before holding social networks responsible for publications by their users.