The Federal Supreme Court (STF) is experiencing a decisive chapter for the future of the liability of digital platforms for improper third-party contents in Brazil. In a joint judgment of Theme 533 and Theme 987 of general repercussion, the Supreme Court evaluates whether article 19 of the Civil Rights Framework for the Internet (Law 12,965/14) — which conditions the civil liability of providers to a prior order to remove improper  contents — is compatible with the Brazilian Constitution.

In practice, it is discussed whether technology companies can be held liable for third-party publications when they fail to act in a timely manner, even without a court order, in the face of clearly illicit content. It is also under discussion whether it is up to the platforms to take care of illegal content

Theme 533 has as its backdrop the Extraordinary Appeal 1.057.258, reported by minister Luiz Fux, in which a digital platform contests a decision that held it responsible for not removing offensive publications from social media, in a reasonable time.

The Attorney General's Office (PGR), in opining for the dismissal of the appeal, maintained that there is no duty of preliminary control of the posts, but that, even before the effectiveness of the Civil Framework, offenses should be excluded at the simple request of the offended party. After 2014, the PGR added, a duty of diligence remains capable to call for the immediate removal of content that is known to be criminal or based on notoriously false facts, notwithstanding a court order.

The Attorney General's Office of Federal Government (AGU), in a similar vein, argued that the removal of illegal material does not require a court order or formal notification. It also argued that this duty extends to the use of artificial intelligence by platforms.

Theme 987, in turn, stems from Extraordinary Appeal 1.037.396, reported by Minister Dias Toffoli, mediator by a social network that pleads for the full validity of article 19 of the Civil Framework.

In this process, the PGR also took a position against the approval of the appeal – therefore, against the thesis that the provision is fully constitutional – in line with the understanding that extrajudicial notifications are enough to generate liability, if the platform does not act after being alerted.

The AGU, while recognizing the need to interpret article acknowledged and agreed jointly with the Constitution, agrees with possibility of holding companies liable without court order liable, invoking the duty of precaution of contents hostesses.

The sum of the debate attracted several amicus curiae. On the one hand, entities such as the Brazilian Association of Radio and Television Broadcasters, the São Paulo Public Ministry and consumer protection institutes defend the unconstitutionality of article 19, arguing that it imposes barriers to access to justice and perpetuates damages. On the other hand, giants of the digital economy sustain the need to hold the current rule to protect freedom of expression and to obtain proportional moderation.

Specialized organizations — the Wikimedia Foundation, the Alana Institute, the São Paulo Lawyers Institute (IASP) — brought nuances, defending models that combine duties of diligence, expansion of exceptions for objectively illicit content, and reinforced protection for children and adolescents.

With this backdrop, the Plenary resumed the trial on June 12. With its start in the morning, the session extended throughout the day, with important votes delivered by several ministers. A majority was consolidated for the modification of the current regime of accountability of platforms. There are already seven votes for the unconstitutionality of article 19 of the Brazilian Civil Rights Framework for the Internet, whether partial or total, with divergences on the scope of this unconstitutionality.

Among these votes, some ministers defend partial unconstitutionality and propose the replacement of the regime of "court order as a rule" with a more rigorous model of duties of care. Others defend the full unconstitutionality and suggest the application of the model of article 21 of the Brazilian Civil Rights Framework for the Internet, until a new law is created, in which extrajudicial notification would already be sufficient to characterize unequivocal knowledge and generate liability.

On the other hand, there is a vote for the full constitutionality of the provision, which highlights the need for a court order to remove content, the distinction between private messaging services and social networks, and the unconstitutionality of the removal of profiles, except in exceptional cases.

How the ministers have voted to date

The minister Luís Roberto Barroso argues that article 19, by requiring a court order for the removal of content, does not adequately protect interests of great relevance, such as the protection of fundamental rights and democracy.

For him, flexibilization is necessary: crimes against honor and civil offenses would continue to depend on a judicial decision, while other crimes could be removed after extrajudicial notification. Paid ads and boosts, on the other hand, would generate a presumption of knowledge of the illicit by the platforms.

Minister Barroso proposes the adoption of a dual model, subjective accountability of platforms (i.e., depends on fault or willful misconduct) combined with a duty of care to prevent and mitigate systemic risks, especially in relation to extraordinarily harmful content. In addition, it suggests measures such as whistleblowing channels, due process, and rectitude reports.

The minister Gilmar Mendes goes further and says that the platforms have already ceased to be "mere conductors" to become regulators of the public speech. He considers article 19 structurally insufficient and proposes an upgraded regime of joint and several liability forplatforms, for cases of non-immediate removal of serious content (e.g., child pornography, inducement or accessory to suicide or self-mutilation, human trafficking, terrorist acts, violent abolition of the Democratic Rule of Law, as well as hate speech and extremist ideologies). The minister also defends the creation of a specialized regulatory body and a robust set of procedural obligations.

On the other hand, Minister André Mendonça maintains the full constitutionality of article 19. It argues that the removal or suspension of user profiles is unconstitutional, except when provenly false, automated or aimed at committing serious crimes.

For him, freedom of expression must have position, rank, status, standing prime and it is up to the Judiciary, not the platforms, to arbitrate conflicts of rights. In addition, in compliance with due process of law, the judicial decision ordering the removal of content should provide specific grounds and be accessible to the platform responsible for compliance, giving it the opportunity to challenge it. Minister Mendonça rejects the idea of preliminary monitoring, but admits for transparency,  obligations and reporting channels and defends self-regulation under judicial supervision.

Minister Flávio Dino recognizes a partial omission in article 19 and, in general terms, follows Minister Barroso, but with additions: the provider would respond in case of systemic failure, including involving artificial intelligence. Minister Dino suggests that, if content is removed due to the platform's duty of care, the plaintiff can request reinstatement in court without the right for compensation. He also proposes that mandatory self-regulation rules be monitored by the Attorney General's Office until a specific law is approved.

Minister Cristiano Zanin also adhere to partial unconstitutionality but makes a distinction between "neutral" providers – which only host contents – and "active" providers.  The latter, which promote or boost contents, would be subject to stricter rules, such as liability after extrajudicial notification and the requirement to have artificial intelligence mechanisms combined with human review.

Minister Zanin also defends the digital education of users and the application of the new rules only for the future, that is, the new rules should be applied only to facts that occurred after the final and unappealable decision, ensuring legal certainty.

Minister Dias Toffoli adopts the most radical position: for him, article 19 should be considered totally unconstitutional. For the minister, the rule confers an excess of immunity to digital platforms, which perpetuates the dissemination of harmful content in the virtual environment. Until a new law is approved, extrajudicial notification should be enough to characterize unequivocal knowledge and generate liability of the platforms, for violations of honor, image and intimacy.

The Toffoli minister expands the range of situations in which liability is objective, including advice, boosting even marketplace that offer prohibited products. He also proposed a list of content that requires immediate removal, such as terrorism, violence against children and hate speech, in addition to suggesting a "Decalogue against digital violence" and demanding from the Legislature a new public policy  in 18 months.

Minister Luiz Fux, in turn, recognizes that there is a lack of protection. For him, the restrictive conditions of liability to a specific court order is undue, however, he argues that the liability of platforms for damage to honor, image and privacy should occur after reasoned notification. In addition, in relation to evidently illegal content, it argued that there is a duty of assets monitoring on the parties of companies and that in cases of paid boosting there is a presumption of knowledge of the illicit. Minister Fux also demands confidential and functional reporting channels.

The minister Alexandre de Moraes, who voted on Thursday, June 12, defended that digital platforms, social networks and private messaging services be equated with other communication ways until there is a new regulation by the National Congress, considering that they are no longer simple repositories and have become active agents in the dissemination of information never plural.

For Minister Moraes, in the face of technological innovations, the massive use of artificial intelligence, and the social striking, impact of these platforms, there is no longer room for neutrality: digital platforms, social networks, and private messaging services must be held civilly and administratively responsible for content driven by algorithms, paid publicity, and omission in the face of unlawful acts.

The minister also highlighted the need for all networks and messaging services operating in Brazil to have headquarters or legal representative in the country. He stated that the self-regulation of the platforms proved to be insufficient and cited as an example the call for the anti-democratic acts of January 8, which occurred without effective intervention by the companies.

Despite the divergences, there are points of convergence among the ministers. Everyone agrees on the need for rectitude, reporting channels and the right to review, although there is no consensus on who should supervise or how to punish non-compliance with these obligations.

Most admit the liability of platforms without a court order in serious cases or after notification, with the exception of Minister Mendonça. Everyone recognizes that paid advertisements generate, at least, a presumption of knowledge of the illicit act – for Justices Mendes and Toffoli, this already constitutes strict liability. Finally, while Ministers Mendes, Barroso, Dino and Zanin defend the creation of a specialized regulatory body, Justices Fux and Mendonça prefer to hold self-regulation under judicial control.

The trial was suspended and will resume in the session on June 25. The votes of Minister Cármen Lúcia and Ministers Edson Fachin and Nunes Marques are still pending. However, barring an unlikely twist, the STF signals that it will establish a new paradigm: platforms will have to act with higher agility and responsibility to prevent the circulation of illicit contents, under penalty of being civilly liable even without judicial provocation.

The Supreme Court should establish theses that reconcile freedom of expression and protection of fundamental rights, imposing duties of rectitude, protocols for the removal of publications and, possibly, differentiated regimes, acknowledged and agreed jointly with seriousness and kind of the material. The case thus becomes a milestone for the Brazilian digital environment, capable of redefining the fine line between the free circulation of ideas and the safekeeping of honor, security, and democracy in the virtual space.