This Thursday (26), the Supreme Federal Court (STF) concluded the judgment on the liability Big Techs by criminal posts of its users.
Let’s understand in chronological order.
On the last day 12, the court MAKE FORMED to account These platforms. The final count was in eight to threeand the ministers Alexandre de MoraesDias Toffoli, Luiz Fux, Flávio Dino, Cristiano Zanin, Gilmar Mendes, Carmen Lúcia and the president of the Supreme, Luis Roberto Barroso, were in favor of liability. Voted against André Mendonça, Edson Fachin and Kássio Nunes Marques -Unique to vote on Wednesday (26).
What did Nunes Marques said?
Nunes Marques understands that the Article 19 of the Marco Civil da Internet, which was also on the agenda, It is constitutional (Understand below). “Declare that Article 19 is constitutional does not prevent that Parliament can debate the topic broadly and deeply. Indeed, This discussion is recommended“He pointed out.
Marques also defended Congress, stating that the house It was not silent to regulate social networks. He also cited law processes under discussion in the legislature, such as the Fake News PLemphasizing that current legislation is able to regulate platforms.
It also understands that freedom of expression is “stone“From the Constitution”. “The solution is not the prior restriction of freedom of expression. It should be guaranteed greatest freedom And with that, the individual and society will have more conditions To make decisions, ”he continued.
What was in vote?
- The trial was carried out on top of two resources that analyze the responsibility of digital platforms on damage from posts from its users, even without court orders for withdrawal of these publications;
- Therefore, the understanding is about if such platforms can be condemned the Pay compensation for moral damages if they do not withdraw the posts;
- The posts involved are Hate, Fake News or Maleficiating Third Party Speeches.
Another subject that was under discussion was if the Article 19 of Marco Civil da Internetthat deals with the theme, it was valid. He says that social networks can only be held responsible in these situations if they did not remove publications after the issuance of a court order.
The Internet Civil Marco has been in force since 2014 and operates as a kind of Internet Constitution.
“Habemus Consensus”
The ministers managed to complete this stage later this Thursday (26). The final decision was that the platforms can be held responsible for illicit or offensive publications made by their users only if, After receiving extrajudicial notification, Do not remove the content And in court he is considered illegal.
In addition, most understood that Article 19 of Marco Civil da Internet is partially unconstitutional. With the change in the view of the article, the omission of the platforms when receiving extrajudicial notifications can generate liability, without being needed await court decision.
This thesis provides that Article 19 does not give the ideal protection Fundamental rights, especially of the honor, dignity and image of citizens, which makes it partially unconstitutional.
In this way, the Supreme Court concluded that,. Until a new regulation is legislated, the article will be interpreted as follows:
- Social networks will be held responsible in the civil sphere if they do not remove the contents After extrajudicial notification;
- The decision, however, is not applicable to rules of electoral legislation. Thus, the normative acts of the Superior Electoral Court (TSE) follow untouched;
- Big techs can be liable civilly by Article 21 Damage generated by content of its users, including when the case involves false profiles;
- In case of crimes against honor, a court order is required as provided for in Article 19. However, the possibility of removal can be excluded upon receipt of extrajudicial notification.
Another point defended by the thesis deals with, in situations of hate speech, racism, pedophilia, incitement to violence or the coup d’état, the big techs need proactively for content removal – Even without receiving prior notification.
Thus, the logic of operation of social networks in Brazil is changing, leading technology companies to review their terms on denunciation and moderation.
Read more:
In addition to Nunes Marques: what each minister voted
Edson Fachin
Fachin was the second minister to destroy the majority. He said that Technology is in “incessant mutation” and that the current judgment It’s not enough to solve the problems generated by the power concentrated in the big techs.
He understands that Regulation must be done by Congressthrough wide legislation. “I do not believe that this theme will necessarily be solved or exhausted with the removal or not from platform content. need for structural and systemic regulationpreferably I did not see Judiciary, ”he said.
The magistrate also brought Risks and Benefits to blame platforms for the contents. On the one hand, Fachin understands that regulation would help in protecting fundamental rights, but, on the other, it can generally “collateral censorship“.“ The adoption of users’ speech control Not part of the rule of democratic law, ”he said.
Therefore, the minister believes that Article 19 of the Internet Civil Marco It is constitutionalkeeping the document as it is currently. “The need for a court order to remove content generated by third parties It seems to me to be the only constitutionally adequate way To compatible freedom of expression with the subsequent responsibility regime, ”he said.
Cármen Lúcia
Carmen Lúcia, like most magistrates, understands that Networks have responsibility and that It is necessary to give interpretation According to article 19 to preserve it, for example, in cases of crimes against honor.
“Censorship is constitutionally prohibited, it is ethically prohibited, it is morally prohibited, I would say even spiritually. But it cannot also allow us to be in a wagon where there is 213 million small sovereign tyrants. Sovereign is Brazil, sovereign is Brazilian law. So, we need to comply with the rules,” he said.
Alexandre de Moraes
- While reading his vote, Moraes cited several posts with racist, anti -Semitic and homophobic speeches;
- “Only omitted minds do not fight to remove it from social networks. This is not freedom of expression, this is a crime,” he said;
- Another theme addressed by the magistrate was the call for the depredation of the Three Powers Square during the attempted coup on January 8, 2023, as well as showing images of acts contrary to democracy;
- “Selma Festival, as extremists organized a blow through social networks. And social networks saw it multiplying and continued,” he said;
- So Moraes showed images of the invasions. “They destroyed, asked for military intervention and posted at the same time, and social networks showed no self -regulation. The bankruptcy of the self -regulation of social networks, which makes us have to judge this in these sessions,” he added.
Toffoli Days
Toffoli was a rapporteur of one of the resources and voted for unconstitutionality of the article, arguing that, in cases of offensive or illicit content, such as racismthe platforms need to act as soon as they are notified in a extrajudicialwhether by the victim, or by his lawyer, dispensing the need to wait for a court decision.
In serious situations, the minister understands that social media must take the same attitude and that, if they omit, need to be held responsible.
Luiz Fux
Fux was the rapporteur of the other process and had the same understanding From Toffoli, in addition to arguing that the removal of irregular posts should be performed shortly after extrajudicial communication.
The minister considers illicit contents that convene discourse of hatred, racism, pedophilia, incitement to violence, apology for the violent abolition of the Democratic Rule of Law and the coup d’état.
Fux still waved to account for social networks if they remain inert after extrajudicial notification and argued that platforms create channels for receiving confidential complaints and actively monitor the publications.
LUÍS ROBERTO BARROSO
The president of the Supreme Court understands that liability can occur when companies do not take necessary steps to remove these publications.
In cases of crimes against honorsuch as injury, slander and defamation, Barroso thinks that the removal of the harmful post should only occur through court order.
The minister also proposed that the companies have duty of careneeding to avoid content, such as: child pornography, instigation or assistance to suicide, trafficking in persons, acts of terrorism, violent abolition of the democratic rule of law and coup d’état.
André Mendonça
Unique oppositeMendonça understands that Article 19 is constitutional and Diverged at other points. But stated that it takes interpret the excerpt according to the Constitution to fix certain points.
One of them is what invalidates removal or suspension of user profiles, except in case of proven false or with illicit activity, and the platforms in general have the duty to promote the identification of the user violating the right of third parties and that It is not possible to hold responsible directly to the social network without prior court decision when there are possible irregularities involving opinions.
Flávio Dino
Suggested thesis that provides for the accountability of internet providers via Article 21 from Marco. This article says that responsibility can occur when There is no arrangements To remove content after extrajudicial notification. If there is a crime against honor, Article 19 prevails.
For Dino, the platforms have the duty of avoid the creation of false profiles. In this situation, the responsibility of the Civil Procedure CodeSpeaking independent of prior judicial or extrajudicial notification.
The decision would also apply to profiles of robots, paid and driven ads. If the provider removes the content due to care, the author can Ask for release in courtwhich, if authorized the post, There will be no punishment with compensation.
Cristiano Zanin
Last to vote, Zanin understands that Article 19 of Marco is “partially unconstitutional”And defends three criteria. One of them says that, in cases of criminal content, the platform would be held responsible by removing the content And without having to wait by court decision.
As for the application of Article 19, this would only be kept on providers unlucky there reasonable doubt on the lawfulness of the content, therefore, There would be no immediate accountability if there were doubt on the legality of the reported content.