Minister Dias Toffoli, of the Federal Court (STF), concluded on Thursday 4th, after three sessions, the reading of his vote in the ruling on the civil liability of platforms and suppliers for content published by users. It proposes new requirements for content moderation and, at the same time, defends an increasing responsibility of these companies for the publications that circulate on the Internet. According to the minister, platforms must “assume the risks and burdens of their deliberate actions or omissions.”
The trial revolves around Article 19 of the Marco Civil da Internet, which prohibits platforms from being held responsible for content published by users, unless they fail to comply with court decisions to remove publications.
For Toffoli, the restriction is unconstitutional because it creates “immunity” for technology companies and, at the same time, leaves users without protection in a context of growing cases of digital violence, such as cyberbullying, stalking, fraud and scams, hate speech and hatred. fake news.
“The justice system can no longer wait, it can no longer remain silent. It is necessary to find mechanisms to protect physical integrity. We must take care of our young people, our children,” he defended. “The virtual today is real.”
The minister proposes to punish platforms that ignore extrajudicial notifications, preferably through their service channels, to remove illegal content, such as fake news and insults. Consequently, the liability of these companies for irregular publications will start from the moment in which these are reported by the users themselves.
Toffoli called the proposed model a “notification and analysis system.” If the changes are approved by the STF Plenary Assembly, it will be up to the platforms to analyze the contested publications and verify whether they should be removed. They could be punished if they keep criminal posts active, but also if they unduly remove regular content.
“With profit comes responsibility,” Toffoli said. “Content recommendation, promotion and moderation activities are intrinsic to the business model adopted by many providers and, in this case, as providers profit from them, they must assume the risks and losses they cause.”
The minister also defined a list of “particularly serious practices” which, according to the vote, should be promptly excluded from the platforms, without the need for notification to users or a court decision. In these exceptional cases, companies must be vigilant and act on their own to prevent the circulation of criminal publications, under penalty of liability. The vote also stipulates that fake profiles must be blocked from social networks.
Consult the list of contents that may lead to the platform’s liability even without extrajudicial notification:
– Crimes against the democratic rule of law;
– Acts of terrorism or preparatory to terrorism;
– Induce, instigate or encourage suicide or self-harm;
– Racism;
– Violence against children, adolescents and vulnerable people;
– Rape against women;
– Health infraction in a national public health emergency situation;
– human trafficking;
– Incitement or threat to acts of physical or sexual violence;
– Dissemination of facts “known to be false or seriously out of context” leading to incitement to physical violence, threats to life or acts of violence against groups or members of socially vulnerable groups;
– Disclosure of facts known to be false or out of context that could cause damage to the balance of elections or the integrity of the electoral process;
– Fake profiles.
Toffoli argues that while publications are created by users, they are often driven by platforms, which profit from engagement. A central argument of the vote is that platforms, providers and social networks are not neutral and influence the flow of information in their ecosystems.
“The content continues to come from third parties, because it was originally created and/or published by them, but by recommending or promoting it to an indefinite number of users, the provider ends up becoming co-responsible for its dissemination,” he said.
The minister distinguishes between the different platforms, depending on the activities carried out. For example, email providers, closed video conferencing meeting platforms, journalism platforms and blogs are exempt. Messaging applications do not respond to private conversations, but they can be held responsible for content posted in public groups and open channels.
Electronic commerce platforms, called marketplaces, will also be affected by the new system, if approved by the majority of the FST. They can be punished if they allow the advertising of products whose sale is prohibited or which do not have certification and approval from the competent bodies, such as TV Boxes, prohibited by the National Telecommunications Agency (Anatel).
The proposals go against big tech. Lawyers for Facebook and Google presented their arguments in favor of the rules as they are. Companies consider it a “trap” to be held responsible for what users post. These platforms predict that the change will create incentives for automatic removal of controversial posts and, ultimately, pre-censorship on social media. One of the concerns is to differentiate, in practice, what is reprehensible from what is criminal, which goes beyond the contractual terms of use.
The trend is for the STF to make changes to the current Marco Civil rules from the Internet, expanding the obligations of big tech. The trial will resume next week.
Source: Terra

Rose James is a Gossipify movie and series reviewer known for her in-depth analysis and unique perspective on the latest releases. With a background in film studies, she provides engaging and informative reviews, and keeps readers up to date with industry trends and emerging talents.