Breaking
18 Apr 2025, Fri

Meta clarifies changes for Brazil and maintains drastic alterations only in the US

Meta


Meta, the parent company of platforms like Facebook, Instagram, and WhatsApp, recently sparked global discussions by announcing significant changes to its content moderation program in the United States. The termination of the Fact-Checking Program in the country raised concerns in various markets, including Brazil, about the potential impacts on moderation and combating disinformation. To reassure Brazilian authorities, the company stated that the changes will be restricted to U.S. territory for now.

In a statement sent to the Office of the Attorney General of Brazil (AGU), Meta emphasized that it is testing a new feature called Community Notes exclusively in the US. This functionality allows users to add information and context to posts, resembling the model adopted by X (formerly Twitter). The company assured that any expansion to other countries would be carefully evaluated.

Despite these assurances, Brazilian institutions remain vigilant. The Federal Prosecutor’s Office (MPF) in São Paulo requested clarifications from Meta about the ramifications of these changes in Brazil, particularly concerning the fight against disinformation and the protection of digital rights. Furthermore, the Brazilian government reiterated the importance of digital sovereignty and the alignment of platforms with local laws.

Impact of changes on combating disinformation

The decision to end the Fact-Checking Program in the United States has sparked intense debates about its potential effects in other markets. Meta explained that the primary goal of Community Notes is to enhance content moderation by enabling greater user participation. However, experts have raised doubts about the effectiveness of this system in complex situations, such as electoral periods or health crises, where disinformation can spread rapidly.

Statistics reveal that platforms like Facebook and Instagram account for over 50% of digital news consumption in Brazil, according to 2023 reports. In a scenario where social networks play a central role in information dissemination, the absence of a robust fact-checking policy could amplify the reach of false or misleading content.

Response from the Brazilian government and authorities

The Brazilian government, through the AGU, organized a public hearing to discuss the implications of Meta’s changes in the country. During the event, representatives from the Ministry of Justice and the MPF emphasized the importance of maintaining clear moderation policies that comply with the Internet Civil Framework. This legislation, in effect since 2014, sets guidelines for content removal and the accountability of digital platforms in cases of violations.

The AGU stressed that Brazil has solid legislation to protect citizens’ digital rights and will not accept setbacks that could compromise informational security. In contrast, Meta reiterated its commitment to values of equality, safety, and privacy, ensuring continued investment in technologies to identify severe violations, such as terrorism and drug trafficking.

Community Notes: solution or challenge?

Community Notes were designed to offer greater transparency on Meta’s platforms. They allow users to contribute additional information to posts, adding context and correcting potential errors. While this approach is innovative, it raises questions about its large-scale applicability.

Some key advantages include:

  • Active user participation in combating disinformation.
  • Greater diversity of perspectives on controversial content.
  • Potential reduction in censorship, promoting open debates.

However, critics point to significant challenges:

  1. Risk of manipulation by organized groups.
  2. Difficulty in verifying the credibility of contributions.
  3. Lack of guarantees about users’ impartiality.

Technology experts warn that to avoid distortions, rigorous monitoring and validation mechanisms for Community Notes will be essential.

Data and history of Meta’s moderation policies

Since its inception, Meta has faced pressure to balance freedom of expression with the responsibility to moderate harmful content. In 2020, the company invested over $1 billion in initiatives to combat disinformation, including partnerships with fact-checking agencies. In Brazil, these efforts were intensified during the 2022 general elections, when platforms removed thousands of posts deemed misleading.

Recent data indicates that more than 80% of Brazilians believe social networks should be responsible for controlling the spread of false information. However, implementing moderation policies often encounters legal and cultural challenges specific to each country, requiring tailored approaches.

Importance of the Internet Civil Framework

The Internet Civil Framework is one of the pillars of digital regulation in Brazil, providing clear guidelines on privacy, freedom of expression, and platform responsibility. It requires companies like Meta to notify users before removing content and obtain judicial orders in cases involving constitutional rights.

This legislation has been crucial in ensuring that changes in global companies’ moderation policies do not harm Brazilian citizens’ rights. The AGU emphasized that it will continue to closely monitor Meta’s actions, demanding transparency and ongoing dialogue.

The future of moderation policies

While Meta has assured that current changes will not be implemented in Brazil, the possibility of future expansions is not ruled out. To protect digital integrity, it is essential for governments and civil society to maintain active discussions about the limits and responsibilities of platforms.

Studies conducted in 2023 show that 62% of Brazilians use social networks as their primary source of information. This underscores the need for effective mechanisms to prevent the spread of disinformation, especially in sensitive contexts like election campaigns and public health crises.

Global concerns and international examples

In addition to Brazil, other countries have expressed concerns about the changes announced by Meta. In the European Union, the European Commission has already launched investigations to assess whether the company’s new policies violate local regulations on disinformation and hate speech.

In the United States, where the fact-checking program was discontinued, digital rights organizations warn of the risk of proliferating misleading content. On the other hand, freedom of expression advocates believe that the decentralized model of Community Notes could be a viable alternative to balance these interests.

Partial conclusion and continued monitoring

Although Meta has taken measures to reassure Brazilian authorities, the debate over content moderation and disinformation is far from over. Brazilian society, along with its government and institutions, will continue to closely follow the company’s actions, ensuring that any changes respect democratic principles and fundamental rights.



Meta, the parent company of platforms like Facebook, Instagram, and WhatsApp, recently sparked global discussions by announcing significant changes to its content moderation program in the United States. The termination of the Fact-Checking Program in the country raised concerns in various markets, including Brazil, about the potential impacts on moderation and combating disinformation. To reassure Brazilian authorities, the company stated that the changes will be restricted to U.S. territory for now.

In a statement sent to the Office of the Attorney General of Brazil (AGU), Meta emphasized that it is testing a new feature called Community Notes exclusively in the US. This functionality allows users to add information and context to posts, resembling the model adopted by X (formerly Twitter). The company assured that any expansion to other countries would be carefully evaluated.

Despite these assurances, Brazilian institutions remain vigilant. The Federal Prosecutor’s Office (MPF) in São Paulo requested clarifications from Meta about the ramifications of these changes in Brazil, particularly concerning the fight against disinformation and the protection of digital rights. Furthermore, the Brazilian government reiterated the importance of digital sovereignty and the alignment of platforms with local laws.

Impact of changes on combating disinformation

The decision to end the Fact-Checking Program in the United States has sparked intense debates about its potential effects in other markets. Meta explained that the primary goal of Community Notes is to enhance content moderation by enabling greater user participation. However, experts have raised doubts about the effectiveness of this system in complex situations, such as electoral periods or health crises, where disinformation can spread rapidly.

Statistics reveal that platforms like Facebook and Instagram account for over 50% of digital news consumption in Brazil, according to 2023 reports. In a scenario where social networks play a central role in information dissemination, the absence of a robust fact-checking policy could amplify the reach of false or misleading content.

Response from the Brazilian government and authorities

The Brazilian government, through the AGU, organized a public hearing to discuss the implications of Meta’s changes in the country. During the event, representatives from the Ministry of Justice and the MPF emphasized the importance of maintaining clear moderation policies that comply with the Internet Civil Framework. This legislation, in effect since 2014, sets guidelines for content removal and the accountability of digital platforms in cases of violations.

The AGU stressed that Brazil has solid legislation to protect citizens’ digital rights and will not accept setbacks that could compromise informational security. In contrast, Meta reiterated its commitment to values of equality, safety, and privacy, ensuring continued investment in technologies to identify severe violations, such as terrorism and drug trafficking.

Community Notes: solution or challenge?

Community Notes were designed to offer greater transparency on Meta’s platforms. They allow users to contribute additional information to posts, adding context and correcting potential errors. While this approach is innovative, it raises questions about its large-scale applicability.

Some key advantages include:

  • Active user participation in combating disinformation.
  • Greater diversity of perspectives on controversial content.
  • Potential reduction in censorship, promoting open debates.

However, critics point to significant challenges:

  1. Risk of manipulation by organized groups.
  2. Difficulty in verifying the credibility of contributions.
  3. Lack of guarantees about users’ impartiality.

Technology experts warn that to avoid distortions, rigorous monitoring and validation mechanisms for Community Notes will be essential.

Data and history of Meta’s moderation policies

Since its inception, Meta has faced pressure to balance freedom of expression with the responsibility to moderate harmful content. In 2020, the company invested over $1 billion in initiatives to combat disinformation, including partnerships with fact-checking agencies. In Brazil, these efforts were intensified during the 2022 general elections, when platforms removed thousands of posts deemed misleading.

Recent data indicates that more than 80% of Brazilians believe social networks should be responsible for controlling the spread of false information. However, implementing moderation policies often encounters legal and cultural challenges specific to each country, requiring tailored approaches.

Importance of the Internet Civil Framework

The Internet Civil Framework is one of the pillars of digital regulation in Brazil, providing clear guidelines on privacy, freedom of expression, and platform responsibility. It requires companies like Meta to notify users before removing content and obtain judicial orders in cases involving constitutional rights.

This legislation has been crucial in ensuring that changes in global companies’ moderation policies do not harm Brazilian citizens’ rights. The AGU emphasized that it will continue to closely monitor Meta’s actions, demanding transparency and ongoing dialogue.

The future of moderation policies

While Meta has assured that current changes will not be implemented in Brazil, the possibility of future expansions is not ruled out. To protect digital integrity, it is essential for governments and civil society to maintain active discussions about the limits and responsibilities of platforms.

Studies conducted in 2023 show that 62% of Brazilians use social networks as their primary source of information. This underscores the need for effective mechanisms to prevent the spread of disinformation, especially in sensitive contexts like election campaigns and public health crises.

Global concerns and international examples

In addition to Brazil, other countries have expressed concerns about the changes announced by Meta. In the European Union, the European Commission has already launched investigations to assess whether the company’s new policies violate local regulations on disinformation and hate speech.

In the United States, where the fact-checking program was discontinued, digital rights organizations warn of the risk of proliferating misleading content. On the other hand, freedom of expression advocates believe that the decentralized model of Community Notes could be a viable alternative to balance these interests.

Partial conclusion and continued monitoring

Although Meta has taken measures to reassure Brazilian authorities, the debate over content moderation and disinformation is far from over. Brazilian society, along with its government and institutions, will continue to closely follow the company’s actions, ensuring that any changes respect democratic principles and fundamental rights.



Leave a Reply

Your email address will not be published. Required fields are marked *