A probe by the news agency indicates that the network is still being used to encourage violence against the Muslim group in Myanmar despite the tech firm promising to tackle the issue.
It said some of the material had been online for six years.
Facebook's rules prohibit "violent or dehumanising" attacks on ethnic groups.
However, the US-based firm mostly relies on users to flag related offending posts rather than hunting them out itself, in part because its software has not had enough training to reliably interpret Burmese text.
The BBC understands Facebook has now removed all the flagged material.
The investigation was carried out in conjunction with the Human Rights Center at the University of California's Berkeley School of Law.
It is likely to add to pressure on the tech company to invest more resources into tackling the problem. Its efforts have previously been criticised by the United Nations as well as politicians in the US and UK.
About 700,000 Rohingya Muslims have left Myanmar since 2017, many of whom now live in refugee camps in Bangladesh.They have reported that Burmese soldiers and vigilantes had murdered and raped members of their community and burned their homes.
Myanmar's military says it is fighting Rohingya militants and denies targeting civilians in Rakhine state.
Genocide posts
Reuters said that most of the anti-Rohingya comments, images and videos it had discovered were in the Burmese language.
It said they included:
- calls for Rohingya citizens to be shot, set on fire and be fed to pigs
- suggestions of genocide - one person wrote: "We must fight them the way Hitler did the Jews"
pornographic anti-Muslim images - descriptions of the group as being dogs, maggots and rapists
In a statement, Facebook acknowledged that it had originally been slow to spot the problem of hate speech spreading "in countries like Myanmar, where many people are using the internet for the first time".
"We're now working hard to ensure we're doing all we can to prevent the spread of misinformation and hate," it added.
"In the last year, we have established a team of product, policy and operations experts to roll out better reporting tools, a new policy to tackle misinformation that has the potential to contribute to offline harm, faster response times on reported content, and improved proactive detection of hate speech."
Reuters reported that Facebook had outsourced most of its Myanmar-related effort to an outside firm - Accenture - which in turn relied on 60 people to review reports of hate speech from the country, as of June.
About 50 million people live in Myanmar, of whom about 18 million regularly use the social network.
'Terrible tragedy'
In March, UN investigators said that the use of Facebook had played a "determining role" in stirring up hatred against Rohingya Muslims in Myanmar.
The following month, Facebook chief executive Mark Zuckerberg was questioned by US senators about the social network's involvement in related attacks on the ethnic minority.
At the time, he acknowledged that his firm needed to do more, describing what was happening in the country as "a terrible tragedy".
He listed three specific things Facebook was doing:
- hiring dozens more Burmese-language content reviewers
- working with civil society in Myanmar to identify specific hate figures
- making product changes in Myanmar to prevent similar issues in future
When the UK's Department for Digital, Culture, Media and Sport (DCMS) conducted its own inquiry into fake news, it was not impressed with the answers from Facebook's chief technology officer Mike Schroepfer about efforts to tackle hate speech in Myanmar.
He was unable to say how many fake accounts had been identified and removed in the country or how much money the firm had made there.
In its report, the committee said that the social network had failed to demonstrate "that it has done anything to stop the spread of disinformation against the Rohingya minority".
More about: Facebook