Friday, March 29, 2024

Facebook launches campaign to combat child exploitation online in PH, Asia Pacific

Social media giant Facebook is tightening the noose on distributors of content involving online sexual exploitation of children (OSEC) and child sexual abuse material (CSAM) in the Philippines and other nations in the Asia Pacific region.

On Wednesday, June 30, Facebook launched #ReportItDontShareIt, a targeted campaign in Asia Pacific to eradicate the sexual exploitation and abuse of children online.

“As Facebook, our primary goal is to make sure people are safe,” Facebook safety policy manager for Asia Pacific Malina Enlund said during the launch. “We really wanted to create interventions that matter, interventions that make a difference, and build on the existing work we have already been doing.”

The public safety campaign aims to raise awareness on how the public can help prevent the online abuse of children by reporting CSAM to Facebook and to local law enforcement authorities and agencies.

Locally, Facebook has partnered with the government’s Inter-Agency Council Against Trafficking (IACAT) and the Inter-Agency Council Against Child Pornography (IACACP), as well as NGOs like Stairway Foundation and Child Rights Network to roll out the campaign.

The social media giant kickstarted the launch with an animated video that stresses the impact of CSAM on victimized children. It specifically encourages the public to report CSAM on Facebook’s platform through the in-app reporting channels and by contacting local authorities.

Another major thrust of the campaign covered is the need to educate users on the harmful consequences of sharing CSAM. Recent findings from research started by Facebook and US-based National Center for Missing & Exploited Children (NCMEC) in 2020 revealed that 90 percent of the CSAM reported on the platform are shares or reshares from content previously reported.

In addition, the study showed that more than 75 percent of shares were not meant to be malicious or to harm a child. More often than not, CSAM was shared for other reasons such as outrage at the content or poor humor. The intent, however, does not change the fact that sharing CSAM is illegal and can even harm the child further.

“We’ve always told those who we work with via our child protection trainings that in the process of manifesting our outrage or concern towards the existence of a certain child sexual abuse material, that there will always be a child involved in the said image. We always have to think about the best interest of the child,” recounted Ace Diloy, senior advocacy officer at Stairway Foundation.

“Once CSAM is produced and distributed, the impact to children involved becomes much, much longer and deeper. As aside from the contact sexual abuse, they would have to deal with the idea and possibility that their images might be out there forever, being downloaded, streamed, and viewed countless times.”

Other than educational campaigns, Facebook has other solutions it is utilizing to stamp out CSAM and support children. The social media platform is partnering with organizations like TechMatters and NCMEC to streamline their reporting process so that unique CSAM is prioritized, victims can be spared more harm, and offenders can be identified.

Facebook’s in-app feature has a pop-up that appears to people who initiate searches on using terms associated with child exploitation. The pop-up contains offender diversion resources from child protection organizations and shares information about the consequences of viewing illegal content.

An additional solution is a safety alert designed for users who attempt to share viral memes incorporating CSAM. The alert informs these users of the harm it causes the victim, warns that this meme is against Facebook’s policies, and that there are legal consequences for sharing the material.

Moreover, by employing technology such as artificial intelligence, 98.9 percent of CSAM is actually removed before it reaches the platform. The less than 2 percent that slips through Facebook’s system, though, requires mutual effort to eliminate, the company said.

“We need all of us to work together because child safety should be and for us, is the main priority,” Enlund said. “It is vital for all of us to get that message across, that we require everyone. We require people to report, we require children to be supported by people, we require all of us to stand together and make sure that kids have a safe and healthy and joyful experience offline and online.”

Subscribe

- Advertisement -spot_img

RELEVANT STORIES

spot_img

LATEST

- Advertisement -spot_img