X is trying to uphold its commitment to battling child sexual exploitation (CSE) online. As part of this, X will hire 100 people for its content-moderation team. As part of their responsibilities, the moderators will also be looking into matters related to fraud and spam, and offer relevant support to customers.
The company was recently criticised for the way it handles explicit content and has been under the scanner for some time now.
Linda Yaccarino, CEO, X, is expected to testify before the Senate Judiciary Committee at the end of this month, that is, 31 January, where she will reveal how X handles CSEM. Moving forward, X plans to adopt a ‘zero tolerance’ policy for such content.
The newly-formed moderation team will handle CSEM and also moderate hate speech. Job listings for these roles specify that moderators
In future, X will improve mechanisms for detecting and identifying inappropriate and reportable content. It will be tying up with the National Centre for Missing and Exploited Children (NCMEC).
Last year, that is, 2023, X suspended 12.4 million accounts that were found to be violating CSE policy. The year before that, it had suspended about 2.3 million accounts for the same reasons.
A few years ago, Facebook had increased the salaries of its employees involved in content moderation work. The minimum wage was hiked when a content reviewer had filed a case against Facebook saying the work had caused her mental trauma.