Short video app TikTok has formed a group of outside experts to advise on its content-moderation policies, it said on Wednesday, the latest in a series of steps it has taken to address data security and content censorship concerns in the US.

Why it matters: Content moderation has become an increasingly pressing problem for social media platforms including Twitter, Facebook, and Google’s YouTube. Coronavirus-related misinformation is rampant on the internet, meanwhile a US presidential election—perhaps ground zero for the phenomenon—approaches.

  • TikTok, owned by Beijing-based startup Bytedance, is drawing particular scrutiny from US lawmakers concerned that the company may transfer personal data belonging to its US users to the Chinese government and censor content on the platform to please Beijing.

Details: The group, which the company calls a content advisory council, will provide “unvarnished views” and advice around its content-moderation policies and practices, TikTok said in a statement on Wednesday.

  • The council chair is Dawn Nunziato, a professor at George Washington University Law School who specializes in the areas of internet law, free speech, and digital copyright.
  • Other members include renowned “deep fake” expert, Hany Farid; Dan Schnur, a political strategist; a social worker who specializes in social media and mental health in youth; and a head of a technology think tank.
  • “I am working with TikTok because they’ve shown that they take content moderation seriously, are open to feedback, and understand the importance of this area both for their community and for the future of healthy public discourse,” Nunziato said in the statement.
  • The seven-member committee will meet at the end of March to discuss topics around platform integrity, including policies against misinformation and election interference, the company said.

“It’s clear that the social media sector has attracted a great deal of interest and potential regulatory oversight in recent years from a number of US government entities. I have been impressed by TikTok’s efforts to voluntarily address these types of concerns, not for the purpose of avoiding such scrutiny but in order to establish itself as a cooperative partner in an effort to achieve these goals for the benefit of consumers and society.”

—Dan Schnur in an email to TechNode

Context: TikTok announced last week it plans to open a content moderation transparency center in its US office to show outside experts how the app moderates content on the platform.

  • The Guardian reported in September that TikTok instructs its moderators to censor videos that are deemed politically sensitive by the Chinese government, citing leaked documents detailing the platform’s guidelines. The company said in November that the guidelines were retired in May.
  • A US national security panel launched in November a review of Bytedance’s $1 billion acquisition Musical.ly, the predecessor of TikTok, in 2017. Experts say the review may force Bytedance to sell TikTok back to a US company.

Updated to include comments from Dan Schnur.

Writing about semiconductors and telecommunications.