When Facebook’s Mark Zuckerberg outlined plans to merge Whatsapp, Instagram, and Facebook’s flagship platform into a small group-focused network, it immediately drew comparisons to WeChat.
“With all the ways people also want to interact privately, there’s also an opportunity to build a simpler platform that’s focused on privacy first,” Zuckerberg wrote in a post titled “A Privacy-Focused Vision for Social Networking.” He claims that in addition to a digital “town square,” people “also want to connect in the digital equivalent of a living room.” Facebook is trying to address concerns about privacy and misinformation on the platform—but small groups do nothing to improve the quality of information users share.
Just look at WeChat to see small groups in action. Allen Zhang, the founder of WeChat, built most its multitude of functions out of intuitions about how groups on the platform work. Zhang, a user-experience design obsessive, recently gave a four-hour speech on his design philosophy, in which he talked about cultivating a space as friendly and familiar to users as “an old friend.”
WeChat has a misinformation problem of its own: friend-sourced content isn’t necessarily factual, objective, or higher quality. Zuckerberg’s assumption that a digital living room will stop misinformation is misleading. This change will simply hide some of the present information problems, as WeChat’s own problems show.
Neither Zhang nor Zuckerberg seems to understand the political and cultural spaces that they’ve created. Neither Facebook nor Tencent are government agencies—but both corporations handle information, data, and decisions that are increasingly political.
Lies in living rooms
According to a report by the Tow Center, 79% of US-based WeChat users use it as a primary source for political information. In political campaigns with candidates of Chinese descent, WeChat has also become a political clearinghouse that can foster “digital intimacy” between candidate and voter, and can personally field voters’ policy questions within politics-related group chats. Most of these fast-paced, strongly worded conversations are difficult to find out and track without group membership.
Chinese government regulators censor all WeChat data. But even after content goes through censorship filters, information on WeChat remains siloed within groups.
WeChat’s most active communications channels are person-to-person and group chats, as opposed to more public-facing walls that fueled Facebook’s rise.
Within these groups, members pass along viral articles, allowing rumors to spread just as quickly as in a Facebook news feed. For example, a news item about a driver with an expired visa fatally hitting a Chinese jogger quickly turned into the misleading headline “Kill a Chinese person, get a green card.”
Given Wechat’s current content standards, stories like the traffic accident headline wouldn’t necessarily register as a high-priority information problem. Wechat prioritizes content deemed politically sensitive to Chinese authorities, and sensitive current events, including critical excerpts of U.S. lawmaker speeches.
Self-appointed fact checkers have emerged to challenge falsehoods on WeChat, but despite their efforts chats are fast-moving and tend to lump together reliable and unreliable sources . Because group membership is the only way to truly locate the memes, false health advertisements, and hate speech circulated, the few fact-check reports produced by volunteers must be forwarded to groups in order to counterbalance bad information in the first place.
Further, since private information ecosystem are invisible to non-members, it’s almost impossible to know exactly how much damage misinformation campaigns have done. For media organizations and government entities already struggling to combat bad information, false information campaigns previously shared in publicly observable spaces will go into hiding.
Moderation tools within WeChat groups remain limited, as well, and group admins have no control over what information is shared. Instead, they are simply given a tool to boot troublesome users and responsibility for all communications within the group.
Lessons for Facebook
WeChat-like group chat won’t solve the problems that are plaguing Zuckerberg in the digital town square. With a souped-up group interface, it will be even more of a challenge to pinpoint and manage the spread of hate speech and curb foreign political influence on the United States.
While the idea of having users engage with smaller peer groups is a nice sounding idea, WeChat’s current iteration of the “digital living room” demonstrate that misinformation can thrive in smaller environments as well. Without considering the myriad misinformation problems that group-based platforms cannot easily resolve, Zuckerberg risks simply sweeping bad information under the rug. There, it would sit untraceable until the next time hoax-inspired violence resurfaces.
Facebook’s suite of messaging apps matter to more nationalities than WeChat, which still primarily serves mainland Chinese, at home and overseas. Facebook’s handling of misinformation overseas has been clumsy: insufficiently answered questions linger about its role in anti-Rohingya violence in Myanmar.
Zuckerberg’s proposed changes to Facebook’s messaging tool kit must learn from the reality of WeChat—or risk fanning the flames of future upheavals. Lofty mission statements about connecting people aside, both Facebook and WeChat should remember that their design decisions affect the news intake, security, and wellbeing of billions.