Editor’s note: This op-ed was contributed by Elliott Zaagman, a trainer, coach, and change management consultant who specializes in aiding Chinese companies as they globalize. To contact him, check him out on LinkedIn, or add ezaagman on WeChat.
Updated on 07 Feb 2018 to include information about Elliot’s previous relationship with the TopBuzz platform as well as his attempts to get official statements from the company.
Over the last year, few Chinese tech firms have made a bigger splash than news aggregator Jinri Toutiao (今日头条). Leaping from relative obscurity just a few short years ago, the app has become a magnet for Chinese eyeballs, with over 120 million daily active users, who spend an average of 74 minutes per day on the platform.
What my Chinese friends (and Toutiao’s investors) seem to like so much about it is the use of machine learning to recommend content to its users based on their preferences and habits. Since the majority of the media I consume is in English, I hadn’t had much of an opportunity to see how its algorithms work. That changed recently when I downloaded TopBuzz, the English-language app of Jinri Toutiao’s parent company, Bytedance. While TopBuzz’s user interface is a bit different, it relies on the same core back-end architecture as its Chinese sibling to deliver a tailored content stream to its users.
This past autumn, I gave Topbuzz a try. I opened an account as a content creator, and re-ran some of my content which I had previously published on other outlets. After posting a couple articles, I became turned off to publishing on the platform, mostly because I found much of the content on it to be click-bait, without much thought involved or substantive value. I did not think it was the right platform for me.
However, the platform began to draw my attention a few months later, beginning with a push notification on December 13th, when Democrat Doug Jones defeated Republican Roy Moore in a special election for a Senate seat in Alabama. On December 12th, in a tightly-contested special election for an Alabama Senate seat, Democrat Doug Jones defeated Republican Roy Moore. A few hours after the major news networks declared that Jones had won, I received this article in a push notification from TopBuzz:
While every credible news outlet in the world was reporting Jones’ victory, TopBuzz was sending me updates from little-known far-right-wing news site One America News Network.
I also get conspiracy-theory-style news from Topbuzz.
A couple weeks ago, I received this headline in a push notification, saying that former Beatle John Lennon’s wife Yoko Ono was claiming to have had a lesbian affair with Hillary Clinton in the 1970s:
This headline is also objectively false. According to fact-checking website Snopes.com, the story originated in 2015 on a site which produces fabricated hoax stories for the purpose of generating click traffic. There is no evidence to support any claim that such an affair occurred, nor that Ono ever claimed that it did.
The fake news seems to be bipartisan as well, as I received a headline earlier this month reporting a scandal involving Republican Arizona Senator John McCain:
For this one, it seemed as though the source of the article was not even a news organization or a hoax site, but just a user named “haitim738653.” I searched online and could not find any reference to what scandal he was referring to. Nevertheless, TopBuzz put it at the top of my feed.
I even received another one just today, claiming that actress Julia Roberts made a crude remark in comparing former first lady Michelle Obama and the current one, Melania Trump:
This claim is also false. Julia Roberts never made such a comment, according to both fact-checking websites Politifact.com and Snopes.com. In fact, Roberts has been a vocal supporter of the Obamas in the past, publicly fundraising for Barack Obama’s 2012 presidential campaign.
Stories like this seem to appear on my TopBuzz feed every day. Even when the story is factually true, the sensational click-bait nature of the headlines tends to be misleading, and the content itself is usually from far-right or far-left-wing fringe websites.
Bytedance’s global ambitions
TopBuzz is one of a series of apps in Bytedance’s portfolio that it hopes to use to power its global expansion. In November, it announced the acquisition of video app Musical.ly for an estimated $800 million, which has a strong user base in North America and Europe. They also acquired global news aggregator New Republic that month. In Japan, Korea, and southeast Asia, they have experienced some success in their music-sharing app TikTok, which they launched in September. TopBuzz seems to be primarily focused on the North American and European markets.
While Bytedance is using different apps for different media and markets, they all operate with the same China-based back-end architecture. “We have different products that cover different regions, but the back-end recommendation engine is the same,” explained Bytedance Senior VP Liu Zhen on the 996 Podcast. “All the engineers and product people are based in China.”
Bytedance aims to do two different things that have proven challenging for Chinese tech firms: Succeed by delivering content to overseas, non-Chinese users, and to develop and manage apps that catch on with high and middle-income foreign markets.
Silicon Valley sea change
As Bytedance has laid plans for global domination, one must wonder how their product and ambition will mesh with the shifts that have taken place in the West and Silicon Valley, in particular, over the past year.
America’s tech giants, who have gained the trust of users in small part due to their ability to project themselves as friendly, democratic, “forces for good,” have recently received greater public scrutiny. Most notable among them has been Facebook, who came under fire in late 2016 after it revealed that it received approximately $100,000 during the US election to publish Russia-linked ads aimed at “amplifying divisive social and political messages across the ideological spectrum.”
While there does not seem to be an expert consensus on the extent to which Russian Facebook content influenced the election, one thing is clear: Politics in the US, and many other democracies around the world, are more divided than they have been in decades. This is in no small part due to the algorithms that social media platforms use, which create information “echo chambers.”
This effect was chronicled in a June 2015 study published in the journal Science entitled “Exposure to ideologically diverse news and opinion on Facebook.” Let’s say person A and person B both live in the US. Person A tends to vote Democrat more often, while person B tends to vote Republican. Person A starts to click on stories from Democrat-leaning news outlets, while person B clicks on more Republican-leaning stories. Recognizing their clicking tendencies, Facebook’s algorithm tries to curate content that is aligned with their habits and begins to suggest more and more stories that confirm, rather than challenge, their existing worldviews.
Eventually, the recursive effect of the algorithm’s interactions with the individuals creates online universes where people not only receive different opinions in their news but different facts altogether. The effect of this has been a further polarization of the populace, as this Pew Research graph indicates:
In democratic societies where governance is based on the ability to have a rational debate and reach a compromise, this effect is downright dangerous.
Last December, when speaking to an audience at the Stanford Graduate School of Business, former Facebook VP of user growth Chamath Palihapitiya used no uncertain terms when expressing “tremendous guilt” for the company he helped create, saying “I think we have created tools that are ripping apart the social fabric of how society works.”
“The short-term, dopamine-driven feedback loops we’ve created are destroying how society works,” he continued, referring to the system of “clicks, likes and thumbs-up” on which the platforms of Facebook and Toutiao are based. “No civil discourse, no cooperation; misinformation, mistruth. And it’s not an American problem — this is not about Russians ads. This is a global problem.” He also described an incident in India where hoax messages about kidnappings shared on WhatsApp led to the lynching of seven innocent people.
Palihapitiya is just one of many tech pioneers who are beginning to re-examine just what impact their products have made on society. In November, Napster founder and early Facebook investor Sean Parker spoke publicly about becoming a “conscientious objector” to social media, saying that Facebook and other social media platforms were succeeding by “exploiting a vulnerability in human psychology.”
This has prompted some serious soul-searching at Facebook. In February 2017, Mark Zuckerberg published a nearly 6,000-word manifesto, in which he spoke in detail about his vision for the platform as a force for social good around the world and the changes that need to be made in order to ensure that is the case. In January 2017, Facebook launched the Facebook Journalism Project, a comprehensive program designed to filter out fake and hoax stories, train and educate journalists and publishers, and improve news literacy among readers. In October, they announced new media guidelines designed to stop fake news and introduced a “more information” button that would allow readers to check the credibility of the publisher before opening a link. Earlier this year, Facebook stock dropped 4 percent after Zuckerberg announced that the platform would be moving away from its content and ad-based model to emphasize more interaction between friends and family members.
But what is Bytedance doing?
As Silicon Valley and Facebook regroup from the current backlash to the social problems they have helped create, it is difficult to see how Bytedance envisions the impact they would like to make. When asking recently about their policy for managing fake news or stance on the societal impact of social media, Bytedance PR declined to comment on the record.
I reported the fake news problem to the Topbuzz team and let them know that, while I did not immediately plan on writing an article on it, I was researching a piece on Bytedance’s globalization process and was paying close attention to the Topbuzz platform, including the fake news issue, and that if it were a consistent theme, it would be worth writing about.
To be fair, Bytedance does seem to acknowledge that social responsibility is something that they at least should reference publicly, but it is difficult to see exactly where it falls in the company’s hierarchy of priorities.
In December, they hosted their first “Global Festival for AI Ideas”, at which founder and CEO Zhang Yiming made the following statement:
“As AI becomes an increasingly integral part of our society, Bytedance believes that we – and our industry peers – have a duty to ensure that we understand and can anticipate the social impact of these new technologies, and manage this impact responsibly. We are delighted to provide this platform where some of the greatest minds and most influential people in our industry can come together to exchange ideas about the future of AI in our society, and drive the best ideas forward.”
The challenge of managing fake news is something they at least seem to recognize, and they have publicized how they are using their technology to help find missing children in China. Their AI and methods for content monetization have also proven useful in helping small creators find audiences and convert their followings into income. They have also partnered with Tsinghua University School of Public Administration to establish the Innovation and Governance Center, which focuses on researching the social impact of, and response to, new technologies, and have formed something called the Bytedance Strategic Technology Committee to provide a sustained platform for cross-industry discussion of critical challenges at the intersection of AI and society.
At the same December event at which Zhang spoke, the head of Bytedance’s AI lab Dr. Ma Wei-Ying addressed foreign journalists and spoke of how the company is developing techniques to train its AI to recognize and remove fake news. According to Ma, the platform also lets users report what they believe to be fake news and analyzes comments to detect whether they suggest the content might be fake. When the system identifies any fake content that has been posted on its platform, it will notify all who have read it that they had read something fake.
Despite Ma’s claims, this is far from my experience with TopBuzz. Although I receive news that is verifiably fake on a near-daily basis, often in the form of push notifications, I have never once received a notification from the app informing me that Roy Moore is in fact not the new junior senator from Alabama, or that Hillary Clinton was actually not Yoko Ono’s sidepiece when she was married to John Lennon.
Actually, on the very morning of this piece’s publication, I learned from TopBuzz’s “Politics” and “Science” channels that Barack Obama travels with a demon in his entourage and that NASA is hiding a secret alien base on the dark side of the moon:
But what is perhaps more concerning is the very nature of the platform, its algorithm, and its incentive structure. Bytedance’s core competency is its AI-based system for content creation, curation, and dissemination, which incentivizes content creators through financial rewards often based on clicks, and curates the content according to the preferences of each individual reader.
In other words, it seems as though Bytedance is basing its entire business model and global expansion on a turbo-charged version of the tools that Chamath Palihapitiya says are “ripping apart how society works.” What Facebook is moving away from, Bytedance is diving into head-first.
In China, Bytedance censors their platform in response to pressure from authorities. In December 2017, the Beijing bureau of China’s top regulator accused Toutiao of “spreading pornographic and vulgar information” and “causing a negative impact on public opinion online,” and ordered the temporary suspension of some popular sections of the app. In response, Bytedance took down or suspended the accounts of more than 1,100 bloggers that is said had been publishing “low-quality content” on the app. It also replaced its “Society” section with one called “New Era,” which primarily pushes state media coverage of government decisions.
Outside of China, in Bytedance’s target overseas markets like North America and Europe, there is far less pressure from government regulators, but the consequences of the media the public consumes is arguably of greater impact. After all, these people are voters. The information they receive has a direct impact on how their societies govern themselves, and social media has played a critical role in hampering the effectiveness of their democratic systems. The world does not need more “curated” information, it needs accurate information.
The globalization push of many Chinese firms is inspiring: What they offer can potentially meet genuine global needs. Mobike and Ofo are helping to ease traffic congestion, lessen pollution, and provide affordable transportation. Alibaba and JD are introducing pathways for foreign businesses to access Chinese consumers. Many of the state-owned firms are developing infrastructure which can spur economic development in underdeveloped regions of the world. But what exactly does Bytedance offer?
Democracy and civil discourse are in rough shape these days. It is suffering from an unhealthy diet of inaccurate, misleading, and heavily biased information that has been fed to its citizens through their social media platforms. As Bytedance expands globally, its answer to this global “information health crisis” seems to be a bigger, greasier cheeseburger.
TechNode does not necessarily endorse the statements made in this article.