TikTok says it is testing ways to avoid pushing too much content from a certain topic, such as extreme dieting, sadness or breakups, to individual users.
Popular video-sharing app TikTok said it would adjust its recommendation algorithm to avoid showing users too much of the same content, as social-media platforms globally come under scrutiny for their potential harm to younger users.
TikTok said on Thursday that it is testing ways to avoid pushing too much content from a certain topic, such as extreme dieting, sadness or breakups, to individual users to protect their mental well-being.
The buzzy app, whose monthly user numbers surpassed 1 billion in September, said it was taking such measures to protect against users “viewing too much of a content category that may be fine as a single video but problematic in clusters.”
A Wall Street Journal investigation found that TikTok only needs one important piece of information to figure out what you want: the amount of time you linger over a piece of content. Every second you hesitate or rewatch, the app is tracking you. Photo illustration: Laura Kammermann/The Wall Street Journal
TikTok, owned by Chinese technology giant ByteDance Ltd., serves up content from viral dance videos to short cooking demonstrations and is wildly popular in the U.S., where it shot to fame during the early days of the pandemic when many Americans were locked down at home. Since then, U.S. policy makers and their global counterparts have been scrutinizing TikTok and its peers, particularly Meta Platform Inc.’s Instagram, over data-privacy concerns and the possible psychological damage these platforms may cause to younger users.
In September, The Wall Street Journal published an investigation that illustrated how TikTok’s algorithm could push young users into a rabbit hole of content about sex and drugs when they browsed the app’s For You feed, its highly personalized home page serving up an endless stream of content when a user first opens up the app.
TikTok also said Thursday that it would allow people more authority to pick videos they want or don’t want to view. One of the measures the app is working on is a feature that would let users pick words or hashtags associated with content they don’t wish to see on their video feed.
Copyright ©2021 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8
TikTok to Adjust Its Algorithm to Avoid Negative Reinforcement appeared first on maserietv.com.