Threads can remove or lower the visibility of a post or a profile if it violates its community standards.
Photo Credit: Threads
If the user thinks Threads has made a mistake, they can submit a request to review the content
Threads on Friday rolled out a new feature to help users understand about their posts and profile. Account Status, which was previously only available on Instagram, has been brought to the Meta-owned microblogging platform. With this feature, users will be able to see when their Threads posts are removed or demoted, in compliance with its community standards. If they think that their post's removal or demotion is not justified, Threads provides an option to file a report too.
Threads detailed the new Account Status feature in a post. It is aimed at helping users get to know the current status of their posts and profile better directly within the app. They can see any actions that have been taken on any of the user posts or replies that violate the microblogging platform's community standards. Users can check their account status by heading over to Settings > Account > Account status.
As per the Meta-owned platform, a total of four actions can be taken on any post or profile — removed content, what can't be recommended, content lowered in feed, and features you can't use. This means Threads might not just remove content altogether, but also stop things from being recommended, lower a post or profile's visibility, and even limit some of the features which are available to users.
If the user thinks that the moderation process has made a mistake, they can submit a request to review the post or the profile. It will alert the user via a notification once the submitted report is reviewed. As per the company, its community standards are applicable to everyone globally and to all types of content, including that generated with artificial intelligence (AI).
It emphasises that while expression is paramount, it can limit expression in the service of factors such as authenticity, dignity, privacy, and safety. The microblogging platform can allow content, which would otherwise go against its community standards, if it is in the public interest or newsworthy. That is done by weighing up its value against the risk of harm, keeping international human rights standards in mind. Meanwhile, there are cases where it may remove content featuring ambiguous or implicit language if additional context is provided.
Get your daily dose of tech news, reviews, and insights, in under 80 characters on Gadgets 360 Turbo. Connect with fellow tech lovers on our Forum. Follow us on X, Facebook, WhatsApp, Threads and Google News for instant updates. Catch all the action on our YouTube channel.
Scientists Unveil Screen That Produces Touchable 3D Images Using Light-Activated Pixels
SpaceX Expands Starlink Network With 29-Satellite Falcon 9 Launch
Nancy Grace Roman Space Telescope Fully Assembled, Launch Planned for 2026–2027