Reddit's Latest Privacy Update Lets Users Go Dark—Moderators Push Back
Some Redditors fear it could lead to more abusive activities on the platform

Reddit has introduced a major new privacy feature, allowing users to control how their content appears across the platform—a move that has sparked both praise and concern among its online community.
Reddit Gives Users Greater Control Over Profile Visibility
On Tuesday, 3 June 2025, Reddit Inc. announced an upgrade to its profile settings that gives Redditors greater autonomy over what content is displayed publicly.
'We're rolling out updates to profile settings, making it easier for Redditors to manage their presence and control what appears on their Reddit profile,' the company stated.
While the update is intended to enhance user privacy and customisation, some worry that it may be exploited by malicious users.
How Reddit's New Privacy Feature Works
According to 9to5Mac, the new feature lets users choose how their activity appears in different subreddits. Redditors now have three content visibility options:
- Default: Keep all comments and posts public.
- New Option: Hide all public comments (including direct messages) and posts.
- New Option: Selectively display posts and comments in chosen communities, while hiding them in others.
These options are accessible via a new setting titled 'Content and Activity', now found under a broader section named 'Curate Your Profile'. This updated section includes:
- Content and Activities: Controls post and comment visibility.
- NSFW Toggle: Enables or disables visibility of NSFW content.
- Followers Toggle: Manages whether followers are publicly shown.
Although the update may seem like a welcome privacy boost for many, it has prompted anxiety among users who fear it could make it easier for trolls and scammers to operate unchecked.
Concerns Grow Over Reduced Transparency
Many Reddit users have voiced alarm over the potential misuse of this feature. One highly upvoted comment by Baruch_S captured the general sentiment: 'This seems like it's going to lead to bad actors being able to more easily hide their behaviour from the average user.'
Baruch_S continued, 'Unless you're a mod, you won't be able to see that they're posting inflammatory misinformation across a dozen different subs, for instance. Less transparency isn't good.'
Another user, Kahzgul, pointed to the risk of fake charity scams. He shared that subreddits he frequents are often targeted by spambots promoting fraudulent GoFundMe links. Visibility into user histories helps spot scams—something that may be harder if profiles become more private.
What Is Reddit Doing About It?
In response to these concerns, Reddit outlined safeguards to ensure that rule-breakers do not slip through the cracks. The platform clarified that if a user interacts in certain ways—such as posting, commenting, requesting post approval, or asking to join a private subreddit—moderators will gain full access to that user's profile content for 28 days.
'If you comment on another user's profile, that user will have 28 days of access to your full profile content,' Reddit added.
However, critics remain unconvinced. Many argue that expecting unpaid moderators to sift through hidden activity, even within the 28-day window, places an unfair burden on an already stretched community.
As Reddit continues to refine its platform, the challenge remains: balancing user privacy with the need for transparency and accountability in an increasingly complex digital landscape.
© Copyright IBTimes 2025. All rights reserved.