TikTok Censorship Glitch or Power Outage? Users Report Anti-ICE Content Blocks
- Olivia Johnson

- 4 days ago
- 7 min read

It is January 2026, and the digital town square feels smaller than it did a week ago. Over the weekend, a wave of content creators found themselves staring at frozen progress bars. The issue wasn't universal. It didn't affect dance trends, recipe vlogs, or sponsored haul videos. The technical paralysis seemed to target a very specific slice of political discourse: criticism of U.S. Immigration and Customs Enforcement (ICE).
What the platform calls a technical error, users are calling a targeted TikTok censorship glitch. The timing is suspicious. With protests erupting over the death of nurse Alex Pretti during an ICE raid in Minneapolis, social media is the primary conduit for organization and outrage. Yet, users attempting to engage with this topic are hitting a wall. The company blames a data center power outage, but the user experience suggests the lights went out only on specific viewpoints.
If you are struggling to get your voice heard, you aren't alone. Before diving into the analysis of why this is happening, let’s look at what users have found actually works to get around these blocks.
Immediate Workarounds for the TikTok Censorship Glitch

If you are currently staring at a video that refuses to process, standard troubleshooting like clearing your cache or reinstalling the app likely won't work. Based on community feedback from creators who have successfully navigated this blackout, here is how to handle the situation.
Verify the Block
First, determine if you are facing a connection error or a content block. Try uploading a benign, non-political video (like a 5-second clip of your floor). If that uploads instantly while your anti-ICE content stalls, you are likely triggering a moderation filter, not suffering from a server outage.
The Platform Pivot
The most effective solution reported by users is immediate migration. When comedian Megan Stalter couldn't get her video discussing Alex Pretti on TikTok, she moved to Instagram. The video successfully uploaded there and garnered over 12,000 shares. If your goal is dissemination, stop fighting the TikTok algorithm and move the file to Instagram Reels or X (formerly Twitter). The audience overlap is significant, and currently, those platforms are not exhibiting the same specific upload freezes.
Keyword Sanitization
Some users suggest that the TikTok censorship glitch is triggered by metadata. If you must stay on the platform, try removing hashtags like #AbolishICE, #ICE, or specific names from the caption. Burn the text into the video file itself rather than putting it in the description. This makes it harder for automated text-scrapers to flag the upload immediately, though it limits discoverability.
The Pattern Behind the TikTok Censorship Glitch
To understand if this is a glitch or a feature, we have to look at the data. A standard server outage—the reason cited by TikTok spokespeople—is usually indiscriminate. When a data center loses power, everyone in the affected region loses access, regardless of whether they are posting about kittens or politics.
That is not what is happening here.
How the TikTok censorship glitch affects political discourse
Reports indicate a "surgical" failure rate. Users trying to upload videos containing anti-MAGA sentiment or footage of recent immigration raids are experiencing indefinite "processing" states. Meanwhile, their feeds remain active, and they can view other content without issue.
This creates a chilling effect known as "shadow-banning," but in a more aggressive form. Shadow-banning usually means your video uploads but nobody sees it. This new phenomenon prevents the upload entirely. It essentially freezes the conversation during critical windows of public interest. By the time the "glitch" is fixed, the news cycle has moved on, and the momentum for organizing protests or raising awareness for victims like Alex Pretti has dissipated.
Keyword Blocking in DMs
The suspicion that this is intentional is bolstered by user reports regarding Direct Messages. Several users have documented an inability to send specific words, such as "Epstein," in private chats. If the backend architecture is set up to filter keywords in private communications, it is highly probable similar filters are applied to public uploads. A power outage does not selectively delete the name of a deceased financier while allowing other messages to pass through.
The Megan Stalter Incident: A Case Study

The most visible face of this controversy is comedian Megan Stalter. Her experience serves as a control group for testing the TikTok censorship glitch. Stalter attempted to post a video specifically calling for action regarding the shooting of Alex Pretti.
The video was not graphic. It was a piece of direct-to-camera advocacy. For hours, she attempted to upload it, only to be met with constant failure. There was no error code indicating a bad connection—just a refusal to publish.
This incident highlights the asymmetry of the "glitch." Stalter is a verified user with a significant following. Typically, accounts of her stature get priority bandwidth and processing. That she was completely roadblocked suggests the filter was placed on the content of the file, not the status of the user. Her subsequent success on Instagram proved the video file itself was not corrupted. The variable was the platform.
Why users believe the TikTok censorship glitch is intentional
The skepticism from the user base is rooted in pattern recognition. We have seen this before. When protests erupted in previous years, "technical errors" frequently coincided with spikes in hashtags used by organizers.
This time, however, the political backdrop is different. The discussion on Reddit threads surrounding this issue points to the new ownership structure of TikTok. Following the platform's restructuring—approved by the Trump administration—there has been a noticeable shift in what is permissible. Users are connecting the dots between the upload failures and the new management's alignment with specific political figures. It is difficult to accept a "glitch" narrative when the "glitch" serves the exact political interests of the platform's stakeholders.
Algorithm Shifts: From Neutrality to Bias?
Beyond the TikTok censorship glitch preventing uploads, there is a separate but related issue regarding what is being pushed to viewers. The recommendation engine—the "For You" Page (FYP)—appears to be undergoing a recalibration.
Fresh accounts are acting as canaries in the coal mine. Users creating brand new profiles, with zero watch history to inform the algorithm, report being immediately inundated with Pro-Trump and pro-administration content. In a neutral algorithm, a new user usually sees a mix of viral, broadly appealing content (comedy, sports, animals).
The aggressive push of specific political narratives to new users suggests a "hard-coding" of bias. If the algorithm is tuned to amplify one side, it stands to reason it is tuned to suppress the other. The "glitch" preventing anti-ICE videos from uploading may simply be the blunt-force version of the subtle suppression happening in the recommendation feed.
This contradicts reports from some users who say they can still see anti-ICE content on their FYP. However, this discrepancy is often how algorithmic throttling works. It’s rarely a 100% blackout, which would be too obvious. Instead, it’s a reduction in reach. The fact that some videos exist doesn't negate the fact that new videos are being blocked from uploading during a breaking news event.
The Official Narrative: Data Centers vs. Content Moderation

TikTok’s official stance regarding the weekend of January 26, 2026, is that a power outage at a US data center caused latency and upload failures.
Technically, this is a convenient explanation. Data centers do fail. Processing delays do happen. However, a data center failure causing a content-specific upload error is a technical anomaly that requires more explanation. Video files are binary data. The server does not know if a video contains a cat or a protest sign unless it is processing that video through computer vision and audio recognition software during the upload.
If the "power outage" slowed down the content moderation AI, the standard protocol for most tech companies is to either queue everything or let everything through and moderate later. For the system to reject specific political topics while the "power is out" implies that the keyword filters were the only thing that stayed online.
The community’s lack of trust is not unfounded. Users are demanding transparency. If it was a power outage, show the uptime logs. Show the geographic distribution of the outage. Until then, "technical difficulties" serves as a catch-all shield against accusations of political interference.
Privacy Concerns and the New Management
The TikTok censorship glitch has reignited broader conversations about privacy and safety. The Reddit threads discussing these upload failures are filled with users expressing regret over maintaining their accounts.
The concern is no longer just about what you can watch, but what is being watched about you. With the reported changes in management involving figures like Larry Ellison’s son and the general "MAGA-fication" of the corporate structure, users are wary of how their location data is being used.
The fear is that if the platform is willing to block content about ICE, it may be willing to share user data with ICE. While this is speculative, the proximity of the platform’s leadership to the administration makes it a valid concern for activists. The recommendation from the most security-conscious users is clear: if you are involved in sensitive political organizing, TikTok is no longer a secure or reliable channel.
Conclusion
The events of January 2026 mark a turning point for TikTok. Whether the TikTok censorship glitch is a result of a literal power failure or a metaphorical power grab by new ownership, the result is the same: the silencing of dissent during a critical human rights conversation.
For years, the internet was viewed as an unshakeable archive. Now, it feels more like a curated broadcast. The Megan Stalter incident proves that virality is permitted only within certain parameters. As we move forward, users must verify whether a "glitch" is truly a bug, or if the system is working exactly as intended. The solution for now is diversification. Do not rely on a single app to document history, especially when that app decides which history is allowed to buffer.
FAQ
Q: Is the TikTok censorship glitch affecting all users?
A: No, the reports are inconsistent. While many users attempting to post anti-ICE or anti-MAGA content face blocks, others report normal functionality. The issue seems targeted at specific keywords and content types rather than a universal account blackout.
Q: Did TikTok admit to censoring anti-ICE videos?
A: No. The company stated that the upload failures were caused by a power outage at a US data center. They attribute the inability to post videos to technical infrastructure problems rather than moderation policies.
Q: Can I fix the upload failure by switching to Wi-Fi?
A: Likely not. If you are experiencing this specific glitch, it is not a bandwidth issue. Users have reported that switching connections does not resolve the "processing" freeze if the content itself is flagged.
Q: Is it safe to post political content on TikTok in 2026?
A: Caution is advised. With the new ownership structure and user reports of location tracking concerns, activists suggest using encrypted platforms for organization. For public posting, be aware that your reach may be throttled.
Q: Why can't I send the word "Epstein" in TikTok DMs?
A: Users have gathered evidence suggesting specific keyword blacklists exist in the Direct Message feature. This indicates an intentional filter is active within the app’s code, independent of any external server power outages.


