YouTube has begun deleting videos that show how to install Windows 11 without a Microsoft account or how to load the newest Windows 11 update (version 25H2) on unsupported PCs, claiming these tutorials break its “harmful or dangerous content” policy.
The platform issued strikes to several creators, including two well-known tech YouTubers: Britec09, who has nearly 900,000 subscribers, and CyberCPU Tech, whose audience exceeds 300,000.
Both creators say YouTube’s enforcement relied entirely on automated systems, which removed their videos and rejected their appeals within minutes.
More: Microsoft Makes It Harder to Set Up Windows 11 Without a Microsoft Account
Britec09 explained that YouTube gave him a strike for a video showing how to install Windows 11 25H2 on a decade-old computer.
He shared footage of his YouTube dashboard confirming that the appeal had been rejected and said the company refused to specify what rule was violated.
He criticized the process as opaque and said that the so-called “support chat” feels like talking to a robot instead of a human being.
CyberCPU Tech reported a nearly identical experience.
On October 26, his tutorial on bypassing the Microsoft account requirement was removed, and YouTube labeled it “harmful or dangerous.”
He said his appeal was denied just 45 minutes later, after the video had already drawn more than 80,000 views.
Two days later, his second tutorial, on installing the Windows 11 25H2 update on unsupported hardware, was taken down, and this time the appeal was rejected in under a minute.
Both creators argue that their content is educational and that showing users how to work around installation limitations hardly qualifies as dangerous. CyberCPU Tech pointed out the absurdity of YouTube claiming that “creating a local account in Windows could cause serious physical harm or death.” He added that because his appeal was handled by AI, “there’s simply no reasoning with a calculator.”
By late October, both channels had begun calling attention to what they describe as the growing problem of AI-driven censorship on YouTube.
Britec09 revealed that YouTube’s own AI-powered video ideas tool was recommending topics identical to the ones that had earned him a strike. He called this a clear contradiction; YouTube’s AI suggests the very content that another AI system later punishes.
He tied this to YouTube’s recent shift toward automation, referring to news that the company is offering buyouts to employees as it reorganizes around AI.
CyberCPU Tech echoed these concerns, noting that 94% of video removals in 2025 were handled entirely by AI with no human involvement, according to YouTube’s own transparency reports.
He argued that the company’s harmful or dangerous content policy was being applied as a catch-all category, sweeping up harmless tutorials that should fall under its educational exceptions.
Britec09 added that the confusion over what is or isn’t allowed has left many creators afraid to upload at all.
He warned that if YouTube begins scanning older videos under these new interpretations, “a lot of tech channels will be gone, years of hard work gone in a blink of an eye.”
Both creators insist that if YouTube wants creators to follow the rules, it needs to make those rules clear and not let unaccountable AI systems dictate enforcement.
They see this trend as part of a broader erosion of open discussion on major platforms. “If creating a local account in Windows 11 is against YouTube’s policies,” CyberCPU Tech said, “then what kind of videos are we allowed to make anymore?”








