For a small price, Malus.sh will use AI to ingest any piece of software you give and spit out a new version of it that “liberates” it from any existing copyright licenses. The result is a new piece of software that serves the same function, but doesn’t have to honor, for example, the kind of copyright licenses that ensure open source software remains free to use and modify, a process which could upend the already fragile open source ecosystem. The site is an elaborate bit of satire designed to bring attention to a very real problem in open source, but it also does exactly what it advertises and is a real LLC that is making money by using AI to produce “clean room” clones of existing software. “It works,” Mike Nolan, one of the two people behind Malus, who researches the political economy of open source software and currently works for the United Nations, told me. “The Stripe charge will provide you the thing, and it was important for us to do that, because we felt that if it was just satire, it would end up like every other piece of research I’ve done on open source, which ends up being largely dismissed by open source tech workers who felt that they were too special and too unique and too intelligent to ever be the ones on the bad side of the layoffs or the economics of the situation.” Malus’s legal strategy for bypassing copyright is based on a historically pivotal moment for software and copyright law dating back to 1982. Back then, IBM dominated home computing, and competitors like Columbia Data Products wanted to sell products that were compatible with software that IBM customers were already using. Reverse engineering IBM’s computer would have infringed on the company’s copyright, so Columbia Data Products came up with what we now know as a “clean room” design.It tasked one team with examining IBM’s BIOS and creating specifications for what a clone of that system would require. A different “clean” team, one that was never exposed to IBM’s code, then created BIOS that met those specifications from scratch. The result was a system that was compatible with IBM’s ecosystem but didn’t violate its copyright because it did not copy IBM’s technical process and counted as original work. This clean room method, which has been validated by case law and dramatized in the first season of Halt and Catch Fire, made computing more open and competitive than it would have been otherwise. But it has taken on new meaning in the age of generative AI. It is now easier than ever to ask AI tools to produce software that is identical in function to existing open source projects, and that, some would argue, are built from scratch and are therefore original work that can bypass existing copyright licenses. Others would say that software produced by large language models is inherently derivative, because like any LLM output, it is trained on the collective output of humans scraped from the internet, including specific open source projects. Malus (pronounced malice), uses AI to do the same thing.“Finally, liberation from open source license obligations,” Malus’s site says. “Our proprietary AI robots independently recreate any open source project from scratch. The result? Legally distinct code with corporate-friendly licensing. No attribution. No copyleft. No problems.” Copyleft is a type of copyright license that ensures reproductions or applications of the software keep it free to share and modify. Malus’s pitch is naked contempt for the open source community, which believes in developing software collaboratively and providing it for free to everyone. Normally, copyright licenses for open source projects only ask that anyone who uses the work give credit to maintainers and that any derivative works will continue to use the same permissive license, which hopefully grows the community of people who contribute back into the project and keep it going. “Some licenses require you to contribute improvements back. Your shareholders didn’t invest in your company so you could help strangers,” Malus’s site says. “Is your legal team frustrated with the attribution clause? Tired of putting ‘Portions of this software…’ in your documentation? Those maintainers worked for free—why should they get credit?”The site gained some incredulous attention when it was posted to Hacker News recently,, but it didn’t take people long to realize that it was an elaborate bit of satire, even if the tool can still replicate open source projects as advertised. Malus was born out of a talk that open source developers Dylan Ayrey and Michael Nolan gave at the open source conference FOSDEM 2026. The AI slop heavy presentation is a whirlwind history of copyright and software, how the two have always had an uneasy but necessary relationship, and how that relationship is fundamentally changed now that AI tools can produce clean room designs at a click of button.“Even if the courts ruled that maybe this is legal, and maybe there aren’t legal restrictions to doing this, is it ethical?” Ayrey asked. “The question we should be asking is, can we get rich off of this?” Nolan said. And so Malus was born. Malus is satire, but it will actually take your money and do what it advertises. It is modeled after the IBM case and uses one AI agent to write the specifications and a different agent to produce the code, creating that “clean room” effect. Malus will also do performance testing and scan for common vulnerabilities to make sure the output is functional. Nolan didn’t tell me exactly how much money the company is making but said it is a real LLC with a bank account and is profitable, with “probably hundreds” of dollars at this point. The service charges $0.01 for each KB of data across the project’s various dependencies.The pricing for using Malus.What Malus is satirizing is also really happening. For example, in March Ars Technica and The Register covered an incident around a widely used Python library called chardet. Originally it was released under the LGPL license; then a version was rereleased under the more permissive MIT license. Dan Blanchard, who used Claude to produce the MIT-licensed version of chardet, argued that it was a complete rewrite of chardet, and not derivative, because only a small percent of the code looked and functioned similarly. Mark Pilgrim, who originally released chardet, disagreed and complained about Blanchard using this method to shed the more restrictive LGPL license. “This concern is legitimate. AI has made clean-room style reimplementation dramatically cheaper,” Blanchard wrote in response to Pilgrim. “What used to require months of work by expensive engineering teams can now, as Armin Ronacher put it, be done trivially.”Blanchard also conceded that Claude, which like all LLMs, was trained on vast amounts of data scraped indiscriminately from the internet and was exposed to the original chardet in its training, but maintains his version is not derivative.“I have seen Malus.sh, and like many people, I wasn’t sure it was satire at first, because I’m sure someone will probably make that for real eventually,” Blanchard told me in an email. “I think the reality of the situation is that traditional software licenses (open source and commercial) weren’t the real barrier against these sorts of rewrites in the past (see WINE, Linux, and IBM PC BIOSes long ago), and the main obstacles were time and money. A rewrite that would’ve taken a team of people months or years can be done in days with AI. As a professional software engineer, I don’t love that much of the business model around selling software is in danger, but I don’t think there’s any putting the genie back in the bottle at this point.”After the backlash, Blanchard changed the license on his version of chardet from MIT to the 0BSD license, which he told me “was a change that satisfied many in the community’s concerns about AI-generated code not even being copyrightable in the first place.” The 0BSD license is very permissive and allows anyone to “use, copy, modify, and/or distribute this software for any purpose with or without fee.”“Much of our law was designed with human scale inefficiencies in mind,” Meredith Rose, a senior policy counsel with Public Knowledge who focuses on copyright, DMCA, and intellectual property reform, told me. “Clean rooms worked because courts kind of looked at the whole clean room methodology and were like, ‘there’s a lot of labor that goes into this.’ That’s part of the calculus. You had a couple human beings recreating this very big source package essentially from nothing but high level specs. The idea of collapsing that into something where you can press a button and get an entire package recreated is kind of wild, even though it is technically correct under the law as far as I can tell.”Others in the open source community say that regardless of the legal implications of AI-generated clean room versions of existing software, the reality and impact of the practice is here, and not good for the open source community. “Whether or not Malus is satire, the concept it describes is already happening in practice. The legal theory that an AI can ‘clean room’ reimplement things was arguably made inevitable by the approach companies like OpenAI and Anthropic have taken to copyright: treat the entire internet as training data, then claim the output is a new, unencumbered work,” Mike McQuaid, developer of the popular open source package manager Homebrew, told me. “Even if you accept the legal argument, the ethics fucking suck. Open source isn’t just source code you download once. It’s an ongoing relationship: security patches, bug fixes, adaptation to new platforms, accumulated expertise from years of triage and review. A ‘clean room’ reimplementation fucks all of that. You get a snapshot with none of the maintenance. It’s basically just a fork where nobody knows how the code works, nobody is watching for CVEs, and nobody knows what to do when it breaks. That’s not liberation, it’s just technical debt.”Nolan told me that he made Malus to make developers feel this danger.“I’ve been publishing research on these [open source] communities for over a decade now, and consistently, what I hear over and over again is that open source has won because 80 or 90 percent of all software applications rely upon us, but what they’re relying upon is the wholesale exploitation of massive communities of workers who convince themselves that they’re winning because Google uses them, and what they end up doing instead is pretending that because their software is licensed under a certain license, that that means they’re ethical,” Nolan said. “It doesn’t matter if they’re in the supply chain of weapons that are committing war crimes. It doesn’t matter that their friends suddenly get the rug pulled out from under them when a CTO decides to change strategy and no longer wants to support that library anymore […] They just keep on saying everything’s okay as the tech sector essentially will collapse down upon them, and they keep saying they’re winning, even when they’re not. And so my hope, with Malus, was to make people think critically about their position.”