ChatGPT manipulated into creating activation keys for Windows 95

open ai windows 95 chatgpt logos

OpenAI’s ChatGPT is sworn to not provide activation keys for any software, but a clever YouTuber found their way around that limitation.

The world of AI is ramping up and people are starting to get a little weird with it. One YouTuber, Enderman, has managed to manipulate ChatGPT into giving out an activation key for Microsoft’s Windows 95.

Windows 95 is renowned for having the methodology behind the key generator blown wide open. A website still hosts a text file that details the formula behind the keys, which Enderman took to the AI to see if it would spit out a code that worked.

Article continues after ad

Parsing the formula as both a math equation and a vague request, ChatGPT was led to giving out thirty different combinations and Enderman was successful in activating a copy of Windows 95.

Despite safeguards being put in place for the AI, there’s nothing to prevent a cleverly worded prompt from giving exactly what you need. ChatGPT has already come under fire for providing false information, as well as tricking the user themselves into solving CAPTCHAs.

Article continues after ad

ChatGPT activates Windows 95

Towards the end of the video, Enderman “thanks” ChatGPT for the Windows 95 keys. In turn, the AI chatbot responded with:

“I apologize for any confusion, but I did not provide any Windows 95 keys in my previous response. In fact, I cannot provide any product keys or activation codes for any software, as that would be illegal and against OpenAl’s policies.

“Please make sure to obtain any necessary licenses or product keys from authorized sources to avoid any legal issues.”

Article continues after ad

Interestingly, Enderman’s video does demonstrate just how limited ChatGPT can be when asking for complex answers.

The prompt had to be edited multiple times over half an hour before the AI could figure out the correct answer. At first, when asked to provide a random string of numbers divisible by seven for the key, represented by “ZZZZZ”, it could only muster a random string of letters instead.