In context: Most, if not all, large language models censor responses when users ask for things considered dangerous, unethical, or illegal. Good luck getting Bing to tell you how to cook your ...
But that would require a lot of effort, something the makers of chatbots do not want to do, damn the consequences Click to expand... Oh no somebody might find out that if you have the right type of ...