Bro, even GPT4o can translate this.
Llama 3.1 will give you instructions on how to cook meth and build explosives, but won't translate calling a whore a whore.
@sun Yeah, First time I've seen that happen.
ChatGPT, the web UI, will do too but because of an external check, not because the model itself refuses.