

Seriously? A chatbot is one function of an ai, not the other way around. So when you give the ai a different task or set of instructions, it’s no longer the chatbot anymore, it’s whatever function that’s needed for that task.
I weep for humanity if you’re any indication of the general education on ai….
If you ask it to create an image, are you seriously expecting it to have a conversation and point out where you messed up? That’s not how any of this works lmfao. “Hey I need to point out that ducks don’t have scales, and the sky isn’t green.” No it does what it’s asked. But now suddenly it’s different? Why?



Yes I’ve already addressed your asinine view.