it will loose its ability to differentiate between there and their and its and it’s.
It already was, the only difference is that now reddit is getting paid for it.
loose
Irony?
Muphry’s Law at work
must of made a mistake their
your so dumb lmao
thank you kind stranger
Should of proof red it
“must have” not “must of”
And it will get LOSE and LOOSE mixed up like you did
it will UNLEASH its ability to differentiate between there and their and its and it’s.
ChatGPT was already trained on Reddit data. Check this video to see how one reddit username caused bugs on it: https://youtu.be/WO2X3oZEJOA?si=maWhUpJRf0ZSF_1T
I’m not gonna watch, but I assume little Bobby Tables strikes again.
I don’t know what to think about this, because on one hand I don’t like not beeing asked permission, but on the other hand I’m glad that my opinions will be somewhat reflected in chatgpt
I’m a bit annoyed by someone else profitting off my words, though. I freely gave them to the world I guess, and never objected to search engines using them, but my words were not “monetized”, even if they were later used to sell advertising. But it just doesn’t seem right for Reddit and others to be cashing in
“What is a giraffe?”
ChatGPT: “geraffes are so dumb.”
“I have not been trained to answer questions about stupid long horses.”
And when it learns something new, the response will be “Holy Hell”.
TIL
It will also reply “Yes.” to questions “is it A or B?”.
Perfectly acceptable answer
Its going to be a poop knife wielding guy with 2 broken arms out to get those jackdaws.
Now when you submit text to chat GPT, it responds with “this.”
Unironically this
ChatGPT trained used Reddit posts -> ChatGPT goes temporarily “insane”
Coincidence? I don’t think so.
They have always trained on reddit data, like, gpt2 was, i’m unsure about gpt1
ChatGPT also chooses that guy’s dead wife