I’d find it disturbing because if we create a sentient being and torture it like that are’nt we fucking evil as hell ? Remember LLM and AI is not the same AI implies a sentient being no matter what its body is made of.
You learn about AI on lemmy. This guy builds them for his job. You read articles. This guy builds them for his job. You were told those incorrect definitions last week. This guy builds them for his job.
Just looked it up and my first AI class of my degree was in 2016 so it’s been about 8 years since I’ve started working on them. But please tell me more about your qualifications
I’d find it disturbing because if we create a sentient being and torture it like that are’nt we fucking evil as hell ? Remember LLM and AI is not the same AI implies a sentient being no matter what its body is made of.
And also if it’s sentient and gets the power to destroy us there would be zero hesitation if it was treated this way
Yep then i eould be rooting for it.
That is not at all what AI means. And LLMs are definitely AI
You don’t know what you are talking about fo you ?
I do actually, I’ve been working with various AI implementations for the better part of a decade.
And younstill don’t know how AI and LLM is completely different ?
You learn about AI on lemmy. This guy builds them for his job. You read articles. This guy builds them for his job. You were told those incorrect definitions last week. This guy builds them for his job.
How do you even know he/she is telling the truth ? I could just say i am working on gemini for google and you would eat it up.
You are a walking talking dunning-kruger effect.
Yeah and you are on the opposite poll of it .(if you even fully understand the dunning-kruger effect you’ll know what i am talking about)
Just looked it up and my first AI class of my degree was in 2016 so it’s been about 8 years since I’ve started working on them. But please tell me more about your qualifications
Me too . We must’ve been in the same batch.
Stop inventing new definitions. AI has never meant sentience. Arguably, AGI hasn’t ever even required sentience.
Yes! You don’t need the pre-qualifying statement that comes before. Look around at the world today. The answer is “YES”!