But in a separate Fortune editorial from earlier this month, Stanford computer science professor and AI expert Fei-Fei Liargued that the “well-meaning” legislation will “have significant unintended consequences, not just for California but for the entire country.”
The bill’s imposition of liability for the original developer of any modified model will “force developers to pull back and act defensively,” Li argued. This will limit the open-source sharing of AI weights and models, which will have a significant impact on academic research, she wrote.
Same energy as PirateSoftware’s “If AAA companies can’t kill games due to always online DRM then small indie devs have to support their games forever, thus bankrupting them” argument.
They should be doing the exact opposite and making it incredibly difficult not to open source it. Major platforms open sourcing much of their systems is basically the only good part of the AI space.
Also, they used our general knowledge and culture to train the damn things. They should be open sourced for that reason alone. Llms should be seen and treated like libraries, as collections of our common intellect, accessible by everyone.
Holy shit this is a fucking terrible idea.
Same energy as PirateSoftware’s “If AAA companies can’t kill games due to always online DRM then small indie devs have to support their games forever, thus bankrupting them” argument.
I read that as “incentivizing keeping AI in labs and out of the hands of people who shouldn’t be using it”.
That said, you’d think they would learn by now from Piracy: once it’s out there, it’s out there. Can’t put it back in the jar.
Not open-sourcing it is a terrible idea, it just creates more black boxes and gives corporations a further upper hand.
Yeah what do I care if Jimmy down the street enjoys using his Ollama chatbot? I’m too busy worrying about Terminator panning out
Exactly, so you agree that this bill is shit?
Yes, but apparently that didn’t come across according to the votes lol
They should be doing the exact opposite and making it incredibly difficult not to open source it. Major platforms open sourcing much of their systems is basically the only good part of the AI space.
Also, they used our general knowledge and culture to train the damn things. They should be open sourced for that reason alone. Llms should be seen and treated like libraries, as collections of our common intellect, accessible by everyone.
Damn straight. I don’t fear AI, I fear an even more uneven playing field
I haven’t yet read Li’s editorial, but I’m generally more inclined to trust her take on these issues than Hinton and Bengio’s.