I always feel a bit weird when people ask me what I do in my own spare time and my answer is basically fixing my shit, then pushing it just hard enough that it breaks again.
I always feel a bit weird when people ask me what I do in my own spare time and my answer is basically fixing my shit, then pushing it just hard enough that it breaks again.
Unfortunately, copyright protection doesn’t extend that far. AI training is almost certainly fair use if it is copying at all. Styles and the like cannot be copyrighted, so even if an AI creates a work in the style of someone else, it is extremely unlikely that the output would be so similar as to be in violation of copyright. Though I do feel that it is unethical to intentionally try to reproduce someone’s style, especially if you’re doing it for commercial gain. But that is not illegal unless you try to say that you are that artist.
https://www.eff.org/deeplinks/2023/04/how-we-think-about-copyright-and-ai-art-0
So at the bare minimum, a mechanism needs to be provided for retroactively removing works that would have been opted out of commercial usage if the option had been available and the rights holders had been informed about the commercial intentions of the project.
If you do this, you limit access to AI tools exclusively to big companies. They already employ enough artists to create a useful AI generator, they’ll simply add that the artist agrees for their work to be used in training to the employment contract. After a while, the only people who have access to reasonably good AI is are those major corporations, and they’ll leverage that to depress wages and control employees.
The WGA’s idea that the direct output of an AI is uncopyrightable doesn’t distort things so heavily in favor of Disney and Hasbro. It’s also more legally actionable. You don’t name Microsoft Word as the editor of a novel because you used spell check even if it corrected the spelling and grammar of every word. Naturally you don’t name generative AI as an author or creator.
Though the above argument only really applies when you have strong unions willing to fight for workers, and with how gutted they are in the US, I don’t think that will be the standard.
I replaced my brakes last weekend. Did the pads, realized I also needed to do the disks and brake fluid too. Ended up being a lot more work than I wanted, mostly because I was missing tools.
eg wind generator parks take up a lot of space
Though the vast majority of this space can still be used. I live near a wind farm and the area under the turbines still is ranchland. There are cows just chilling under them. The wind company pays farmers for the land in a long term lease agreement: https://www.wri.org/insights/how-wind-turbines-are-providing-safety-net-rural-farmers.
Well, the typical way of measuring q does measure the energy it takes to get the boulder up the hill, but not the inefficiency of the machine to get the boulder up there and the ineffency in extracting its energy as it goes back down.
There’s a lot of unsexy research that could make fusion come a whole lot sooner. More efficient powerful lasers, better cooling methods and design for superconducting electromagnetics, more efficient containment methods and more thought on how to extract energy from the plasma efficiently, and then making it cheap enough to build and maintain that we can actually afford to build them.
You’re right, copyright won’t fix it, copyright will just enable large companies to activate more of their work extract more from the creative space.
But who will benefit the most from AI? The artists seem to be getting screwed right now, and I’m pretty sure that Hasbro and Disney will love to cut costs and lay off artists as soon as this blows over.
Technology is capital, and in a capitalist system, that goes to benefit the holders of that capital. No matter how you cut it, laborers including artists are the ones who will get screwed.
Hopefully, it can also be used to prove that someone was not at the scene of a crime, enabling prosecutors to rule out suspects and innocent people to get off.
True, I wrote this from a US law perspective, where that kind of behavior is expressly protected. US law is also written specifically to protect things like search engines and aggregators to prevent services like Google from getting sued for their blurbs, but it’s likely also a defense for AI.
Regardless of if it should be illegal or not, I feel that AI training and use is currently legal under current US law. And as a US company, dragging OpenAI to UK courts and extracting payment from them would be difficult for all but the most monied artists.