As someone who’s been in CG animation for 15 years, I love AI and the creative possibilities it opens up. But I also worry about how it could displace artists. My idea is that AI models should only be trained on open-source or copyright-free material, or companies should hire artists to create work specifically for training AI. I think this could help protect artists and create a more responsible AI industry. What are your thoughts on this? Is there a better way?
The problem with limiting AI training to only ethical, consent-based data is that it will concentrate the power to create foundational AI models into the hands of megacorps. Small teams like the ones behind Stable Diffusion wouldn’t be able to compete. I don’t think this would help progress.
Yeah, that’s a fair point. But shouldn’t there still be some regulations in place to make sure creators are compensated or at least acknowledged?
I’m with you on supporting ethical models, but I don’t think people really care about ‘ethical’ AI. It’s just another criticism that will be replaced by something else. People are more worried about the technology than the ethics, tbh.
True, but I think having clear rules might make people feel better about AI and reduce some of the backlash. Plus, it could create new jobs for creators!
Enforcing ‘ethical’ AI models could also stifle innovation. If only big corporations can afford to create models because of these rules, it would hurt open-source development. Also, we can’t really tell what data a model was trained on. Models are opaque.
I get that. But if companies like OpenAI can’t say what data they used, shouldn’t they be held accountable? At least for transparency?
In theory, yes, but in practice, it’s really hard to regulate AI in that way. And, as you said, it might just slow down progress without really solving the problem.
I don’t think it’s fair to compare human artists referencing work with AI. Artists have ethical boundaries and use references to create something unique. AI can replicate without those boundaries, which is why it’s important to regulate.
Exactly! AI doesn’t have the same ethical compass that a human artist does, and that’s why there should be more restrictions on how it’s trained.
I think the real issue is job displacement. AI is going to take jobs, no doubt about it. But isn’t that just part of progress? Society doesn’t owe anyone their dream job forever. New technologies always disrupt industries.
Fair point. But wouldn’t a more gradual transition—where artists are still part of the process—help soften the blow? That way, AI and artists can coexist more sustainably.