this post was submitted on 12 Dec 2023
42 points (83.9% liked)
Asklemmy
43907 readers
1358 users here now
A loosely moderated place to ask open-ended questions
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
A bit of both to be honest. With the current fuss around AI artwork, I don't want to either steal someone else's work, or in the very unlikely event that mine becomes popular, have mine taken. The second one is much less of a concern though.
Fair enough. I'm pretty sure most of the fuss around AI is way over exaggerated. But we'd need a few more legal disputes and a few new laws to ultimately settle this.
I'm alright with using AI tools. I think it's wrong that these companies just take everything they can get hold of, without licensing it. (I mean, I would get in trouble if I downloaded illegal torrents of every novel out there without paying the authors. But it seems companies that develop ChatGPT get away with similar things.) But in the end it's like if a teacher copies textbooks or shows a pirated version of a movie. This doesn't make the things the students learned from it 'illegal' or forbidden knowledge. I think the same applies to AI. (Given the fact they don't copy this 1:1 which they usually don't do anyways... It is possible AI regurgitates its training data in some special cases. So there is some substance to this worry. But I've mainly seen this issue come up for example when generating computer code. Less so with images.) But that's just my oppinion.
It could bite into your side of copyright, however. But then you're also embedding that into a context, adding your own text and story around it. Even if the single images turn out not to be copyright-able, the combined work definitely is.
I don't want to talk you into using AI. Just: before you end up not doing it at all, maybe re-consider using it. Or do it as a preliminary step to draft something you like. You can still send this version to an artist afterwards.
And I bet all those issues will be solved in a few years time. Everyone and their grandma are already using AI and AI is not going away. There is no way around that.
This being said, I think AI also has downsides. Sometimes I generate images or text. But I don't think it's proper art. It copies styles well and does what it's told to do. Sometimes it is super creative and does hilarious things, sometimes the output is a bit bland. But it can't choose a style for a reason, or embed things into a context or hide little things or a second level of meaning into it's output. It doesn't choose colors or other details to underline a certain thing as a proper human artist would do.
And using AI for real-world purposes is hard. Like you have to learn how to force it to draw the same character in the next image and not a completely different woman. Add the details that fit. Do a consistent stlye. And as I understand people work on their prompts for quite some time before they get the exact results they want. And still, they then generate hundreds of images and go through them to end up with a single good one. It's way harder to put it to good use, than drawing some astronaut on a horse for some quick fun.