The use of AI generated content in TAS products

There seem to be a lot of emotional responses grounded in misconceptions about how AI-assisted art is actually produced.
For a more nuanced perspective, you can read my full article:
“The Artist, the AI, and the Future of Traveller TAS Products” — it explores authorship, ethics, and the production methods I use, along with suggestions for how TAS could better support indie creators working with AI.

https://www.cyborgprime.com/travell...e-ai-and-the-future-of-traveller-tas-products
Do you also advocate for AI-generated text in Traveller TAS products?
 
Do you also advocate for AI-generated text in Traveller TAS products?
No, I don't advocate for any AI-generated content - I am making a distinction between AI-generated and AI-assisted.
AI as part of a process - not raw AI output.
For personal use, sure, but for commercial use, there needs to be quality standards and a human subject matter expert to curate/edit the output so that it can be copyrighted.
 
Last edited:
I don't advocate for any AI-generated content - I am making a distinction between AI-generated and AI-assisted.
I noticed in your post it seems AI-assisted could include the idea of AI creating the initial version of the art and then the artist drawing over it or refining it. My university students argue that using ChatGPT to write their research papers is the same sort of process. They tell ChatGPT to craft a paper that fulfills the research prompt I gave them, and then they look over what ChatGPT came up with and then they edit it. As they note, writing a paper is time consuming and it is more efficient for ChatGPT to write the initial draft and they do the equivalent of drawing over it. I think they would call that process AI-Assisted since they did some editing of what ChatGPT did. Would you support a similar process for TAS products?

Say for example, I have an idea. What if there were an animal that was like a platypus but adapted to be native to a planet like Mars? But let's say that writing is time consuming and organizing or realizing thoughts is difficult, and just because I might not be good at those less important parts of writing, less important than my creativity, shouldn't mean I should be gatekept out of making art, right? So ChatGPT comes up with the initial write-up based on my idea, and then I do the editing and I put out that new creative work. That seems to fit within your concept of AI-Assisted creative works, yes?
 
Last edited:
I noticed in your post it seems AI-assisted could include the idea of AI creating the initial version of the art and then the artist drawing over it or refining it. My university students argue that using ChatGPT to write their research papers is the same sort of process. They tell ChatGPT to craft a paper that fulfills the research prompt I gave them, and then they look over what ChatGPT came up with and then they edit it. As they note, writing a paper is time consuming and it is more efficient for ChatGPT to write the initial draft and they do the equivalent of drawing over it. I think they would call that process AI-Assisted since they did some editing of what ChatGPT did. Would you support a similar process for TAS products?

Say for example, I have an idea. What if there were an animal that was like a platypus but adapted to be native to a planet like Mars? But let's say that writing is time consuming and organizing or realizing thoughts is difficult, and just because I might not be good at those less important parts of writing, less important than my creativity, shouldn't mean I should be gatekept out of making art, right? So ChatGPT comes up with the initial write-up based on my idea, and then I do the editing and I put out that new creative work. That seems to fit within your concept of AI-Assisted creative works, yes?

It really comes down to the amount of human involvement and expertise.

If a student or creator hands their work over to someone else, or to an AI, and lets them produce the entire piece, that’s no longer their work. In school, that’s plagiarism; in publishing, it’s misrepresentation. The real issue is who did the actual thinking and whether that person has genuine subject-matter understanding.

AI can be a great tool for gathering ideas, organizing thoughts, or polishing a draft, but the human still has to guide, edit, and make it their own. Without that engagement, there’s no real learning or authorship.

The same principle applies to Traveller TAS projects. AI can help brainstorm or visualize, but the final content needs to be directed and finished by a human subject-matter expert. AI should assist, not replace, and the creative authority must always stay with the human.

To your “Martian platypus” example, yes, that’s a perfect case of AI-assisted creativity. You came up with the idea and guided the process, using AI as a tool to help organize and express your concept. You’re still the subject-matter expert because this imaginary creature comes from your own creativity. The AI didn’t invent it; it just helped you articulate and visualize it. You put work into this; the AI didn’t just poof it into existence on its own.

That’s really where the line is for me. If AI is part of a guided process where the human is curating, shaping, and refining the results, that’s AI-assisted creativity. But if the AI is doing most of the heavy lifting and the human just copies and pastes whatever it spits out without much thought or editing, that starts sliding into what I’d call “slop.”

The key is to stay involved at every step, directing the idea, guiding the refinement, and giving it that final human touch. AI can absolutely be part of the workflow, but it should never take the place of real human judgment or craftsmanship.
 
The worst part about it is that over time, after it has put artists and writers out of work by basically stealing their content and refurbishing it for the unwitting masses, it will eventually just be copying from other copiers! That's right. Eventually the AI slop will get even worse, as it plagiarizes the plagiarizers — the other AIs — regurgitating boring tropes over and over again, mixing and matching them in a swirling mass of vomited goop.

I'm going to hope against all hope that eventually people wake up and notice how dull content has become, and they rebel, asking for good material again.

I'll have to disagree with you there, @CyborgPrime. Unless one's AI creation only draws from a pool of content that the creator either made or otherwise owns, then it will be theft from the pool that is the entire internet. The moment you step outside the boundaries of your own work, you risk committing theft. It is still theft even if the creator does not know where the parts and inspiration their mishmash creation is built out of. I read your article, and I respect your process, but to truly be ethical, great care would have to be taken every step of the way, including the careful use of search terms when asking AI to flesh out a concept so that the work of artists is not inadvertently stolen. I can't imagine that most users will abide by the stringent rules that you set for yourself.

If someone goes into their ChatBot art creator and asks for a starship, or the bridge of a starship, a plasma gun, or any other sci-fi element, and they do not limit its choices to their own work, chances are, it is stealing from objects labeled as such in various public forums. People use AI because it's easy, and getting them to follow a careful process like the one you outlined is a hard bargain to drive.
 
Last edited:
I noticed in your post it seems AI-assisted could include the idea of AI creating the initial version of the art and then the artist drawing over it or refining it. My university students argue that using ChatGPT to write their research papers is the same sort of process. They tell ChatGPT to craft a paper that fulfills the research prompt I gave them, and then they look over what ChatGPT came up with and then they edit it. As they note, writing a paper is time consuming and it is more efficient for ChatGPT to write the initial draft and they do the equivalent of drawing over it. I think they would call that process AI-Assisted since they did some editing of what ChatGPT did.

Goodness gracious! We're doomed! ☠️

Your post disheartens me, but I've got two sons, one of whom just completed a doctorate in linguistics and another who is halfway through his doctorate in bioinformatics, neither of whom has used a ChatBot to write a paper. Hopefully most students refrain as well.

If you use Google to find something now, you cannot help interacting with AI, but to let a ChatBot write your paper! My goodness. Putting your thoughts down on a subject is how you learn about something and retain it.
 
...
Say for example, I have an idea. What if there were an animal that was like a platypus but adapted to be native to a planet like Mars? But let's say that writing is time consuming and organizing or realizing thoughts is difficult, and just because I might not be good at those less important parts of writing, less important than my creativity, shouldn't mean I should be gatekept out of making art, right? So ChatGPT comes up with the initial write-up based on my idea, and then I do the editing and I put out that new creative work. That seems to fit within your concept of AI-Assisted creative works, yes?
Say for example, I'm out of shape and suck at sports. Practice and body building are difficult, and just because I might not be good at those things, I shouldn't be gatekept out of professional sports. So bionics, steroids and other cyber-enhancements provide me the kick I need to be competitive with people who are, you know, ACTUALLY talented.

This is how I see the pro-AI in professional humanities argument. Pardon me if my contempt for the participation trophy crowd is too up front.
 
The worst part about it is that over time, after it has put artists and writers out of work by basically stealing their content and refurbishing it for the unwitting masses, it will eventually just be copying from other copiers! That's right. Eventually the AI slop will get even worse, as it plagiarizes the plagiarizers — the other AIs — regurgitating boring tropes over and over again, mixing and matching them in a swirling mass of vomited goop.

I'm going to hope against all hope that eventually people wake up and notice how dull content has become, and they rebel, asking for good material again.

I'll have to disagree with you there, @CyborgPrime. Unless one's AI creation only draws from a pool of content that the creator either made or otherwise owns, then it will be theft from the pool that is the entire internet. The moment you step outside the boundaries of your own work, you risk committing theft. It is still theft even if the creator does not know where the parts and inspiration their mishmash creation is built out of. I read your article, and I respect your process, but to truly be ethical, great care would have to be taken every step of the way, including the careful use of search terms when asking AI to flesh out a concept so that the work of artists is not inadvertently stolen. I can't imagine that most users will abide by the stringent rules that you set for yourself.

If someone goes into their ChatBot art creator and asks for a starship, or the bridge of a starship, a plasma gun, or any other sci-fi element, and they do not limit its choices to their own work, chances are, it is stealing from objects labeled as such in various public forums. People use AI because it's easy, and getting them to follow a careful process like the one you outlined is a hard bargain to drive.

There is a problem here in having three separate threads on exactly the same subject, where people have to deal with the same points each time.

So we end up in a circular discussion. We've already had a long back and forth on the problem with the position you raise in one of the other threads: humans and LLMs both train on "the internet" as you put it, including art, writing and more.

Nobody springs forth, fully formed, from the forehead of Zeus. It is trivial to list examples: Schoenberg uses Brahms' motifs in his Verklare Nacht; Brahms includes elements of Beethoven's 9th in his own 1st Symphony; Beethoven re-uses Mozart's theme from Bei Mannern, Mozart uses a theme from Bach's Martern aller Arten. The same goes for visual arts. From the other thread:

Berg didn't "rip off" Bach: he was a stupendous artist who took inspiration (“stole” as the simplistic version would put it). Warhol wasn't "ripping off" Mr Campbell, famed soup manufacturer. Picasso and Braque were never "ripping off" those whose works they incorporated in their collage pieces. Damien Hirst isn't "ripping off" Koons or Bacon or Duchamp. Come to that, Duchamp wasn't "ripping off" the J.L. Mott Ironworks Company.

The reason that people are up in arms this time is not in fact a principal that "training on the art of others is bad" because there are always excuses made for humans doing it ("Good artists copy, Great artists steal"). It boils down to the industrial revolution one: there is a place for artisan weavers or potters even today, but mediocre potters cannot sell wonky mugs because people have more convenient and cheaper alternatives.

But if you are right and slop will be the outcome then what have you to worry about: human art will clearly stand out! And I agree: it will. Just not as a source of side-income for swathes of mediocre artists. Why we should protect mediocre artists but not programmers, weavers, car workers, bricklayers or even manual harvesters with scythes is something I have never seen convincingly explained.
 
But if you are right and slop will be the outcome then what have you to worry about: human art will clearly stand out! And I agree: it will. Just not as a source of side-income for swathes of mediocre artists. Why we should protect mediocre artists but not programmers, weavers, car workers, bricklayers or even manual harvesters with scythes is something I have never seen convincingly explained.
There is a difference between art and manufacturing. If you make each basket different, you are an artist. If you make a million baskets all the same, then you are a manufacturer, not an artist. This is illustrated best by the quote below.

"I want AI to do my laundry and dishes so that I can do art and writing, not for AI to do my art and writing so that I can do my laundry and dishes,"
 
The worst part about it is that over time, after it has put artists and writers out of work by basically stealing their content and refurbishing it for the unwitting masses, it will eventually just be copying from other copiers! That's right. Eventually the AI slop will get even worse, as it plagiarizes the plagiarizers — the other AIs — regurgitating boring tropes over and over again, mixing and matching them in a swirling mass of vomited goop.

I'm going to hope against all hope that eventually people wake up and notice how dull content has become, and they rebel, asking for good material again.

I'll have to disagree with you there, @CyborgPrime. Unless one's AI creation only draws from a pool of content that the creator either made or otherwise owns, then it will be theft from the pool that is the entire internet. The moment you step outside the boundaries of your own work, you risk committing theft. It is still theft even if the creator does not know where the parts and inspiration their mishmash creation is built out of. I read your article, and I respect your process, but to truly be ethical, great care would have to be taken every step of the way, including the careful use of search terms when asking AI to flesh out a concept so that the work of artists is not inadvertently stolen. I can't imagine that most users will abide by the stringent rules that you set for yourself.

If someone goes into their ChatBot art creator and asks for a starship, or the bridge of a starship, a plasma gun, or any other sci-fi element, and they do not limit its choices to their own work, chances are, it is stealing from objects labeled as such in various public forums. People use AI because it's easy, and getting them to follow a careful process like the one you outlined is a hard bargain to drive.

Thanks for the response, @Paltrysum. Just to clarify - my position isn’t in opposition to you personally. It’s aligned with the U.S. Copyright Office’s stance on AI-assisted work. The law already draws a clear line between AI-generated and human-authored content, and that’s the same distinction I emphasize in my article.

If a human artist directs, edits, and finalizes the work, then authorship and copyright still belong to the human. That’s not just my opinion — that’s the official position of the U.S. Copyright Office and current case law. My entire workflow is built around maintaining a verifiable chain of human authorship.

The “stolen art” argument keeps coming up, but it’s based on a misunderstanding of how modern AI models work. They don’t store or copy pictures; they learn patterns (color, shape, and composition) the same way a human artist studies thousands of paintings to understand form and style. Ethical AI systems like Adobe Firefly are trained on licensed and opt-in datasets and even offer legal indemnity to users.

To put it in perspective, the Stable Diffusion models I use are about 2 gigabytes in size. It’s not physically possible for “every image on the internet” to be stored in a 2GB file. These models don’t contain jpegs; they contain mathematical relationships; descriptions of how light, texture, and shape interact. It’s like learning to paint clouds without memorizing any specific painting of clouds.

Also, there’s some irony here. There’s a popular Traveller Third Imperium video on YouTube that uses unattributed art scraped from Google throughout — including pieces from my own website, and nobody said a word. Where are the pages and pages of outraged forum posts? Yet when artists like me use AI responsibly, trained on licensed models and finished by hand, suddenly it’s called “theft.”

When a human artist uses references, we call it inspiration. When an AI model studies references, people call it theft. That’s a double standard. AI doesn’t copy; it learns relationships between forms, just like we do. The only difference is that I can tell you exactly which tools, datasets, and processes I use, which is far more transparent than most traditional art practices.

If we’re going to have a serious conversation about ethics in art, it should be based on process and transparency, not fear or emotion. My work is AI-assisted, artist-directed, and human-finished, and I have demonstrated that every step of the way.

At the end of the day, I think we’re both after the same thing: protecting real creativity, respecting artists, and keeping human vision at the center of the process. We just happen to use different tools to get there.

For me AI is part of the process; it doesn't replace it.
 
Last edited:
Also, there’s some irony here. There’s a popular Traveller Third Imperium video on YouTube that uses unattributed art scraped from Google throughout — including pieces from my own website, and nobody said a word. Where are the pages and pages of outraged forum posts? Yet when artists like me use AI responsibly, trained on licensed models and finished by hand, suddenly it’s called “theft.
Not saying anything is not a double standard. People assume that permission is given in You Tube videos because of how easy it is to copyright strike any video, sometimes without even proving you have rights to the content. If something is there from your site, and you know about it, our assumption is that you are OK with it.

Adobe and ethics probably should not be used in the same sentence. Frequently, you opt in to whatever merely by using the product. Opting out is hidden or incurs fees for too many things. Sort of like when uploading anything to You Tube lets them use your content on any of their accounts on other platforms without any monetization for you. One user pointed out a video of theirs on the You Tube account on Facebook that had millions of views, but on You Tube, the algorithm did not push it, and he only had 300k monetized views.
 
Back
Top