Traveller, TAS, and AI

MongooseMatt

Administrator
Staff member
Hey everyone,

We have just made a change to the TAS programme on Drivethru and wanted to let you all know about it, and the reasons why this change has been made.

First off, we are aware that there are many Referees out there using AI in their home games to quickly generate encounters, adventure ideas and the likes of character portraits and landscapes. This is a good use of AI, especially for those of us with zero artistic talent - the ability to quickly flash up a screen or printout with what the Travellers see in a one-off encounter is obviously useful.

Problems start popping up when this moves to publishing.

We have (as of yesterday) instructed Drivethru to disallow all AI art in new TAS publications. Text was already prohibited, and existing titles that use such material will be 'grandfathered in', in the interests of fairness. However, as of now, AI cannot go anywhere near the TAS programme.

We are not technophobes, and have fond hopes for the future of AI in areas of research, analysis and repetitive labour. At this time, for example, we are aware of a party trying to bring back some (very) old Traveller titles, looking at the use of AI to recover PDFs of some very poor scans done back in the day. This is a great use of AI, as doing it manually would be time-consuming, an absolute pain in the rear end, and has no real artistic input.

The problem lies in creation - specifically, in the creation of art. Or 'art', if you prefer. As we have stated several times in the past, art needs an artist.

We foresee a future, one that may be just around the corner, where AI is solely responsible for the creation of TV series, films, video games and, yes, RPGs. The text/script is AI-generated, then AI creates art/video based on that. And when this happens at scale, there will simply be a tsunami of 'slop' crushing over us all. Picture, if you will, people going back home after work and then effectively plugging themselves into a streaming service, video game or novel, and everything - everything - they are consuming is machine-generated. And there will certainly be those in positions of power who will be altering algorithms for their own ends. It is easy to imagine that this will reach a point where vast quantities of material are being put out with no human seeing it until it hits a consumer.

To us, that sounds like a nightmare. The effective death of art. And perhaps the start of Idiocracy, but I digress.

We believe that, at this time, this is an unavoidable fate. It will happen.

But not to Traveller. Or Paranoia, or Shield Maidens, or any other RPG from Mongoose.

We may dream of the day that AI is able to run our accounts, admin, and mail orders (but not customer service!), freeing all the creatives at Mongoose to, well, create.

But AI will not be nosing its way into our games. For this reason, we have to draw a line not only with our own titles but also anything with a Traveller logo attached to it.

This may be a forlorn hope, with us desperately skidding towards the edge of the cliff, but we feel nothing about Traveller's art will be improved with AI and there is an awful lot to lose - not just with a single RPG, but society as a whole.

As always, we would welcome any comments, criticisms or concerns about this policy.
 
I can agree as far as wholly AI created. But I don't believe a blanket ban is needed. What about AI assisted? Say someone is using the AI features within an image editor such as Photoshop for example?

Or say using for research? Say give me a list of possible navy ship names for a SciFi RPG, or character tropes? Not just copy/pasted in from the AI.
 
I think that this is a sensible stance to take overall, but there is a corner case on which I'd like some clarification:

There are a few of us who have used Nvidia's StyleGAN2 (trained on the Flickr-Faces-HQ dataset, which only includes images published under permissive licences) to generate photographic reference that's then used as the basis for line art character portraits. I used it when I produced the lost DGP book Manhunt for Marc Miller back in 2022 (this was the first use of genAI in a published Traveller book that I'm aware of), and Tom Mouat has subsequently used the same approach in several of the Moon Toad books.

For both of us, the process by which we get to the final line art character portrait is essentially manual; there's a great deal of hand-editing and hand-drawing. We could have used non-genAI photographs as input to the process, but even if we'd chosen photographs published under a suitably permissive licence, we'd still have the issue that the photographs would have depicted a real person. Using the output of StyleGAN2 avoids that issue.

Would this use of genAI fall foul of Mongoose's new policy for TAS?
 
I think that this is a sensible stance to take overall, but there is a corner case on which I'd like some clarification:

There are a few of us who have used Nvidia's StyleGAN2 (trained on the Flickr-Faces-HQ dataset, which only includes images published under permissive licences) to generate photographic reference that's then used as the basis for line art character portraits. I used it when I produced the lost DGP book Manhunt for Marc Miller back in 2022 (this was the first use of genAI in a published Traveller book that I'm aware of), and Tom Mouat has subsequently used the same approach in several of the Moon Toad books.

For both of us, the process by which we get to the final line art character portrait is essentially manual; there's a great deal of hand-editing and hand-drawing. We could have used non-genAI photographs as input to the process, but even if we'd chosen photographs published under a suitably permissive licence, we'd still have the issue that the photographs would have depicted a real person. Using the output of StyleGAN2 avoids that issue.

Would this use of genAI fall foul of Mongoose's new policy for TAS?
AI sampling and copying a work of art is AI created.
A person making a drawing of a stock image/permissive license, regardless of the source is a drawing created by a human. You know that the source was OK with its use in commercial projects, and you then recreate your own rendering based upon that image.

Edit: But Mongoose' mileage may vary...
 
I wholeheartedly agree with. Since all AI art and Text is based on algorithms there is no true creativity to it. Maybe one day we will see true AI creativity but I don’t see it in the near future.
 
I think the answer is more nuanced.

A ban on AI generating art is fine by me for commercial publications, what I may choose to generate in support of my game is up to me

The nuance is with written content. I have found AI to be unreliable with facts and in my field, one needs to be a Subject Matter Expert to validate returned content. The same will be true for generated Traveller content, where careful reading and editing is required. Where I see a use for AI generated content is with the early stages of creating adventures, with the clear understanding all AI generated content be re-written by a human.
 
I think the answer is more nuanced.

A ban on AI generating art is fine by me for commercial publications, what I may choose to generate in support of my game is up to me

The nuance is with written content. I have found AI to be unreliable with facts and in my field, one needs to be a Subject Matter Expert to validate returned content. The same will be true for generated Traveller content, where careful reading and editing is required. Where I see a use for AI generated content is with the early stages of creating adventures, with the clear understanding all AI generated content be re-written by a human.
As the post says, Mongoose is not banning AI for personal use. Just publishing.
 
All right, Matt - somethhing to ponder. I've posted this elsewhere but it is a good question, I think for you and others to consider.

Proof of life

A sincere question for the community, especially those firmly against AI-generated art in publications:

I’ve worked as an illustrator across multiple mediums: traditional, digital, CGI, photography, animation, and yes, more recently, I’ve been experimenting with incorporating AI tools and plugins to speed up certain aspects of my process. That said, I also continue to produce fully human-made works — no AI fills, no upscaling, no generative shortcuts—just good old-fashioned work, whether it’s painting, digital brushwork, or compositing.

But here’s the challenge:

Even when I go out of my way to avoid AI tools entirely, my work can still get flagged by automated “AI art detectors” as being likely AI-generated. I’ve heard from folks in the industry (including the head of ops at Roll20) that these detectors are notoriously unreliable, falsely tagging human-made art as AI, and missing actual AI art just as often.

I’m not trying to be flippant —

What is an artist supposed to do to prove their work is not AI-generated if a community or platform demands that assurance?

Do I need to start maintaining behind-the-scenes WIP screenshots or time-lapse recordings just to defend my own process? Is that now part of the cost of doing business?

I ask this with respect and genuine curiosity — especially as I’ve seen talented human artists (myself included) get side-eyed or questioned simply because their work looks “too clean” or “too polished.”

What does proof look like in this space, and how do we avoid putting artists in a position where they’re assumed guilty until proven innocent?

Would love to hear thoughts from both sides.
 
All right, Matt - somethhing to ponder. I've posted this elsewhere but it is a good question, I think for you and others to consider.

Proof of life

A sincere question for the community, especially those firmly against AI-generated art in publications:

I’ve worked as an illustrator across multiple mediums: traditional, digital, CGI, photography, animation, and yes, more recently, I’ve been experimenting with incorporating AI tools and plugins to speed up certain aspects of my process. That said, I also continue to produce fully human-made works — no AI fills, no upscaling, no generative shortcuts—just good old-fashioned work, whether it’s painting, digital brushwork, or compositing.

But here’s the challenge:

Even when I go out of my way to avoid AI tools entirely, my work can still get flagged by automated “AI art detectors” as being likely AI-generated. I’ve heard from folks in the industry (including the head of ops at Roll20) that these detectors are notoriously unreliable, falsely tagging human-made art as AI, and missing actual AI art just as often.

I’m not trying to be flippant —

What is an artist supposed to do to prove their work is not AI-generated if a community or platform demands that assurance?

Do I need to start maintaining behind-the-scenes WIP screenshots or time-lapse recordings just to defend my own process? Is that now part of the cost of doing business?

I ask this with respect and genuine curiosity — especially as I’ve seen talented human artists (myself included) get side-eyed or questioned simply because their work looks “too clean” or “too polished.”

What does proof look like in this space, and how do we avoid putting artists in a position where they’re assumed guilty until proven innocent?

Would love to hear thoughts from both sides.
My guess is that a simple Statement of Authenticity, signed by you would cover it. Then if it came out later that you lied, it would be fraud and a crime, so that should keep most people honest. Then you can just take them at their word and no need for any fancy AI detectors. :P
 
AI art detectors are pretty terrible and I would not recommend using them to screen out TAS products. Human AI detectors are often even worse, because most of them are just halfwit randos on the internet.

As someone who will not intentionally buy a product using AI generated art, I am nevertheless fine with a statement saying "don't do this" and trusting people not to do so. Because the alternative is definitely worse.

Art should have artist or other source credited and if those things turn out to be lies that will eventually come out. But throwing out the baby with the bathwater using garbage tools to try to detect cheating is worse than doing nothing.
 
A type of statement like "However, as of now, AI cannot go anywhere near the TAS programme" is very unhelpful as it does not address what constitutes using "A.I." Is the litmus test 0%, like if I use A.I. to help with brainstorming ideas for content, am I using A.I.? What if I write something then use A.I. to proofread it, was that created with A.I.? What about if I use A.I. to copyedit something, but make the edits manually? What if I use A.I. to fact-check my ideas and if they are scientifically inaccurate to give me some more realistic alternatives? What if I use A.I. to do research about current scientific achievements to enhance what I write?

The point is A.I. is becoming an incredibly powerful tool, not just to mindlessly produce slop, but to assist in the creative development process. If you take an absolutist stance of no A.I. use period, you will start loosing talented contributors who utilized A.I. to enhance their human created output. If you start creating strict guidelines for acceptable and unacceptable A.I. use then you are running into a quagmire because people use A.I. differently. This does not even get into the issue of demonstrating A.I use. If you start banning A.I. you will be creating incentives for people to stop reporting their use of A.I., which could lead to witch hunts and "internet A.I. Experts" levying accusations of suspected heretics.

Overall, I think this type of stance is creating a needlessly antagonistic position to content creators who use A.I., especially as a creative assistant. As time progresses and A.I. becomes more common (which people seem to think it will) you will be drawing from a smaller and smaller group of people who don't utilize A.I., while those that do seek alternative outlets for their creativity.

Just my thoughts on the matter.
 
A type of statement like "However, as of now, AI cannot go anywhere near the TAS programme" is very unhelpful as it does not address what constitutes using "A.I." Is the litmus test 0%, like if I use A.I. to help with brainstorming ideas for content, am I using A.I.? What if I write something then use A.I. to proofread it, was that created with A.I.? What about if I use A.I. to copyedit something, but make the edits manually? What if I use A.I. to fact-check my ideas and if they are scientifically inaccurate to give me some more realistic alternatives? What if I use A.I. to do research about current scientific achievements to enhance what I write?

The point is A.I. is becoming an incredibly powerful tool, not just to mindlessly produce slop, but to assist in the creative development process. If you take an absolutist stance of no A.I. use period, you will start loosing talented contributors who utilized A.I. to enhance their human created output. If you start creating strict guidelines for acceptable and unacceptable A.I. use then you are running into a quagmire because people use A.I. differently. This does not even get into the issue of demonstrating A.I use. If you start banning A.I. you will be creating incentives for people to stop reporting their use of A.I., which could lead to witch hunts and "internet A.I. Experts" levying accusations of suspected heretics.

Overall, I think this type of stance is creating a needlessly antagonistic position to content creators who use A.I., especially as a creative assistant. As time progresses and A.I. becomes more common (which people seem to think it will) you will be drawing from a smaller and smaller group of people who don't utilize A.I., while those that do seek alternative outlets for their creativity.

Just my thoughts on the matter.
I think some people who want to justify using AI to "create" are trying to muddy the waters.

Europe has some scary rules about copyrights. AI has a bad habit of "sampling" the work of others in much the same way that too many rappers a few decades ago used to "sample" the works of others by including way too much of those works in their "art" and then not wanting to pay royalties.
They don't want AI generated images. They want YOUR images, or those of a paid/volunteer human. If you cannot make or buy art, don't try to sell "art." They don't want AI generated plots. They want YOUR plots. If you can't generate plots then you should not make adventure modules your line of work or even a side hustle. They don't want AI writing your text. They want YOUR text. If you can't write, do something else. It really isn't that hard.
Trying to claim they want to eliminate proofreading or pre-project research or googling something is just a strawman argument.
 
Back
Top