Portrait Packs

I was discussing with a professor about their "no AI work in my class, you will write everything, no AI help. No Grammarly either" But they refused to make a statement if their students could use the grammar and spelling check in Word. That is based on the same set of algorithms.
Not to mention a significant number of professors, TAs and teachers generally are THEMSELVES using AI for curricula, syllabi, and assignments - which is just a hair short of naked hypocrisy.
Me? I'm cheerfully using AI as much as I can to provide NPC images, setting images, etc (I have a big digital screen behind me as I GM). If I didn't take it from AI, I'd be snipping it from movies, deviantart, artstation anyway.
 
Not to mention a significant number of professors, TAs and teachers generally are THEMSELVES using AI for curricula, syllabi, and assignments - which is just a hair short of naked hypocrisy.
I think you misunderstand the nature of learning.

The students need to do the thing they are learning for themselves in order to learn how to do it. Having AI do it means you are not learning to do it. Remember back in elementary school when they wouldn't let you use a calculator until you'd learned to do arithmetic? That.

You are also overestimating the technical abilities of professors, who are not generally aware of the possibilities for using AI. Students THINK professors are using AI; they labour under many misapprehensions about professors, but in general they are not.

There is also the reason that AI gets stuff wrong. Professors are generally experts in their fields and notice that AI gets it wrong, and so stay away from it. AI looks stuff up on the internet, more or less, but not all information is available on the internet, or even known to humans, and so sorting through all that is what professors do, and what professors try to teach to students (usually with little success). But AI produces pretty plausible sounding answers in many cases, so non-experts are generally taken in. Professors generally want to continue to try to build an edifice of knowledge around actual facts and truths and realities, instead of just giving in to assuming the AI answer is right even when it is wrong. This is a losing battle, there are too many people who just haven't got a clue, but we have to try.

Maybe that won't matter soon, as there won't be any jobs any more because AI is doing them (badly, but cheaply), so it won't matter that the students aren't learning anything as long as they continue to pay their tuition. Student loans will be key here.

One day soon, it will be mandatory to allow AI in teaching. From that day forward, I can have AI write my lectures, the students can have an AI listen to the lectures and then write their essays, and then I can get an AI to read the essays.

Not long after, they will fire all the professors. Students can have Professor AIs which teach their Student AIs, and then they just need to come in once at the end of the program to pick up their degrees. Or just have it emailed to them.
 
I think you misunderstand the nature of learning.
Nah, I think you just aren't reading closely. Note I said "just a hair short of naked hypocrisy" - sure, there's a difference between using AI to generate a syllabus and a student using it to write a paper. But IMO the difference is pretty small - thoughtful pedagogy isn't just vomiting a low-effort reading list EITHER. A thoughtful curriculum is tuned specifically to the course, the class, even the method of teaching in a way I doubt AI can deliver.

You are also overestimating the technical abilities of professors, who are not generally aware of the possibilities for using AI. Students THINK professors are using AI; they labour under many misapprehensions about professors, but in general they are not.

Sure. I'd call 60% generally, but your statistics may vary.
It's sad but listen to the NPR piece about it https://www.npr.org/2025/05/21/1252663599/kashmir-hill-ai
1751137625433.png

Personally? I think (what we call) AI is mostly-awful. (I think we can all agree it's not actual AI, but that's modern idiom)
My company is PUSHING AI down everyone's throats; I routinely - maybe once a week - I have conversations with people who clearly used AI to 'summarize' something longer than they cared to read and have drawn if not grossly-wrong conclusions at least meaningfully-wrong conclusions.
Then again, let's be honest: before "AI" 46% of adults reports not reading a SINGLE BOOK in the previous year. Children are learning almost nothing in schools, social promotion is rife, education generally is in a parlous situation, and few people can write more than a 280-character sustained thought.
I'm not defending AI in case I seem to be implying it. I simply admit that I am not averse to using it when it is helpful in the grey-area of fair use. I'd never -for example- try to sell anything with AI generative art that I paid someone for.
 
Personally? I think (what we call) AI is mostly-awful. (I think we can all agree it's not actual AI, but that's modern idiom)
My company is PUSHING AI down everyone's throats; I routinely - maybe once a week - I have conversations with people who clearly used AI to 'summarize' something longer than they cared to read and have drawn if not grossly-wrong conclusions at least meaningfully-wrong conclusions.
Then again, let's be honest: before "AI" 46% of adults reports not reading a SINGLE BOOK in the previous year. Children are learning almost nothing in schools, social promotion is rife, education generally is in a parlous situation, and few people can write more than a 280-character sustained thought.
I'm not defending AI in case I seem to be implying it. I simply admit that I am not averse to using it when it is helpful in the grey-area of fair use. I'd never -for example- try to sell anything with AI generative art that I paid someone for.
I guess we are talking about LLM type AIs, I know there are other ones but these are the ones I encounter in my work. The problem seems not really to be AI per se, but the way it presents as a reliable expert when it is not. So students think it has written them a quality fact based essay, when it hasn't . Or, to go back to topic, the way people think they've produced a good gaming product, when they haven't.

I, as a professor, I do use AI to translate lectures from Finnish to English, or vis-versa, and it is not reliable, but doing it automatically for a first pass and then fixing it is faster than doing it myself. Key here is that I speak both languages, so I can use it this way. If I didn't, it would be sometimes ok, but very risky indeed. I know in the natural sciences, they often work with AI to sift big data. There are definitely useful AI tools, the issue only arising if the user is not knowledgeable enough, or too lazy to use the tool in the right way.

Someone who has never read a book, in the past is likely to have been aware of his lack of knowledge on topics outside his experience. Not so much any more. On the one hand, the availability of AI makes them feel like they have mastered the task by using the current top-of-the-line tool to get it done, but they haven't. For someone who has never mastered intellectual tasks before, they don't know the difference.

I can only imagine the pile of AI written garbage the book publishing company editors must be facing now. My deepest sympathies to them.

As far as children not learning in anything in school, I live in Finland, and my children sure as hell learned stuff in school. But in other countries, sure. And not all kids take to it, even here. You can raise the average level, but you can't educate the stupid out of some people.
 
Back
Top