.
Right now, if you make an event on a calnedar hosted by churchofjesuschrist.org, and you click the "public event" box (meaning people can see the event without logging in as a church member), and then you start typing a description, a troubling thing pops up on your screen:
I have so many issues with AI and each of them is amplified when placed in a church setting.
First, no matter where you stand on AI, except for people who imagine they can make money from it, the average person fumbles along in this direction:
(Incidentally, I found that observation via Anne Lutz Fernandez's solid three-parter on why AI should keep out of schools to which I amen loudly and with enthusiasm.)
But hey, you ask a large-language model something and it gives you what sounds like a reasonable answer and that's really neat! But if you stick with them long enough, one of two things will happen. You will either grow skeptical of AI's ability to give you anything worth having. Or you will lose the ability to recognize when such skepticism is warranted. Which makes sense. A"I"-generated writing has only two uses that I can see: To do things that shouldn't be bothered with anyway. To embrace laziness.
I have a friend who works for OpenAI and he was suggesting that AI would be terrific for teachers, a huge time saver. To which I can only answer, how? I suppose I could make it do things for me but to what end? I honestly cannot think of a single use case that wouldn't damage my credibility, my relationship with my students, the quality of the work i assign, the quality of work I receive back, and/or the quality of the grading. The only way AI could do any of those things as well as I can do them is if I'm already not doing them well. Yes, LLMs can do rote crap fine if you don't care much about the outputs, but if a teacher is teaching students to provide rote crap that does not require much attention to grade, I would propose that that sounds like a crappy education.
Anyway, AI companies are desperately looking for ways to make money and school districts are suckers who delight in wasting money on magical fixes, so I'm sure I'll be getting access to free crap soon enough. People who don't teach always know how to make our jobs better. (My district went over $40,000,000 overbudget on consultants last year. That's fun. And exactly why they'll be excited to pay for AI. Why not pay less for crappy work that never needed to be done?)
Speaking of, it's worth mentioning that these companies are far from profitable as of today. For instnace, OpenAI, which is doing better than most, loses $2.50 for every $1 it makes. That's overall, but they're even losing money on their highest-priced corporate accounts. The new version of their question-answering bot? The one they say that is a huge leap forward and will blow our minds and that they haven't let any reporters check out yet? The cost in computing and energy for every single answer it produces is $10,000.
$10,000.
No matter how good those answers are, I don't see anyone paying OpenAI $10,001 per answer so they can turn a profit. Do you?
Anyway, I was borderline rude in my outrage a couple months ago during stake council. I'd volunteered to choose some pullquotes from the recent stake conference for use in the stake's social media. Somebody said I could just have an AI do this and I was shocked at the suggestion. How hard is it to read some talks and pull out some choice quotes? How could an LLM do that better than I can? And once it gave me quotes, unless I'm grossly irresponsible, I would have to check they were even in the talk! And if I'm the least bit concerned with doing a good job, I would either have to get it to give me multiple options to choose from (each of which I'd have to check) or compare it against the talk myself. How does an LLM save me any time here? How?
It only saves me time if I don't care about doing a good job. What I said at the time was something like, "If it's worth doing, it's worth doing," and I stand my that. (Although my tone of voice should probably be rescinded.) Even granting that this kind of task is very much in my, Theric's, wordsmith roundhouse, it's not a hard task. So if it's worth doing, I should do it. Even if I let an AI take a crack, if it's worth doing, I'll still have to do it.
And that gets to why I'm so distressed that the Church is apparently paying for AI services. This is church. How can we rely on the Spirit if we're relying on an artificial intelligence?
The best argument I can come up with here is some people are intimidated by writing; it's hard for them. And an AI-written calendar item lets them succeed at their calling.
It's an argument. But is it a good one?
Besides, giving tithing money to companies that plagiarize and pollute while we believe in honesty and environmental stewardship is troubling.
(Incidentally, if you want something in-depth about the waste and nonsense that is the modern AI business, check out this newsletter.)
I appreciate that the Church isn't trapped in a moribund past, but while this always upsets me—
—in a living faith, it also depresses me.
Are we shoving God into the machine?