[00:00:00] Hey, thanks so much for listening to this message. My name is Jason and I'm one of the ministers here at the Madison Church of Christ. It's our hope and prayer that the teaching you hear today will bless your life and draw you closer to God. If you're ever in the Madison area, we'd love for you to stop by and study the Bible with us on Sundays at 5pm or Wednesdays at 7pm if you have questions about the Bible or want to know more about the Madison Church, you can find us
[email protected] be sure to subscribe to this podcast as well as our Sermons podcast, Madison Church of Christ Sermons. Thanks again for stopping by. I hope this study is a blessing to you.
[00:00:37] If you recall last week, or if you weren't here last week, we talked about echo chambers and the idea of in the social media world in particular, algorithms are designed oftentimes to create an echo chamber, which means that if you like something, then that algorithm is going to try and point you more to that content because you're more likely to engage with it.
[00:00:58] And oftentimes what happens is that snowballs. If you are upset at the world because everything you're seeing on your newsfeed says one thing and nobody else seems to say that. It's not because nobody else thinks that. It's because the algorithm is feeding you the same type of content over and over again. And sometimes that can be silly, like your SEC affiliate, your fan, the team that you're a fan of. Other times it could be really serious things. It can be it's definitely political, it's definitely theological.
[00:01:26] So we want to be learned in how algorithms work and how we interact with them, but more importantly, how they interact with us. Keep in mind the emphasis in this study all quarter long is discernment, the ability to rightly see the world as it exists, and then to make sure that we move forward with the Gospel as our filter exclusively.
[00:01:46] So tonight we're going to talk about artificial intelligence. That's sort of where our conversation led us last week. The idea that AI oftentimes, particularly the companion related AIs, or even things like ChatGPT, they're not necessarily designed to be a companion. Oftentimes they're used for research. But a lot of them are designed to interact with you in a way that is very positive. If you ask it, or you say, I think the sky is made of Cheetos, why is the sky made of Cheetos? A good friend will look at you and say you're a Moron. That's ridiculous. But a chatbot might say, what an interesting perspective. I'm glad you asked that. It will reinforce some of these things that oftentimes don't need to be reinforced. And where we left off last week is that scripture doesn't always do that.
[00:02:34] Scripture is inspired to teach us specifically also to correct us for instruction and righteousness. The church is designed not just to lift up and exclusively encourage and positively reinforce everything, but, but to also hold accountable. That's how the church continues to grow.
[00:02:53] So the natural progression of our conversation moved us into artificial intelligence. And this was a chart that I showed last week. I was on a webinar a couple weeks ago from a guy named Kerry Newhoff, and he does a lot of church leadership stuff. He's got a couple of big podcasts and this is something that he presented based off of his research and what he has seen is that the main usage of generative artificial intelligence have changed from 2024 to about halfway through this year. What you can see over here is the top, let's say the top five in 2024. We're generating ideas, therapy and companionship, specific search, editing text, and then exploring topics of interest. And you can see how they've kind of traced some of these. So enhanced learning has moved from 8 to 4 in 2025.
[00:03:41] Healthier living had a big jump from 76 to 10.
[00:03:44] But therapy and companionship last year to this year moved from second to first. So therapy and companionship has consistently been a big use of artificial intelligence. That was not high on my list of what I've used it personally.
[00:03:58] But what we're seeing is a growing trend, particularly among younger generations and emerging generations. You see things like generating code for pros. You know, some of those things don't really move a whole lot.
[00:04:09] But the relationship based ones, the top three there, organizing my life, finding purpose and therapy and companionship for 2025. That's the main use of generative AI.
[00:04:21] I think that's significant and I think that's something for us to consider when it comes to the idea of companionship, the idea of influence, the idea of where we're going to find advice and where God has designed us to go to find advice. I don't think AI is purely evil. Spoiler alert. We'll get to that one here in just a second. Just to define a few terms real quick. Artificial intelligence is the big. It's the marketing term. It's everywhere, right? It's a part of your part of Apple. They came out With Apple AI, everybody now has incorporated artificial intelligence. My hot take for this class is that the age of social media as we know it, I think is coming to a close for a few reasons. One, more and more legislation is coming about and ironically starting on the west coast and moving east, a lot of states have already, our state this year, this summer passed a law that you can't have cell phones in school. But more and more legislation is putting an age limit on access to social media. More and more age verification is coming about on adult content sites as well as social media. So the access to social media, particularly from that like 8 to 12 year range, is shrinking. The access is becoming more difficult, more and more speed bumps for them to get there. That alone will impact the popularity of social media.
[00:05:39] You think about the last five to seven years.
[00:05:42] What are the main players in the social media landscape?
[00:05:45] Meta, Facebook and Instagram, TikTok, Snapchat, Twitter. It will never be X. I will die on that hill. Those are the main five ones, right? You got Bereal that kind of jumped up there. There's a few, but Omegle keeps coming around every few years, which is just terrible.
[00:06:02] Which also means I'm just old. But I also think those five kids on the block, so to speak, have not changed a whole lot. They've been the main players for a while, which is remarkable because Facebook became a thing in the early 2000s. Their ability to stick around is like LeBron James level longevity. But they've adapted. It's changed quite a bit. Their algorithms, algorithms have changed. What their features have have changed. As other, other platforms have become popular, they've adapted those features.
[00:06:31] Now where is the innovation? The innovation is inside those main apps and the innovation is surrounding artificial intelligence. I don't have to go to my feed to post to get positive reinforcement. I can go to my chatbot now, see how things have kind of shifted a little bit. So two years ago when we did this class, I had divided feedback, I guess had some folks that came up and said, well, that was kind of interesting. Other folks said, why did we waste an entire Bible class talking about AI?
[00:06:58] I think two years later, now where we live today, I think it makes a lot of sense. I think it made sense then as well. But I think it makes even more sense for us to examine it from a spiritual standpoint. So four terms that I just want to kind of touch base on before we get digging into a little bit more. Artificial intelligence, as I mentioned, is the marketing term, right? It's the branch of computer science. That deals with computer systems that can reason and learn and perform tasks typically done by humans. It's a great, great feature of AI, is to automate a task.
[00:07:28] Machine learning is kind of a step down from that. A program or system that trains a model from input data. It gives the computer the ability to learn without explicit programming. Deep learning, a little bit different, uses neural networking. Neural networks allow them to process more complex patterns than traditional machine learning. And LLMs, or large language models, you've probably heard that term as well, are deep learning algorithms that can recognize, summarize, translate, predict, and generate content using very large data sets. For the most part, what we're going to be talking about tonight, I know this is actually like a horrifically intimidating topic to talk about in the city of Huntsville in particular, because there are many people here that are in the industry that use it on so many different levels that there is a lot of facial recognition software technology that has utilized AI for a long time. But for our purposes, we're talking about. We're talking about math that processes at a really high rate, that tries to identify and produce patterns. That's generally how it's going to impact us in the ways in which we're talking about it. Earlier this year, I had the opportunity to talk to a school down in Mobile in south Alabama, and I talked to two different groups of students, Middle schoolers and high schoolers. Smelly and smellier as I like to refer to them. And I asked them, what is AI? Very curious to hear what a completely different generation understands.
[00:08:52] Middle schoolers consistently said two things.
[00:08:55] Robots in China, which is very interesting. See, parents, they do listen to you. The second group, for the high schoolers, almost exclusively every commune with, was it helps enhance my homework experience.
[00:09:11] Pretty sure AI wrote that response for you, right?
[00:09:14] All of their comments, it was really remarkable.
[00:09:18] Those were the different ones. When it comes to artificial intelligence, everybody has sort of their relationship with it based off of how they use it or how it impacts them in their work or in their play.
[00:09:31] When it comes to AI, I think that there are quite a few nuanced parts of this conversation. I'm not trying to convince you that it's the worst thing in the world. I'm not trying to convince you that it's the best thing in the world. I'm trying to do what we have done in every class, and that is to look at it with nuance. What are some of the benefits of AI technology? Well, the first one that comes to mind for me was how we used it here at Church a few years. A couple years ago, when we were doing the Kings and Judges series, we created these trading cards that our kids would collect each week. It was a great way to teach them visually. Apologetics Press has since done the same thing, and they have put together 365 different cards of different people from Scripture. That's a great way to. If you grew up in the same era that I did, then you probably grew up with felt board Jesus. Right. The flannelgraph. And so we would teach visually. We would have little scenes from Scripture and. And put them up there. And the only thing we really know for sure is that they probably didn't look like that. Right. So we can now take a lot of basically the collective wisdom or information of the Internet, and we can create these images that have some kind of.
[00:10:38] A lot more detail, for sure. Thank you. But definitely have a little bit more accuracy, at least to some degree. That's a good use of it, in my opinion. This is another headline that came out a couple of years ago. A boy saw 17 doctors over three years for chronic pain. Chatgpt found the diagnosis.
[00:10:56] A mom took her son to doctor after doctor after doctor, and they could not figure out what was going on with him, what the root of the challenge was. They kept treating symptoms as opposed to treating the actual issue. And so she really, in a desperate move, she took all of his medical history as best she could, put all the symptoms in into ChatGPT. And it popped out four or five different options. A couple of those she had not heard of. So she picked one. She called a specialist in that field. They got him in to see her son, and they came up with the right diagnosis.
[00:11:34] At Faulkner right now, they work with virtual cadavers in the School of Health Sciences, I'm sure uab, other places, Huntsville as well. There's some really amazing medical things that are happening here. They can identify proteins. We can identify minutiae in data at lightning speeds. We're identifying and rightly or correctly diagnosing cancers faster than ever. And that will only improve. We're using massive data sets. Imagine going to the doctor and he can use all of the X ray scans of the last hundred years of people that we have those scans for that have been databased and cataloged. And he can go through those lightning quick and compare it to yours and quickly and accurately diagnose. That's a net positive for humanity. That's a really, really good use of AI. I'm excited and I'm thankful that I live in this era.
[00:12:21] To be able to use AI for something like that, it's extremely, extremely powerful.
[00:12:27] This is the other end of the spectrum. And we've shared this in this class before as well. This was towards the end of last year.
[00:12:33] They're currently still in the midst of this lawsuit. The mother is an AI chatbot pushed a teen to kill himself. A lawsuit against its creator alleges character AI is designed to be one of those companion AIs where you create an avatar based off of whatever image you want. This young man chose a character from not Lord of the Rings, from Game of Thrones.
[00:12:54] And over time, very short period of time, his behavior began to change. He became more reclusive. And eventually the AI in the text threads that they have, towards the end, the chatbot basically said, I really wish you could come join me on the other side. And he said, okay.
[00:13:10] So he went and got his dad's gun and he took his own life.
[00:13:14] This is a young man who is still learning how to think. We've talked about this a lot. How the prefrontal cortex, how our brains are so intricately designed and yet are so incredibly powerful that giving an incredibly powerful technology in the hands of someone who doesn't know how to reason completely yet. It's not that they're a terrible person, it's that they're an adolescent. They don't know how to reason. They don't have a lot of life experience outside of.
[00:13:39] Outside of the digital world to understand there are other consequences here.
[00:13:44] So this is a big negative. This is one of those that's like a huge negative. As a parent, this is terrifying. So there's a little bit of alarms going off, but I just want us to kind of get a little bit of the landscape. Last Wednesday, like a couple hours before our class, I had seen a headline, but I didn't read it all. Somebody came up to me after class and mentioned it, and I've since researched a little bit more. ChatGPT is now also in the midst of a lawsuit for a teenager who took his own life. This young man started using ChatGPT for his homework. It was a tutor, basically. It was helping him to learn. It was educational. And ChatGPT is not typically designed to be companion based. It's a chatbot. It is conversational in the interface. And I think that's the greatest danger in a lot of these bots. We know it as adults, as a machine, right? It's a robot. It's a bot conversing with us. Same bot that if you go on Best Buy's website pops up. Hey, how may I help you today? It's not a real person, and these kids don't know that.
[00:14:45] And relationships are really, really powerful.
[00:14:49] So it's worth talking about AI in the context of the Gospel and talking about it in the context of the Bible and the church.
[00:14:58] Is AI evil when it's misused? Okay. All right. Anybody disagree? It's a hammer. It's a hammer. Okay, elaborate. It's just a.
[00:15:10] Like any other tool. Device. Yes. Either good or bad, depending on who uses it, what they're using.
[00:15:19] Okay.
[00:15:20] I think there's danger.
[00:15:22] Somebody has to program.
[00:15:24] So if it's inherently programmed with a bias for a specific purpose, it can be.
[00:15:31] Okay, so in that way, it differs from a hammer. Right. Okay. What else, if I may? You may. Good point. Because you remember some of the first images that came out that they created. You know, do you want a George Washington? Well, and it was a minority. Or, you know, it won't tell you the good things that if you ask him, hey, what are the selling points? Donald Trump won't give you that information, but it would give you the other candidate. So, you know, that's a really good point. It's not without its title.
[00:16:03] We've talked about in the past how. Or last week, particularly how algorithms can oftentimes be weighted based off of the information that they're fed. Right. Depending on the sources. Just like a person. A person who only watches CNN is going to say things the way CNN does. A person who only watches Fox News is most likely going to speak the way Fox News speaks. So it mimics us in very much the same way. And that bias is important. Sometimes having a bias is, okay, I have a bias. I hope it's the gospel. My goal is to make it exclusively the gospel.
[00:16:35] But knowing someone's bias is really important. Let's go. What else? I would say it's not evil. It's more of just a ethical dilemma. It's an ethical dilemma. Okay. What do you mean by that?
[00:16:46] I mean, like, that it's more a problem for the person who is the coder. Like the person. We need to be more concerned with the companies that are hiring the people who are creating and putting this information and allowing this thing to become more and more involved so that we understand certifications that they are being required to meet to be able to put out this kind of.
[00:17:20] Would you. This is the chat JMH version of what you just said. So the burden, the ethical burden is on the programmers, on the creator, the Person who feeds the information to the GPT or to the AI, that's their burden is to make sure that it's good. Right. Information, data that's not harmful to others and that sort of thing. Is that kind of what you're getting at? Okay. All right.
[00:17:47] I think it's interesting.
[00:17:49] It's very easy for us to take that approach of. Well, it's. And I'm not disagreeing with anybody. No, no, I wish you would. It's way more fun.
[00:17:58] Yeah.
[00:18:00] This is a safe place. It's very easy to say, oh, it's a tool, but the people making it are the same thing.
[00:18:10] Because what they want is they're not going to sit there and program it to harm people. They can't do that. They want to say it's what you make it and that's what protects them. It's what you turn it into.
[00:18:27] And I just, I don't. This doesn't really answer that question, but it's just a very interesting perspective in that they're using the same argument that we are and that it's. You're saying the bad guys are using the same argument, that it's only in how you use it. Yes. Yeah. Okay. Yeah, it is. They program it to be what you want.
[00:18:47] Yeah.
[00:18:49] Kirk, this is the first debate I've ever seen where someone says, you know what? I think I'm going to change my mind. This is like history happening here, people.
[00:19:01] All right, Kirk. Yeah. There's a machine learning aspect to it. Right. So this thing is capable of consuming unbelievable amounts of information from websites, YouTube videos, social media, you know, the things we put on books, music.
[00:19:22] Yeah. I mean, the stuff that this software. Yeah. You know, I may code it up. Stuff that it's capable of ingesting and then returning responses to is astronomical. And so that adds a whole different, you know, arena to the mix.
[00:19:43] Yep. I mean, I might be able to code this thing up to operate some way. But if it's truly AI, I mean, there's a huge machine learning aspect to it to where it's taking in all the stuff that we put out there. You feed it ingredients, it bakes, whatever it wants to bake. Yep. Well aware of the checks and balances. To say what it learns is true versus what you're feeding. Yep. You feed it lies, it's going to perpetuate lies. I like to think of it as an intern. We have interns here at the church from time to time.
[00:20:16] You spend time with it, you try to teach it, you try to mentor it. You try to give it direction, and then you release it out into the wild and it does whatever it wants.
[00:20:24] And sometimes that's no reflection on you, I've learned, I hope.
[00:20:28] But that is part of this, is that there is an element of once it's kind of out there. Sam Altman signed us all up to be a part of this group project, whether we realize it or not, or wanted to or not. And OpenAI in particular has been one of the more vocal ones of this has got to be out in the general public because the more information, then the better. Guardrails we can put in.
[00:20:47] You don't put up a guardrail until you lose a few vehicles off the side. Yes, sir. I think the baseline here is that at the very least, it has the capability for evil, even if it's not, you know, ontologically good or bad. Okay, all right. It has the capability of evil and then some other big words. Yes, sir.
[00:21:03] You know, think of it like social media. Like, I could get on Instagram and train that algorithm to give me biblically uplifting content, or I could just have it feed me porn. Like, it's to some aspect, like Kirk was saying, it's learning off of something.
[00:21:18] It's learning off of what we put into it. And I know you can get in the whole argument about coding and all that, but that's where I mean, fundamentally, I think for probably my age and younger, but maybe everybody too. Like, discernment is more important than knowledge because we can say, like, was it giving us the truth? Well, who's the arbiter of that? You know, obviously there are some things that are true. There are a lot of things that evolve or opinion based. I don't really care what ChatGPT's opinion is on things. Some might, but it can give us information, but we shouldn't trust it to give us the discernment to that information. Very well said. I like that. Discernment is more valuable than data, Would you say knowledge? Knowledge. Data's alliteration there. We could think, okay, it's your quote. We'll let you go. Yeah. Yes, sir.
[00:22:05] I think if you go to the very base, is AI evil?
[00:22:10] You kind of get the question of, like, what is evil? Is something without a soul can even be evil? Because, like, you know, bears, bees, hornets, we all hate these things. Battlestar Galactica, right? Yeah.
[00:22:21] I mean, you led us right there. All them evil, but, like, they don't have souls, so can they really be evil? So then it's like, is AI capable of that? At all. And if we get to the point of, like, it's truly sentient, does that change the question? It still doesn't have a soul, but it acts human. Is it now evil and creators are free of that? Yeah, a big, big conversation that comes about. Obviously, Skynet, all that stuff is sentience, right?
[00:22:49] It will never have a soul.
[00:22:52] So, no, it won't be sentient in that sense. But what it's becoming is really, really good at mimicking human emotion.
[00:23:01] It doesn't have emotions. It has a series of if then statements essentially at its core. Right.
[00:23:06] It has these math equations that find patterns, predict patterns, create new patterns.
[00:23:15] That's not the same as having human emotion and mimicking human emotion. But that's the danger is because we will be a little bit weaker, the weaker vessel, so to speak. Sometimes, especially folks that are emotionally vulnerable, those that are much older, those that are much younger, and then several in between as well.
[00:23:33] Scripture talks about being carried away by every wind of doctrine. Right. The tickling ears part. We can be deceived by our own hearts. Scripture speaks to that multiple times.
[00:23:41] So there's a deep spiritual forming aspect to this. I found this line from Andy Crouch's book, the Life We're Looking For. He says, what makes someone a person? This kind of gets to your comment a second ago. Is it relationship?
[00:23:54] In that case, we have relationships with our pets. We have relationships with our cars, technology.
[00:24:00] What is the difference between something and someone?
[00:24:05] Five years ago, my oldest would have been five. Would I have a five, three and a one year old?
[00:24:12] I would not have thought that the idea of transhumanism, or what is a person and what is not transhumanism, is that blurred line when technology is. Is human, right? What is human? What is tech? What is a robot kind of thing?
[00:24:28] I would have thought that was kind of a silly conversation.
[00:24:31] I am increasingly becoming concerned that that needs to be one of the most talked about conversations in front of my children in particular. I know there's a difference, but when we start naming things, we start to blur those lines, particularly from a relationship and an emotional standpoint. We call it Alexa. And everybody's got Alexa. Everybody's got the same Alexa, but we don't. I switched my Waze app today to speak to me like Nate Bragazzi.
[00:25:02] It was great. Nate was right there in the car with me. Hey, man, turn around. But do it safe. No cowboy stuff. All right. Thanks, Nate. That's exactly what he would say. We start to form these bonds. Humanity is kind of infatuated. With this anthropomorphic stuff, right? We have Robin Hood was a fox growing up. For me, Robin Hood was a character of a person, but we turned him into an animated character, an animal. We like to have things that are not humans, have human characteristics.
[00:25:30] Well, what happens when, as a child, your imaginary friend now can have some kind of embodiment presence?
[00:25:41] We're not that far away from holograms. That's all I'm saying.
[00:25:44] VBS is going to be killer.
[00:25:48] But truly right now you put on your.
[00:25:51] Well, you put on your meta glasses and it has all these overlays. It's an augmented reality. Not necessarily a fully artificial. Not necessarily a fully.
[00:25:59] The interface is augmented, right? There's this blurred line between artificial and real that can take place. We have to make sure that we know where that line is and we have to think deeply about it. I know we haven't gotten into scripture yet. We're going there.
[00:26:15] But I wanted to mention one more article here that I think is interesting. This came out a few weeks ago when I ran across it, an MIT study they did. I guess it was. I can't remember the date earlier, the beginning of this year into last year, they had three different groups of people that did essay writing over the span of, I think it was about a month. And they had one group that only used ChatGPT for their research and writing, one group that did Google search, and one group that did just on their own, and they did a series of essays. And in the very last essay they did, they swapped groups just to see if there was a difference, and they did EEGs afterwards, brain scans.
[00:26:52] The group that exclusively used ChatGPT had a significantly different brain function, activity, brain activity than the other two groups. They use less of their brain muscle to write those things and to come up with their essays.
[00:27:09] The lady that published, or the group that published this, it hadn't been peer reviewed yet. So full disclosure on that.
[00:27:16] They went ahead and published it and made it public because their fear is in six to eight months, they don't know where we're going to be when it comes to AI. It could be that some legislature says, hey, we need to AI kindergarten.
[00:27:30] And one of the ladies that was the head of the group in the interview, I listened to her say, that's scary. And we want to just make people aware that there could be some significant impacts to this. Now, those people didn't lose their ability to function as humans, but their critical thinking skills were directly impacted.
[00:27:50] And these were not children that were in these control groups. We, when you put it on, particularly when you immerse the youngest among us into these tech driven environments, there's some real impact. Teachers, those of you in education, the group that went through Covid, not in school, not in person, there's a difference in their behavior in the classroom and their expectations and their interactions. There's a tangible impact there.
[00:28:18] So the idea that we just rush into all of these things is.
[00:28:22] That seems irresponsible. The fact that we're going to not regulate at all, we can get into the political side or whether that's the national level, the state level. The idea that the people that are responsible for putting up our guardrails, some of them have other incentives is something that we always deal with. It just seems like some of the impact continues to scale up when it comes to technology.
[00:28:44] That's my cause for pumping the brakes a little bit.
[00:28:49] There are two dangers that I see that are higher than others when it comes to AI and those are the ones that I think have the greatest spiritual shape on us that I want to spend some time on tonight.
[00:28:58] One is the idea of shortcuts.
[00:29:00] Is a shortcut an evil thing?
[00:29:04] Is it sinful to take a shortcut?
[00:29:07] Nathan, what did you say?
[00:29:09] If it saves you gas. Alright, so what do you think? Are shortcuts? Are they wrong? Maybe evil is the wrong word. Are shortcuts wrong?
[00:29:17] A shortcut? Do you mean a cheat? Is that what you really. It can be, yeah. Okay. I don't know. You keep changing your stance on stuff. Bart.
[00:29:30] I love it.
[00:29:35] Ends justify the means. My arm did this workout or my arm.
[00:29:40] Okay, we'll go Tim. And then two over here. Yeah, I wouldn't say they're able. I'd say they're. They make you lazy.
[00:29:48] Okay. Shortcuts can make you lazy. Okay.
[00:29:52] Mike. And then I think. John. Yeah. So I'm working on a project right now. And so I had, I had AI generate the initial code for me instead of me spending a day and a half. And so I skipped doing the fundamental piece and now I'm focusing on the, the business cases and the business rules. So for me, the shortcut took me from, from an 80 hour project to the 40 hour project. Right. Okay.
[00:30:21] For me. No, but I don't like the terms even I'm with Bart. I made a mistake. Okay, I'll change it. Yeah. John, So I think it's again, it's all about how it's used and what you're using it for.
[00:30:35] I believe it's. It's exactly what you were talking about. It's a detriment to learning. It's a detriment.
[00:30:41] Control F is what I have in my head when I was going through. I didn't have control. I had to read the whole thing more than once to find it. And that helped me learn because it was repetition, control F. You go right to it, you skip over all the other stuff that you're really supposed to be learning too, to get to that one answer. Yeah.
[00:31:00] When Ellen and I first got married, I realized I did not know my wife's telephone number.
[00:31:05] And so I would force myself to call her from the office and use the office phone.
[00:31:10] And that way I could learn the pattern because she was.
[00:31:14] It's not that I didn't know her. She was in my phone. She was number one. So I just pressed one and that was it. But sometimes I would find myself with someone else's phone or some situation where I didn't have my cell phone. Oh, that'd be useful to have that number ingrained in my head. I mean, I got my number that I grew up with 30 years ago.
[00:31:30] Much less helpful. Now.
[00:31:32] Sometimes the shortcut. Shortcuts do have consequences. And I think that's the part where we have to look that determines the value of the shortcut or whether or not the impact. I guess sometimes a shortcut, as Mike's saying, is not expediency, is efficiency. Right. The fact that my doctor can go and analyze 30,000 patients and compare with mine in a morning, I'm okay with that, you know, as long as it's accurate. Right. And that's the challenge here. Sometimes we don't know when it steps outside that line of accuracy. But I want to look from a spiritual standpoint. Turn with me to Matthew chapter four.
[00:32:08] And a lot of this conversation I think we need to chew on as adults. But I'm also thinking about it in terms of the next generation.
[00:32:17] We have. All of us in this room have grown up with some kind of pre AI life, some longer than others. Right. So we have something else to compare it to. So AI right now is new. It's compelling, it's curious, it's powerful, it's helpful, but it's also hurtful. We're still figuring some of that out.
[00:32:37] The next generation that has just been born or is yet to be born, they will grow up fully immersed in this world from beginning, from the very beginning. And that's the starting. Points matter sometimes, particularly when we're trying to bestow wisdom on someone who sees it as foolishness. Or just stunting my growth. Right. You're just an old person. You don't know what you're talking about.
[00:33:01] But do. And particularly when it comes to the spiritual side of things, because I believe Solomon was right. Nothing new under the sun. When it comes to the spiritual heart of these issues, they're the same. We're still talking about lust today. They talked about in the first century. We're still talking about truth today. Talked about in the first century and before that as well. So I think there's a lot that we can glean from here. In Matthew, chapter 4, verse 1, then Jesus was led up by the spirit into the wilderness to be tempted by the devil. And after fasting 40 days and 40 nights, he was hungry, and the tempter came, and he said to him, if you are the Son of God, command these stones to become loaves of bread. But he answered, it is written, man shall not live by bread alone, but by every word that comes from the mouth of God.
[00:33:40] Had a conversation recently with a friend of mine, and we were talking about parenting in the age of AI. And he kind of.
[00:33:48] He turned this conversation from an angle that I had not considered before.
[00:33:52] Jesus has been fasting for 40 days.
[00:33:56] He's in the desert, and Satan offers him a shortcut to break his fast before he intended to by turning stones into bread.
[00:34:09] The devil took him to the holy city and set him upon the pinnacle of the temple. Said, if you're the Son of God, throw yourself down, for it's written, he will command his angels concerning you. On their hands they will bear you up, lest you strike your foot against a stone. Again it is written, you shall not put the Lord your God to the test again. He took him to a very high mountain, showed him all the kingdoms of the world, their glory. And he said to him, all of these I will give you if you will fall down and worship me.
[00:34:31] You can look at these as shortcuts.
[00:34:34] Glory. Shortcuts to power. Shortcuts away from what God's plan is.
[00:34:41] And Jesus didn't take the shortcut.
[00:34:45] When I hear high schoolers say the greatest value is that it helps me cheat or it helps me get out of doing hard things, there's a cause for concern there.
[00:34:58] Sometimes we need to get up and turn on our own lights.
[00:35:01] There is value to. To doing things more on an analog basis. However, there's also value to having efficiency in our lives. That's why this is hard.
[00:35:12] My fear is that when.
[00:35:14] What do you lose if you just read the headings in Your Bible, the detail, all of it, Right.
[00:35:21] If you just read the summary.
[00:35:24] The only time my mom bought me Cliff Notes was for Lord of the Flies. And I'm thankful for that. What a weird, strange book that I did not have to read. And I was so excited because I felt like I got to cheat the system. And it was parent sponsored. It was great.
[00:35:40] But when we constantly form these habits, and we've talked about this idea too, that technology really forms habits and shapes habits, it shapes how we think.
[00:35:49] If our interaction with this platform is exclusively to cut corners, that is not good.
[00:35:55] I think there's some really serious and deep spiritual consequences to that mentality. Then kind of drifting over.
[00:36:04] I don't think that the Bible, I just don't think that it was written for us to not think deeply about it.
[00:36:14] These letters were not written to be.
[00:36:17] To be, you know, put out as a CNC machine on a hobby lobby board that you can buy. They weren't meant to be like commercialized or consumed. They were meant to be truly transformative in the life of every person who reads it.
[00:36:31] There's no artificial element to faith.
[00:36:36] And so when we use this technology, we have to make sure that we are using in a way that does truly glorify God and does shape us into the image of Jesus. And sometimes the task at work, that's not like profoundly spiritually shaping, but the other side of that is when we allow these technologies then to kind of creep into that companionship side, there's a danger there as well. What's the difference in, let's say someone who's older, whose spouse has died, and they live in an isolated environment, they're not around their family a lot. A lot of times people will bring them pets, they'll get a pet. What happens if that person is not able to take care of a pet?
[00:37:14] It's not good for them to be.
[00:37:16] To rely on that the older for the pet to rely on the human to take care of them, to clean them, to clean up their messes and feed them.
[00:37:23] What about an AI robot, like a dog robot? It's companionship, right?
[00:37:28] I don't think the question is, well, is it right or wrong? Is it good, bad?
[00:37:33] We're already here.
[00:37:35] The future was yesterday when it comes to this stuff.
[00:37:39] I mean, that's a true question. What do you think?
[00:37:42] Is there a difference?
[00:37:44] Go ahead. You kind of get to this arena of you can't really answer the question, right? And so it comes to thinking of, I'm going to think like what Paul said All things are lawful, but not all things are profitable.
[00:37:57] So while something may be available and maybe said to be good or not bad, however you want to define it, doesn't necessarily mean it's profitable. What's. What's more profitable in a man's life to be a rock star or guitar hero or actually learn to play an actual guitar? You know, like, can make money both ways. But, yeah, I think one of the things that we'll end up losing, probably have been losing for a while, is just this idea of value and profitability and art, you know, as a were.
[00:38:34] I don't know any other way to say it, but, yeah, go ahead. I keep going back to our study of acts. Right.
[00:38:41] So in acts, why did they choose the Seven?
[00:38:44] They didn't choose the seven to pick up after the Gathering. I mean, they chose the seven to take care of the widows. And so I think if we're looking at technology to form that relationship of technology, then I think it's bad, because what I think is we're not.
[00:39:05] We're not doing our jobs as fellow humans if we're not taking care of widows, if we're not taking care of those within our. Within our family.
[00:39:17] You know, I think there's a big difference in an assistant who will. Who will help them.
[00:39:24] You know, like, I think there's a big difference between an assistant versus someone taking that place.
[00:39:31] Rosie the Robot did both, right? We all have Rosie the Robots now. We've got our little vacuum cleaners. But she also spoke to the kids. She was also a mentor to Elroy. Right. And to Jane. And so we didn't realize it, but that's where we are now. Real quick. Yes. I think there's two big negatives that can impact us from AI. One is, you know, the. The. The critical thinking, which could impact our Bible study. You know, being able to think through scripture, being able to make those connections and whatnot to further spiritual growth. The other aspect is the relationship building.
[00:40:07] You know, if we're.
[00:40:09] If we're using it as a substitute, you know, it doesn't have a soul for us to save. You know, like if when you build a relationship with someone else, a friend who might not be a Christian or whatnot, you have the opportunity to build a relationship and to eventually save them.
[00:40:25] I was talking to Mr. Barry at some point today. He kind of brought up the fact that me and him were having this conversation because 15 years ago, Blake Gannon invited my older brother because of that relationship. And so we were all impacted. But you don't have that on opportunity with AI, you might not develop that relationship if you're using it as a substance.
[00:40:46] When it replaces the human side of this, right? When the church allows anything in the digital space to.
[00:40:55] Allows it really to be better, more substantive than the real thing, then we're failing.
[00:41:02] And that's why this is a recurring theme as well. This class, this building has to be filled with people that are willing to be in each other's lives. That's why we put such an emphasis this year on fellowship and hospitality. Last year was Bible study and prayer, all of these things. We're trying to make each other a priority.
[00:41:21] In the midst of living with this technology that is godlike in its power, right? We have these medieval institutions and laws that are always playing catch up and never quite getting there. We have to see spiritually real quick when it comes to shortcuts, I feel like all sin might be a shortcut. When we give in to lust, we're saying in that moment, God, I don't trust you to be able to bring fulfillment in my life the way that I need. And so I gotta solve the problem here. When it comes to murder, envy, strife, covetousness, dishonesty, greed, all of that is God. You can't do this. In this moment, I got a shortcut.
[00:41:56] I gotta fix it myself. Taking that control, maybe there's an element of shortcut. There was one other idea that Will shared with me at lunch today that I think is really intriguing. And so we'll make this your homework. Well, two things. One, when it comes to AI, Christians are truth tellers.
[00:42:12] It's the same as a Google search.
[00:42:14] We don't promote untruth. We speak the truth. Jesus is the way, the truth, and the life. But the other one comes from Genesis chapter three. And I want you to dig through this a little bit. In Genesis 3, the serpent was more crafty than any other beast of the field. He said, did God actually say to you, you shall not eat of any tree in the garden? And the woman said, no, no, just this one. Don't eat it. Don't touch.
[00:42:34] Perhaps God's plan was to reveal to Adam and Eve in due time, in his time, the reality of good and evil.
[00:42:43] And perhaps Satan sold them on a shortcut by taking of the fruit.
[00:42:48] There's an interesting parallel that Will has been reading about. Share with me. And this is for you to the yarn for you to pull on in your own study. When Jesus is sitting there, the Last Supper. He says, take, eat.
[00:43:01] So now is the time to take eat when we take communion. The fulfillment of what? God. God's original vision of right and wrong. Excuse me, of good and evil. It's very interesting, and it's a horrible way to end a class because you can't say anything with it. But I want you to think on that a little bit. I think especially the garden, the Genesis 3, the situation there is really, really interesting when it comes to the idea of shortcuts. Shortcuts and companionship. Those are two big dangers that also have some benefits along the way, and it's really hard to discern. Next week, we're going to move more in the direction of intimacy. It's going to be real uncomfortable, so bring a friend. It'll be great. Thanks for joining us online. Have a good week.