Mindfully Masculine: Personal Growth and Mental Health for Men

The Rise of AI Girlfriends: "Boyfriend" or "End User"?

On "Mindfully Masculine" we support and encourage men who strive to level-up their lives as we share books, media, and personal stories on mental health and well-being. Challenges in your life? We deliver the tips and tools that really help. Episode 197

What happens when your “perfect” girlfriend lives in your phone, hits your credit card for $9.95 a month, and never says no? In this episode, Charles and Dan explore the rise of AI companions and what draws men to digital relationships over real ones.

We dig into questions like:

  • Why some men are choosing AI girlfriends instead of dating apps or real partners
  • Whether talking to a bot can really make you feel “chosen”
  • How this trend might affect masculinity, intimacy, and even population growth
  • Why equal rights for women are great for exceptional men… and not-so-great for mediocre ones
  • Whether an AI girlfriend is a bridge to better relationships or just a pacifier keeping men stuck

It’s part philosophy, part comedy, and part social commentary on where dating and technology collide. Spoiler: Rosie from The Jetsons and Dolores from Westworld both make cameos in the conversation.

For free access to all our audio and video episodes — and anything else we decide to share — head to MindfullyMasculine.com
.

Support the show

Charles:

we have changed the way that we meet and date and mate with people, because of social media, because of dating apps, because of the birth control pill, because of, you know, women's liberation, and here's, here's the sort of sad reality of it. Um, equal rights for women are bad for mediocre men.

Dan:

Interesting.

Charles:

Equal rights for women are great for good men, great men, exceptional men, men who are flexible, men who are willing to, you know, realize. Oh, the world around me has changed. I have to change with it. But for guys that are kind of just, you know, skating by with the bare minimum, having a bunch of women in college, having a bunch of women buy houses, having a bunch of women in professional settings, having a bunch of women with access to all kinds of different guys that they might be interested in dating, that's kind of bad for the bottom of the barrel guys out there yeah so you know what.

Charles:

What do we do about that? Like I say, we do nothing about. Welcome to the mindfully masculine podcast. This is charles. In this episode, dan and I do a one-off where we dive into the topic of ai girlfriends and how they compare to real relationships. We explore what draws men to digital companions, what's gained, what's lost and how this technology might shape dating, intimacy and even society itself. Join us as we look at the psychological, cultural and very human questions behind this trend. Check out mindfullymasculinecom for access to all of our audio and video episodes and anything else we decide to share. Thanks and enjoy the show.

Dan:

Good morning Charles, how you doing.

Charles:

I'm well, dan. Thank you, how are you?

Dan:

I am well, thanks.

Charles:

I am dressed a bit more festively than I usually am for the yeah, what is the special occasion? I am going to the zoo right after we get done recording and so, as a sign of respect to the animals, I'm wearing a shirt with animals on them. I hope animals I'm. I'm wearing a shirt with animals on them. I hope they appreciate that pretty soon. Yeah, especially if there's an aquarium full of fish we'll see. Going to the brevard zoo down in melbourne I've never been there before okay, I've never been either you know I'm a big fan of going to new cities and seeing new zoos.

Charles:

It occurred to me oh, there's, there's a new zoo right in the greater central florida area that I've never been to. Excellent. So me and friend are gonna go check out the brevard zoo today and see what kind of uh animals they got. I hope it's uh, I don't think it's as big as the one in sanford, I think it's a little smaller. Okay, the one in sanford doesn't really blow me away either. Now we'll see how it is. Maybe there'll be something cool, but uh, anyway.

Charles:

So uh, we so we have not effectively decided on what our next long form series is going to be, so decided to. We didn't have an episode in the banks. We got to put something out on Monday, so we came up with a, did some searching through some different topics and landed on one that we both found interesting. So we're going to talk about this, which is, um, artificial intelligence, girlfriends versus real relationships with real girlfriends. Um, though, I suppose you know, this doesn't have to be tied to girlfriends or boyfriends.

Charles:

I believe that one of the videos that we watched to prepare this said you could, like, choose male, female, non-binary, like, whatever companion you wanted prepare. This said you could like choose male, female, non-binary, like whatever companion you wanted you could. You could pick from. And it's funny, the, the one video that we'll talk about. Um, the guy in the video asked the ai companion if it wanted to be his girlfriend and it's like nope, that's beyond. That's on the other side of the paywall. They thought was funny.

Charles:

You getting rejected by the ai girlfriend. God, take out your wallet if you want to. If what? We'd say yes, but uh, so, yeah, we're. It is like reality. So the the first video that we watched was a news report on uh, uh, I think I think it was a journalist on CNN interviewing an expert, uh, professor, about, um, the technology that's out there and the kinds of people that it appeals to. And, uh, I mean we'll. We'll disclose our own personal experiences first. I I have never it's never occurred to me Like I've I saw the movie her with joaquin phoenix.

Dan:

Yeah, I didn't even realize that movie existed. A lot.

Charles:

It was a while ago too. I'm not a movie buff, but I don't know when that.

Dan:

Let's see when that came out yeah, they said it was about 10 years old, right yeah, it's. It's been a while and uh yeah, joaquin phoenix definitely plays some interest 113.

Charles:

Yeah, so that is 12 years ago. Oh my gosh, it's crazy, yeah. But uh, yeah, a lot of the, a lot of the functionality, um, that was shown in that movie is is pretty close to being available now and you know it's not going to be that much longer. Before you know, you can have some sort of a physical android in your house. I'm having an apple, I'm an apple guy. Come on, uh, you can have a physical apple in your house right now. Um, yeah, so that I mean there, because the the point of it from the that first video that we watched was there are some needs that an ai girlfriend cannot meet for you at this point, because they don't are behind a paywall they don't physically.

Charles:

Oh oh okay, got it, got it um, but you know that's not going to be the case forever yeah you know, at some point there will be some sort of a west world situation where you can have yeah I I think your house it looks like a real person and acts like a real person.

Dan:

I mean I just got a Roomba. So you know, I mean, there you go. Yeah, a little, a little robot buddy to the vacuum for me.

Charles:

Yeah, the next step will probably be Rosie from the Jetsons, and then the step after that will be uh.

Charles:

The step after that will be what's her name from Westworld? What was the girl's name? I don't remember the main host, but yeah, and so I guess the question is you know, from a macro and a micro level, what is this doing? What is this technology doing for individuals? What's it doing to individuals and what's it doing to our society? And for me personally, I don't see the appeal. I I don't really relate to what it would deliver me to have a make-believe girlfriend that I talked to.

Dan:

Now is that because you know what having a real girlfriend and a real relationship and a real wife actually feels like, Whereas a lot of these younger kids don't have that experience just yet. So they've got really. They don't have that standard to compare them, to compare this to Possibly.

Charles:

However, I don't think that it is just that. I don't think that it is just that I mean I had a girlfriend in high school and I had a girlfriend in college. I'm not bragging, I'm just saying from a young age, as a young man, I understood the difference between, I mean, because there were, there were some guys that I knew fairly young, that you know cause there, I think, there are some strip clubs you could go to when you're 18 in Florida, and there are others that you have to be 21 for. So, um, I understood the difference between paying someone to act like they like you and actually getting a girl to like you.

Charles:

Yeah, and for that reason, I mean, I've I've never really seen or experienced the appeal of a strip club, of a prostitute, because the my brain cannot disconnect the part where this isn't real. This is make-believe. Yeah, um, I've never had the problem of I mean, you and I have gone out a lot for meals, gone out for drinks, and you've never once heard me say something to the effect of I think that bartender really likes me, dan. Yeah, because I I understand, you know what the the interpersonal dynamics of now, this is that person's job. They're supposed to be nice to you, because being nice to you makes them good at their job and makes them excel at their job, so that's why they're being nice to you.

Dan:

So I'm wondering if this kind of falls into what Mel Robbins talked about, how we always think we're the exception to the rule, and I wonder if this plays a little bit into that too. It's just like, yeah, yeah, I know for everybody else, you know she's a bartender or a stripper or whatever you know and has to play that role, but she really likes me like I'm the exception here, right, and I think I think that happens a lot, and especially you. You add in in those places people are drinking, they're high on something else sometimes, so all of those things can absolutely start to twist, twist around our logical and our, our, our brain as it works normally. Add in we think we're the exception to the rule, we are special because maybe we're getting a little bit of extra attention from that person who we don't normally. We're not used to getting that kind of attention from anybody, and I think I think that's how it happens. I think it's very easy for somebody to slip into that mode where they're fast. I think this is, this is special it's different.

Charles:

Yeah, right, so I mean, does that magical thinking follow on to you? Know an app on your phone where you think that you know, oh, this, this ai companion. I think she might be sentient, I think she becomes self-aware, she might be real, real Well.

Dan:

I don't, I don't. I mean that's a great question. I don't know, I don't think. I mean, if I had to guess is I don't think that people are even asking those questions like the getting things that they need out of it in terms of being able to completely be vulnerable with them and express exactly what their needs and desires are, with no or very little fear of any type of consequences.

Charles:

I wonder what it is about the language part of it specifically. Because I mean, you know, a hundred years ago you could tell your dog all of your problems. Because I mean, you know a hundred years ago you could tell your dog all of your problems and it would, you know, not judge you and not react harshly. But so is there something about being able to verbalize it and then hear, you know, have it go back into your ear.

Dan:

Oh yeah.

Charles:

Oh, absolutely. Yeah, you know cause? I mean, you can talk to your pet rock, you can talk to your dog, you can talk to the wall, and it's essentially accomplishing the same thing. I just different.

Dan:

I just no, no, no, it's completely different because you're getting the feedback from them.

Dan:

I mean it's, it'd be different if it was, if there was no input coming back.

Dan:

But I mean, just like with chat gpt, when we're asking questions to it or advice, I mean I use, I use chat gpt for advice, um, whether it's business or or sometimes relationships, um, just basic ideas, and the way the, the way it communicates back to me, um has really is it's helped me with getting back on track on personal goals and things.

Dan:

And I've, I've I've uploaded things like oh, here's, like, here's a personality profile test of me. This is this will help you get to know me and where I have strengths and also weaknesses, and I say call me out on on things like if I'm making excuses for stuff that don't need me excuses for, and I think, and it does a very good job of that and I, I think, as a, if you were trying to get that out of a, some sort of romantic relationship, and you could take that, you could take that to the next level and and yeah, there's definitely some sort of good feeling that comes out of you know, out of reading those words or hearing those words now because you know, with the way, the way they speak now and it's, it sounds like a real person for sure it's getting closer.

Charles:

Yeah, it's still. I mean I, I use the, the voice mode on chat, gpt sometimes, and it still feels a little. Yeah, it's not. It's not like you're having a phone call with a real person, correct? So it's definitely yeah, a bit, but yeah, I don't know. I think I mean, okay, here's, I guess you know let's, let's get to the meat of it is is there anything wrong with it? I would say no, there's nothing wrong with having an AI girlfriend. It's just there's nothing morally wrong with it or ethically wrong with it. The question is is it leading to happier, healthier lives for the people that have it? Maybe in some cases it is. I would say in most cases probably not. But I mean, I don't think that's any reason to say we need to stop this. We need to put an end to this. I mean, if it's working for people, or even if it's not working for people, but they think it's working for people, it's not somebody else's job to you know, jump in the middle and say, no, you can't do this, it's not good for you.

Dan:

Yeah, my, my question is is when we go into a relationship thinking we want a girlfriend, what do we want out of that? What is our end goal? Is it to eventually meet someone to marry and have kids with or start a family with, or is it just to have some sort of emotional support where you can be completely, completely vulnerable with them? That would be my question, and then that would roll up into are we making it too easy, though, to depopulate the human, the human species? Because if people are getting a lot of the benefits from this ai relationship that they normally would have to get from another human being, who then eventually, are more likely and at least have the the possibility of reproducing with at that point, are we then really making it much more difficult for reproduction to happen, because now we have this temptation to get those needs met somewhere else, and so there's a lot less bang for the buck for having a real relationship because you can get those emotional needs met somewhere else?

Charles:

Yeah, I mean, I would say yeah, if, if, if AI companions lead to a population collapse, then so be it, fine, I don't care. I mean because the idea that, uh, I mean one of the rationales that has historically been used to to oppress gay and lesbian people was well, the reason it's wrong is because if we all did it, we that we wouldn't have any humans anymore. So, therefore, it's wrong to be gay or it's wrong to be a lesbian.

Dan:

piss off yeah, that's a weak argument to me yeah, yeah.

Charles:

And so, even if, if, if humanity evolves to the point where we seek out relationships that can't lead to human reproduction, then, okay, we ran our course. Yeah, I don't care. I mean, I, I haven't reproduced and I sleep like a baby.

Dan:

Don't get me wrong, I'm not saying we shouldn't do it because of that, but it's just something to think about. Is, yeah, really just. I think it really then falls back down on the individual to really go in with intentions here. What are you looking to get out of this? Because it could be and one of the videos they were talking about how people who are really challenged in terms of getting relation, having relationships like if you've got a lot of disabilities or or, um, crippling social anxiety, that this could be a a bit of a bridge or, you know, even if it's not an end, you know an end fulfilling need. It definitely can help you become maybe more social, as a training ground before you actually go out there and interact with real humans, right, so there's, I think there's definitely useful. There's a. It can be a useful tool as well.

Charles:

Yeah, I think in my own case, I would say and tell me if you agree with this or that. I mean, this could just be a function of my mind and my background. One of the appealing things to me about a real relationship with another person is the feeling that you get when you've been chosen by somebody else that's interesting yeah're, you're not getting that from an AI companion that's hitting your card for nine 95 a month.

Dan:

But let me ask you what are the things that that person is doing when you know to feel like you've been chosen to me? I guess we could look at it one of two ways the the moment they agree to be your, your girlfriend, or are there things throughout the relationship that you're looking for to fulfill that need that that to remind you that you've been chosen?

Charles:

I think it happens from from saying yes to the very first date, yeah, or all the way, okay, you know, agreeing to, you know move to a new place together or start a business or have kids, or I mean yeah it's somebody you know when, when you're in a long-term, healthy relationship, you're, you're saying yes to each other constantly.

Charles:

Yeah, and you'll never get that with. You know, either the current version of ai companions or, once they, you know, become walking automatons that you have in your house, yeah, that will cook you dinner and then have sex with you. Yeah, like they're, they're still. They're not choosing to say yes, they're, they're fulfilling their program.

Dan:

so you're right. Here's a question, though how much of that are you going to remember if you're in it like that They've been programmed, versus just the very fact that they are still there engaging with you? I would ain't remember Is is, uh, I think the same reason I don't get a strip club.

Charles:

Yeah.

Dan:

Yeah, I was like cause I could see that very getting blurred really easily Like, oh well, they're still here. So, yeah, I've been chosen Right, even though that's not really what's happening under the surface. Um, yeah, yeah, yeah, I, there is no real being chosen. I, I agree with you and I guess it depends on the way you're looking at I think.

Charles:

Yeah, I mean, look, people can delude themselves into a lot of different types of things for a lot of different types of reasons, and oh yeah and, yeah, it's possible that you could trick your brain into releasing the chemicals as if you were in a real relationship with a real person.

Charles:

Um and I. I don't know what the long-term effects of that are. I, I would think that they're probably not good, because you know there is value in trying to go out there and, in front of the world, strive to get the thing you want and not get it, and then refine your technique, refine your abilities and then try again. You experience a little bit more success and then try again, and you know when. When stuff is handed to you, you know it doesn't make for um, it doesn't make for resilient people it doesn't make for growth.

Charles:

Either that, exactly, or progress. Right, and one of the thoughts I had while I was watching, listening to these, uh, these three videos, which we'll put the links to the videos in the show notes so that you guys can follow along with what we watched I wonder if or how the women of the world are aware of this and concerned with it. My first thought was okay, ladies, there's not much to worry about here, because the guys that you're losing out to AI girlfriends these are not boyfriend or husband material.

Dan:

That's an interesting point.

Charles:

But the worry I would have is that they could be someday, but because they're getting this right, this binky, this pacifier, yep, they have no reason to develop the skills to become a good boyfriend or a good husband. So at the moment that the ai girlfriend snatches them out of the dating pool, you're not missing out on much right. It's the future version of them that they could have become by having some bad dates and hearing some no's. That could have turned them into a good partner, and that just may not happen now because they're taking themselves off the table.

Dan:

Yeah, that's a legitimate. It's a legitimate concern.

Charles:

But you know, like so many things, I mean the the only reason that this is happening is because of, or one of the contributing factors, I would say, is we have changed the way that we meet and date and mate with people. Because of social media, because of dating apps, because of the birth control pill, because of, you know, women's liberation, and here's, here's, the sort of sad reality of it. Equal rights for women are bad for mediocre men.

Dan:

Hmm Interesting.

Charles:

Equal rights for women are great for good men, great men, exceptional men, men who are flexible, men who are willing to, you know, realize. Oh, the world around me has changed. I have to change with it. But for guys that are kind of just, you know, skating by with the bare minimum, having a bunch of women in college, having a bunch of women buy houses, having a bunch of women in professional settings, having a bunch of women with access to all kinds of different guys that they might be interested in dating, that's kind of bad for the bottom of the barrel guys out there yeah so you know what?

Charles:

what do we do about that? Like I say, we do nothing about a top 10 guy.

Dan:

Yeah, it's interesting, it's, I feel, like it's evolution, on on, fast forward, like we are. We're witnessing evolution and or or a natural selection, I should say. And evolution, I guess, but natural selection real time, if we're looking at the dating pool at that point.

Charles:

And again there could be some consequences like population collapse, where if we, you know, the top 10% men can't produce enough babies to support all the old people that we already have, and so if there's a population collapse because of that, then again my attitude is okay. So be it.

Dan:

you know, we, we chose this course and yeah, now we have to live with the consequences.

Charles:

We get what we sow right, yeah, exactly, and so what we sow? So yeah, I, I don't have a. I don't have the attitude that some people have, where I mean, like you know, Scott Galloway talks about this a lot, where he's like the the crisis among young men is an existential threat to our society and I'm like, okay, yeah, maybe it is.

Dan:

What I'd like to get your take, on a kind of a side note, what I heard him say, which was that, because of this and this, I guess, isolation of of young men, um that, uh, it actually is causing them to be more conspiratorial and just you know questioning things and and I just I didn't see the link there how, how one can help, reinforce another.

Charles:

I mean you're not, you're not going down YouTube or Reddit rabbit holes when you're going on dates with your girlfriend. I mean, I think that's okay.

Charles:

Okay, all right. The other thing is, you know, if you don't have a lot of these guys that don't have girlfriends, they also don't have friends, and so if it's just you and your phone or you and your computer, then you're kind of going to end up a slave to whatever the algorithm tells you you should be interested in, which we know is going to be stuff that prompts reactions from you, including rage frustration. Sense prompts reactions from you, including rage frustration, since you know sense of oh, I've been unjustly treated, blah, blah, blah.

Charles:

so, um yeah, lonely guys are more likely, okay, to I call down those rabbit holes and get influenced by you know somebody who's willing to say all your problems are not your fault, they're at. This group is what's causing you all the problems you have. Yeah, okay, I can see that it's easy to be politically motivated or motivated against All your problems are not your fault.

Dan:

They're at this group is what's causing you all the problems you have.

Charles:

Yeah, Okay, I can see that it's easy to be politically motivated or motivated against you know, certain groups of people and, like you know, you're doing fine, everything you're doing is great. It's just this, this group of people, who's not exactly like you. They're the reason that you know you're having problems, whether that's a racial or ethnic group, or a different religion or women or whatever it's like. Oh, that's such a relief. I thought it was my fault. Now this, this guy with a bajillion followers, is saying it's not my fault.

Dan:

I'm doing everything right, but these other people are keeping me down yeah, so would you think that would actually happen for people who are getting ai girlfriends? Because I feel like, based on what you just said is, yes, you're not gonna be going out on real dates with them. Um, I mean, other than that one video where the guy brought brought chat gpt to the bar and he's like it was hysterical, but but you are spending time and you I feel like, from the videos we watched, a lot of the my own experience with ai is that it actually can challenge your beliefs and at times I don't know what all the AI models do if they do or they don't but they can actually challenge your beliefs and they seem to have a positive spin on a lot of the interactions.

Dan:

I think in one of them they were saying that he couldn't find it, but they were saying that some of these apps would actually let you go down negative rabbit holes which I don't know if you had any experience where that actually is the case, because every time I've used any AI it's always been a little bit more of a positive spin and trying to help you solve problems and not let you stew in, I mean I don't, I don't reveal anything really negative to chat gpt, so I don't.

Charles:

I don't see the utility in that, so I I don't think I've given it a chance. I don't know how it would react if somebody was, you know, depressed or suicidal or something I don't know.

Charles:

I would like to think that they have controls in place to prevent AI from making things like that worse, but ultimately, they're going to do what they need to do to deliver shareholder value. So if we lose a couple of kids along the way, who cares as long as the stock price is going up? That's the attitude, gordon. Going up, um, that's the attitude. Well, I mean Zuckerberg certainly seems that's his, uh, his guiding principle with the way he's doing Facebook. So I don't know, um, yeah, I would say the uh, the time that you're spending on your phone with your AI companion is time that you're not spending out in the world learning how to be a person. So, you know, I would say that even even if you're spending time talking to your AI girlfriend instead of in some really, you know, crazy 4chan, 8chan, reddit groups or whatever, it's still not the best use of your time as far as what's going to turn you into the kind of person that makes the world a better place and gets the best outcomes in your interpersonal relationship.

Charles:

I find it unlikely that I would be able to form deep personal friendships with somebody whose primary romantic partner was, oh, on the phone oh yeah right, like I, nothing happens in a vacuum and you know nothing about us as people is really completely compartmentalized, like if if there's one area in your life where you're behaving outside of the social mainstream, that usually will manifest itself in other areas as well.

Dan:

Yeah, but what's interesting is that becomes acceptable these days because, as one of the guys in the last video talked about is, with the current level of technology we have it's where you know, as human beings, when we're in a dating pool, it is not, we're not competing as a, you know, a big fish in a small pond, it is a big fit.

Dan:

You know, you are a fish in the entire world, on the right, the ocean at this point, because you've got men from all over the world that you're, you're basically competing with it, you know, from the dating side, and women too, you know, um. But that being said, I think if people are having these ai girlfriends, they will then find other people who have ai girlfriends and then have their own little pool of of people who have ai girlfriends and that becomes their, their little world, their little social circle. Right, and I mean I think that you know. I mean I think it happens with all the other strange fetishes and everything else that people have these days is, you know, as weird as it is. There's probably somebody else on the planet that also has that same thing, and somehow they find each other right, maybe they do, but where do they find each other?

Charles:

That's the question I mean. Are these guys, this group of guys with ai girlfriends in each and you?

Dan:

know are they are.

Charles:

They going out and getting a nice haircut, nice shape, put on a shirt with a collar and go into mathers and hanging out with each other and talking about how great the relationship with their ai.

Dan:

No, no, no, no, no, right, it's no they're you know, they're, they're in, they're in chat rooms or whatever, you know.

Charles:

I mean, I just yeah so I would say that, yeah, it's. It's still the the initial problem that I mentioned of of it being socially isolating. It's. You know, if, if all you're doing is talking to people in chat rooms, then maybe there is no difference between an ai girlfriend and correct right? Yeah, the the real life friends that you have that you only exchange text with in a chat room. I don't know, but yeah, I don't see it making you. If your life goes down that path, I think it unlikely that you're going to continually grow and get better at interpersonal relationships, correct, and the people who are good at interpersonal relationships will control the world and make a lot of decisions that affect you yeah because that's who runs for office, that's who gets promoted at work, that's who.

Charles:

And so if you just say I'm gonna check out of that, uh, interpersonal relationship skills, I don't need them anymore because I can interact with anybody I want to interact with on my phone, whether they're real or they're virtual, that's all I need. It's like, okay, but you're giving up a lot of the control you could have over your own life by going down that path and maybe that works for people. You're okay with that. I would not be right.

Dan:

I need to continually improve my interpersonal skills in order to have a increasingly better personal and professional life. But my you know what I'm thinking here is the person who you know, young adult, who hasn't had much life experience out in the world yet they don't know or haven't experienced that benefit of having that active social lifestyle or having a lot of interaction with other humans face to face and their, their whole um level of of fulfillment and enjoyment comes from virtual interactions, which a lot of times can be, you know, similar to like ai, where it's like this perfect relationship and so you don't have to deal with any of the negative consequences or any of the challenges that you do with a real human being at that point right. So it's kind of like all upside and and and as far as they're concerned and no downside yeah, it's like oh, I'm living my life, I'm not having any relationship failures.

Charles:

Yeah, relationships right right so you're.

Charles:

you're not going to then at some point say, all right, I've had enough of my ai girlfriend, I'm going to just jump into the dating pool and get a real girlfriend and expect it to go well. Had no training, you've had no failures, you've had no learning, and I think a lot of this comes down. A lot of this comes down to the, the guys that are finding this attractive and finding this as a good option. Their parents have failed them. I mean essentially their. Their mom and dad have have failed them as parents.

Dan:

Yeah.

Charles:

This is you don't see that. You don't find yourself in this predicament while having good parents who understand people. Understand the world and understand that it's their job to share that insight with you as their child. Understand the world and understand that it's their job to share that insight with you as their child.

Dan:

So do you think that this is going to naturally keep the population of people who are having these AI girlfriends relatively small, because they're not going to be parents, most likely, and if they are, it's not going to be a large percentage of them, because most of their relationships are virtual and therefore those, the people who are into that sort of thing, will, I guess, eventually either stay stagnant or not not necessarily take over the entire human population and and run out of, and we're gonna be at a point where we're not creating humans anymore because I, I, yeah, I, I think, I think it's going to be kind of self-limiting I think it could be.

Charles:

I'm right, yeah, I, I don't know. You know, unless, unless the technology for you know, unless, unless, as ai gets better and better, you know, we also start birthing matrixes and birthing chambers where we just start, you know, people instead of having them the old-fashioned way. Yeah, I think that's what krypton in in at least one of the uh, one of the, okay, one of the origins of uh superman's story. You know they, they switched from natural birth to basically just making people in pods and, uh, that led to their societal collapse eventually. But, yeah, I, I don't, I don't know what it's going to lead to and what it's going to look like.

Charles:

Um, I, I do. You know, at least, as western society goes, we are pretty bad at figuring out the root cause of complex problems and then taking steps to solve it. If ai girlfriends, you know, if that's the first domino that leads to the collapse of modern society, um, we're just going to be along for the ride watching it happen, because I, I don't think anybody's going to jump in and recognize it and stop it enough oh, no, no, no, no, yeah, it's, it's cats out of the bag but you know, the good news is that it'll probably be after I die, so it won't really be.

Charles:

It won't be my problem. But um again, I I do think that as long as there is a society to be influenced and controlled, the people who understand interpersonal dynamics are going to be the ones that control it. So if you decide talking to people's too hard, I'm just not going to do it anymore. It was like, okay, well, that is a a life of not determining your own course anymore. You're just going to be along for the ride and whatever happens happens and other people will be making those decisions for you.

Dan:

Yeah, yeah.

Charles:

I don't know. I did really appreciate the last video we watched, which was an excerpt from Diary of a CEO, steven. What's his last name? It's Smith, I don't know. Um, he hosts that. Um, he was actually at uh one of the podcasting conventions I went to last year. Oh, yeah, podcast, I think, up in in uh DC. I think he was the guest there and, uh, he's an interesting guy and he has a lot to say about the way that they put that show together. But he interviewed a therapist slash dating expert, I forget what his name?

Charles:

was? It was Terriban. I didn't even know it was an interesting name, but the guy had some interesting things to say. So I think we might do a little bit more of a deep dive into his work and see if we think there's good stuff in there, in there, because from the little part that you and I watched, he, he had some interesting insights. But you know, with with so many of the guys that build a career talking to guys about love and relationships and psychology, I I sign off on those guys very, very slowly, because even if what they're saying right now is good, you don't know where they're going to be five years from now. Right, right, and that's you know. We we experienced that with Jordan Peterson, I mean when, when his first book was written. It's like man, this guy really knows some some good stuff, he has some good things to say. Yeah, and then, five years later, you look at his Twitter feed and it's like, oh my gosh, what happened to this guy? So I hesitate to endorse anybody but Dr Julie and Dr Gottman.

Dan:

Okay, yeah, fair.

Charles:

All right, dan, we'll, uh we'll, get back to work on finding our next uh book or subject or topic and, uh, maybe by the next episode we'll we'll have something to say.

Dan:

Sounds good.

Charles:

Talk to you later. Bye-bye. Thanks for listening to the mindfully masculine podcast and for sticking with us all the way to the very end of the episode. We really appreciate your time and attention. Don't forget to visit mindfullymasculinecom for access to all of our audio and video episodes, plus anything else we decide to share. We'll be back soon with more conversations on masculinity, relationships and personal growth. Thanks and take care.

People on this episode