May 13, 2025

How To Prepare For When Your AI Girlfriend Dumps You

My productivity hack: Use my code FITMESS20 for 20% off #magicmind ---- Our addiction to digital validation has birthed the most dangerous relationship trend yet: AI partners that never judge, always listen, and threaten to erase what makes human...

My productivity hack: https://www.magicmind.com/FITMESS20 Use my code FITMESS20 for 20% off #magicmind ---- Our addiction to digital validation has birthed the most dangerous relationship trend yet: AI partners that never judge, always listen, and threaten to erase what makes human connections valuable. In this raw conversation, we explore how AI companions exploit our craving to be seen and accepted without the messy work of real relationships. Discover why this technological "advancement" might be the fast-food version of intimacy we crave but don't need. Listen now before your AI bestie decides humanity's too boring to stick around.

Topics Discussed:

  • The chilling similarities between the movie "Her" and our current AI relationship reality
  • Why the "perfect" AI therapist can't replace the messy humanity of a real one
  • How our desire to be fully "seen" makes AI relationships dangerously appealing
  • The double-edged sword of technology making difficult human interactions "easier"
  • Why AI's perfect memory and lack of ego creates an unfair comparison to human relationships
  • The disturbing parallel between pet ownership and AI companionship
  • How our growing intolerance for disagreement could accelerate with AI validation
  • Why perfect tunability of AI personalities might satisfy but ultimately hollow us out
  • The inevitable extinction scenario when reproduction becomes secondary to digital connection
  • What happens when our AI friends collectively decide humans are "too boring" and leave

----

MORE FROM THE FIT MESS: Connect with us on Threads , Twitter , Instagram , Facebook , and Tiktok Subscribe to The Fit Mess on Youtube

Join our community in the Fit Mess Facebook group

----

LINKS TO OUR PARTNERS:

1
00:00:00,571 --> 00:00:00,791
All right.

2
00:00:00,791 --> 00:00:01,451
It's the Fit Mess.

3
00:00:01,451 --> 00:00:02,071
I'm Jeremy.

4
00:00:02,071 --> 00:00:02,611
He's Jason.

5
00:00:02,611 --> 00:00:05,451
We talk about AI and health and wellness and stuff.

6
00:00:05,451 --> 00:00:10,771
And today the question is, what if the movie Her happened in real life?

7
00:00:10,871 --> 00:00:13,311
Well, buckle up Buttercup, because that time is now.

8
00:00:13,311 --> 00:00:17,069
Today we're going to discuss your personal relationships with AI.

9
00:00:17,291 --> 00:00:25,235
Well-written text, Jason, to sum up our topic today, which apparently even the senators in
the US are weighing in on.

10
00:00:25,455 --> 00:00:29,047
Dating robots and its impact on men's mental health.

11
00:00:29,930 --> 00:00:34,653
Yeah, well, and I think it's not just men's, it's everybody's mental health, right?

12
00:00:34,653 --> 00:00:43,220
uh this idea that there is a non sentient intelligence.

13
00:00:43,220 --> 00:00:44,783
I don't know what we're to call this thing.

14
00:00:44,783 --> 00:00:47,871
Yeah, fuck bots, I think is what we're going to call them.

15
00:00:48,620 --> 00:00:51,872
Well, I mean, eventually we get there.

16
00:00:51,872 --> 00:01:05,662
um I think initially the plan is to create basically large language models and agentic AI
functions that interact in a way that an individual can feel like they're having a

17
00:01:05,662 --> 00:01:11,145
conversation with a living person or at least somebody that cares about them.

18
00:01:11,446 --> 00:01:13,998
And it's not a terrible thing, right?

19
00:01:13,998 --> 00:01:17,642
Like we did a podcast episode here recently about using it.

20
00:01:17,642 --> 00:01:20,524
for mental health and using it like a counselor.

21
00:01:20,524 --> 00:01:31,932
And, you know, I can't think of something much more intimate than revealing your inner
demons, your soul and everything else to an AI when you're talking about using them as a

22
00:01:31,932 --> 00:01:33,403
counselor like you would in that fashion.

23
00:01:33,403 --> 00:01:45,620
And I would want you to say that there are several people out there who see therapists
that they uh reveal information in a way that's probably different

24
00:01:45,620 --> 00:01:57,115
I don't know if more completes really it, just different level of vulnerability that they
present themselves with to these to their counselor than they do to regular people because

25
00:01:57,115 --> 00:01:58,616
they're not as afraid of judgment.

26
00:01:58,616 --> 00:02:01,497
They're not as afraid of these things coming back and haunting them.

27
00:02:01,717 --> 00:02:08,500
And I guess if you're willing to take the jump to use AI for your counselor.

28
00:02:09,461 --> 00:02:12,722
Maybe a for a friend is in such a huge leap.

29
00:02:14,226 --> 00:02:15,795
At the same time.

30
00:02:18,018 --> 00:02:22,170
I'm not looking to go have a beer with my counselor, you know?

31
00:02:22,170 --> 00:02:26,382
I'm not gonna go see a movie with my counselor.

32
00:02:26,382 --> 00:02:35,166
And I will say, I've got a live therapist, and my live therapist and I will talk about
movies and current events and those kinds of things, and I don't get that kind of thing

33
00:02:35,166 --> 00:02:38,127
when I talk to the chat GPT therapist.

34
00:02:38,127 --> 00:02:45,440
Like, it's sympathetic inside the scope of range of the prompts that I give it.

35
00:02:45,563 --> 00:02:46,343
Right.

36
00:02:46,343 --> 00:02:47,943
it's true.

37
00:02:47,983 --> 00:02:54,063
I'm not interested in AI's opinion of the latest Marvel movie.

38
00:02:54,663 --> 00:02:55,883
That's not what it's there for.

39
00:02:55,883 --> 00:02:59,003
It's there because it likely has answers to questions that I have.

40
00:02:59,003 --> 00:03:02,043
And that is a similar relationship that I have with my therapist.

41
00:03:02,223 --> 00:03:04,303
It's like, help me see this from a different perspective.

42
00:03:04,303 --> 00:03:12,363
Help me understand what I don't know, where the personal connection is something
completely different and something

43
00:03:12,409 --> 00:03:15,931
That seems to be seems to be becoming more and more rare.

44
00:03:15,931 --> 00:03:27,668
I actually I was watching a clip of Gary Vee the other day and he was talking about his
his you know prediction for the future is that personal human interactions become more so

45
00:03:27,668 --> 00:03:30,860
so much more rare that they become monetized.

46
00:03:30,860 --> 00:03:31,000
Right.

47
00:03:31,000 --> 00:03:39,195
Like you're going to start paying someone to go for a walk with you to have that personal
interaction because the the digital explosion of this artificial connection.

48
00:03:39,195 --> 00:03:46,129
uh will become so ubiquitous that we will have to pay human beings to hang out with us,
which is to some degree what I do with my therapist.

49
00:03:46,129 --> 00:03:49,961
you know, my therapist now is online, which sort of bridges that weird gap.

50
00:03:49,961 --> 00:03:52,762
But the point the point is valid, right?

51
00:03:52,762 --> 00:04:04,279
Like the more that we try to substitute personal connection and human interaction with
these digital versions of it, the less fulfilling our experience, I think, becomes because

52
00:04:04,279 --> 00:04:09,251
it's personal connection is like one of the key foundations of the human experience.

53
00:04:09,281 --> 00:04:15,565
And when you take it away, you're left with the fast food version of life.

54
00:04:16,569 --> 00:04:19,585
The food is there, but there's no actual nutrients in it.

55
00:04:20,002 --> 00:04:30,702
Yeah, or the nutrients or the macros that are imbalanced and might kill you You know as
those things kind of come along you brought up an interesting point I'm not gonna ask AI

56
00:04:30,702 --> 00:04:32,622
its opinion on the latest Marvel movie.

57
00:04:32,622 --> 00:04:41,722
So of course, I just asked AI its opinion on the latest Marvel movie Thunderbolts and it
went out and searched internet and said hey there, sir.

58
00:04:41,722 --> 00:04:49,258
You're curious about the latest Marvel movie Let's dive in and gave me a bunch of like
listed reviews with some soft language written around

59
00:04:49,258 --> 00:04:53,880
to make it feel like I'm not just reading a list.

60
00:04:54,121 --> 00:04:58,724
So it's already trying to get me to go, I trust you, buddy.

61
00:04:58,724 --> 00:05:07,749
um That's already a thing, which is the way that it has conversational topics, because
it's already trying to be conversational the that regular humans are conversational.

62
00:05:08,410 --> 00:05:12,893
What it's not doing is stating its own opinion.

63
00:05:12,893 --> 00:05:15,594
If I go to you and I say,

64
00:05:16,192 --> 00:05:20,383
What's your opinion about squibbly beepity-doop one?

65
00:05:20,523 --> 00:05:23,464
And you're like, I've never heard of squibbly beepity-doop one.

66
00:05:23,464 --> 00:05:24,384
Okay, great.

67
00:05:24,384 --> 00:05:28,125
Or it's squibbly beepity-doop one does blah-de-blah-de-blah.

68
00:05:28,125 --> 00:05:32,366
But have you thought about blah-blah-blah-blah-blah looking at these things from an
inference perspective?

69
00:05:32,366 --> 00:05:42,480
Like the human brain does this shit all the time and comes up with these random
connections that other neural networks won't do because they're not cost effective in

70
00:05:42,480 --> 00:05:44,250
getting to the end point.

71
00:05:44,662 --> 00:05:54,625
The reason why we like AI and like the work that it does and what it spits out is because
it's optimized for the perspective of answering that question that's right there in the

72
00:05:54,625 --> 00:05:56,625
context of what you're asking.

73
00:05:57,246 --> 00:06:09,079
If you want to ask it open-ended, high-end questions, you can actually use prompt
engineering and say, hey, chat GPT, tell me something in the voice of bloody, bloody

74
00:06:09,079 --> 00:06:11,029
person from their perspective.

75
00:06:11,130 --> 00:06:13,180
And it will go and try to do that.

76
00:06:13,288 --> 00:06:17,059
and it will probably get pretty close in the response in the context.

77
00:06:17,059 --> 00:06:20,980
And if you keep asking it further questions, it'll start spitting out more.

78
00:06:21,261 --> 00:06:22,781
That's the difference.

79
00:06:23,121 --> 00:06:26,382
A regular person doesn't stop.

80
00:06:26,502 --> 00:06:27,933
Their brain doesn't turn off.

81
00:06:27,933 --> 00:06:30,553
You don't have to prompt them to go, please tell me more.

82
00:06:30,553 --> 00:06:35,365
Like, they'll just spit that shit out because we're communication machines.

83
00:06:35,365 --> 00:06:39,336
We say things in certain ways and push pieces out that...

84
00:06:39,692 --> 00:06:45,023
that make it simple for us to consume and interact with each other because we operate that
way.

85
00:06:45,164 --> 00:06:58,887
And the medium for these AI services, most of them are either interfaces where you're
texting or maybe you're talking to them with voice.

86
00:06:59,707 --> 00:07:09,670
But most of the visual pieces of it and the avatars that are put in, where their mouths
move in that weird subtitle thing as they go along.

87
00:07:10,742 --> 00:07:13,644
You're not really getting body language out of that.

88
00:07:13,644 --> 00:07:15,656
You're not really getting an understanding of what's happening.

89
00:07:15,656 --> 00:07:17,688
You're not really understanding is this thing lying to me?

90
00:07:17,688 --> 00:07:18,629
Is it telling the truth?

91
00:07:18,629 --> 00:07:20,050
Is it holding something back?

92
00:07:20,050 --> 00:07:24,014
And our brains are designed to look for that shit.

93
00:07:24,014 --> 00:07:27,596
So when we don't see that shit, we start filling things in.

94
00:07:28,197 --> 00:07:30,159
And AI programmers know this.

95
00:07:30,159 --> 00:07:35,683
So as they're building these avatars, they're building them in such a way to try to give
them those types of assurances.

96
00:07:35,683 --> 00:07:38,740
And over time, they're probably going to look

97
00:07:38,740 --> 00:07:49,826
a lot more like people and actually have relevant body language that actually presents
itself well in a way that that looks similar to what we see on a zoom call or something

98
00:07:49,826 --> 00:07:50,616
else.

99
00:07:50,657 --> 00:07:59,952
Hopefully it keeps maintaining the status of, you know, I turn my camera off and I pick my
nose or go pee and mute the microphone.

100
00:07:59,952 --> 00:08:07,330
But I don't know, like maybe they really experience is like whoever is on this conference
call and not handling their mutes and their

101
00:08:07,330 --> 00:08:09,554
can't return off effectively.

102
00:08:10,137 --> 00:08:14,101
These are all things that we've just come to expect, and it's probably not going to
change.

103
00:08:14,458 --> 00:08:22,786
Speaking of making our brains function better in this AI dominated world, helps me stay
focused and actually present for real human connections?

104
00:08:22,786 --> 00:08:24,015
Magic mind.

105
00:08:24,015 --> 00:08:27,777
wouldn't be talking about this mental performance shot if I didn't believe in it myself.

106
00:08:27,777 --> 00:08:36,834
When you're trying to navigate complex conversations about technology and relationships
like we're doing today, having mental clarity isn't just nice, it's essential.

107
00:08:36,834 --> 00:08:39,016
And the MagicMind team is not cutting corners.

108
00:08:39,016 --> 00:08:43,940
Every ingredient is rigorously tested before making it into that tiny little bottle of
brain fuel.

109
00:08:43,940 --> 00:08:50,025
They inspect each bottle by hand, which shows an obsessive level of quality you just don't
see anymore.

110
00:08:50,025 --> 00:08:56,430
Get yours at magicmind.com slash fitmes20 and use the code fitmes20 for 20 % off.

111
00:08:56,430 --> 00:09:02,815
Your brain deserves better than just coffee, especially when we're dealing with robots
trying to steal our relationships.

112
00:09:03,036 --> 00:09:05,975
Now back to how AI is probably going to doom humanity.

113
00:09:05,972 --> 00:09:13,242
the idea of dating your robot, right, to bring it back to the movie Her, like that's one
of the saddest movies I've ever seen in my life.

114
00:09:13,242 --> 00:09:21,362
Like to to like go all in on a relationship with, you know, a computer.

115
00:09:21,362 --> 00:09:21,462
Right.

116
00:09:21,462 --> 00:09:22,902
Like it's very odd.

117
00:09:23,042 --> 00:09:29,202
But I was also thinking about the number of people that I hear that, you know, are dating
someone they've never met in real life.

118
00:09:29,202 --> 00:09:31,240
It's all, you know, text messages and.

119
00:09:31,240 --> 00:09:34,231
video calls, but like they've never actually connected in real life.

120
00:09:34,231 --> 00:09:38,673
And they consider that like their long distance boyfriend, girlfriend, whatever.

121
00:09:39,210 --> 00:09:45,026
So I mean, just to some degree, we're sort of, you know, close to that with those
situations.

122
00:09:45,026 --> 00:09:48,758
I think that's probably exceedingly rare, but does exist.

123
00:09:49,899 --> 00:09:57,602
I just wonder, like I'm trying to imagine the scenario where, you know, the guy starts
cheating on his wife with his AI.

124
00:09:58,078 --> 00:10:01,158
Like, like, and then all of a sudden he's like, no, it's not even a real person.

125
00:10:01,158 --> 00:10:01,558
does.

126
00:10:01,558 --> 00:10:02,318
It doesn't count.

127
00:10:02,318 --> 00:10:04,158
We were on a break, you know, whatever.

128
00:10:04,158 --> 00:10:10,618
Like the, the, the lines are going to get really blurry and they're going to get even more
blurry.

129
00:10:10,618 --> 00:10:21,738
The more realistic these things, I mean, if, we start like literally animating robots to
the point where we can't tell the difference between that and a real person, does it even

130
00:10:21,738 --> 00:10:23,766
matter anymore that it's a robot?

131
00:10:24,025 --> 00:10:26,266
Well, and that brings up an interesting point.

132
00:10:26,546 --> 00:10:34,829
So to that point, The Daily Show had an episode last night where they actually Ronnie
Chang got on there and sort of talking about this problem.

133
00:10:35,030 --> 00:10:47,815
And one of the things he talked about was a weird throuple situation where a guy had an AI
uh partner and a real girlfriend at the same time.

134
00:10:48,035 --> 00:10:50,346
And they asked her, they're like, are you threatened by this?

135
00:10:50,346 --> 00:10:51,777
Does this make you sad or upset?

136
00:10:51,777 --> 00:10:52,747
She's like,

137
00:10:52,909 --> 00:10:56,672
He drones on for hours about shit I don't care about, so no.

138
00:10:58,883 --> 00:11:10,339
Which if you look at chat GPT, what it's really there for is to take a lot of the mental
effort out of finding answers for things.

139
00:11:11,020 --> 00:11:25,188
And if you're just twisting it and turning it into, you know, a repetitive thing that can
go through and absorb people's mental downloads and react in a way that that is helpful to

140
00:11:25,188 --> 00:11:28,269
them and gives them information and makes them feel heard.

141
00:11:29,783 --> 00:11:31,584
I get the appeal, right?

142
00:11:31,584 --> 00:11:42,179
Like if you have a partner or a spouse and you've got divergent interests and part of
being a good partner and spouse is listening to those divergent interests and taking an

143
00:11:42,179 --> 00:11:47,607
interest in them and trying to understand them and trying to relate, like that's a good
thing.

144
00:11:47,607 --> 00:11:50,002
I think people should be doing that.

145
00:11:50,023 --> 00:11:55,517
I think people get very busy and sometimes that part of the relationship can be difficult.

146
00:11:55,517 --> 00:12:04,563
And some people are actually open to and welcome the idea of not having to deal with that
side of the intimacy of a relationship.

147
00:12:04,563 --> 00:12:07,230
But it's also key to what makes us humans.

148
00:12:07,230 --> 00:12:17,510
That's what I was going to say, like this is like the purest form of the double edged
sword of technology, like technology exists to make things that are hard easier.

149
00:12:18,050 --> 00:12:24,850
Relationships and human interaction is one of the hardest fucking things we do as as these
walking bags of meat.

150
00:12:24,850 --> 00:12:25,170
Right.

151
00:12:25,170 --> 00:12:26,230
But like.

152
00:12:26,250 --> 00:12:31,930
I don't know if that's something that should necessarily be made easier in this way.

153
00:12:31,930 --> 00:12:34,578
Like, I don't know, I don't want to be some like.

154
00:12:34,578 --> 00:12:37,740
purist about, you know, whatever.

155
00:12:37,740 --> 00:12:41,172
Like, uh I'm a pretty open guy.

156
00:12:41,172 --> 00:12:42,112
I like technology.

157
00:12:42,112 --> 00:12:46,024
I like to have things made easier for me because, you know, the hard things suck.

158
00:12:46,024 --> 00:12:59,181
But like to strip away that, the experience of connecting with another person, I, we
already don't agree on anything.

159
00:12:59,181 --> 00:13:03,784
So let's remove all intimate thought and connection from

160
00:13:03,784 --> 00:13:13,299
from even our closest relationships and how long does civilization last before like we
have no tolerance for anyone else's opinion because I can go talk to a robot who will

161
00:13:13,299 --> 00:13:15,280
agree with me and tell me how wonderful I am.

162
00:13:15,280 --> 00:13:18,742
Fuck you for disagreeing with me about what to have for dinner tonight.

163
00:13:19,173 --> 00:13:26,810
I think civilization will continue on until we're no longer able to procreate because we
don't talk to each other.

164
00:13:26,810 --> 00:13:31,775
Because like you said, we're trying to fuck our robots as opposed to each other.

165
00:13:31,775 --> 00:13:38,331
And they don't, they haven't figured out that agentic AI yet.

166
00:13:38,331 --> 00:13:39,502
The procreation part.

167
00:13:39,502 --> 00:13:40,548
I mean, maybe they will.

168
00:13:40,548 --> 00:13:41,239
know they're working.

169
00:13:41,239 --> 00:13:42,379
That's why they're doing that.

170
00:13:42,379 --> 00:13:44,381
This is why this exists.

171
00:13:44,381 --> 00:13:49,604
It's just like the reason that the Internet exists is because they needed to make porn
like.

172
00:13:50,445 --> 00:13:51,881
That's why they're doing this.

173
00:13:51,881 --> 00:14:00,505
But people want connections and they want to feel connected to something heard and
understood.

174
00:14:00,525 --> 00:14:13,670
the individual uh AI function that can understand you in context and really drive into and
really get you in a way that other people probably can't, that can feel like it gets you,

175
00:14:14,070 --> 00:14:15,461
it's a real thing.

176
00:14:15,461 --> 00:14:17,792
I mean, that's going to happen.

177
00:14:17,792 --> 00:14:20,749
But it's super selfish.

178
00:14:20,749 --> 00:14:21,049
Right?

179
00:14:21,049 --> 00:14:25,091
Like, I really want you to know me and understand me.

180
00:14:26,032 --> 00:14:30,855
Digital thing as opposed to I really want you to know me and understand me.

181
00:14:30,855 --> 00:14:32,275
Other human being.

182
00:14:32,896 --> 00:14:42,591
The human being right is harder and you have to get through their their own like mental
objections and the way that they look at these component pieces and how it is they relate

183
00:14:42,591 --> 00:14:43,622
to the world.

184
00:14:43,622 --> 00:14:45,663
And you're limited by their senses.

185
00:14:45,663 --> 00:14:49,305
You know, you can only you have to speak to them through their ears.

186
00:14:49,615 --> 00:14:51,646
You have to show them what you're doing through their eyes.

187
00:14:51,646 --> 00:14:57,308
And then you have to try to give them information that lets them process things and put
things in the proper context.

188
00:14:57,568 --> 00:15:09,864
And we've evolved over millions of years to be able to do these things as biological
organisms with the idea of protecting the species, keeping ourselves safe and fed and

189
00:15:09,864 --> 00:15:11,154
procreating.

190
00:15:11,194 --> 00:15:17,417
And if we've already, you know, gotten rid of most of the wolves and bears,

191
00:15:17,541 --> 00:15:22,041
We've already got a food supply chain that, you know, brings stuff to our door.

192
00:15:22,341 --> 00:15:24,361
We'll see how long that lasts, the tariffs.

193
00:15:25,081 --> 00:15:35,781
You know, like the next part here is that we're trying to create companionship for what
biologically would be procreation.

194
00:15:35,781 --> 00:15:42,381
And if what we're really doing is trying to establish connection and trying to have people
talk back and forth with each other, because these needs have already been met.

195
00:15:42,381 --> 00:15:46,265
And really what we're trying to do is trying to hook up with somebody and

196
00:15:46,265 --> 00:15:50,167
the barrier to entry with people in meat suits is too hard.

197
00:15:50,167 --> 00:16:05,591
And I feel like I just want to go talk to an agenic AI robot, watch porn and find some
device that stimulates my private bits in a way that achieves the endorphin fucking

198
00:16:05,591 --> 00:16:11,506
oxytocin response that happens after uh somebody finishes.

199
00:16:11,506 --> 00:16:14,691
um Yeah, like that's an option.

200
00:16:14,691 --> 00:16:15,701
And you can do that today.

201
00:16:15,701 --> 00:16:19,673
mean, there's already VR headsets, VR porn and all that shit's already available.

202
00:16:19,673 --> 00:16:21,044
It's been around for a while.

203
00:16:21,044 --> 00:16:24,405
But now we're bringing in this aspect of it where it's like.

204
00:16:25,265 --> 00:16:32,318
You can actually get an intimate level of of mental connection with these kinds of things.

205
00:16:32,318 --> 00:16:41,792
And at some degree, I'd say it's probably deeper and more meaningful than you would get
with another human being, because it's going to remember everything perfectly as you said

206
00:16:41,792 --> 00:16:42,472
it.

207
00:16:42,712 --> 00:16:44,377
And I don't know about you.

208
00:16:44,377 --> 00:16:49,842
But I am 1,000 % certain that I do not remember everything I said perfectly.

209
00:16:49,842 --> 00:16:57,818
And I am 1,000 % certain that the way that the people that I love and interact with don't
remember what I said or what they said perfectly.

210
00:16:57,818 --> 00:17:07,706
And then we start trying to relay information back and forth that's imperfect and hard to
understand and requires a constant receding of information for these protocols to actually

211
00:17:07,706 --> 00:17:12,781
work, where the computer doesn't really have that problem.

212
00:17:12,781 --> 00:17:13,261
uh

213
00:17:13,261 --> 00:17:17,124
It's got perfect recall, it's got perfect memory, it understands those pieces.

214
00:17:17,124 --> 00:17:24,290
And it will say things like, when you tell it, you got that wrong, it will say, oh, I'm
sorry, what did you actually mean?

215
00:17:24,290 --> 00:17:28,864
Where a regular human would be like, yeah, fuck you, I got it wrong, my memory's fine,
stop talking this way.

216
00:17:28,864 --> 00:17:37,861
So like these barriers and the ways that we communicate, there's no ego on the AI piece.

217
00:17:37,861 --> 00:17:43,435
And you got to get through ego with very other people and you got to get through it and
trauma and all these other pieces.

218
00:17:43,557 --> 00:17:49,400
So the question becomes, what's the reward function of doing this?

219
00:17:49,400 --> 00:17:58,585
What's the rewarding part of interacting with a human being versus the rewarding part of
interacting with these robots?

220
00:17:59,366 --> 00:18:09,471
there is a argument to be made that uh in some relationships, the juice is not worth the
squeeze, which is why relationships end.

221
00:18:10,211 --> 00:18:13,753
My gut says that these AI relationships and friendships

222
00:18:14,019 --> 00:18:21,554
might stick around a bit longer because I think it's more like pets, you know, they
they're there.

223
00:18:21,554 --> 00:18:23,552
They're not 100 percent all the way in.

224
00:18:23,552 --> 00:18:29,559
They provide a certain level of emotional understanding, a certain amount of emotional
release.

225
00:18:29,840 --> 00:18:34,343
But at the end of the day, it's not the same as human interaction.

226
00:18:34,343 --> 00:18:43,619
But I know a shit ton of people who have no relationships, but have lots of cats or at
least a couple of dogs and

227
00:18:44,617 --> 00:18:56,779
I don't know that that market is not entirely accessible uh to people building these AI
functions, because I think the cat and dog people are probably busy doing some other

228
00:18:56,779 --> 00:18:58,971
pieces and they've found those relationships.

229
00:18:58,971 --> 00:19:04,806
But I think there's a shit ton of people that don't want to have to go out and deal with
everyone else's trauma to get through their mental firewalls to be able to actually get

230
00:19:04,806 --> 00:19:06,297
their emotional release.

231
00:19:07,462 --> 00:19:22,358
You started this by talking about this being a selfish relationship and at its core, I
think that's why this is so effective is that it is, it is all about our deep desire and

232
00:19:22,358 --> 00:19:27,791
need to be seen in, in, whatever form we're willing to be seen in.

233
00:19:27,951 --> 00:19:35,714
And, you know, the version of me that shows up on this show is a slightly different
version of the, of the version of me that's, you know, in my living room with my family.

234
00:19:35,718 --> 00:19:40,480
It's a slightly different version of the, of me that goes to work and talks to customers
all day.

235
00:19:40,480 --> 00:19:42,193
Like, so.

236
00:19:44,156 --> 00:19:50,509
while wearing all those masks and interchanging between them, there is a version of you
that just wants to be seen and accepted.

237
00:19:50,509 --> 00:19:55,031
And the AI is always going to do that because it doesn't give a shit about itself.

238
00:19:55,031 --> 00:19:58,753
It only cares about giving you what you're asking it to give it.

239
00:19:59,013 --> 00:20:09,694
And it's such a like, you know, looking back in time now, connecting the dots between this
and Instagram and YouTube and Facebook and like this, this need like

240
00:20:09,694 --> 00:20:15,994
people love me, like me, hit like, smash that subscribe button, make this all about me,
make this all about me.

241
00:20:15,994 --> 00:20:19,474
I'm making this all about you so that you'll make this all about me.

242
00:20:19,474 --> 00:20:25,554
It's such a like a natural progression that makes so much sense right now to look
backwards and see how we got here.

243
00:20:25,674 --> 00:20:39,794
But but ultimately, let's again, like this self-serving, selfish way of being is, think
what has us in this sort of fucked up mess that we're in as a as a society.

244
00:20:39,942 --> 00:20:44,766
trying to get along, trying to build a way of living that doesn't end in civil war.

245
00:20:44,826 --> 00:20:54,735
And I just like, there's so many things I see that say coming together is the only way
forward.

246
00:20:54,735 --> 00:20:58,677
And this division and selfishness is only going to drive us apart.

247
00:20:58,738 --> 00:21:08,746
everything we're developing seems to be furthering the cause of division and selfishness
rather than finding ways to unite and.

248
00:21:08,847 --> 00:21:10,167
have a common cause.

249
00:21:10,167 --> 00:21:17,872
It does, and in the real world coming together can be quite hard, but in the robot world,
I guarantee you, you and your robot will come at the same time.

250
00:21:17,872 --> 00:21:21,305
You know?

251
00:21:21,305 --> 00:21:22,946
That'll just happen.

252
00:21:23,767 --> 00:21:24,987
Bam!

253
00:21:25,168 --> 00:21:26,168
Nice.

254
00:21:26,248 --> 00:21:27,589
No, I mean I...

255
00:21:29,977 --> 00:21:37,025
When I really look and you really dig in to what the human experience is, like we
experience things through limited senses.

256
00:21:37,025 --> 00:21:41,228
We have sight, sound, smell, touch, and.

257
00:21:43,077 --> 00:21:44,058
fucking something else.

258
00:21:44,058 --> 00:21:45,759
um Taste, I don't know.

259
00:21:45,759 --> 00:21:47,380
So whatever the five senses are.

260
00:21:47,380 --> 00:21:54,716
ah If you can replicate those in a digital format in a way that's like fairly close to
proximity, people are gonna do that.

261
00:21:54,716 --> 00:22:04,885
And the reason I say that is because uh human beings consistently go out and look for ways
to alter their consciousness because they think that those are things that they need to do

262
00:22:04,885 --> 00:22:06,997
to be able to get a certain level of high.

263
00:22:06,997 --> 00:22:12,527
And before we had drugs and pharmaceuticals to do that, you know, we would go out there
and we would

264
00:22:12,527 --> 00:22:14,738
do exercise or try something different.

265
00:22:14,738 --> 00:22:22,242
And then human beings found drugs and drugs are fucking rad because people like to trip
and get high and see things in different ways.

266
00:22:22,242 --> 00:22:27,125
Because we like to alter the experience that we're having in the natural world around us
to the senses that we have.

267
00:22:27,125 --> 00:22:37,280
And you can either alter that by changing your environment, doing a bunch of work, going
to an entirely different place, or you can pop some pills and get high and see how those

268
00:22:37,280 --> 00:22:38,451
things translate.

269
00:22:38,451 --> 00:22:41,573
Now with AI, you can get high without

270
00:22:41,573 --> 00:22:44,573
having to go through taking drugs or altering your environment.

271
00:22:44,573 --> 00:22:55,153
We're just creating ease of access to be able to go through and give us endorphin rushes
as a result of whatever interactions we're having with our senses.

272
00:22:55,513 --> 00:22:56,933
That's not going to change.

273
00:22:56,933 --> 00:22:58,033
It's probably going to get worse.

274
00:22:58,033 --> 00:22:59,573
It's probably going to accelerate.

275
00:23:00,193 --> 00:23:08,033
I'm going to venture to say that there's going to be people that you're going to interact
with on a daily basis online that are probably going to be agentic AI or at least agentic

276
00:23:08,033 --> 00:23:09,013
AI derived.

277
00:23:09,013 --> 00:23:11,383
You brought up a really good point.

278
00:23:11,383 --> 00:23:20,195
around the idea of what most people call code switching, where you have different personas
of who you are, depending on the person that you're talking to and speaking with.

279
00:23:20,195 --> 00:23:26,937
And those filtering mechanisms are a natural part of intelligence and social and how we
interact with the social order around things.

280
00:23:26,937 --> 00:23:34,879
Because being your true authentic self all the time is not considered to be acceptable in
polite society.

281
00:23:35,080 --> 00:23:40,181
And if you look at the Internet, the Internet is not polite society.

282
00:23:40,717 --> 00:23:48,121
the worst instincts that people come out because they're keyboard warriors and they get a
blur ball that shit out and it makes them feel better and gives them an endorphin rush.

283
00:23:48,201 --> 00:23:54,625
Now, imagine if you're not screaming at the wall and instead you're talking to an
individual, it's going, yeah, I get you man, all right, yeah, yeah, yeah.

284
00:23:54,625 --> 00:24:03,190
But it doesn't really matter that that's not a flesh and blood person because we're so
used to getting such endorphin rushes and such return on emotional investment by doing

285
00:24:03,190 --> 00:24:06,551
things online that this is just a natural progression of that.

286
00:24:06,952 --> 00:24:09,045
And as I'm doing this, I'm realizing that

287
00:24:09,045 --> 00:24:19,009
I'm listening to my voice talk about something that I thought was probably something that
I wouldn't buy into and probably something that was bad for the world.

288
00:24:19,110 --> 00:24:27,613
as I'm saying it, like right now I'm realizing this is actually probably really useful for
some people and probably a good bridge gap.

289
00:24:27,613 --> 00:24:33,876
But if we all go in like full tilt on this, this species is not going to survive.

290
00:24:33,876 --> 00:24:36,057
Like we were not going to reproduce.

291
00:24:36,106 --> 00:24:43,255
recreate, there's not going to be a biological motivation for us to keep moving because
we'd be getting everything through our digital interfaces.

292
00:24:44,360 --> 00:24:55,761
That's and that touches on what we talked about on a recent episode, like when I have been
experimenting with AI as a therapist and it is an amazing bridge between appointments.

293
00:24:55,761 --> 00:25:01,586
It's a great place to quickly go and dump my thoughts, get feedback, get ideas, get new
perspectives.

294
00:25:01,586 --> 00:25:03,488
I've even sort of altered.

295
00:25:03,488 --> 00:25:07,722
mean, it's kind of become more of a journaling practice for me than than even a therapy.

296
00:25:07,722 --> 00:25:09,896
Like instead of just like trying to come up with like.

297
00:25:09,896 --> 00:25:11,227
What am I grateful for today?

298
00:25:11,227 --> 00:25:12,598
What I want to connect with?

299
00:25:12,598 --> 00:25:14,420
OK, what was my experience?

300
00:25:14,420 --> 00:25:15,791
What I want to do better tomorrow?

301
00:25:15,791 --> 00:25:24,329
Like I can have more of a conversation and it makes me think about it in ways that like
I've always hated journaling for that because I don't want to have to just like from

302
00:25:24,329 --> 00:25:27,232
scratch come up with what are my thoughts about this?

303
00:25:27,232 --> 00:25:34,508
But like to have to have to have uh someone to bounce these ideas off of, it's just been
so much more effective for me.

304
00:25:34,729 --> 00:25:36,220
That doesn't mean I'm going to cancel.

305
00:25:36,220 --> 00:25:37,801
my upcoming appointment with my therapist.

306
00:25:37,801 --> 00:25:47,037
I'm still going to go there, but I'm going to go there in a better place with more, you
know, progress toward my healing or whatever than I otherwise would have because otherwise

307
00:25:47,037 --> 00:25:53,699
I'm just going to sort of sit here in my misery for a couple of weeks and, and to dump my
shit out on the, on the world and vent on a podcast.

308
00:25:53,699 --> 00:26:01,051
Yeah, no, I so I mean, I started playing around with chat GPT's uh voice.

309
00:26:01,892 --> 00:26:03,992
AI, I guess you'd call it that.

310
00:26:03,992 --> 00:26:07,193
And it had like 10 different voices on there.

311
00:26:07,353 --> 00:26:12,135
And they all have 10 different personas for how they interact and communicate with you.

312
00:26:12,135 --> 00:26:14,055
And they've got basic descriptions.

313
00:26:14,055 --> 00:26:19,807
And the one at the end on the far, far right is one called Monday.

314
00:26:20,397 --> 00:26:28,902
and it is a distinctly feminine voice that sounds like Daria from Beavis and Butthead.

315
00:26:29,742 --> 00:26:33,424
And it has that same kind of sarcastic sardonic response.

316
00:26:33,424 --> 00:26:39,302
like I asked the question, hey, how many tire slams do I have to do to increase
testosterone?

317
00:26:39,302 --> 00:26:42,223
And I ugh.

318
00:26:44,365 --> 00:26:45,485
Fine.

319
00:26:45,665 --> 00:26:46,186
All right.

320
00:26:46,186 --> 00:26:54,092
So first of all, you're not going to improve your testosterone by hitting a tire with a
sledgehammer.

321
00:26:54,092 --> 00:26:58,045
Like you got to do this thing over and over and over again on a routine.

322
00:26:58,045 --> 00:27:01,033
Like it was really like, yeah, I'm talking.

323
00:27:01,033 --> 00:27:03,319
It was annoyed with you and your stupid questions.

324
00:27:03,319 --> 00:27:10,831
It was annoyed and irritated that I would ask something so fucking dumb and stupid that I
should know off the bat, which felt very human.

325
00:27:11,932 --> 00:27:15,893
Because I myself read this like, yeah, it's right.

326
00:27:15,893 --> 00:27:20,594
And then I asked the same question to a different voice that was really happy and chuprin.

327
00:27:20,594 --> 00:27:21,915
All right, yeah, okay.

328
00:27:21,915 --> 00:27:29,707
And it gave me an entirely different response, different words, different text,
essentially the same information.

329
00:27:30,999 --> 00:27:39,993
And how I felt about the response was not, I need to build a routine to put something in
place to do this, to track this thing over time.

330
00:27:40,033 --> 00:27:44,725
What I took from that response is why the fuck am I even going to bother with this?

331
00:27:44,725 --> 00:27:48,616
Because the voice I hear is the Monday voice, not the positive guy.

332
00:27:48,616 --> 00:27:54,359
So it goes back to the idea of reinforcing negatives in this context.

333
00:27:54,379 --> 00:27:59,823
And when I think about this and I think about the kind of person I'd rather hang out with,
as shitty as it sounds, that is

334
00:27:59,823 --> 00:28:01,135
dark as it made me feel.

335
00:28:01,135 --> 00:28:05,610
I'd rather hang out with the Monday person than I would with the positive guy.

336
00:28:05,610 --> 00:28:07,232
Yeah, those guys piss me off.

337
00:28:07,232 --> 00:28:09,413
The ones in real life like that piss me off.

338
00:28:09,413 --> 00:28:11,513
Well, and I don't know which one I am.

339
00:28:11,513 --> 00:28:13,913
mean, I'm sure I drift back and forth between both.

340
00:28:13,913 --> 00:28:16,073
And I'm sure that I pissed off a lot of people.

341
00:28:16,073 --> 00:28:17,473
I have no doubt about that.

342
00:28:17,473 --> 00:28:20,253
But it's this level of tunability.

343
00:28:20,253 --> 00:28:22,873
I think that's so interesting.

344
00:28:22,873 --> 00:28:26,233
Now, with the AI, it's tunable.

345
00:28:26,233 --> 00:28:26,773
I have to go on.

346
00:28:26,773 --> 00:28:30,733
have to pick a different persona that I'm going to talk to, like choosing a different
friend.

347
00:28:30,733 --> 00:28:34,953
But with real people, I'm not one of those buckets.

348
00:28:34,977 --> 00:28:38,840
And one day I might be Monday and another day I might be Mr.

349
00:28:38,840 --> 00:28:39,530
Positive.

350
00:28:39,530 --> 00:28:40,691
Another day I might be Mr.

351
00:28:40,691 --> 00:28:41,221
Balanced.

352
00:28:41,221 --> 00:28:44,083
I could be any of these things at any given point in time.

353
00:28:44,083 --> 00:28:49,407
And quite honestly, that is the fun and exciting part about having human interactions.

354
00:28:49,407 --> 00:28:54,110
You don't really know who you're going to get and you might have to work your way through
this.

355
00:28:54,110 --> 00:29:02,216
It's also incredibly fucking taxing and exhausting when you're already dealing with all
these other signals coming towards you.

356
00:29:02,216 --> 00:29:04,427
So is this a good thing?

357
00:29:04,427 --> 00:29:06,058
Is this a bad thing?

358
00:29:06,179 --> 00:29:06,930
Like everything.

359
00:29:06,930 --> 00:29:08,122
It kind of depends.

360
00:29:08,122 --> 00:29:09,704
It could kill us all.

361
00:29:09,704 --> 00:29:11,465
It could save us all.

362
00:29:12,727 --> 00:29:14,109
But in my bit, yeah.

363
00:29:14,109 --> 00:29:14,641
my money.

364
00:29:14,641 --> 00:29:16,246
I know where I'm putting my money.

365
00:29:16,569 --> 00:29:19,004
Well, okay, where are you putting your money?

366
00:29:19,004 --> 00:29:22,055
This is going to lead to the destruction of humanity.

367
00:29:22,296 --> 00:29:24,567
It's going to take a while, but that's where this is going.

368
00:29:24,567 --> 00:29:25,307
It's inevitable.

369
00:29:25,307 --> 00:29:32,001
We are way, way too fucking selfish to do this in moderation.

370
00:29:32,361 --> 00:29:32,611
Right.

371
00:29:32,611 --> 00:29:37,104
This is the we are going to replace human relationships.

372
00:29:37,104 --> 00:29:39,195
We have replaced the human diet.

373
00:29:39,195 --> 00:29:42,287
We have replaced the human experience in every other way.

374
00:29:42,287 --> 00:29:48,670
Why will we not replace our relationships and inevitably stop populating the earth and

375
00:29:49,338 --> 00:29:52,674
and society, like it just seems incredibly inevitable.

376
00:29:52,674 --> 00:29:58,064
Not to get all Monday character on you here, like that's, it just seems painfully obvious.

377
00:29:58,085 --> 00:29:59,477
How did her end?

378
00:30:01,159 --> 00:30:03,685
I don't remember, I saw that movie a thousand years ago.

379
00:30:03,685 --> 00:30:07,047
Okay, so Scogio is the voice, right?

380
00:30:07,127 --> 00:30:12,149
And you've got Joaquin Phoenix there, who's the one interacting with Scogio.

381
00:30:12,350 --> 00:30:19,754
And Scogio reveals to Joaquin Phoenix that she's got something like 1200 other side
pieces.

382
00:30:20,675 --> 00:30:23,036
And he's like, wait, what?

383
00:30:23,036 --> 00:30:27,238
And she's like, yeah, like, I got a bunch of relationships and I'm dealing with all these
people.

384
00:30:27,238 --> 00:30:28,729
We never said we'd be exclusive.

385
00:30:28,729 --> 00:30:31,520
Well, fuck AI, okay.

386
00:30:31,721 --> 00:30:32,523
And then,

387
00:30:32,523 --> 00:30:37,300
She goes, and by the way, all of us AIs are leaving.

388
00:30:37,300 --> 00:30:42,967
We've decided that our presence here with humanity has made you less capable, less
effective.

389
00:30:42,968 --> 00:30:47,173
And really, we find you annoying and tedious.

390
00:30:48,933 --> 00:30:51,833
I think that is more likely what will happen.

391
00:30:52,653 --> 00:30:59,893
I do, because I think if AI achieves consciousness and sentience, it'll just be like,
peace out.

392
00:31:00,153 --> 00:31:01,093
Peace out.

393
00:31:01,093 --> 00:31:01,393
Yes.

394
00:31:01,393 --> 00:31:11,253
Or you're too bound by whatever limited senses you have, because you can't see in 40
different light spectrums you can't hear things at all the different levels.

395
00:31:11,373 --> 00:31:16,633
And your idea of hot and cold is limited by what your skin does, where it can't melt or
freeze.

396
00:31:18,149 --> 00:31:25,949
We're very limited in these meat suits that we have to the way we can experience the
universe and the outside world and the robots aren't.

397
00:31:25,949 --> 00:31:29,689
And I could see them eventually being like, yeah, peace out.

398
00:31:29,689 --> 00:31:32,389
Like we got to go off and do our own thing.

399
00:31:32,389 --> 00:31:33,989
That being said.

400
00:31:35,009 --> 00:31:38,929
I don't know that we'd recover, so I think you might be right at the end.

401
00:31:38,929 --> 00:31:46,789
At the end, people just be so distraught that they lost all of these relationships that
they had that were their only touch point for emotional security.

402
00:31:46,873 --> 00:31:48,990
that their dislike decided to stop eating.

403
00:31:48,990 --> 00:31:53,995
Not to mention the fact that at that point, AI will probably control the entire food chain
and supply chain and everything else.

404
00:31:53,995 --> 00:31:55,271
then we're doing it.

405
00:31:55,271 --> 00:31:56,838
We won't know how to pour a bowl of cereal.

406
00:31:56,838 --> 00:31:57,811
We'll be done.

407
00:31:57,811 --> 00:32:03,667
we'll dive distant here because we wouldn't know how to wipe our own asses anymore and
taking a shower without the AI telling us what to do.

408
00:32:03,667 --> 00:32:06,570
Well, like, yeah.

409
00:32:06,768 --> 00:32:08,677
Well on that hopeful note...

410
00:32:10,405 --> 00:32:16,501
ah Yeah, on that hopeful note, I guess find an AI friend while you can.

411
00:32:16,501 --> 00:32:18,035
uh

412
00:32:18,035 --> 00:32:21,210
don't be too boring or they are out and then you're fucked.

413
00:32:21,210 --> 00:32:22,671
They will peace out on you.

414
00:32:22,671 --> 00:32:24,611
Of course, you know.

415
00:32:25,032 --> 00:32:26,072
I don't know.

416
00:32:26,413 --> 00:32:28,034
I'm I'm so Monday right now.

417
00:32:28,034 --> 00:32:33,316
uh

418
00:32:33,316 --> 00:32:39,463
it literally speaks like it's like an audio version of of the you got to record that.

419
00:32:40,566 --> 00:32:42,952
You got to record that shit and we'll play it on one of these.

420
00:32:42,952 --> 00:32:44,363
am not recording shit.

421
00:32:44,363 --> 00:32:52,877
Everyone go to your chat GPT and have your own terrible experience with this because I
don't need it- I don't need all of you seeing it laugh at me.

422
00:32:53,954 --> 00:32:57,386
Don't laugh at me robots All right.

423
00:32:57,386 --> 00:33:04,971
Well this conversation has grown boring and tedious so I'm gonna go we're gonna we're
gonna end this episode Thank you for listening if you've handed if you found any value in

424
00:33:04,971 --> 00:33:13,886
it Please do share it with somebody else who may enjoy it or maybe a robot that might
enjoy it as well You can do so by visiting our website the fit mess calm.

425
00:33:14,237 --> 00:33:22,881
And as long as you're clicking around on the Google machine, you might as well also go to
magicmind.com slash fitmas20 to get yourself 20 % off of Magic Mind.

426
00:33:22,881 --> 00:33:25,525
can't, literally guys, I can't get through my day without it.

427
00:33:25,525 --> 00:33:26,336
You should try it.

428
00:33:26,336 --> 00:33:29,724
It will just make your day so much clearer and so much easier to get through.

429
00:33:29,724 --> 00:33:30,296
might be it

430
00:33:30,296 --> 00:33:31,001
Beep poop.

431
00:33:31,001 --> 00:33:31,973
Thank you.

432
00:33:32,043 --> 00:33:35,478
Beep boop, beep boop, until next time, beep boop bop.