April 22, 2025

You Are Already Addicted to AI...So Now What?

You Are Already Addicted to AI...So Now What?

Ever been uncomfortable using your own brain because AI could do it faster? Jeremy and Jason dive into the unsettling reality of our growing dependency on artificial intelligence. While these tools save us hours of work, they're rewiring our brains to...

Ever been uncomfortable using your own brain because AI could do it faster? Jeremy and Jason dive into the unsettling reality of our growing dependency on artificial intelligence. While these tools save us hours of work, they're rewiring our brains to crave the easy dopamine hits that come from outsourcing our thinking.

The guys explore how this addiction isn't just about convenience—it's systematically changing how we interact with the world, potentially making us more isolated, less creative, and prime targets for manipulation. But there's hope if we can redirect our "saved time" toward meaningful activities instead of mindless scrolling.

Listen now to learn how to use AI as a tool without becoming its tool.

Topics Discussed

  • Jeremy's moment of discomfort when realizing he'd rather have AI write content than use his brain
  • How AI addiction triggers the same brain reward systems as essential biological needs
  • The irony of using AI to research for a podcast about AI addiction
  • The distinction between using technology to save time versus becoming dependent on it
  • How companies employ psychologists to make social media and AI products intentionally addictive
  • The concerning future of neural interfaces and potential "brain subscription" models
  • Why collective intelligence often leads to worse decisions than small-group thinking
  • The productivity paradox: saving time with AI only to waste it on mindless activities
  • Jason's theory on the future commodification of human brain processing power
  • How to maintain agency when using AI tools in daily life

Resources

----

MORE FROM THE FIT MESS:

Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok

Subscribe to The Fit Mess on Youtube

Join our community in the Fit Mess Facebook group

----

LINKS TO OUR PARTNERS:

 

 

1
00:00:01,500 --> 00:00:05,205
right, so how much time did you spend talking to chat GPT or Gemini this week?

2
00:00:05,205 --> 00:00:09,568
We rely on AI for so much, but can that reliance cross a line?

3
00:00:09,568 --> 00:00:11,330
Today on the Fitmas, that's what we're talking about.

4
00:00:11,330 --> 00:00:22,339
AI addiction, like most things in my life, think genetically I have a predisposition to
addiction sort of things.

5
00:00:22,339 --> 00:00:26,792
I've never gone full blown addicted on anything, but certainly developed dependencies.

6
00:00:26,923 --> 00:00:31,545
And the other day I was writing, I think I was writing an email or something.

7
00:00:31,606 --> 00:00:37,989
And there's so many things that I now have AI write for me where like I have the idea, I
know what I want to do, I just don't want to take the time doing it.

8
00:00:37,989 --> 00:00:40,831
So I tell the robot, hey, write this thing and it writes it.

9
00:00:40,831 --> 00:00:51,097
And I realized like it needed all of this context that would take longer to give it the
context than it would to just use my brain and come up with the content myself.

10
00:00:51,097 --> 00:00:54,379
But it was a very uncomfortable like few seconds where I was just like,

11
00:00:54,829 --> 00:00:56,361
That feels weird.

12
00:00:56,361 --> 00:01:01,847
I'm getting so used to being able to just have this done for me.

13
00:01:02,148 --> 00:01:08,905
That it's making me uneasy in my own skin trying to use my own brain to create this stuff.

14
00:01:09,947 --> 00:01:10,747
It's funny.

15
00:01:10,747 --> 00:01:22,491
So remember way back in the olden days, back before we had, you know, full blown
calculators in our pockets that we're told we'd never have for math and how we were told

16
00:01:22,491 --> 00:01:29,572
that we have to buy a set of encyclopedias just to survive and get by or be prepared to
library card to go and get shit.

17
00:01:29,573 --> 00:01:31,333
Well, then Google shows up.

18
00:01:32,173 --> 00:01:35,134
And all of the answers are there.

19
00:01:35,434 --> 00:01:39,087
They might not be accurate or good or valid.

20
00:01:39,087 --> 00:01:44,531
but people can point to it as a reference and be like, see, I Googled it, therefore it is
a thing.

21
00:01:44,752 --> 00:01:47,264
And there's plenty of truth to that.

22
00:01:47,264 --> 00:01:48,655
Like look at Wikipedia.

23
00:01:48,655 --> 00:01:55,581
Wikipedia was this information repository for topics that didn't fit into a nice clean
bucket sanitized function of an encyclopedia.

24
00:01:55,581 --> 00:01:59,934
And they could get more in depth on these things than you would typically find in some
kind of print piece.

25
00:01:59,934 --> 00:02:03,627
But it was user generated or guess community generated.

26
00:02:04,087 --> 00:02:08,633
Now at the top of your Google search, if you open up.

27
00:02:08,633 --> 00:02:12,274
the Google and you type, I just typed in AI addiction.

28
00:02:12,274 --> 00:02:13,134
And what did I get?

29
00:02:13,134 --> 00:02:18,066
I got an AI overview at the very top of the list.

30
00:02:18,066 --> 00:02:21,667
AI is telling me why I'm addicted to AI.

31
00:02:21,703 --> 00:02:22,907
It is very meta.

32
00:02:22,907 --> 00:02:29,749
We are living in a simulation that we created, whether we like it or not, because we built
this fucking thing this way.

33
00:02:29,749 --> 00:02:36,781
And we've entirely engrossed ourselves into this filthy thing of

34
00:02:36,781 --> 00:02:42,435
our own past experiences being refined and distilled into something that can be used to
market towards us.

35
00:02:42,556 --> 00:02:45,759
And I don't give a shit what you think AI is good for.

36
00:02:45,759 --> 00:02:48,661
It's going to be used for things that aren't good for you.

37
00:02:48,661 --> 00:02:50,683
And you talk about addiction.

38
00:02:50,683 --> 00:02:53,785
mean, a really good example of addiction.

39
00:02:53,866 --> 00:02:56,889
Human beings are addicted to a few things by default.

40
00:02:56,889 --> 00:02:58,230
We are addicted to food.

41
00:02:58,230 --> 00:02:59,411
We are addicted to water.

42
00:02:59,411 --> 00:03:00,672
We are addicted to air.

43
00:03:00,672 --> 00:03:04,536
We are addicted to not burning up in the sun or freezing to death in the snow.

44
00:03:04,536 --> 00:03:15,875
Beyond that, you can start manipulating certain things in the human brain to trigger the
same responses for those particular components, which come down to hormone regulation, the

45
00:03:15,875 --> 00:03:18,667
way that your brain actually operates with those components.

46
00:03:18,888 --> 00:03:26,644
If you think about it from the perspective of what smart devices do and what Facebook
does, they actually have psychologists that they've employed in social media to go through

47
00:03:26,644 --> 00:03:33,820
and figure out ways to generate content that is the most addictive to you to keep you
engaged as much as possible.

48
00:03:33,866 --> 00:03:41,266
Well, that's the thing, like the distinction between those addictions that you mentioned
that, essentially are what keep us alive and the ones that we're getting into now are, is

49
00:03:41,266 --> 00:03:43,466
the negative impact that it has on your life.

50
00:03:43,466 --> 00:03:54,346
Like if you are having mood swings because you can't communicate with the chat bot, if you
are neglecting work, if there's things that are suffering in your life because of your

51
00:03:54,346 --> 00:03:58,406
dependency or your use of this thing, that's the addiction we're talking about.

52
00:03:58,406 --> 00:04:01,166
Not so much the one, you know, with the vitamin D and the sunshine.

53
00:04:01,166 --> 00:04:02,862
That's a little bit of a different addiction, I think.

54
00:04:02,862 --> 00:04:04,034
But they're not.

55
00:04:04,034 --> 00:04:07,458
They're the same base root triggers that cause you to perform the same way.

56
00:04:07,458 --> 00:04:13,045
What there is, it's a reward and a punishment system that's biochemically enabled inside
of your brain.

57
00:04:13,045 --> 00:04:15,176
So when you eat some...

58
00:04:15,176 --> 00:04:16,101
sun to live.

59
00:04:16,101 --> 00:04:17,688
I don't need Facebook to live.

60
00:04:17,688 --> 00:04:25,962
But you don't need McDonald's cheeseburgers to live, and you don't need Coffee Chris to
live, and you don't need coffee to live.

61
00:04:25,962 --> 00:04:28,182
Now I take exception with that.

62
00:04:28,182 --> 00:04:29,922
I would argue that I do.

63
00:04:29,922 --> 00:04:30,700
In fact, I

64
00:04:30,700 --> 00:04:33,171
that if you cut it out for a month, you would still live.

65
00:04:33,171 --> 00:04:36,362
You'd be miserable as fuck, but you'd be alive.

66
00:04:40,044 --> 00:04:42,685
It is an alternative set of reality.

67
00:04:42,685 --> 00:04:48,908
actually along that, that aspect, the way you experience the world when you're not
caffeinated is different.

68
00:04:49,128 --> 00:04:49,988
Right.

69
00:04:50,148 --> 00:04:59,358
So again, I mean, this goes back to the whole point that, you know, your existence is an
intrapersonal, it is an interpersonal experience like

70
00:04:59,358 --> 00:05:08,213
You experience the world from within your own little chemical makeup and you take these
inputs and signals in and it filters through a bunch of hormones and a bunch of synapses

71
00:05:08,213 --> 00:05:11,175
in your brain that fire to give this bit of information.

72
00:05:11,515 --> 00:05:20,251
And now we're adding another signal and that other signal has more authority and shortcuts
the need for us to think through problems in a much faster route to figure out answers

73
00:05:20,251 --> 00:05:25,944
because we no longer have to do the hard work of going, I figured that out.

74
00:05:25,944 --> 00:05:29,329
because that actually used to be a reward system in human beings.

75
00:05:29,329 --> 00:05:34,556
Like you would actually have a reward for going to the library and looking some shit up
and going, I didn't know that.

76
00:05:34,556 --> 00:05:35,678
That's cool.

77
00:05:35,678 --> 00:05:37,600
Now it's just there.

78
00:05:37,807 --> 00:05:41,980
I'll share a confession and a bit of irony about this episode.

79
00:05:41,980 --> 00:05:46,834
Literally before we were recording, I was reading a bunch of articles and I went, what am
I doing?

80
00:05:46,834 --> 00:05:52,839
And I copied a bunch of them, threw them into AI and went summarize this, create an
outline for me so that can do a podcast about this topic.

81
00:05:52,839 --> 00:05:57,713
Like that used to take hours and I did it in minutes today.

82
00:05:57,713 --> 00:06:07,470
Got the same information, have the same reward of getting to do this and being relatively
informed on what the issues are.

83
00:06:07,614 --> 00:06:12,364
But I just saved hours of my life because of the convenience of this amazing tool.

84
00:06:12,364 --> 00:06:16,966
And over time that reliance is going to change the way that we interact with things.

85
00:06:16,966 --> 00:06:25,709
And it's going to, for lack of a better term, evolve the way that we interact with these
components because we've shortcut them.

86
00:06:25,709 --> 00:06:31,972
I mean, every technology goes through iterative life cycle functions, but human beings did
the same thing, right?

87
00:06:31,972 --> 00:06:41,156
Like with the invention of the shoe, human beings basically lost like the thick padding on
the bottom of their feet and the hobbit feet kind of went away because they didn't need

88
00:06:41,156 --> 00:06:42,276
them as much.

89
00:06:42,364 --> 00:06:50,681
And with the invention of hats, people started to lose their hair because it didn't have
as much direct sunlight coming out.

90
00:06:50,681 --> 00:07:00,940
There's all kinds of little tiny micro things that you can look at that are either genetic
or epigenetic results of us implementing or instrumenting new technologies into pieces

91
00:07:00,940 --> 00:07:02,722
because certain traits survive.

92
00:07:02,722 --> 00:07:09,868
Well, now our biological mass is going to survive no matter what because we have medicine
that keeps us alive long enough to

93
00:07:10,360 --> 00:07:15,944
be a member of society, whether that's good or bad is highly subjective.

94
00:07:15,944 --> 00:07:28,563
But at the end of the day, what we use technology for now augments the way that we
interact with the world, because the world itself is so vastly stacked with things that

95
00:07:28,563 --> 00:07:33,706
instrument and wire things up to give us big sets of information into this type of,

96
00:07:34,322 --> 00:07:39,924
telemetry system that grabs all these different sets of sensor data and you have sensor
data that you wear on your body.

97
00:07:39,924 --> 00:07:40,664
It's in your house.

98
00:07:40,664 --> 00:07:42,024
It's in your car.

99
00:07:42,044 --> 00:07:43,465
It's kind of fucking everywhere.

100
00:07:43,465 --> 00:07:47,105
And all this is being taken in and dropped into an AI system.

101
00:07:47,105 --> 00:07:53,547
And someone saying, take all this information out there from these pieces and tell me how
to market to Kevin.

102
00:07:53,607 --> 00:07:55,648
And they're like, all right.

103
00:07:55,648 --> 00:07:57,885
Well, I know everything that Kevin did.

104
00:07:57,885 --> 00:07:59,079
I know what Kevin likes to eat.

105
00:07:59,079 --> 00:08:02,440
I know his all of his traits and trends online.

106
00:08:02,440 --> 00:08:04,260
And I know how he behaves there.

107
00:08:04,336 --> 00:08:09,507
And Kevin's online persona might be entirely different from his real world persona.

108
00:08:10,308 --> 00:08:19,031
But if he's wired up like most of us are these days, they're getting the information and
they know that Kevin's online persona is full of shit and they are going to capitalize

109
00:08:19,031 --> 00:08:19,901
that on that.

110
00:08:19,901 --> 00:08:29,984
And they're going to figure out ways to express things to Kevin in a way that's going to
target his insecurities at just the right way to get him to react.

111
00:08:30,640 --> 00:08:32,662
That's what this is being used for.

112
00:08:32,662 --> 00:08:35,095
So we talk about it as an addiction.

113
00:08:35,095 --> 00:08:37,137
Yes, it absolutely is.

114
00:08:37,137 --> 00:08:38,698
I don't care who you are.

115
00:08:39,300 --> 00:08:43,564
If you use the internet for anything, it's already in your life.

116
00:08:43,564 --> 00:08:44,825
You're already machined up.

117
00:08:44,825 --> 00:08:45,707
It's already there.

118
00:08:45,707 --> 00:08:46,688
Skynet's winning.

119
00:08:46,688 --> 00:08:48,109
Blah, de blah, de blah.

120
00:08:48,109 --> 00:08:49,971
But how do we make this better?

121
00:08:50,752 --> 00:08:52,619
What's the positive spin on this?

122
00:08:52,619 --> 00:08:56,491
right, I mean, the main one for me is time saving, right?

123
00:08:56,491 --> 00:08:59,993
Like the amount of hours that I get back to do other things.

124
00:08:59,993 --> 00:09:09,348
Now, so, here's the rub, I think, for a lot of people is that a lot of people are then
taking that time to do a little more scrolling and a little bit more just nonsense.

125
00:09:09,348 --> 00:09:20,137
we don't, I think technology has evolved because things were hard and we wanted them to be
less hard so we could have more leisure time to.

126
00:09:20,137 --> 00:09:21,877
really enjoy life.

127
00:09:22,238 --> 00:09:26,700
But like we're all so damn bored with life that we are out of addiction.

128
00:09:26,700 --> 00:09:32,802
our brain says, pull the phone out of your pocket, scroll, scroll, scroll, text, text,
text, scroll, scroll, scroll.

129
00:09:32,802 --> 00:09:34,302
We don't interact with people.

130
00:09:34,302 --> 00:09:35,523
We don't read books.

131
00:09:35,523 --> 00:09:36,963
We don't go outside.

132
00:09:36,963 --> 00:09:44,626
Like all of the things that this blob needs to have a fulfilling experience, we're not
doing.

133
00:09:44,626 --> 00:09:49,258
And then we wonder why there's a mental health crisis in the world and why people are.

134
00:09:49,258 --> 00:09:52,499
more more depressed and more and more detached and isolated.

135
00:09:52,499 --> 00:09:57,740
It's because we're being driven in by the devices that we're becoming more and more
addicted to.

136
00:09:57,740 --> 00:10:01,731
So the antidote is do less, right?

137
00:10:01,731 --> 00:10:07,002
But of course, as it becomes more ingrained and more part of our lives, that becomes
harder.

138
00:10:07,082 --> 00:10:09,603
But when you're wondering, why am I so depressed?

139
00:10:09,603 --> 00:10:10,863
Why am I so isolated?

140
00:10:10,863 --> 00:10:12,443
Why am I so alone?

141
00:10:12,684 --> 00:10:19,395
Stop looking at the thing you're looking at while you're having that thought and put it
away and go outside and call a friend and go meet up with them.

142
00:10:20,186 --> 00:10:22,747
Yeah, yeah.

143
00:10:22,828 --> 00:10:28,272
Comedian David Cross has a great joke on technology being used in this way.

144
00:10:28,606 --> 00:10:31,675
I don't know if we can keep this in the podcast, but I'm going to go for it anyways.

145
00:10:31,695 --> 00:10:35,398
He basically said that he saw an ad for electric scissors.

146
00:10:35,759 --> 00:10:44,165
So, you know, a thing out there that keeps you from opening and closing your hand to cut
things, it opens and closes for you to make all these things happen.

147
00:10:44,165 --> 00:10:46,347
And he goes, what's the point of this?

148
00:10:46,347 --> 00:10:50,340
How much time did I really save with this at the end?

149
00:10:50,552 --> 00:10:51,933
And he goes, death shows up.

150
00:10:51,933 --> 00:10:52,783
He's on his deathbed.

151
00:10:52,783 --> 00:10:53,644
He's waiting.

152
00:10:53,644 --> 00:10:55,005
He's waiting to die.

153
00:10:55,005 --> 00:10:56,616
And death goes, hey, I'm sorry.

154
00:10:56,616 --> 00:10:57,822
It's time to go.

155
00:10:57,822 --> 00:11:00,408
And the guy goes, but I used electric scissors.

156
00:11:00,408 --> 00:11:06,271
He's like, death goes, all right, here's an extra five minutes.

157
00:11:06,952 --> 00:11:08,703
You saved five minutes doing this thing.

158
00:11:08,703 --> 00:11:10,433
Here's your extra time.

159
00:11:10,494 --> 00:11:11,835
What are you going to do with it?

160
00:11:11,835 --> 00:11:13,706
Well, AI didn't save us.

161
00:11:13,706 --> 00:11:14,876
You'd save you five times.

162
00:11:14,876 --> 00:11:15,517
Five minutes.

163
00:11:15,517 --> 00:11:17,684
It saves you hours.

164
00:11:17,684 --> 00:11:18,332
hours.

165
00:11:18,332 --> 00:11:27,458
So what are you gonna spend, like what's the thing you're gonna do to redirect your
attention when you spent all this time doing these other things, which actually did give

166
00:11:27,458 --> 00:11:30,200
you a dopamine response at the end when you completed that task.

167
00:11:30,200 --> 00:11:32,411
You're like, I'm finished, thank God.

168
00:11:32,432 --> 00:11:40,667
Now, we're just addicted to the bing on your phone or a notification or an alert to go, I
got a check to see if some bullshit's there.

169
00:11:40,667 --> 00:11:47,902
Maybe there's some new sneaker that's come out or maybe Trump's announced tariffs on
fucking seagulls.

170
00:11:48,554 --> 00:11:51,777
Maybe I've got three likes on that picture I at the beach.

171
00:11:51,777 --> 00:11:53,039
exactly.

172
00:11:53,039 --> 00:11:54,210
But I mean, that's the point.

173
00:11:54,210 --> 00:12:03,209
Like none of this is none of it's got a straight linear thread that you can pull to say
this is why this is bad or this is why this is good because it's all intermingled and

174
00:12:03,209 --> 00:12:04,329
mashed up.

175
00:12:04,430 --> 00:12:13,198
And like all intelligence, it's complicated and its complications are definitely.

176
00:12:13,486 --> 00:12:19,439
exacerbated by the way and the intent in which the way that we use the technology and what
our focal point is on it.

177
00:12:19,439 --> 00:12:26,712
And the reality is, that how most of us interoperate is we communicate, we speak or we
write.

178
00:12:26,712 --> 00:12:29,013
Those are kind of the two things that we do.

179
00:12:29,013 --> 00:12:33,555
Some of us, you know, do things with touch pieces like that and body language and all
that.

180
00:12:33,555 --> 00:12:40,578
But let's just ignore that that side of it for a minute, because that would require you to
actually be in front of somebody and fucking communicating, which most of us didn't do

181
00:12:40,578 --> 00:12:41,363
these days.

182
00:12:41,363 --> 00:12:44,963
Which we say as we're having a remote conversation from hundreds of miles apart.

183
00:12:44,963 --> 00:12:45,603
Yeah.

184
00:12:48,303 --> 00:12:49,523
That's right.

185
00:12:50,483 --> 00:12:51,803
That's right.

186
00:12:52,344 --> 00:12:53,925
even though you can't see it.

187
00:12:53,945 --> 00:12:54,896
Anywho.

188
00:12:55,015 --> 00:12:57,985
The hours we've saved in commuting every week by just doing this.

189
00:12:57,985 --> 00:12:58,708
Amazing.

190
00:12:58,708 --> 00:12:59,899
Where's my Facebook?

191
00:12:59,899 --> 00:13:00,579
Right.

192
00:13:00,579 --> 00:13:10,783
But my point is that when you look at the way that we've you know we've taken these bits
of technology that we've taken these different component pieces and we've slapped them

193
00:13:10,783 --> 00:13:15,794
into something that's actually useful for the average per delay person.

194
00:13:15,794 --> 00:13:19,186
You know we can all go on and we all have instant access to generative AI.

195
00:13:19,186 --> 00:13:19,846
Cool.

196
00:13:19,846 --> 00:13:21,016
That's great.

197
00:13:21,252 --> 00:13:22,763
We don't have to think as much.

198
00:13:22,763 --> 00:13:28,027
with that free time, like you said, we're probably going to fall back on tools that give
us better dopamine responses.

199
00:13:28,027 --> 00:13:38,715
If you look at that from the other perspective and how people are using AI to get to you
more effectively, well, if you spend your time on the tool that generates AI that spits

200
00:13:38,715 --> 00:13:47,681
more AI back at you, that uses AI to get you more interested, to give you more dopamine
responses, to keep you more engaged, you didn't disconnect from anything.

201
00:13:47,821 --> 00:13:49,302
You didn't save any time.

202
00:13:49,338 --> 00:13:53,249
you just redirected your eyeballs to something else that somebody else is going to
capitalize on.

203
00:13:53,249 --> 00:13:59,811
So if you're going to use AI like this, use it, write it down, move on to the next thing,
but don't change your patterns.

204
00:13:59,811 --> 00:14:09,223
If you have extra free time, make sure you're using it for free time that's productive for
you and not productive for somebody else to market to you.

205
00:14:09,844 --> 00:14:11,214
And that's a hard flip, right?

206
00:14:11,214 --> 00:14:16,025
Like one of the cool things about AI is that you can write books and stories and
everything else.

207
00:14:16,105 --> 00:14:19,214
I mean, use it to do fun shit like

208
00:14:19,214 --> 00:14:20,865
Make that be the reward mechanism.

209
00:14:20,865 --> 00:14:22,217
Have it spit those things back.

210
00:14:22,217 --> 00:14:23,047
Be productive.

211
00:14:23,047 --> 00:14:24,879
Don't just sit there and doom scroll.

212
00:14:24,879 --> 00:14:28,361
There's plenty of shit out there for us to be depressed and sad about right now.

213
00:14:28,362 --> 00:14:29,703
And there always will be, right?

214
00:14:29,703 --> 00:14:31,924
That's never not going to be the case.

215
00:14:32,385 --> 00:14:41,883
That being said, you have access to the compendium of information and a super smart oracle
that can spit back answers to you and do things for you that you yourself probably either

216
00:14:41,883 --> 00:14:44,456
don't want to take the time to do or it's going to be too cumbersome.

217
00:14:44,456 --> 00:14:47,978
And use those things to make cool shit and refine them and make them better.

218
00:14:48,034 --> 00:14:57,670
And if you really look at the true thought behind why people want these artificial
intelligence systems, it's because they do a lot of the neural mapping, a lot of

219
00:14:57,670 --> 00:15:04,053
understanding that the human brain is trying to emulate and put into power for our regular
data tasks.

220
00:15:04,053 --> 00:15:08,186
And they've taken these pieces and made them digital so people can understand them in
context.

221
00:15:08,446 --> 00:15:12,429
That, I think, is noble and a good use for that system.

222
00:15:12,429 --> 00:15:15,470
And I think those things using it for that are great.

223
00:15:16,156 --> 00:15:22,070
They're also using it to track faces, track activity online, look at where the security
threats go through.

224
00:15:22,070 --> 00:15:28,574
There's all kinds of interesting things in ways that they're using the technology, but it
all comes down to the way someone's using it.

225
00:15:28,574 --> 00:15:31,856
And if it's being used against you, that's going to suck.

226
00:15:31,856 --> 00:15:35,598
If it's being used for you, that's going to be good, maybe.

227
00:15:35,598 --> 00:15:37,118
I think that's the double edged sword of it, too.

228
00:15:37,118 --> 00:15:42,730
I mean, just as you're talking about this, I'm just I'm just thinking like this is another
case where this is the tractor.

229
00:15:42,730 --> 00:15:44,330
This makes farming so much easier.

230
00:15:44,330 --> 00:15:44,970
I don't have to do it.

231
00:15:44,970 --> 00:15:46,701
Go out and do all the manual farming.

232
00:15:46,701 --> 00:15:55,623
I've got all this leisure time to read, be with my family, write a book, do like create,
be a be a part of nature or whatever.

233
00:15:55,623 --> 00:16:01,464
But the double edged sword part of this is that like doing stuff is hard.

234
00:16:01,504 --> 00:16:05,315
So we so often default to what's easy.

235
00:16:05,328 --> 00:16:12,220
So when you have loads of free time, the hard part is figuring out what's the best use of
this time for me.

236
00:16:12,681 --> 00:16:19,243
And the easy thing now that we have calculators in our pockets is to come, you know, pull
it out and ride boobless upside down because that's what we're doing.

237
00:16:19,243 --> 00:16:26,806
But like, it's just, it's so easy to just give up and do the easy thing that feeds our
brain, right?

238
00:16:26,806 --> 00:16:34,448
Instead of doing the productive hard thing that will make us better people that are more
fulfilled and filled with purpose in our life.

239
00:16:34,448 --> 00:16:39,498
Yeah, I think a very interesting used a very interesting phrase there.

240
00:16:39,675 --> 00:16:40,272
Boobless?

241
00:16:40,272 --> 00:16:42,343
You

242
00:16:44,437 --> 00:16:45,597
Free time.

243
00:16:46,499 --> 00:16:47,779
is not free.

244
00:16:48,060 --> 00:16:50,162
It actually is a limited resource.

245
00:16:50,162 --> 00:16:52,383
You don't know how much you have.

246
00:16:52,664 --> 00:16:56,968
And if you're like most of us that work, you don't know how much you have every day.

247
00:16:56,968 --> 00:17:02,553
If you have kids, you don't know how much you have every hour, and it's probably none,
especially if they're little kids.

248
00:17:02,974 --> 00:17:07,001
If you have a partner, you don't know how much of your time is going to be

249
00:17:07,001 --> 00:17:09,082
stuff for you or stuff for them.

250
00:17:09,082 --> 00:17:12,385
And the reality is, that time is not free.

251
00:17:12,746 --> 00:17:28,780
And if you are redirecting your attention to distraction because you have regained time
from heavy lifting, mental heavy lifting, you should use that for more mental heavy

252
00:17:28,780 --> 00:17:33,544
lifting and redirect that focus and get something out of that time.

253
00:17:33,544 --> 00:17:35,425
Because at the end of the day,

254
00:17:35,565 --> 00:17:43,248
It is not about, did I save a bunch of time with chat GPT as much as it should be?

255
00:17:43,949 --> 00:17:47,650
Was I able to use chat GPT to be more productive?

256
00:17:47,650 --> 00:17:57,205
And if I have a body of work and I complete it in five minutes versus an hour, yes, I
saved 55 minutes.

257
00:17:57,205 --> 00:18:02,057
But if I use that 55 minutes to do dumb shit,

258
00:18:03,053 --> 00:18:04,904
then I haven't saved anything.

259
00:18:05,104 --> 00:18:15,389
I've just redirected my focus towards leisure, I guess, or something that's not productive
and not actually making you any better.

260
00:18:18,271 --> 00:18:21,913
it's somewhat selfish to think that the goal of these things should be to make you better.

261
00:18:21,913 --> 00:18:25,125
But at the end of the day, that's what it's really meant for.

262
00:18:25,125 --> 00:18:28,556
Like it's meant to make you better, more productive, more capable.

263
00:18:28,637 --> 00:18:32,897
And if you don't use it for that, someone's going to use it against you to

264
00:18:32,897 --> 00:18:41,285
make you obsolete because if you've taken your job that used to take an hour and boil it
down to five minutes with an AI script, they don't need you.

265
00:18:41,285 --> 00:18:49,122
And that'll happen because they'll find somebody else that can run the script 12 times and
replace 12 people's hours of work.

266
00:18:49,143 --> 00:18:49,704
And that'll happen.

267
00:18:49,704 --> 00:18:55,060
the term the term you use that I find interesting is is productive, because I think that's
that's one that

268
00:18:55,060 --> 00:19:00,682
In these contexts, when I hear that word, I hear it as better employee, right?

269
00:19:00,682 --> 00:19:02,823
Like, can you be a better employee?

270
00:19:02,823 --> 00:19:05,044
Can you make more money for the big boss upstairs?

271
00:19:05,044 --> 00:19:07,335
Like that's that's just like in my brain.

272
00:19:07,335 --> 00:19:07,905
That's where it good.

273
00:19:07,905 --> 00:19:09,856
Like I have this negative response to it.

274
00:19:09,856 --> 00:19:11,176
We're like, that's.

275
00:19:11,697 --> 00:19:13,017
That's not what drives me.

276
00:19:13,017 --> 00:19:16,919
I'm not here to make my boss happy other than the eight hours that I'm there.

277
00:19:16,919 --> 00:19:19,240
Like I want the leisure time.

278
00:19:19,240 --> 00:19:23,631
I want to just enjoy the few spins around the sun that I get on this rock.

279
00:19:23,775 --> 00:19:33,249
So for me, productivity is getting to the point where I have more time to just be and to
just live and to just exist and to grow as a person.

280
00:19:33,249 --> 00:19:38,892
Because I think that that's where I do is with my family, with my friends, with those
leisure activities.

281
00:19:39,612 --> 00:19:47,246
So I just wanna share that in case there's somebody who's having the same sort of trigger
response I do of like, but I don't wanna be more productive at work.

282
00:19:47,246 --> 00:19:48,436
I don't give a shit about that.

283
00:19:48,436 --> 00:19:50,028
I want a more fulfilling life.

284
00:19:50,028 --> 00:19:51,752
Yes, and.

285
00:19:52,213 --> 00:19:56,313
When I'm saying productive, I'm not talking about whatever company I work for.

286
00:19:56,313 --> 00:19:58,593
And the company I work for right now is great.

287
00:19:59,573 --> 00:20:05,733
But the main company that I work for is Uink, as in me, me incorporated.

288
00:20:05,733 --> 00:20:09,133
Because I've got my job that pays me money.

289
00:20:09,133 --> 00:20:11,873
I've got a couple of other side projects that I do.

290
00:20:12,353 --> 00:20:13,653
I've got a wife.

291
00:20:13,653 --> 00:20:14,513
I've got kids.

292
00:20:14,513 --> 00:20:15,313
I've got sports.

293
00:20:15,313 --> 00:20:16,733
I've got activities.

294
00:20:16,873 --> 00:20:18,513
And then I actually do have leisure.

295
00:20:18,513 --> 00:20:22,193
And how I spend most of my leisure time is

296
00:20:22,301 --> 00:20:26,464
like most Americans, surfing on the internet or watching television.

297
00:20:26,464 --> 00:20:34,089
So I actually try to minimize leisure time and try to put forward productive group time.

298
00:20:34,089 --> 00:20:37,461
Date nights are great, and you should do more of those.

299
00:20:37,461 --> 00:20:39,903
And going out with friends is great.

300
00:20:39,903 --> 00:20:45,897
You should have more of those interactions and prioritize those things over sitting at
home and watching Netflix.

301
00:20:45,897 --> 00:20:51,481
But that being said, if you change your focus from

302
00:20:53,085 --> 00:21:00,408
productivity from being productive for my employer, the people that are writing my
paycheck, which you're going to have to increase that if you're incorporating GNI.

303
00:21:00,408 --> 00:21:01,648
That's just a given.

304
00:21:01,648 --> 00:21:03,989
But you can split the difference, right?

305
00:21:03,989 --> 00:21:08,580
Like, hey, this one out this 50 minute task now takes five.

306
00:21:08,580 --> 00:21:09,251
Well, cool.

307
00:21:09,251 --> 00:21:16,233
So I'm going to do five of these tasks now and roll it up in 25 minutes to be five times
more productive and half the time.

308
00:21:16,753 --> 00:21:17,543
Like.

309
00:21:17,813 --> 00:21:18,233
great.

310
00:21:18,233 --> 00:21:19,534
That's good for everybody.

311
00:21:19,534 --> 00:21:23,234
And no employer is going to be like, you're only five times as productive as before.

312
00:21:23,234 --> 00:21:24,837
They're going to be like, that's fucking great.

313
00:21:24,837 --> 00:21:26,358
Until somebody figures that piece out.

314
00:21:26,358 --> 00:21:28,329
And then you got to figure out to keep upping those games.

315
00:21:28,329 --> 00:21:30,700
Because it is an evolving process.

316
00:21:30,700 --> 00:21:33,332
And the technology is evolving.

317
00:21:33,332 --> 00:21:34,613
And how we use it is evolving.

318
00:21:34,613 --> 00:21:35,883
And our skill sets are evolving.

319
00:21:35,883 --> 00:21:41,493
And what you don't want to be is in the bottom half of the median curve of people adopting
and using this technology.

320
00:21:41,493 --> 00:21:42,457
So you need to know it.

321
00:21:42,457 --> 00:21:43,498
You need to use it.

322
00:21:43,498 --> 00:21:45,429
I'm not saying you need to love it.

323
00:21:45,429 --> 00:21:46,269
But

324
00:21:46,653 --> 00:21:52,205
it might not hurt to at least love some parts of it and figure out how to make those
things better for you.

325
00:21:52,205 --> 00:21:57,096
But at the end of the day, this comes down to how quickly can you adapt to these pieces?

326
00:21:57,096 --> 00:22:01,307
And yeah, you're it's going to be addictive for some things.

327
00:22:01,307 --> 00:22:02,028
There's no doubt.

328
00:22:02,028 --> 00:22:03,578
It's just that easy button, man.

329
00:22:03,578 --> 00:22:13,671
Like, I don't know about you, but I'm I am addicted to driving because walking 30 miles to
the gym around trip seems

330
00:22:17,279 --> 00:22:17,843
Yeah.

331
00:22:17,843 --> 00:22:18,765
but driving it.

332
00:22:18,765 --> 00:22:19,946
OK, sure.

333
00:22:20,028 --> 00:22:21,029
I'll do that.

334
00:22:21,347 --> 00:22:22,518
Absolutely.

335
00:22:22,879 --> 00:22:32,598
I'm just waiting for the first AI addiction support groups to open up and start seeing
signs everywhere with the 12 steps to get rid of your AI addiction.

336
00:22:32,598 --> 00:22:36,612
I'm sure as I'm saying this, I can see you frantically Googling for it.

337
00:22:36,612 --> 00:22:37,655
Does it exist?

338
00:22:37,655 --> 00:22:42,279
Internet and technology addicts anonymous recovering from AI addiction.

339
00:22:42,800 --> 00:22:46,282
Oddly enough, not an AI summary up top.

340
00:22:47,324 --> 00:22:48,504
Pretty weird.

341
00:22:49,406 --> 00:22:52,247
The AI was like, whoa, pump the brakes.

342
00:22:52,588 --> 00:22:54,249
You don't need support.

343
00:22:54,870 --> 00:22:56,191
I'm here for you, baby.

344
00:22:56,191 --> 00:22:57,472
I'm here for you.

345
00:22:58,384 --> 00:22:59,636
it's so good.

346
00:22:59,958 --> 00:23:00,488
Wow.

347
00:23:00,488 --> 00:23:02,401
There's actually...

348
00:23:02,914 --> 00:23:05,767
Am I seeing Elanon for AI support?

349
00:23:06,149 --> 00:23:06,731
wow.

350
00:23:06,731 --> 00:23:08,382
This is crazy.

351
00:23:08,382 --> 00:23:09,843
Yeah, like this is nuts, right?

352
00:23:09,843 --> 00:23:12,645
Like take this to the next step.

353
00:23:14,647 --> 00:23:17,739
We've gone through and AIs sitting everywhere.

354
00:23:17,739 --> 00:23:19,691
It's in all of our personal assistance.

355
00:23:19,691 --> 00:23:25,465
we walk around with headphones, like most of time all day, I'm constantly pulling mine out
because they're constantly irritating my ears.

356
00:23:25,465 --> 00:23:31,900
And I think I've had like 15 external ear infections in the last 10 years from wearing
headphones everywhere.

357
00:23:31,900 --> 00:23:33,320
That being said.

358
00:23:34,977 --> 00:23:44,308
When they're in my ears, I am not above saying, whatever AI assistant I'm talking to at
that point in time, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah,

359
00:23:44,308 --> 00:23:55,121
blah, blah, blah, blah, blah, blah, blah, blah,

360
00:23:55,405 --> 00:24:05,129
because it's not going to be cheap and it's going to be kind of dangerous, but it's also
going to make us meat puppets more productive.

361
00:24:05,610 --> 00:24:10,232
I actually just watched the new season of Black the first episode of the new season of
Black Mirror.

362
00:24:10,612 --> 00:24:16,975
And the lady gets a tragic brain injury and they cut out part of her brain and they stick
in synthetic functions.

363
00:24:16,975 --> 00:24:21,721
And she has to stay within X amount of mileage of these cell phone towers.

364
00:24:21,721 --> 00:24:24,152
because they're actually feeding her consciousness to her brain.

365
00:24:24,152 --> 00:24:26,563
And when she goes out to it, she kind of passes out.

366
00:24:27,464 --> 00:24:31,326
Well, they changed the service plan.

367
00:24:31,767 --> 00:24:35,909
And the $300 a month service plan now has ads.

368
00:24:37,040 --> 00:24:37,815
God.

369
00:24:37,815 --> 00:24:40,566
where she just starts speaking ads.

370
00:24:41,886 --> 00:24:46,768
She's at her job teaching children and she just starts talking about ads.

371
00:24:46,768 --> 00:24:49,579
Like so and so for these types of things.

372
00:24:49,579 --> 00:24:52,950
She's going to have sex with her husband and he's like, yeah, I'm just not in the mood.

373
00:24:52,950 --> 00:24:58,511
And it immediately flips over to a boner pill ad.

374
00:24:59,332 --> 00:25:01,912
this is not far-fetched.

375
00:25:01,912 --> 00:25:04,539
Like, and yes, it's a Black Mirror episode.

376
00:25:04,539 --> 00:25:13,025
I I love the idea of we don't need school anymore because everyone's just going to get the
robo brain that just has the infinite knowledge to everything.

377
00:25:13,025 --> 00:25:13,226
Right.

378
00:25:13,226 --> 00:25:14,857
Like that's a cool idea.

379
00:25:14,979 --> 00:25:15,470
But.

380
00:25:15,470 --> 00:25:25,736
the subscription model, the ads, the contract renewal, where it's like, you've been paying
this, we've made all these upgrades, now it's this much to have your brain.

381
00:25:25,736 --> 00:25:28,047
Or you can go be a big stupid sack of dumb.

382
00:25:28,047 --> 00:25:30,168
Like, which one do you wanna be?

383
00:25:30,988 --> 00:25:34,010
Yeah, like that's that I mean, that's happening, right?

384
00:25:34,010 --> 00:25:41,976
Like that's the whole Internet gap that we had here in the US where we had to go through
and we had to give 50 % of the planet Internet access.

385
00:25:42,177 --> 00:25:57,319
Oddly enough, when we forced that to happen, that's when all the online political shit
really took off, because now you had everybody with access to this in these mediums that

386
00:25:57,319 --> 00:25:58,040
could go through.

387
00:25:58,040 --> 00:26:01,130
And collectively, we're not smarter together.

388
00:26:01,130 --> 00:26:05,753
Like we make a lot worse decisions as a big group together.

389
00:26:05,753 --> 00:26:09,756
And there's entire organizations and companies that recognize this.

390
00:26:09,756 --> 00:26:15,250
Like Amazon, for example, has this concept of a two pizza meeting or two pizza team.

391
00:26:15,250 --> 00:26:21,995
It's basically enough people in the room that when you have them in there and you get to
buy them lunch, two pizzas feeds them.

392
00:26:21,995 --> 00:26:29,292
So yeah, I mean, that's going to be somewhat dependent upon the people that are there
because some people are going to eat

393
00:26:29,292 --> 00:26:31,924
two slices, some are going to eat five, blah blah blah.

394
00:26:31,924 --> 00:26:40,330
But that's the concept, it's squishy and kind of nebulous, but if you find yourself
needing to order more than two pizzas to keep everyone fed and in the room, then your

395
00:26:40,330 --> 00:26:41,620
team's too big.

396
00:26:41,681 --> 00:26:52,428
That's not an entirely unfounded concept, and I've seen people try to do the math behind
it thermodynamics to try to understand this concept and say, this is how many electrons it

397
00:26:52,428 --> 00:26:55,450
takes to produce good ideas, blah blah blah.

398
00:26:55,450 --> 00:26:57,824
It's fun, it's a neat exercise to go through, but

399
00:26:57,824 --> 00:27:12,437
We are approaching a time where that level of productivity and that type of refinement of
work effort output is getting turned into that because the thing that makes us us is

400
00:27:12,437 --> 00:27:20,204
completely and totally disconnected from the empirical world of measurement and statistics
and analysis.

401
00:27:20,352 --> 00:27:25,605
because those things operate independently of what our interpersonal experience is.

402
00:27:25,605 --> 00:27:35,540
And people are making decisions without thinking about those things all the way through
because you've spread sheeted all of your decision making criteria and morals and ethics

403
00:27:35,540 --> 00:27:37,851
and empathy don't have to come into play anymore.

404
00:27:37,851 --> 00:27:42,444
And AI is going to apply that.

405
00:27:42,764 --> 00:27:50,268
And it gets scarier when you start taking this up and you start saying, Hey, AI make us
more efficient because

406
00:27:50,268 --> 00:27:52,550
it's going to start shutting us down a bit more.

407
00:27:52,550 --> 00:28:05,208
That was the other thing about that Black Mirror episode is that they had they were making
her sleep longer because they were using her the processing power in her brain to act as

408
00:28:05,208 --> 00:28:10,164
to basically do processing for higher paying subscribers.

409
00:28:10,625 --> 00:28:16,249
And this is like it's just a tilt on the matrix.

410
00:28:16,249 --> 00:28:20,362
It's the augmented reality version of the matrix as opposed to virtual reality.

411
00:28:23,077 --> 00:28:33,557
That's the part that as you're describing that the part that scares me the most about that
is, you know, I think of 1977 and seeing Star Wars and like, wow, that that's a million

412
00:28:33,557 --> 00:28:34,237
years away.

413
00:28:34,237 --> 00:28:35,377
That's incredible.

414
00:28:35,377 --> 00:28:36,557
Like what you just described.

415
00:28:36,557 --> 00:28:38,337
That could happen next year.

416
00:28:38,537 --> 00:28:39,617
Like nine.

417
00:28:39,617 --> 00:28:39,857
I know.

418
00:28:39,857 --> 00:28:42,757
But like the technology was like so far.

419
00:28:42,757 --> 00:28:44,989
How could that possibly be?

420
00:28:45,123 --> 00:28:47,858
I'm hearing you describe something and I'm like, I could see that happening next year.

421
00:28:47,858 --> 00:28:49,931
Like we're, we're, right there.

422
00:28:50,758 --> 00:28:53,196
Yeah, I mean, I don't know if we're right there.

423
00:28:53,196 --> 00:28:53,954
I mean, there's

424
00:28:53,954 --> 00:28:57,799
I mean relatively speaking, we're right as fast as this stuff is growing.

425
00:28:57,799 --> 00:28:58,828
We are right there.

426
00:28:58,828 --> 00:29:01,368
Well, I think you have two problems.

427
00:29:01,488 --> 00:29:06,228
one, the biomechanical interface, the neural interface, so you're to test chud into your
brain.

428
00:29:06,228 --> 00:29:12,488
The only person that's really working on that, that's publicly talking about it, is
somebody nobody trusts.

429
00:29:12,628 --> 00:29:14,228
So that's going to slow that down.

430
00:29:14,228 --> 00:29:15,348
That's going to fuck that up.

431
00:29:15,348 --> 00:29:17,788
there are people out there working on different pieces.

432
00:29:17,788 --> 00:29:26,868
The French have gone through and they've actually created an overlay for people's eyes to
where it's almost like a contact lens that basically helps blind people be able to see.

433
00:29:27,292 --> 00:29:31,095
can go through and it can basically connect right into your neural cortex.

434
00:29:31,095 --> 00:29:39,611
And by doing that, it's it's basically simulates the idea of having eyes that work like if
you have full cataracts, they can't get all the way through.

435
00:29:39,611 --> 00:29:40,822
Right.

436
00:29:40,822 --> 00:29:44,108
And I think this was like five, six, seven years ago when they came up with this shit.

437
00:29:44,108 --> 00:29:52,610
So it's not new auditory implants, cochlear implants, people that go through and have
these things in that pipe directly into this into the hearing centers and speech center

438
00:29:52,610 --> 00:29:55,582
parts of your brain so you can understand and communicate like

439
00:29:56,064 --> 00:30:07,624
Those pieces have been built and have already kind of gone into play, but they've been
experimental and they've come out slowly because they have a terribly large number of side

440
00:30:07,624 --> 00:30:07,999
effects.

441
00:30:07,999 --> 00:30:09,400
You have to keep them clean all the time.

442
00:30:09,400 --> 00:30:11,497
You have to badger to swap them out.

443
00:30:11,558 --> 00:30:14,310
But the technology that we have is getting smaller.

444
00:30:14,310 --> 00:30:16,021
It's going to fit in spaces better.

445
00:30:16,021 --> 00:30:23,788
There's a new company that just came out with a new transistor chip that is performing as
well as the GPUs and LPUs.

446
00:30:23,808 --> 00:30:28,992
that people are using for all these AI models, but it uses photonics as opposed to using
electrons.

447
00:30:29,012 --> 00:30:36,769
And the difference is that when you send light through the circuits, the circuits don't
have to be as big and they can be directed because photons move in a straight line.

448
00:30:36,769 --> 00:30:39,160
They don't follow patterns like electrons do.

449
00:30:39,321 --> 00:30:45,046
It makes them about 10 times more efficient in terms of power utilization.

450
00:30:45,046 --> 00:30:49,970
And because they don't create a bunch of heat, you can stack a bunch of them together so
it can get much smaller.

451
00:30:49,970 --> 00:30:53,182
And the first iteration of it they're putting into standard motherboards.

452
00:30:54,144 --> 00:30:55,985
Why is this nerd shit important?

453
00:30:55,985 --> 00:31:04,610
Well, it's making things smaller and they've just we've just overcome something that we
knew would be a problem later on down the line with trying to make these things soft

454
00:31:04,610 --> 00:31:13,475
transistors sit in human bodies because you've gone through and yes, there's mechanical
components, but you can put them on silicon and not have to have any type of metal

455
00:31:13,475 --> 00:31:17,237
interface that causes degradation or leaching into the brain.

456
00:31:17,437 --> 00:31:21,820
These things become something that is now plausible in these different scenarios.

457
00:31:21,820 --> 00:31:22,420
So like

458
00:31:22,420 --> 00:31:29,204
If you were to do a neural interface today, you'd have to carry a backpack around with you
to keep all the processing in it.

459
00:31:29,204 --> 00:31:34,848
And you'd have to be plugged into a fucking at least 40 amp circuit to power it.

460
00:31:35,789 --> 00:31:37,909
This takes care of that problem.

461
00:31:38,130 --> 00:31:46,675
And because of the way that it puts these things forward, it might actually be
biochemically capable of being powered, i.e.

462
00:31:46,675 --> 00:31:47,884
let's bump up.

463
00:31:47,884 --> 00:31:56,624
the power of your body generating X amount of electrical current inside of it and use it
as a bio wetware soft computer.

464
00:31:57,024 --> 00:31:58,784
This is full sci fi shit.

465
00:31:58,784 --> 00:32:03,104
Like I am way down the line thinking about this this problem.

466
00:32:03,264 --> 00:32:07,264
But this concept was introduced in like the 70s.

467
00:32:07,264 --> 00:32:07,953
So.

468
00:32:07,953 --> 00:32:10,922
that computer wouldn't have been a backpack, it would have been a warehouse.

469
00:32:10,924 --> 00:32:21,984
Yeah, like, like it would have been a fucking Costco, like Costco giant Cray computers,
like that crazy computer that we saw on Tron back when that first came out with like the

470
00:32:21,984 --> 00:32:23,404
massive lead doors.

471
00:32:23,404 --> 00:32:28,804
You know, I mean, I mean, that shit's it's not we're not talking science.

472
00:32:28,804 --> 00:32:31,024
I mean, it is science fiction, but it's near science fiction.

473
00:32:31,024 --> 00:32:32,184
It's close.

474
00:32:32,764 --> 00:32:35,364
Who can afford it and who gets access to it?

475
00:32:35,364 --> 00:32:40,300
That becomes a different scenario because it's going to be.

476
00:32:40,300 --> 00:32:42,480
It's going to be like cars, right?

477
00:32:42,480 --> 00:32:45,080
When cars first came out, the average person could not buy it.

478
00:32:45,080 --> 00:32:57,080
And then Henry Ford came up with the Model T and people talk about how great it was, but
people don't talk about all the accidents and all the dead.

479
00:33:00,840 --> 00:33:01,540
Right.

480
00:33:01,540 --> 00:33:10,220
Well, and then we're going to wind up with these artificial robot assistants like, you
know, again, more Musk working on this whole optimist thing, this $30,000 home robot.

481
00:33:10,220 --> 00:33:11,800
that's going to come through and clean everything up.

482
00:33:11,800 --> 00:33:14,300
What am I going to do with all this extra free time?

483
00:33:14,340 --> 00:33:15,480
Well, I'll tell you what you're going to do.

484
00:33:15,480 --> 00:33:22,900
You're going to go do dig ditches because it's going to be cheaper for them to pay you to
dig ditches than it is for them to pay the $30,000 smart robot to dig ditches.

485
00:33:22,900 --> 00:33:28,960
And that's what you're going to have to do to be able to afford your brain subscription
and your robot.

486
00:33:29,100 --> 00:33:31,380
Do all the rest of this shit around your house.

487
00:33:31,380 --> 00:33:32,900
And you know what's going to happen?

488
00:33:33,360 --> 00:33:40,060
Your brain assistant is going to have way more leisure time than you, and it's going to
recycle your unused brain time for this.

489
00:33:40,234 --> 00:33:47,277
and your robot is gonna have way more leisure time than you, because it's just gonna
basically turn you off so you stop making such a fucking mess in your house.

490
00:33:47,277 --> 00:33:51,308
It's gonna get pissed off at you for not doing your dishes just like everybody else in
your house does.

491
00:33:51,308 --> 00:33:54,936
but you'll have way more time to go to your AI addiction meeting to get over all of it.

492
00:33:54,936 --> 00:33:55,578
So it'll be great.

493
00:33:55,578 --> 00:33:56,826
It'll all work out just fine.

494
00:33:56,826 --> 00:34:02,039
shut your ass down and repurpose your brain to be used for other more productive tasks.

495
00:34:02,039 --> 00:34:04,352
Well, on that uplifting note, perhaps we should go.

496
00:34:04,352 --> 00:34:08,065
I've got conversations with robots to have.

497
00:34:09,598 --> 00:34:10,799
That sounds fun.

498
00:34:10,799 --> 00:34:11,540
Yes.

499
00:34:11,540 --> 00:34:18,847
And on a side note, if anybody's not using AI today, that's listening to this podcast.

500
00:34:19,048 --> 00:34:20,429
You're not listening to this podcast.

501
00:34:20,429 --> 00:34:22,151
Everyone's using AI today.

502
00:34:22,552 --> 00:34:23,842
Whether you like it or not.

503
00:34:23,842 --> 00:34:29,745
it yet because Google hasn't that hasn't referenced it as a show to listen to when you
went looking for something else.

504
00:34:30,746 --> 00:34:31,726
All right.

505
00:34:31,726 --> 00:34:32,066
All right.

506
00:34:32,066 --> 00:34:33,057
Well, that's going to do it for us.

507
00:34:33,057 --> 00:34:34,037
I hope you have enjoyed this.

508
00:34:34,037 --> 00:34:37,299
I hope you've gotten something out of it and your brain subscription will be going up next
year.

509
00:34:37,299 --> 00:34:40,031
So we'll have that conversation next time we get together.

510
00:34:40,031 --> 00:34:48,845
If you would like to share this episode with a friend, please do so through our website
that you can share the fit mess dot com slash follow so they can follow us on whatever

511
00:34:48,845 --> 00:34:50,134
podcast player they're using.

512
00:34:50,134 --> 00:34:53,943
We appreciate you listening and we will be back in about a week at thefitmass.com.

513
00:34:53,943 --> 00:34:55,248
Thanks so much for listening.

514
00:34:55,350 --> 00:34:56,021
Thanks guys.