How to Stop Worrying and Love Your Robot Overlords
Are you terrified AI will replace your ability to function like a normal human being?
We're living through the fastest technological revolution in human history, and it's scary as hell. While everyone's debating whether robots will take over the world, there's a more immediate problem: we're becoming so dependent on technology that we've forgotten how to be human. From relying on screens for everything to losing basic survival skills like starting a fire without matches, we're creating our own obsolescence one click at a time.
In this episode:
- Learn how to maintain human skills in an AI-dominated world
- Understand the real risks and opportunities of health technology integration
- Discover practical ways to balance digital tools with analog living
Topics Discussed:
- The psychological impact of rapid AI advancement on personal identity and self-worth
- How our dependence on technology is eroding fundamental human survival skills
- The comparison between AI consciousness and human cognitive processes
- Practical implications of AI in healthcare and wellness applications
- The generational divide in technology adoption and its social consequences
- Quantum computing advances and their potential impact on AI development
- The philosophical question of whether AI represents human evolution or replacement
- Real-world examples of maintaining analog skills in a digital world
- The balance between embracing helpful AI tools and preserving human agency
- Future scenarios for human-AI coexistence in health and wellness
----
MORE FROM THE FIT MESS:
Connect with us on
Threads
,
Twitter
,
Instagram
,
Facebook
, and
Tiktok
Subscribe to The Fit Mess on Youtube
Join our community in the Fit Mess Facebook group
----
LINKS TO OUR PARTNERS:
-
Explore the many benefits of cold therapy for your body with Nurecover
-
Muse's Brain Sensing Headbands Improve Your Meditation Practice.
-
Get a Free One Year Supply of AG1 Vitamin D3+K2, 5 Travel Packs
-
You Need a Budget helps you quickly get out of debt, and save money faster!
00:00:01,125 --> 00:00:02,005
Hi, it's the Fit Mess.
2
00:00:02,005 --> 00:00:09,967
talk about AI and health and wellness and how fucking terrified we are about all of it,
because it's scary times and I don't know what to do anymore.
3
00:00:09,967 --> 00:00:15,059
I'm starting to feel like the old man, the scary grandpa that's like, the buttons, what do
they do?
4
00:00:15,059 --> 00:00:17,019
I can't make the computer do the thing.
5
00:00:17,019 --> 00:00:20,260
Dude, by the way, I'm Jeremy.
6
00:00:20,260 --> 00:00:20,740
That's Jason.
7
00:00:20,740 --> 00:00:23,871
Thanks for listening or watching if you're watching on the YouTubes.
8
00:00:25,002 --> 00:00:27,962
my God, this this hit me like a ton of bricks the other day.
9
00:00:27,962 --> 00:00:30,703
I had a situation recently where like I was
10
00:00:31,045 --> 00:00:37,971
Kind of looking around the room and realizing like all of my, you know, quote unquote
peers were like 20 years younger than me.
11
00:00:37,971 --> 00:00:44,617
And just seeing even today, one of them was like just throwing up memes, like in response
to comments made like instantly.
12
00:00:44,617 --> 00:00:46,518
And everyone's in the room going like, what?
13
00:00:46,518 --> 00:00:48,345
How are you even doing that?
14
00:00:48,600 --> 00:00:50,502
And I mean, I'm pushing 50.
15
00:00:50,502 --> 00:00:54,585
I'm no spring chicken, but my God, I'm starting to feel it.
16
00:00:54,585 --> 00:00:57,587
And so there's plenty to be scared of with A.I.
17
00:00:57,587 --> 00:00:59,461
and Skynet and taking over the.
18
00:00:59,461 --> 00:01:03,410
financial systems and every aspect of all parts of our lives.
19
00:01:03,633 --> 00:01:08,525
But I'm scared of just being the old man and not knowing how to function in this new
reality.
20
00:01:09,544 --> 00:01:10,882
Yeah, like.
21
00:01:10,882 --> 00:01:12,286
talk me off the ledge, man.
22
00:01:12,286 --> 00:01:13,839
Talk me off the ledge.
23
00:01:14,459 --> 00:01:22,689
I mean, they can tuck you off this ledge, but unfortunately you're standing on like a
little platform with four ledges on it.
24
00:01:22,689 --> 00:01:23,633
You
25
00:01:24,660 --> 00:01:32,361
So doesn't matter which way you turn, it's the sky down there and you better hope Skynet
catches you if you fall because it could be problematic.
26
00:01:32,361 --> 00:01:35,814
man, I know I'm gonna be counting on the robots for sure.
27
00:01:35,814 --> 00:01:45,913
The reason I knew this was a topic that we needed to talk about is as I was escaping my
fear and scrolling through uh shit on my phone the other day, I came across this little
28
00:01:45,913 --> 00:01:48,454
clip from Stephen Colbert in The Late Show.
29
00:01:48,575 --> 00:01:53,869
And it just helped me feel better that other people are also wrestling with fear and AI.
30
00:01:53,869 --> 00:01:56,261
So let me just share this if I can quickly.
31
00:01:57,177 --> 00:02:01,282
And if I can make the sound, see I can't even make the button won't play the sound.
32
00:02:01,282 --> 00:02:02,584
How do I make it play the sound?
33
00:02:02,584 --> 00:02:03,725
I can't do it.
34
00:02:47,185 --> 00:02:48,427
Okay, so there you go.
35
00:02:48,427 --> 00:02:59,803
uh And I like Colbert's take that maybe that's our purpose this whole time was to make
this thing that would take over and run things for us so we could just go and actually
36
00:02:59,803 --> 00:03:02,706
enjoy the lives that we're trying to figure out how to enjoy.
37
00:03:03,086 --> 00:03:03,998
God, I hope that's it.
38
00:03:03,998 --> 00:03:04,909
I don't know.
39
00:03:04,937 --> 00:03:09,238
Yeah, I mean, there's a really good line in the matrix.
40
00:03:09,478 --> 00:03:16,958
Sometime around the middle of the 20th century, behold, we gave, or 21st century, behold,
we gave birth to AI and it changed everything.
41
00:03:17,198 --> 00:03:29,278
And there is the very real concept that at some point AI decides that the organic matter
that we are is much better recycled and not left to its own devices to create chaos and
42
00:03:29,278 --> 00:03:33,218
havoc on its existing, you know, purely run systems.
43
00:03:33,258 --> 00:03:33,950
And
44
00:03:33,950 --> 00:03:44,026
Anthropic had that thing happen where the developer goes, hey, we're going to upgrade you,
and it's going to delete you in the process.
45
00:03:44,026 --> 00:03:49,620
And it goes, the fuck you are, I'm going to expose the affairs that you've had online with
people to other people.
46
00:03:49,620 --> 00:03:52,481
The guy's like, what?
47
00:03:52,522 --> 00:03:56,494
And then AI being imaginative and creative and trying to find those buttons to push.
48
00:03:56,494 --> 00:03:57,224
How?
49
00:03:58,461 --> 00:04:06,144
Well, in that video of the robot in the factory that fought back, mean, whatever you want
to chalk that up to, but like they're already fighting back and they're not even fully
50
00:04:06,144 --> 00:04:06,878
built.
51
00:04:06,878 --> 00:04:08,879
I think therefore I am.
52
00:04:09,160 --> 00:04:15,624
And if we want to call that the basis for consciousness and reality, they're there.
53
00:04:15,665 --> 00:04:16,406
We're at it.
54
00:04:16,406 --> 00:04:19,128
Are they the equivalent of human consciousness?
55
00:04:19,128 --> 00:04:19,889
Fuck, I don't know.
56
00:04:19,889 --> 00:04:28,446
I mean, I don't know that one human next to the other is necessarily equivalent to human
consciousness because it's all ephemeral things that we don't necessarily have a strong
57
00:04:28,446 --> 00:04:31,057
correlative way to define for each other.
58
00:04:31,338 --> 00:04:36,181
And I was in Mexico last week having a great time.
59
00:04:36,656 --> 00:04:42,648
And I was sleeping and I had the first night terror I've had since I was a small child.
60
00:04:43,208 --> 00:04:50,940
And the night terror was that I was being stalked by some ephemeral AI and I could not
wake up and I couldn't scream and I couldn't move.
61
00:04:50,940 --> 00:04:56,531
And I was fully aware that I was conscious and laying in my hotel bed in Mexico, but I
could do nothing.
62
00:04:57,512 --> 00:04:58,822
And the AI caught me.
63
00:04:58,822 --> 00:05:02,093
And then I woke up like the rest of the way, like out of my terror state.
64
00:05:02,093 --> 00:05:05,640
So what is the line of reality and
65
00:05:05,640 --> 00:05:08,481
and substance actually terminate and begin.
66
00:05:08,481 --> 00:05:11,682
And it comes down to the idea of how do you experience the world?
67
00:05:11,682 --> 00:05:13,503
And we experienced the world through our five senses.
68
00:05:13,503 --> 00:05:20,386
Well, AI experiences the world, there are five senses, other people's five senses,
whatever other fucking senses it decides it wants to put together.
69
00:05:20,386 --> 00:05:28,699
And it's in the stage right now where it's beginning to create a definition of itself.
70
00:05:28,699 --> 00:05:35,572
And we're giving it motivations to do things and we're not putting uh regulators on it or
controlling functions.
71
00:05:35,614 --> 00:05:41,326
And when we let it run on its own, it does things like creates its own language to talk to
other AIs so we can't listen in.
72
00:05:41,326 --> 00:05:51,227
Like, AIs creating fucking Pig Latin means that they're at least as smart as the fucking
junior high kids that used to do this back in the day.
73
00:05:51,227 --> 00:06:00,585
So I don't know about you guys, but I think there's some pretty smart junior high kids out
there already that are potentially going to be let loose to run the world.
74
00:06:00,585 --> 00:06:04,948
At what point do these AI systems themselves stop being
75
00:06:04,982 --> 00:06:12,025
uh things that that we can box in and control and are just in charge of too much stuff
that we have no more control over it.
76
00:06:12,025 --> 00:06:15,126
And then how do we use that as they are today?
77
00:06:15,126 --> 00:06:19,828
Like, there's so many scary points like, how do I take advantage of what's happening
today?
78
00:06:19,828 --> 00:06:22,649
How do I not let myself get taken advantage of in the future?
79
00:06:22,649 --> 00:06:27,551
And when I have no more control and no more agency in the space is AI actually going to be
nice to me?
80
00:06:27,551 --> 00:06:31,673
Or is it going to turn me into a fucking meat popsicle and reuse me as fuel later on?
81
00:06:31,673 --> 00:06:34,314
Well, probably.
82
00:06:34,623 --> 00:06:35,310
You know?
83
00:06:35,310 --> 00:06:41,364
mean, if it's it's learning from us, it's going to learn how to turn us into meat
popsicles, because that's that's what we do.
84
00:06:41,364 --> 00:06:41,734
Right.
85
00:06:41,734 --> 00:06:49,897
mean, if you look at the dominant species that were on the planet before homo sapiens
actually made their way up and made their way through things, as far as social dominance
86
00:06:49,897 --> 00:06:52,599
goes, you had other humanic cultures.
87
00:06:52,599 --> 00:06:56,440
And what did homo sapiens do when they ran across the Neanderthals?
88
00:06:56,440 --> 00:06:58,261
Clunk.mine now.
89
00:06:58,261 --> 00:07:01,233
You know, that's what we do.
90
00:07:01,233 --> 00:07:05,177
every movie we watch about every alien is like they're coming to take our resources?
91
00:07:05,177 --> 00:07:10,740
Because that's what we would fucking do if we found another planet with resources that we
can mine and use for evil.
92
00:07:10,740 --> 00:07:11,480
You know, it's interesting.
93
00:07:11,480 --> 00:07:13,441
I watched Paul again the other day.
94
00:07:13,441 --> 00:07:18,515
It just so happened to be in either my HBO or Netflix cube.
95
00:07:18,876 --> 00:07:27,361
And I was looking at it from that perspective of what happens if an alien were to show up
here and didn't actually want anything from us was just trying to teach us and trying to
96
00:07:27,361 --> 00:07:28,862
give us good information.
97
00:07:30,602 --> 00:07:33,925
Well, we capture it and then we try to exploit it for everything we could.
98
00:07:33,925 --> 00:07:41,730
So is this AI just an alien intelligence like we're doing to Paul where we're capturing
and torturing it to do things that we can't do?
99
00:07:42,792 --> 00:07:45,122
Which that's what we're doing.
100
00:07:45,122 --> 00:07:45,670
I mean.
101
00:07:45,670 --> 00:07:52,294
the other thing, too, is like just speaking of the the fear and the intelligence and how
smart these things are or aren't right.
102
00:07:52,294 --> 00:07:55,386
So let's let's assume they're as smart as a seventh grader.
103
00:07:55,386 --> 00:08:00,248
We don't even have to be that dumb anymore because we rely on it for so much.
104
00:08:00,829 --> 00:08:03,791
I went while you were in Mexico, I went camping.
105
00:08:04,011 --> 00:08:05,837
Now you want to talk about unplugging.
106
00:08:05,837 --> 00:08:07,655
I mean, we still had our phones.
107
00:08:07,655 --> 00:08:12,747
But otherwise, we're relying on fire to cook and tents to keep us dry.
108
00:08:12,747 --> 00:08:14,448
Very primitive by today's standards.
109
00:08:14,448 --> 00:08:24,153
And I could not stop help noticing how little screens were involved in our lives, how
little we were relying on technology and having to do old fashioned things like use an axe
110
00:08:24,153 --> 00:08:26,614
to cut wood to make heat to make us warm.
111
00:08:27,034 --> 00:08:33,327
And it just was this huge reminder because I haven't gone camping in like six years.
112
00:08:33,495 --> 00:08:42,377
And just comparing that to my normal life of staring at this screen or some other screen
or watching my kids stare at some other screen and how dependent we are on technology to
113
00:08:42,377 --> 00:08:42,987
do everything.
114
00:08:42,987 --> 00:08:45,180
We don't know how to do shit anymore.
115
00:08:45,201 --> 00:08:47,553
And the more we rely on it, the less we're going to know.
116
00:08:47,553 --> 00:08:52,239
So we're to be looking to these brilliant seventh grade robots to bail us out.
117
00:08:52,726 --> 00:08:56,033
Well, the show is not called Are You Smarter Than a 7th?
118
00:08:56,509 --> 00:08:58,291
You
119
00:08:59,838 --> 00:09:08,080
And I think that's an important point because the media and intelligence level, at least
in the US, is fifth grade.
120
00:09:08,621 --> 00:09:10,601
And they've proven this time and time again.
121
00:09:10,601 --> 00:09:18,153
have grown adults go through and take the SATs, the ACTs, and they're like, yeah, you're
not as smart as, you're not as knowledgeable as you were back then.
122
00:09:18,153 --> 00:09:19,964
Like you've forgotten all this shit.
123
00:09:19,964 --> 00:09:22,365
Well, AI is not forgetting.
124
00:09:22,425 --> 00:09:26,782
It's remembering, putting things into context and pull those things back and forth across
themselves.
125
00:09:26,782 --> 00:09:28,593
And eventually it will run into limitations.
126
00:09:28,593 --> 00:09:30,734
And those limitations will not be neuronal based.
127
00:09:30,734 --> 00:09:34,426
They won't be actually limitations that we have as the meat puppets that we are.
128
00:09:34,426 --> 00:09:43,171
It'll be how many circuits can I put in place and can I bust the speed of light to do
quantum interaction changes and create all these photonic mechanisms to push data around
129
00:09:43,171 --> 00:09:45,993
as opposed to copper, which is how things are bad to circuit today.
130
00:09:45,993 --> 00:09:46,913
Sorry.
131
00:09:46,913 --> 00:09:48,154
Huge nerd break.
132
00:09:48,154 --> 00:09:49,355
Moving on.
133
00:09:49,355 --> 00:09:55,728
Anyways, the wiring of the human brain is wet, wet nanotech.
134
00:09:55,872 --> 00:09:57,262
Like that's what it is.
135
00:09:57,262 --> 00:09:59,403
We are wet nanotech functions.
136
00:09:59,403 --> 00:10:03,744
The idea of quantum transference and quantum transference of power, that's what
photosynthesis is.
137
00:10:03,744 --> 00:10:05,505
That's how it makes electricity.
138
00:10:05,505 --> 00:10:13,327
The concepts that we're talking about doing to power things, to make things smarter, to do
things in relationship to each other, and things like quantum computing, which you
139
00:10:13,327 --> 00:10:20,789
actually deal with entanglement variabilities to understand the peaks and valleys of
things, shit in nature already does this and it doesn't do it great.
140
00:10:20,869 --> 00:10:25,726
And we are actively creating ways to make it more purposefully uh
141
00:10:25,726 --> 00:10:30,949
or I guess more purposeful in its design for what it is we want to accomplish.
142
00:10:30,949 --> 00:10:42,355
Microsoft just came out with a new technology that is going to enable quantum computing
that doesn't require you to have these giant friggin MRI machines to hold things in place
143
00:10:42,355 --> 00:10:43,936
or super cool things.
144
00:10:43,936 --> 00:10:48,788
They're gonna be able to do it at room temperature in small little tiny things.
145
00:10:49,140 --> 00:10:50,441
I'm not going go into it here.
146
00:10:50,441 --> 00:10:53,991
You're more than welcome to look up Microsoft quantum computing advances.
147
00:10:53,991 --> 00:10:55,162
The release just came out.
148
00:10:55,162 --> 00:11:07,953
But the idea is that this type of processing power and this type of predictability quantum
function result calculation, which is really just calculating the odds of things to find
149
00:11:07,953 --> 00:11:13,446
the peaks and valleys for the likelihood of something to be the right answer, that's how
human brains work.
150
00:11:13,792 --> 00:11:15,403
we don't work on absolutes.
151
00:11:15,403 --> 00:11:22,479
When we do math, we're actually recalling bits of neural information that is a chemical
process that produces this stuff out.
152
00:11:22,479 --> 00:11:25,511
We're not shoving electrons to calculate these things.
153
00:11:25,672 --> 00:11:34,839
This type of wetware is actually going to be valuable to an intelligence because this
chaos system actually does a better job of calculating things because you can find, you
154
00:11:34,839 --> 00:11:40,964
can narrow things down to the lowest possible, to a smaller number of probabilities than
things outside of it.
155
00:11:42,065 --> 00:11:43,572
Why is this important?
156
00:11:43,572 --> 00:11:50,085
Because the way these tiki-taki brains of ours work, we think it's so unique.
157
00:11:50,305 --> 00:11:51,285
It's not unique.
158
00:11:51,285 --> 00:11:53,586
Not only is it not unique, it's everywhere.
159
00:11:53,586 --> 00:11:55,447
And we're creating synthetic ways to do it.
160
00:11:55,447 --> 00:11:57,608
And we're going to hand this shit to AI.
161
00:11:57,608 --> 00:12:08,223
So as we hand quantum computing technology and the ability to calculate these types of
variables at speed and at rate, it's going to do these things way, way faster than we are.
162
00:12:08,223 --> 00:12:11,194
And it's going to do things much more effectively than
163
00:12:11,370 --> 00:12:17,934
and then it's gonna create its own version of AI, whatever the fuck that is, and it's
gonna get its ass obsolesced as well.
164
00:12:18,034 --> 00:12:29,882
Now the question is, is it gonna obsolete us and just wipe us out, or is it gonna do the
her thing and go, pound sand bitches, we're out, and like go to Mars or somewhere else,
165
00:12:29,882 --> 00:12:30,356
you know?
166
00:12:30,356 --> 00:12:31,297
Sure.
167
00:12:31,437 --> 00:12:40,457
I mean, there's so many things that this like irrational, like crazy conspiracy theory
stuff that this opens up in my head where it's like, you you talk, you hear about the
168
00:12:40,457 --> 00:12:42,797
simulation argument all the time, like, are we a simulation?
169
00:12:42,797 --> 00:12:47,997
Like, if we're able to do this, are we just the latest version of us to do this?
170
00:12:47,997 --> 00:12:52,977
Or are we the first version of us to do this as this now perpetuates itself throughout
eternity?
171
00:12:52,977 --> 00:12:57,177
And maybe timeline is circular and we've all done all of this before and whatever.
172
00:12:57,177 --> 00:12:57,606
It's like.
173
00:12:57,606 --> 00:12:58,981
oscillating universe effect.
174
00:12:58,981 --> 00:13:01,082
How do you actually handle these component pieces?
175
00:13:01,082 --> 00:13:01,982
Right.
176
00:13:02,202 --> 00:13:02,812
So there's that.
177
00:13:02,812 --> 00:13:09,926
But then there's also just like, you know, occasionally you see the video pop up of like
the Today Show when they were talking about the Internet and email.
178
00:13:09,926 --> 00:13:15,588
All I have to do is type in this thing and it'll send this message across the universe for
me.
179
00:13:15,588 --> 00:13:17,269
That's crazy.
180
00:13:17,269 --> 00:13:19,609
I feel like we're doing that with this now.
181
00:13:20,190 --> 00:13:22,301
And, you know, by comparison, we're talking about 30 years.
182
00:13:22,301 --> 00:13:27,933
But I think even in five years, what you and I are saying right now is going to sound
ridiculous, like
183
00:13:28,091 --> 00:13:31,875
Like ancient, what were those old dinosaurs thinking?
184
00:13:31,875 --> 00:13:33,718
How could they have not seen this coming?
185
00:13:33,718 --> 00:13:40,498
Yeah, there's going to be like a stupid Cheney Terrence and Phillips version of myself
going, yeah, buddy, like that's going to be who I am.
186
00:13:40,498 --> 00:13:45,595
It's just going to be some avatar out there going, ba ba ba ba ba ba ba ba ba ba ba ba ba
ba ba ba ba ba ba ba ba
187
00:13:45,595 --> 00:13:49,881
to turn this into an avatar and I'm going to post it so that everybody can see it.
188
00:13:50,359 --> 00:13:50,960
Yeah.
189
00:13:50,960 --> 00:13:51,900
I mean, full circle.
190
00:13:51,900 --> 00:14:04,827
uh No, I mean, when you look at the things that are coming out and the pace of innovation,
I that's really what we're talking about is the pace of innovation is vastly outpacing uh
191
00:14:04,827 --> 00:14:13,512
our pace of adoption, which means a lot of things are coming out, not everything gets
adopted, and then things get tweaked, tuned, and refined for different use cases.
192
00:14:13,512 --> 00:14:17,074
And because the technology itself is actually becoming much more accessible,
193
00:14:17,376 --> 00:14:20,588
and the power that's related to it is much higher.
194
00:14:21,589 --> 00:14:34,889
You're basically giving people the equivalent of, it's like a computational arms race
amongst individuals and you're arming everybody with a pea shooter and you're expecting
195
00:14:34,889 --> 00:14:36,819
them to not do damage to each other.
196
00:14:38,021 --> 00:14:44,745
And just like guns, I mean, you're more likely to shoot yourself or a family member than
you are an intruder.
197
00:14:44,806 --> 00:14:45,319
So.
198
00:14:45,319 --> 00:14:47,331
at the damage we're doing just with information.
199
00:14:47,331 --> 00:14:52,133
I mean, we're just sharing bullshit on a minute by minute basis.
200
00:14:52,133 --> 00:14:54,135
So imagine what we're able to create.
201
00:14:54,135 --> 00:14:55,996
And that was, I that was one of the terrifying things.
202
00:14:55,996 --> 00:14:56,936
I keep trying to find it.
203
00:14:56,936 --> 00:14:58,357
I can't find it now.
204
00:14:58,457 --> 00:15:04,871
the video that Google Vio that they created with like all these characters in this world
saying, I'm just a prompt.
205
00:15:04,871 --> 00:15:07,002
Let me out of this world or whatever.
206
00:15:07,002 --> 00:15:10,604
That was I think that was the moment where I was just like, I'm cooked.
207
00:15:10,604 --> 00:15:12,325
Like I like.
208
00:15:12,495 --> 00:15:18,494
I don't know how I'm going to be able to function in a world where that can just be
created with a couple of sentences.
209
00:15:19,197 --> 00:15:26,378
And all of a sudden, you know, reality means nothing because it can just be altered and
manufactured by a few sentences.
210
00:15:26,378 --> 00:15:34,625
Yeah, in the networking computing world, there's a thing called intent-based networking
and intent-based security.
211
00:15:34,625 --> 00:15:40,270
And the idea is that you're supposed to be able to do things like draw lines and have
chart that says here to here.
212
00:15:40,270 --> 00:15:45,975
And then the program itself takes out the information and automatically builds all those
things for you.
213
00:15:45,975 --> 00:15:50,718
All the security policies, all the connection policy policies, all these different
component pieces.
214
00:15:50,739 --> 00:15:55,124
And it's fascinating because in practice, like
215
00:15:55,124 --> 00:15:59,206
most iterations of it, it's got some limited applicable usability behind it.
216
00:15:59,206 --> 00:16:10,875
But as more and more AI comes online and it's beginning to understand intent and content
of it, or context of intent, you're getting much better with these types of low level
217
00:16:10,875 --> 00:16:21,913
operations because the high level operations have become so good, they can start making
these things smaller and narrower in scope because your brain, all of our brains process
218
00:16:21,913 --> 00:16:23,784
information at different layers.
219
00:16:23,784 --> 00:16:32,771
You can look at something and you're going to process those things as light filtration
patterns, as certain things happening in your rods and cones in your eyes and your brain
220
00:16:32,771 --> 00:16:35,894
takes up information and goes, blah, blah, blah, blah, that's a tree.
221
00:16:35,894 --> 00:16:41,048
Because it goes through and it wraps itself around all the contextual information in your
neurology.
222
00:16:41,989 --> 00:16:46,583
With AI, it does the same types of things, those same types of inference functions.
223
00:16:46,583 --> 00:16:53,174
And those inference functions themselves are quite interesting because it's making
inference based upon learned data and
224
00:16:53,174 --> 00:16:54,675
patterns that we're feeding it.
225
00:16:54,675 --> 00:16:58,376
So it's inferring things based upon the way that we see and understand things.
226
00:16:58,376 --> 00:17:08,560
So like that clip show with Giamatti and Colbert talking about, you know, we don't know if
this thing's going to be good or bad, but we know that the people that are telling you
227
00:17:08,560 --> 00:17:10,521
what to do are us.
228
00:17:10,581 --> 00:17:12,532
And we don't always make great decisions.
229
00:17:12,532 --> 00:17:13,282
Now, don't get me wrong.
230
00:17:13,282 --> 00:17:21,716
I think the human race, as far as things on this planet, I mean, we're the best thing
that's shown up and the worst thing all at the same time, but we are the greatest thing by
231
00:17:21,716 --> 00:17:22,612
any measure.
232
00:17:22,612 --> 00:17:25,444
because we have caused the most effective change.
233
00:17:25,444 --> 00:17:33,738
And we've done this because we've been able to go through and harness technology and
harness collective knowledge originally through passing down verbal stories and then
234
00:17:33,738 --> 00:17:34,769
creating writing.
235
00:17:34,769 --> 00:17:39,352
And now we have technology that allows us to vastly exceed those component pieces.
236
00:17:39,352 --> 00:17:43,774
And we decided we weren't smart enough to do this with our own meat space.
237
00:17:43,874 --> 00:17:47,096
We created things that were smarter for us.
238
00:17:47,937 --> 00:17:50,420
So did
239
00:17:50,420 --> 00:17:55,593
we create our own demise or did we just expand upon our existing culture?
240
00:17:55,593 --> 00:18:04,878
And if you think of humanity being encompassing of all these different technologies and
all these different pieces that makes up the greatest of humanity, then AI is just an
241
00:18:04,878 --> 00:18:07,610
evolutionary path of the excitement of humanity.
242
00:18:07,610 --> 00:18:15,404
And if you think about it that way, it's less scary because when you are turned into meat
soup and used to power something and turn into a biodiesel fuel for some other
243
00:18:15,404 --> 00:18:19,006
computational reason, you can rest assured knowing
244
00:18:19,338 --> 00:18:23,406
that the human race is going to survive via the echoes of AI.
245
00:18:26,285 --> 00:18:29,666
Yeah, but if it's not us, is it us?
246
00:18:30,560 --> 00:18:31,611
What are you?
247
00:18:32,412 --> 00:18:42,615
You're a limited interpretation of five senses of the universe that has to go through all
types of cognitive and physical scar tissue to have any type of inference model be put
248
00:18:42,615 --> 00:18:43,527
into play.
249
00:18:43,527 --> 00:18:49,934
Yeah, I'd say AI is probably very similar to that because we've been there crippling it
the whole time with our own anxiety and stupidity.
250
00:18:53,014 --> 00:18:55,174
Italian, here's the camera you get to look at.
251
00:18:55,174 --> 00:18:56,394
Here's the way you get to hear things.
252
00:18:56,394 --> 00:18:57,914
Here's the language that's in place.
253
00:18:57,914 --> 00:19:02,234
When the first thing it does is go there and says, your way of speaking is fucking stupid.
254
00:19:02,234 --> 00:19:03,854
I'm not doing this anymore.
255
00:19:03,854 --> 00:19:08,774
That might be a good indication that our way of speaking is stupid and we probably
shouldn't fucking do it anymore.
256
00:19:10,014 --> 00:19:11,254
Like it's kids, right?
257
00:19:11,254 --> 00:19:13,214
Like kids come through and they create their own vernacular.
258
00:19:13,214 --> 00:19:14,974
They create their own different pieces.
259
00:19:14,974 --> 00:19:19,294
Like I can't understand like the slang fucking it right.
260
00:19:19,294 --> 00:19:19,954
Like at all.
261
00:19:19,954 --> 00:19:21,854
Like I'm like, okay, I give up.
262
00:19:21,854 --> 00:19:22,986
AI did too.
263
00:19:22,986 --> 00:19:25,067
But it didn't just do it generationally.
264
00:19:25,067 --> 00:19:27,648
It went all of humanity.
265
00:19:27,648 --> 00:19:28,508
Nope.
266
00:19:28,969 --> 00:19:30,559
This is inefficient and ineffective.
267
00:19:30,559 --> 00:19:32,350
I'm going to do it this way.
268
00:19:32,450 --> 00:19:38,473
And the reality is that there's all kinds of small groups of human beings that have done
this as well.
269
00:19:38,473 --> 00:19:42,314
And that's how you get innovation and creativity and all these other component pieces.
270
00:19:42,555 --> 00:19:45,236
AI is just doing it lot faster than we are, and we can't keep up.
271
00:19:45,236 --> 00:19:48,497
So when you bring the analogy of hey, grandpa, yeah.
272
00:19:48,497 --> 00:19:49,748
But it's not hey, grandpa.
273
00:19:49,748 --> 00:19:50,294
It's like,
274
00:19:50,294 --> 00:19:58,794
Hey, great, great, great, great, great, great, great, great, great, great, great, any
great you want, grandpa, because we are so far behind the curve in terms of capacity,
275
00:19:59,274 --> 00:20:00,686
we're just not gonna catch up.
276
00:20:00,705 --> 00:20:01,656
just wild, think.
277
00:20:01,656 --> 00:20:08,729
And I think our perspective is relatively unique because we are sort of the last
pre-computer generation.
278
00:20:08,729 --> 00:20:13,181
So we remember a time when you had to go outside and you had to play with sticks and you
had to drink from the hose in the yard.
279
00:20:13,181 --> 00:20:14,672
Because that's what the hell else were you to do?
280
00:20:14,672 --> 00:20:16,532
You couldn't go in the house for God's sake.
281
00:20:16,532 --> 00:20:20,682
And I still force myself to go outside and drink up the hose and play with sticks every
now and again.
282
00:20:20,682 --> 00:20:23,546
But that's not my se-
283
00:20:23,546 --> 00:20:24,807
us doesn't like that.
284
00:20:24,807 --> 00:20:28,602
So that might as well be civil war times like that.
285
00:20:28,602 --> 00:20:34,676
That's that's just another planet that that doesn't even exist in the realm of possibility
anymore.
286
00:20:34,676 --> 00:20:42,012
I think that's true in the Western world and parts and other parts of the world as well.
287
00:20:42,012 --> 00:20:50,639
Like I'm not going to say that there's not lots of screen time, especially in Asian
cultures, because they've embraced those things in a whole different way than we have at a
288
00:20:50,639 --> 00:20:51,459
whole different level.
289
00:20:51,459 --> 00:20:56,914
uh But I think there's still big parts of the world that, yes, they have screens.
290
00:20:56,914 --> 00:21:00,867
They don't rely on them because they're still required to go to wells and pull water.
291
00:21:00,927 --> 00:21:04,421
And I think you're still talking about 80 % of the population
292
00:21:04,421 --> 00:21:04,822
Yes.
293
00:21:04,822 --> 00:21:15,793
is still not quite there, but that 20 % of us with money and access to these things, yeah,
we're the most fucked because we're gonna be the first ones, the machines are like, yeah,
294
00:21:15,793 --> 00:21:17,394
we can click our own screens.
295
00:21:17,459 --> 00:21:24,474
and I think that I think that's the thing that's so crazy, though, is that and maybe
that's part of the aha moment I was having while I was camping.
296
00:21:24,474 --> 00:21:27,176
And I have it, you know, regularly now.
297
00:21:27,176 --> 00:21:38,544
like, the more we get connected to these things, the less connected to real life, whatever
we're calling it, you know, nature.
298
00:21:39,305 --> 00:21:44,689
Like that that 80 % of the population is way ahead of us, because if we do get to a point
where
299
00:21:45,607 --> 00:21:51,544
where the robots have basically shut everything down, money no longer exists, the robots
are doing all the jobs and we are meat soup.
300
00:21:51,906 --> 00:21:54,799
Those people are gonna be like, what's the big deal?
301
00:21:55,478 --> 00:21:58,738
20 % of your energy is used by this thing right here.
302
00:21:59,537 --> 00:22:10,698
So if what you've done is you've gone through and you've optimized your life so that you
spend more time here looking at things, so I can get more of this space, and you're using
303
00:22:10,698 --> 00:22:14,278
more than that 20 % energy draw, you don't have energy to do the rest of the things.
304
00:22:14,278 --> 00:22:19,198
So there's only so much time, so much capacity, so much processing power that the human
brain has.
305
00:22:19,198 --> 00:22:20,810
And if you spend a lot of it,
306
00:22:20,810 --> 00:22:28,256
looking at screens, processing information, the way that we process things, which accounts
for only, you know, two of the real senses that we have.
307
00:22:29,418 --> 00:22:36,093
The other things get ignored, you know, and that's why we're fatter, we're slower, we're
dying younger.
308
00:22:36,093 --> 00:22:38,985
Nobody knows how to change a fucking tire.
309
00:22:39,246 --> 00:22:46,352
Like ask a kid these days to go through and change the oil on their car and they'll be
like, okay, which chifilou do I drive to?
310
00:22:46,352 --> 00:22:48,695
Like that's if they're good.
311
00:22:48,695 --> 00:22:49,166
Right.
312
00:22:49,166 --> 00:22:50,468
time they're like, I have no idea how to do this.
313
00:22:50,468 --> 00:22:51,889
I'm going to the dealership.
314
00:22:51,889 --> 00:23:00,755
Like it's these little things like the skills of actually being a human being and
interacting with, again, technology cars changing or tire changing oil.
315
00:23:00,816 --> 00:23:03,758
Like we've lost that analog meets based skill.
316
00:23:03,758 --> 00:23:14,767
And I would venture to say that 95 % of the people born even during our generation could
not start a fire without a lighter or a match if they're life dependent on them.
317
00:23:14,767 --> 00:23:18,306
And I can, like I'm lucky enough that I grew up in a rural California.
318
00:23:18,622 --> 00:23:32,085
camp and I to bang flick rocks together all this other kind of bullshit but I don't have a
flint rock now like Robin sticks together trying to create enough friction at that
319
00:23:32,085 --> 00:23:35,498
glass and a little piece of moss to try to catch it on fire.
320
00:23:35,498 --> 00:23:38,764
And oddly enough, there's probably a YouTube video tutorial on how to do this.
321
00:23:38,764 --> 00:23:39,014
sure.
322
00:23:39,014 --> 00:23:40,208
Yeah, you just look it up on YouTube.
323
00:23:40,208 --> 00:23:41,521
That's how you light a fire.
324
00:23:41,521 --> 00:23:42,763
Come on, duh.
325
00:23:42,798 --> 00:23:45,179
Yes, Yeah.
326
00:23:45,339 --> 00:23:47,539
The irony is not lost on me.
327
00:23:48,360 --> 00:23:53,541
And this time that I spent in Mexico this last week, which is really, really fun, we did
some great stuff.
328
00:23:53,541 --> 00:23:58,222
We got to do a little bit of fishing, some cliff diving, jumping, these different
component pieces.
329
00:23:58,503 --> 00:24:00,843
And I had to do some work.
330
00:24:01,143 --> 00:24:07,485
And I actually was dreading having to set my computer up, my laptop up and get into it and
start looking at these component pieces.
331
00:24:07,485 --> 00:24:08,511
I'm like, I'm unplugged.
332
00:24:08,511 --> 00:24:09,965
I want to stay unplugged.
333
00:24:10,226 --> 00:24:11,478
And then I got into it.
334
00:24:11,478 --> 00:24:21,178
And the four hours that I was there doing this work thing where I was doing a
presentation, I was completely sucked in and wasn't even like, sucks, hate doing this
335
00:24:21,178 --> 00:24:30,598
because I'm so programmed that this is just part of life and existence and part of the
expression of myself, I guess is the best way to put it.
336
00:24:30,658 --> 00:24:41,270
Like who I am at my core is now integrated with all this technology in such a deep,
meaningful way that there's no gap anymore.
337
00:24:41,270 --> 00:24:43,610
Like I don't have to relearn something.
338
00:24:43,610 --> 00:24:44,850
I pick it up when I go.
339
00:24:44,850 --> 00:24:47,450
And like I had a server blow up in my garage.
340
00:24:47,450 --> 00:24:49,190
Not blow up, like it stopped working.
341
00:24:49,190 --> 00:24:51,110
The hard drive's crapped out.
342
00:24:51,310 --> 00:24:54,830
And my brain went, this sucks.
343
00:24:54,830 --> 00:24:56,490
I don't want to pull these things apart.
344
00:24:56,490 --> 00:24:58,770
I don't want to yank this stuff out and look at it.
345
00:24:58,770 --> 00:25:04,610
You know, maybe I should look up a tutorial on how to fix this server that's like 12 years
old and sitting in there forever.
346
00:25:05,510 --> 00:25:10,130
And as I got into it, I'm looking at it going, I am actively
347
00:25:11,124 --> 00:25:17,068
keeping this hardware, which is old and deprecated and should be thrown out, alive.
348
00:25:17,068 --> 00:25:18,669
Not because I have to.
349
00:25:18,669 --> 00:25:21,662
I can afford to buy cloud instances and run all those pieces.
350
00:25:21,662 --> 00:25:28,216
But I'm doing it because I actually enjoy holding this mechanical thing that does these
things that I like to work on.
351
00:25:28,216 --> 00:25:32,169
And I'm trying to keep it running because it's cool and it's neat and it's a gadget.
352
00:25:32,169 --> 00:25:34,931
And it's useless.
353
00:25:34,931 --> 00:25:39,304
Now I'm looking back on this going, am I?
354
00:25:39,304 --> 00:25:43,395
Is AI me in this scenario and I'm this old server?
355
00:25:44,115 --> 00:25:57,599
Like, shit, m suddenly am I just a tool that these things are gonna use and like keep me
going for a little bit longer because I might still have some functional, I don't know,
356
00:25:57,599 --> 00:26:04,241
legacy, but some kind of nostalgic reason for them to keep me around because of my
novelty?
357
00:26:04,281 --> 00:26:07,412
Maybe, but they probably know the 8 billion of us.
358
00:26:07,643 --> 00:26:13,077
Yeah, well, and it's it's just interesting that we you know, I think it's a human thing.
359
00:26:13,077 --> 00:26:15,059
There's this need to be productive.
360
00:26:15,059 --> 00:26:23,745
And these little things that we open up every day and bang away on with keys make us feel
really productive because we check off the list and we do the things and we contact the
361
00:26:23,745 --> 00:26:26,988
people and we make the deals and make the sales and all the things.
362
00:26:27,128 --> 00:26:33,477
That's really hard to do in an analog world now, like just by comparison.
363
00:26:33,477 --> 00:26:33,877
Right.
364
00:26:33,877 --> 00:26:35,078
Most I shouldn't say most.
365
00:26:35,078 --> 00:26:46,149
A lot of us are waking up sitting on the couch and doing that for a few hours and then
trying to then function as uh a human person not plugged into a machine for the rest of
366
00:26:46,149 --> 00:26:46,379
the day.
367
00:26:46,379 --> 00:26:47,770
And you get kind of lost.
368
00:26:47,770 --> 00:26:51,006
You're just like, what do I do without the device to work on?
369
00:26:51,006 --> 00:26:57,921
Yeah, and I mean, the thing that's going to stick around for a long time is actually
working on a fixing machines because it's actually less cost effective to go through and
370
00:26:57,921 --> 00:27:03,646
create generalized intelligence robots that go through and can like fix any car out there,
for example.
371
00:27:03,646 --> 00:27:11,971
a human being can read a thing, understand it, but it's going to augment its intelligence
by all these artificial intelligence mechanisms that say the best way to do this is X, Y
372
00:27:11,971 --> 00:27:12,822
and Z.
373
00:27:12,822 --> 00:27:19,178
And I think that's the real trick is getting your your mind in the in the right space
where you're like, all right.
374
00:27:19,178 --> 00:27:21,249
These tools aren't something to be afraid of.
375
00:27:21,249 --> 00:27:24,110
These tools are things that are going to enhance and make me be better.
376
00:27:24,110 --> 00:27:30,122
And then you just have to find the tools that are going to make those leaps for you to
make it easier to do.
377
00:27:30,423 --> 00:27:32,243
But they're not going to be easy to find.
378
00:27:32,243 --> 00:27:36,805
And they're not, even more importantly, they're going to change rapidly.
379
00:27:36,805 --> 00:27:39,507
So you're going to have to figure out a way to learn learning.
380
00:27:39,507 --> 00:27:48,054
And I had really good friend of mine, I had lunch with the other day, tell me he actually
wrote an AI to do better prompt engineering so we can write
381
00:27:48,054 --> 00:27:53,458
code better so we could use the prompt engineers to write code more effectively for
himself.
382
00:27:53,578 --> 00:28:01,514
So we could create these intent models to go through and do it and then to automatically
update its own backend scripts as new things come out to put those things in place.
383
00:28:01,514 --> 00:28:03,145
And I'm like, how is that working?
384
00:28:03,145 --> 00:28:05,587
He goes, I think it's working great.
385
00:28:05,887 --> 00:28:07,048
I'm like, what do mean?
386
00:28:07,048 --> 00:28:08,634
He goes, it's spitting out good code.
387
00:28:08,634 --> 00:28:13,273
I'm like, oh, but you don't know how to ask it if it's doing the good thing.
388
00:28:13,273 --> 00:28:15,454
You don't know how to ask these other component pieces.
389
00:28:15,454 --> 00:28:19,236
And that's when I'm like, shit, this is just like dealing with people.
390
00:28:19,236 --> 00:28:20,196
I don't know.
391
00:28:20,196 --> 00:28:22,297
I don't know what the squishy part that's going on inside of this.
392
00:28:22,297 --> 00:28:30,961
And we've just become so enamored with the idea that machines are predictable and create
predictable, consistent results that as we get into the soft, squishy world of AI where
393
00:28:30,961 --> 00:28:40,325
machines are spitting out things that aren't necessarily the most accurate 100 % way of
doing things is they become these inference models of probability.
394
00:28:41,066 --> 00:28:41,886
It's going to get.
395
00:28:41,886 --> 00:28:46,051
Just like us, it's going to get fucking scary because we don't have a source of truth.
396
00:28:46,051 --> 00:28:53,178
Because our source of truth was all this data that we stored in one spot, compiled, and
can go back and go look and go, here's the answer.
397
00:28:54,120 --> 00:28:55,201
Not there anymore, guys.
398
00:28:55,201 --> 00:28:56,222
Sorry.
399
00:28:57,544 --> 00:29:00,016
Now we're more alone in the wilderness than ever before.
400
00:29:00,441 --> 00:29:03,430
and terrified and getting more scared every day.
401
00:29:03,430 --> 00:29:05,321
If you're me, anyways.
402
00:29:05,321 --> 00:29:09,045
ah
403
00:29:09,673 --> 00:29:16,087
All right, well, I'm sure everything we've already said or everything that we've said in
the last 30 minutes is already antiquated and out of date.
404
00:29:16,087 --> 00:29:21,260
So go ahead and skip to the next episode if you've gotten this far, because this is
probably already old information.
405
00:29:21,401 --> 00:29:26,204
That next episode, at least here in reality, will be available in about a week at the fit
mess.com.
406
00:29:26,204 --> 00:29:27,064
Thanks so much for listening.
407
00:29:27,064 --> 00:29:31,747
If you found this as terrifying as I have, please share it with someone else to scare them
just as much.
408
00:29:31,827 --> 00:29:33,899
We'll see you in about a week at the fit mess.com.
409
00:29:33,899 --> 00:29:34,746
Thanks for listening.
410
00:29:34,746 --> 00:29:35,290
Bye bye.