March 25, 2025

Why Doctors Using AI Might Be Hazardous to Your Health

Does your doctor seem more focused on their screen than on you?  Two-thirds of physicians are now using AI for everything from billing to diagnosis - but here's the kicker - only 38% actually trust the technology. This brewing healthcare disaster...

Does your doctor seem more focused on their screen than on you? 

Two-thirds of physicians are now using AI for everything from billing to diagnosis - but here's the kicker - only 38% actually trust the technology. This brewing healthcare disaster isn't just about robots taking doctors' jobs; it's about who's accountable when AI makes a potentially life-altering mistake with your medical information, and why insurance companies are salivating at the thought of using algorithms to deny your claims even faster.

Listen as Jeremy, Jason, and Zach dive into the alarming rise of AI in healthcare, how to find meaning in your fitness journey when depression makes everything feel pointless, and why sometimes treating your workout like wiping your ass is the perfect mental approach to consistency.

Topics Discussed

  • Why 66% of physicians are using AI despite only 38% trusting it - creating a dangerous accountability gap in healthcare
  • How insurance companies will likely use AI to deny claims faster rather than improve patient care
  • The potential for errors in medical records when AI is used to generate and update patient information
  • The challenge of determining liability when AI contributes to medical errors
  • Using workout data tracking to combat depression's lie that "nothing is working"
  • How Jeremy discovered he'd doubled his strength despite feeling like he wasn't making progress
  • The value of consistency over intensity when building healthy habits
  • Why treating workouts like mundane necessities (like wiping your ass) helps with consistency
  • Finding meaning in fitness when battling existential crises
  • The importance of "showing up" rather than focusing solely on specific fitness goals

Resources

----

MORE FROM THE FIT MESS:

Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok

Subscribe to The Fit Mess on Youtube

Join our community in the Fit Mess Facebook group

----

LINKS TO OUR PARTNERS:

 

 

Transcript

1
00:00:03,388 --> 00:00:04,188
Welcome to the Fit Mess.

2
00:00:04,188 --> 00:00:05,549
My name is Jeremy.

3
00:00:05,589 --> 00:00:11,012
Joined today by my co-hosts, co-hosts, that's the word, Zach and Jason.

4
00:00:11,252 --> 00:00:12,273
Glad to see you guys here.

5
00:00:12,273 --> 00:00:13,753
Thanks for being here today.

6
00:00:14,354 --> 00:00:19,887
I had a moment the other day where, you know, I tend to live in sort of a depressed state
most of the time.

7
00:00:19,887 --> 00:00:23,979
And I had one of those moments the other day where I was like, the fucking gym.

8
00:00:23,979 --> 00:00:25,000
I don't want to go today.

9
00:00:25,000 --> 00:00:25,880
What's the point?

10
00:00:25,880 --> 00:00:27,201
It's not doing anything.

11
00:00:27,201 --> 00:00:28,042
I look like shit.

12
00:00:28,042 --> 00:00:29,162
I feel like shit.

13
00:00:29,162 --> 00:00:32,636
And I had this like, aha, duh moment.

14
00:00:32,636 --> 00:00:34,836
that relates to AI and we're going to talk about it a little bit.

15
00:00:34,836 --> 00:00:38,156
So I want to tell you that story a little bit later because it's not our top story.

16
00:00:38,156 --> 00:00:46,336
It's just a kind of a funny thing that happened to me that is a good reminder for a better
way for me to consider all these little things that I'm trying to do to improve my life.

17
00:00:46,496 --> 00:00:58,076
But the big story I want to talk about today is why two in three physicians are using
health AI, which is up 78 % from 2023.

18
00:00:58,576 --> 00:01:01,892
And on the surface, what's the problem?

19
00:01:01,892 --> 00:01:02,152
Right?

20
00:01:02,152 --> 00:01:03,495
They're saving all kinds of time.

21
00:01:03,495 --> 00:01:05,195
This is going to make healthcare better.

22
00:01:05,195 --> 00:01:05,975
It's going to make it cheaper.

23
00:01:05,975 --> 00:01:10,586
It's going to make it more affordable, especially in the US where it's a finely tuned
machine.

24
00:01:10,627 --> 00:01:12,300
Jason, you stumbled across this story.

25
00:01:12,300 --> 00:01:16,276
Your reaction to more doctors using more AI to do more.

26
00:01:16,856 --> 00:01:28,784
Well, so essentially the whole point of AI is to provide a level of intelligence that
augments human intelligence in some sort of way.

27
00:01:28,784 --> 00:01:30,455
So that's how we're using it today.

28
00:01:30,455 --> 00:01:36,449
But eventually, you know, it'll get to the point where it could subsume certain human
intelligence functions.

29
00:01:36,449 --> 00:01:40,800
The artificial part of it suggests that these are things that are being generated by
larger sets of data.

30
00:01:40,800 --> 00:01:47,300
And if you look at how physicians interoperate with things to become a doctor, you have to
go to school for a certain period of time.

31
00:01:47,300 --> 00:01:48,660
Sometimes that's seven years.

32
00:01:48,660 --> 00:01:51,700
mean, some people go to community college for seven years.

33
00:01:51,700 --> 00:01:56,913
I may have been one of them to go through and get their level of education.

34
00:01:56,913 --> 00:01:57,014
it.

35
00:01:57,014 --> 00:01:59,087
That was the you just really you're having a good time.

36
00:01:59,087 --> 00:02:01,101
You wanted to stretch that out as long as you could.

37
00:02:01,101 --> 00:02:01,780
I did.

38
00:02:01,780 --> 00:02:02,312
I did.

39
00:02:02,312 --> 00:02:09,506
mean, most people that end college after 14 years with my BA, they are not just doctors.

40
00:02:09,506 --> 00:02:10,373
They have multiple doctors.

41
00:02:10,373 --> 00:02:12,408
I didn't go down that path.

42
00:02:12,408 --> 00:02:17,970
Anywho, yes, exactly.

43
00:02:18,571 --> 00:02:27,408
The strange thing is that when you look at the way the data is actually collected and
pulled in, they basically pull data in from multiple different sources.

44
00:02:27,408 --> 00:02:37,155
And AI systems use large language models and then large learning principle functions to go
through and create things that are verifiably usable for different use cases.

45
00:02:37,155 --> 00:02:44,440
And physicians and healthcare in general has a whole battery and set of information for
its use cases.

46
00:02:44,440 --> 00:02:50,745
And the data that they bring in is not limited to your medical records.

47
00:02:50,745 --> 00:02:55,988
It's not limited to your behavioral records or the inputs the doctor has.

48
00:02:56,318 --> 00:03:02,112
The information they use to make decisions can be broadly based upon macro information.

49
00:03:02,112 --> 00:03:06,635
So it could be, you know, people in your area react this way to certain health criteria.

50
00:03:06,635 --> 00:03:10,398
It could be you react these ways to different types of things in the environment.

51
00:03:10,398 --> 00:03:15,261
All these things are ways and mechanisms that feed these AI systems.

52
00:03:15,261 --> 00:03:17,602
And doctors are trying to take advantage of it.

53
00:03:17,602 --> 00:03:26,152
So two and three doctors out there are using artificial intelligence in this way to go
through and help them make decisions about patients' health.

54
00:03:26,152 --> 00:03:31,155
about billing systems, about record systems, all these different component pieces.

55
00:03:31,276 --> 00:03:49,248
At the same time, only 38 % of doctors feel that AI is good enough from a privacy
perspective, a security perspective, and a reliability perspective to say they trust it

56
00:03:49,348 --> 00:03:53,171
and its capabilities more than the potential downsides.

57
00:03:53,171 --> 00:03:55,262
So what that means is you've got an inverted model.

58
00:03:55,262 --> 00:04:02,237
You have 66, about two thirds of doctors out there using AI, about a third of doctors
saying they trust AI.

59
00:04:02,237 --> 00:04:10,562
So that means you've got one third of doctors using something that they think that they're
not sure that they trust for your health.

60
00:04:11,663 --> 00:04:13,524
And it's, it's scary.

61
00:04:13,690 --> 00:04:16,422
so Zach, you're the IT security professional.

62
00:04:16,422 --> 00:04:17,453
I want to get your take on this.

63
00:04:17,453 --> 00:04:22,376
But I just want to sort of explain a little bit about how these doctors are using AI.

64
00:04:22,376 --> 00:04:32,383
So mean, according to this article from the American Medical Association, documentation of
billing codes, creation of discharge instructions, translation services, summaries of

65
00:04:32,383 --> 00:04:35,665
medical research and standards, assistive diagnosis.

66
00:04:35,665 --> 00:04:37,806
That one scares me a little bit.

67
00:04:38,167 --> 00:04:39,760
Generation of chart summaries.

68
00:04:39,760 --> 00:04:41,721
So mean, it does sound like it's lot of admin work.

69
00:04:41,721 --> 00:04:46,247
They're trying to free up a little bit of admin space to focus on other things.

70
00:04:46,247 --> 00:04:48,439
But what's the risk here?

71
00:04:48,439 --> 00:04:49,870
What's the danger?

72
00:04:49,950 --> 00:04:55,763
Zach, what red flags are going off for you as you hear doctors are using AI to do part of
their jobs?

73
00:04:55,763 --> 00:05:05,308
I wouldn't, mean, so the diagnosis part doesn't scare me as much as all the other parts do
because again, like we all Google our own symptoms and right.

74
00:05:05,308 --> 00:05:09,109
That cough is definitely a brain tumor, like all those things.

75
00:05:09,130 --> 00:05:18,474
But, and I've, I've used AI to write performance reviews.

76
00:05:18,474 --> 00:05:20,539
I've used AI to like, you know,

77
00:05:20,539 --> 00:05:26,080
It really does help if you're one of those people who like, you're looking at a blank
canvas, you can't do anything.

78
00:05:26,441 --> 00:05:29,882
You can generate something to edit and then you can go edit it.

79
00:05:30,642 --> 00:05:40,925
Now, if my medical charts and my, my information that other doctors and other
professionals are going to make decisions on is based on AI stuff that could be wrong,

80
00:05:40,925 --> 00:05:41,805
could be right.

81
00:05:41,805 --> 00:05:47,925
That's where it gets scary is like how long before my medical chart is like way off base,
right?

82
00:05:47,925 --> 00:05:54,260
from what it should be because AI was like, we'll just put this in there because I think
this sounds cool.

83
00:05:54,260 --> 00:05:58,564
you know, the doctor told me to write this in, you know, Donald Trump style.

84
00:05:58,564 --> 00:06:00,706
So it's all about like, it's going to be great.

85
00:06:00,706 --> 00:06:02,707
It's going you know, like, who knows?

86
00:06:04,349 --> 00:06:05,349
Exactly.

87
00:06:05,349 --> 00:06:15,007
So I think that doctors will more than likely, if they use it for diagnosis assistance,
right, they're probably going to double check that work.

88
00:06:15,007 --> 00:06:17,289
I wouldn't be as concerned about that.

89
00:06:17,289 --> 00:06:20,860
because they're using other sources for that anyway.

90
00:06:20,860 --> 00:06:30,764
It's more like the, you know, again, my personal records not being updated correctly,
errors that people aren't catching because I know everyone else does this.

91
00:06:30,764 --> 00:06:37,246
You ask AI to write something and you skim it and you say it looks good and you don't read
it in detail.

92
00:06:37,246 --> 00:06:40,412
I mean, I do read it in detail, but most people don't read it in detail.

93
00:06:40,412 --> 00:06:41,232
Yeah.

94
00:06:41,232 --> 00:06:49,492
Well, that's what I was thinking too, as I was sort of considering the different pros and
cons of this, is that this could open up an entire new industry in healthcare of basically

95
00:06:49,492 --> 00:06:50,332
proofreaders.

96
00:06:50,332 --> 00:06:56,932
So if the AI is going to create the content, we need somebody who can actually, with an
eye for detail, figure out is this correct?

97
00:06:56,932 --> 00:06:59,252
Is this what the doctor said?

98
00:06:59,332 --> 00:07:07,438
But I guess the other thing that scares me, and I think this is maybe what you're getting
to, Jason, is the idea that this information is being fed through some external tool.

99
00:07:07,438 --> 00:07:09,511
And again, what kind of data is being collected?

100
00:07:09,511 --> 00:07:10,652
How is it going to be used?

101
00:07:10,652 --> 00:07:11,763
Who has access to it?

102
00:07:11,763 --> 00:07:13,596
I mean, that's the thing that scares me.

103
00:07:13,596 --> 00:07:15,208
Aside from, you know, Dr.

104
00:07:15,208 --> 00:07:24,298
Google telling me I have brain cancer because I coughed, what does Elon Musk suddenly know
about me because AI was used to figure out that I don't really have brain cancer?

105
00:07:24,839 --> 00:07:27,760
Yeah, so there's a couple of pieces here.

106
00:07:27,760 --> 00:07:36,062
So you can use AI and ML to collect information and to enrich it and to put the
information into context.

107
00:07:36,163 --> 00:07:38,363
And when you do that, you create a data set.

108
00:07:38,363 --> 00:07:40,084
And now you have a new data set in place.

109
00:07:40,084 --> 00:07:45,826
And that data set gets reused and updated and changed over time, just like regular human
intelligence.

110
00:07:45,826 --> 00:07:49,347
Like as a kid, you go through and you learn basic math.

111
00:07:49,347 --> 00:07:52,928
And then you build upon those skills to learn more advanced math.

112
00:07:52,928 --> 00:07:53,778
Well,

113
00:07:53,810 --> 00:08:03,395
If you were shit at addition and subtraction when you were a little kid and it didn't get
any better and you came up with the wrong answers and then you start doing more higher

114
00:08:03,395 --> 00:08:11,809
order math later on, you still haven't solved the fact that you were shit at doing
addition and subtraction and that's going to affect all future calculations.

115
00:08:12,170 --> 00:08:13,691
The same is true with AI.

116
00:08:13,691 --> 00:08:21,014
We are not aware necessarily yet of how good the information and data they collected is.

117
00:08:21,155 --> 00:08:23,782
We're just assuming that it's okay.

118
00:08:23,782 --> 00:08:34,600
And the reality is that the people that are writing these models, doing these checks, most
of them are not medical professionals and are not professionals in those industries.

119
00:08:34,600 --> 00:08:38,503
So you have human augmented intelligence and human augmented auditors.

120
00:08:38,503 --> 00:08:41,966
Like there's entire companies out there, like Scale AI, for example, that does this.

121
00:08:41,966 --> 00:08:44,268
And there's the Mechanical Turk from Amazon.

122
00:08:44,268 --> 00:08:50,472
All these pieces in place where they say, we're going to take these models and then we're
going to learn on them.

123
00:08:50,598 --> 00:08:55,151
And I think it's called real human augmentation validation, something like that.

124
00:08:55,151 --> 00:09:01,535
But the idea is that you're going to take experts to review these things to try to make
these models better.

125
00:09:02,216 --> 00:09:05,358
That's great for companies that do that.

126
00:09:05,819 --> 00:09:09,291
A lot of companies aren't going to do that because it's expensive.

127
00:09:09,291 --> 00:09:10,142
It's not cheap.

128
00:09:10,142 --> 00:09:15,366
And going through and validating these pieces and level of validation you put into place
is hard.

129
00:09:15,366 --> 00:09:18,592
And if you start taking this content and you start learning from it,

130
00:09:18,592 --> 00:09:22,404
creating collateral material as learning material, spread these things out.

131
00:09:22,404 --> 00:09:29,016
Well, now you've got a disinformation problem where the AI itself could be putting out
false information and bad things.

132
00:09:29,016 --> 00:09:37,710
And it's hard to correct that when we ever become so addicted, complacent and compliant
with what these things are saying.

133
00:09:37,710 --> 00:09:48,764
Cause if we turn over our sense of authority from doctors into the hands of AI, well now
who actually knows and who's doing the right thing.

134
00:09:48,864 --> 00:09:53,624
It's difficult, if not impossible, to be able to extract those things from each other.

135
00:09:53,624 --> 00:10:10,064
And over a prolonged period of time, like you guys said, WebMD is going to become the
thing that you're going to use to validate the thing that your doctor said.

136
00:10:10,204 --> 00:10:14,224
And we're going to have the Denning-Kruger effect all over again.

137
00:10:14,224 --> 00:10:16,574
And we're going to have the Denning-Kruger effect with our health.

138
00:10:16,574 --> 00:10:18,005
our belief system, everything else.

139
00:10:18,005 --> 00:10:21,766
And it's just going to create so many competing signals that a lot of people are just
like, fuck it.

140
00:10:21,766 --> 00:10:23,347
I'm not listening to any of you.

141
00:10:23,347 --> 00:10:24,967
People already don't trust doctors.

142
00:10:24,967 --> 00:10:26,878
People already don't trust medical institutions.

143
00:10:26,878 --> 00:10:29,709
People already don't trust the authorities in this space.

144
00:10:30,390 --> 00:10:38,063
And we have doctors who are supposed to be authorities saying, I don't trust this, but I'm
going to have to use this because I know that in order for me to stay alive in the

145
00:10:38,063 --> 00:10:43,055
business world and be able to do the things that I need to do at scale, I have to have
these things in place.

146
00:10:43,055 --> 00:10:44,756
And it's incredibly dangerous.

147
00:10:44,756 --> 00:10:46,056
It's incredibly

148
00:10:46,172 --> 00:10:52,955
misguided, it's ineffective from a costing perspective in terms of going through and using
these tools to make these things faster.

149
00:10:52,955 --> 00:10:56,456
AI is going to help us do a lot of things much, much quicker, including fucking up.

150
00:10:56,456 --> 00:10:59,157
We're going to fuck up quicker and at scale.

151
00:10:59,157 --> 00:11:00,818
And we're going to go, that was great.

152
00:11:00,818 --> 00:11:05,610
It only cost us, you know, the earth in the process because we're going to burn everything
up.

153
00:11:05,610 --> 00:11:06,120
Yeah.

154
00:11:06,120 --> 00:11:08,201
I mean, you mentioned cost a couple times there.

155
00:11:08,201 --> 00:11:15,364
If in the end, the goal here is to reduce the cost of American health care, okay, there's
a case to be made.

156
00:11:15,364 --> 00:11:17,204
There's no way that's gonna be the end result.

157
00:11:17,204 --> 00:11:22,543
They're gonna argue that this costs more and that's why all of a sudden your costs need to
go up, which is why insurance premiums need to go up.

158
00:11:22,543 --> 00:11:24,386
This is not going to solve that problem.

159
00:11:24,386 --> 00:11:30,076
And just to your point about the doctors themselves, mean, according to this article,
nearly half of physicians surveyed.

160
00:11:30,076 --> 00:11:34,680
47 % ranked increased oversight as the number one regulatory action that needs to happen.

161
00:11:34,680 --> 00:11:44,780
So again, now we're bringing in more people to oversee the AI that humans have been doing
for a really long time pretty badly already.

162
00:11:44,780 --> 00:11:47,352
So do we really need robots to also do it badly?

163
00:11:47,360 --> 00:11:55,580
Well, not just that, you mentioned a really salient point and that's the idea of having
insurance companies involved in this and the point of using AI.

164
00:11:55,580 --> 00:11:59,200
So the point of using AI is not to make things cheaper.

165
00:11:59,200 --> 00:12:05,560
The point of using AI is to not do anything other than make things faster.

166
00:12:05,680 --> 00:12:06,960
That's all it's really meant to do.

167
00:12:06,960 --> 00:12:10,920
It's meant to shortcut these pieces so you can execute things quicker and go much quicker.

168
00:12:10,920 --> 00:12:14,336
And I guarantee you, insurance companies are going to use it to deny your claims faster.

169
00:12:14,336 --> 00:12:20,851
because they're going to use it to go through, scrape this bit of information, understand
these things in context and say, now I don't need someone to go through and read these

170
00:12:20,851 --> 00:12:21,271
charts.

171
00:12:21,271 --> 00:12:26,790
I can say no fast because they're going to use keyword searches and indexing functions and
everything else.

172
00:12:26,790 --> 00:12:31,528
And they're going to stick these things together and then hope people don't look too hard.

173
00:12:31,608 --> 00:12:43,134
And we're not exactly in a position right now as consumers and as citizens where we're
going to have the US government try to help us and protect us.

174
00:12:43,134 --> 00:12:44,730
These are things that are just going to fall apart.

175
00:12:44,730 --> 00:12:49,286
of now hiring jobs up in the federal government.

176
00:12:50,428 --> 00:12:50,785
Right.

177
00:12:50,785 --> 00:12:57,160
I mean, the government is not, the current administration does not want the government to
try to help consumers.

178
00:12:57,160 --> 00:13:02,394
They want the administration to enable businesses to be able to do more business like
things.

179
00:13:02,394 --> 00:13:07,537
And we already know that businesses in general, especially corporations, you know, are
sociopathic.

180
00:13:08,044 --> 00:13:10,239
they're, they're to extract money from you and pull in.

181
00:13:10,239 --> 00:13:12,851
And I'm not saying that businesses are the problem here.

182
00:13:12,851 --> 00:13:17,428
I'm saying that our adoption of a technology that we don't understand at scale, at rate,

183
00:13:17,428 --> 00:13:27,701
going into actually critical infrastructure, your body, and being said, now we're going to
start making decisions based upon this, and then punch it into a database, and then hope

184
00:13:27,701 --> 00:13:32,967
for the best in the future, is probably not a great way to work on health.

185
00:13:33,835 --> 00:13:41,290
I think, I mean, you guys hit on a couple of points where, you know, the healthcare system
in America is for sure for profit, right?

186
00:13:41,290 --> 00:13:43,475
And they make money on it.

187
00:13:43,581 --> 00:13:51,245
And even in the business I'm in AI is the buzzword that executives are hearing too, right?

188
00:13:51,245 --> 00:13:55,127
I have been told, get AI into our systems.

189
00:13:55,728 --> 00:13:57,169
I don't know what that means.

190
00:13:57,169 --> 00:13:58,950
They don't know what that means.

191
00:13:59,270 --> 00:14:01,341
And it, you know,

192
00:14:01,591 --> 00:14:09,375
I always hesitated and was like, I'm not going to put AI into our product, but I'll put AI
into the backend part of it and make lives easier for us.

193
00:14:09,375 --> 00:14:14,678
But at the end of the day, you know, Jeremy, you said make healthcare cheaper.

194
00:14:15,339 --> 00:14:15,899
way.

195
00:14:15,899 --> 00:14:16,659
Jason's right.

196
00:14:16,659 --> 00:14:20,302
It's going to be faster and prices won't go down.

197
00:14:20,302 --> 00:14:24,444
Their cost of doing business is going to go down and they're just going to keep more
profits.

198
00:14:24,444 --> 00:14:27,155
But where are they going to choose to put that AI?

199
00:14:27,155 --> 00:14:30,645
Because some executive in the healthcare system is going, Hey,

200
00:14:30,645 --> 00:14:33,659
put AI into the system, put AI into the system.

201
00:14:33,659 --> 00:14:42,049
And you've got people who don't know what they're doing putting it in where it doesn't
belong or where it's not necessary, where it's dangerous, where it could cause so many

202
00:14:42,049 --> 00:14:42,589
problems.

203
00:14:42,589 --> 00:14:51,088
like on top of all the other things that Jason's talking about, like it could be a huge
clusterfuck with AI all over the place where it shouldn't be.

204
00:14:51,088 --> 00:14:57,291
Well, and can imagine a scenario too where, you know, right now when you go to the doctor,
you fill out the intake form and list out all the symptoms you've had.

205
00:14:57,291 --> 00:14:59,422
That's essentially what we're doing now with WebMD.

206
00:14:59,422 --> 00:15:05,715
So now I'm doing it in my doctor's lobby while I'm waiting to be seen, probably on some
tablet that's going to be telling AI, these are my symptoms.

207
00:15:05,715 --> 00:15:14,069
And the doctor is going to basically validate what the decision AI came up with rather
than using the, you know, the brain they have that they developed from all those many

208
00:15:14,069 --> 00:15:16,340
years of community college sitting next to Jason.

209
00:15:16,340 --> 00:15:17,144
And,

210
00:15:17,828 --> 00:15:20,121
And now my diagnosis is coming from Dr.

211
00:15:20,121 --> 00:15:20,521
Google.

212
00:15:20,521 --> 00:15:23,524
And I would hope that's not the end result.

213
00:15:23,524 --> 00:15:33,654
But in a push to do more faster to reduce cost, I see that as a pretty likely ending
scenario here.

214
00:15:34,104 --> 00:15:48,004
Yeah, and that we also have to take into consideration is that using AI to get information
and to do to diagnose treatments, I mean, all these pieces, it's not necessarily a bad

215
00:15:48,004 --> 00:15:48,784
thing.

216
00:15:48,784 --> 00:15:58,524
I mean, doctors today use Google and they use research magazines and they use industry
trade publications to go through and try to understand what medical journals are saying

217
00:15:58,524 --> 00:15:59,380
nowadays.

218
00:15:59,380 --> 00:16:02,702
These are all things that they're already doing, and they're already using the Google to
do this.

219
00:16:02,702 --> 00:16:12,786
It's the summarization of this information and putting these things into these augment or
these really generative AI capabilities to go through and actually give you a generative

220
00:16:12,786 --> 00:16:15,127
response before it becomes scary.

221
00:16:15,127 --> 00:16:21,050
And it's not necessarily that it's going to be entirely wrong or any worse than what we
have today.

222
00:16:21,050 --> 00:16:25,822
It's just that it's going to be harder to point the finger and say who to blame.

223
00:16:26,068 --> 00:16:38,395
Because if I go through and I misdiagnosed somebody and I give them the wrong drug and it
causes vasodilation and they pass out and they crash their car on the way home from the

224
00:16:38,395 --> 00:16:42,977
doctor's office and kill somebody, who's at fault?

225
00:16:43,038 --> 00:16:44,718
Because is it the AI?

226
00:16:44,718 --> 00:16:45,679
Is it the doctor?

227
00:16:45,679 --> 00:16:47,580
Is it the person driving the car?

228
00:16:47,580 --> 00:16:49,431
Who is really to blame for all these pieces?

229
00:16:49,431 --> 00:16:50,852
And we don't know yet.

230
00:16:50,852 --> 00:16:55,072
And it's the accountability factor and the fact that there really is no accountability
lane in this.

231
00:16:55,072 --> 00:17:00,232
And there's no way to reverse engineer some of the AI pieces that come out of it and some
of the information that it gives you.

232
00:17:00,232 --> 00:17:02,772
And most people wouldn't even know how to do it.

233
00:17:02,772 --> 00:17:08,692
And that's the scary part is that these things are, they're just way outside of our zone
of control.

234
00:17:08,692 --> 00:17:10,592
I, doctors don't move fast.

235
00:17:10,592 --> 00:17:12,912
Like they don't change their opinions quickly.

236
00:17:12,992 --> 00:17:14,572
They tend to move slowly.

237
00:17:14,592 --> 00:17:21,892
They're moving fast on AI and they're moving fast on AI because there is a economic reason
and incentive for them to do that.

238
00:17:21,892 --> 00:17:24,574
But also all the tools out there.

239
00:17:24,574 --> 00:17:31,166
that are in the medical industry are making this shift towards AI capabilities as well,
because they're trying to get rid of their back end developers.

240
00:17:31,166 --> 00:17:33,867
They're trying to get rid of the people that actually do data entry.

241
00:17:33,867 --> 00:17:42,049
So we're just handing more and more of these things over without looking at what the
controlling functions are going to be over time.

242
00:17:42,049 --> 00:17:49,651
As the signals go up and the data gets more polluted, what you're going to wind up with is
more noise.

243
00:17:49,651 --> 00:17:51,792
And the signal to noise ratio

244
00:17:51,792 --> 00:17:53,183
is the really, really big thing.

245
00:17:53,183 --> 00:17:54,915
So we have a ton of noise coming in.

246
00:17:54,915 --> 00:18:00,058
We don't know how to decipher it and turn it into a signal that says, this is what you
should do.

247
00:18:00,219 --> 00:18:03,161
AI is supposed to help us with that.

248
00:18:03,322 --> 00:18:10,177
But the way that we're implementing it and instrumenting it, it's going to make it worse
because we're not putting filters on those signals.

249
00:18:10,177 --> 00:18:17,193
Those signals are rapid firing coming towards us and they're just turning into noise.

250
00:18:17,193 --> 00:18:21,076
And then we're relying on these experts to try to tune these things down.

251
00:18:21,780 --> 00:18:23,280
And don't think they were equipped for it.

252
00:18:23,280 --> 00:18:24,602
I mean, I'll be honest.

253
00:18:24,842 --> 00:18:34,699
My, my, old air force physician that worked on me as a kid when I had allergies or the
time that I like blew my ankle out, like if you told that dude, Hey, talk to this robot

254
00:18:34,699 --> 00:18:36,060
and have you tell you what's going on.

255
00:18:36,060 --> 00:18:41,113
I'd get a fuck you kid as he's smoking a cigarette and blowing smoke in my face from my
allergy discussion.

256
00:18:41,714 --> 00:18:47,958
But he's also the guy that probably would have done the best job treating me for my
allergies compared to how things are today, because he actually would taken the time to do

257
00:18:47,958 --> 00:18:49,609
a test and run through these things.

258
00:18:49,609 --> 00:18:50,622
And not just said.

259
00:18:50,622 --> 00:18:57,580
denied because your healthcare doesn't support these pieces based upon some arbitrary
information that we extracted in these layers.

260
00:18:57,901 --> 00:19:01,596
It just makes things harder and people are going to give up and they're going to not try.

261
00:19:01,596 --> 00:19:03,108
And then I'm like, I don't care.

262
00:19:03,108 --> 00:19:07,174
I'm going to go take ginseng and ginger and hope that I live.

263
00:19:07,174 --> 00:19:08,094
So that's, yeah, okay.

264
00:19:08,094 --> 00:19:13,666
That's exactly what I wanted to go next with this is desperately clinging to any glimmer
of hope in this.

265
00:19:14,327 --> 00:19:15,097
What do we do, right?

266
00:19:15,097 --> 00:19:16,018
What's the solution?

267
00:19:16,018 --> 00:19:26,122
And to me, I wonder, you know, are the naturopaths out there chomping at the bit because
this means people are going to want the more, you know, a holistic approach to healthcare.

268
00:19:26,122 --> 00:19:31,934
They're going to want the hour with their doctor to talk about their diet and what they're
doing differently and what's going on.

269
00:19:31,934 --> 00:19:37,104
Does, does this shift us more into hippie mode where we suddenly, where we're a little
less reliant on

270
00:19:37,104 --> 00:19:42,976
the traditional medical system and starting to look for alternatives because we don't
trust the robots that they're using.

271
00:19:42,976 --> 00:19:53,676
Yeah, so my end is an MD and I've had a couple of discussions about with him about, you
know, how do you feel about AI to generate diagnoses or feeding this information?

272
00:19:53,676 --> 00:20:02,476
And he goes, well, I mean, we've been using it to some degree for a long time anyways,
whether it's, you know, the naturopathic side or the actual medical doctor side.

273
00:20:02,576 --> 00:20:08,496
And looking at these bits of information and understanding these things doesn't
necessarily mean you're not going to get good outcomes.

274
00:20:08,496 --> 00:20:12,676
It all depends on who's probing the system, inputting information.

275
00:20:12,680 --> 00:20:19,043
I think what it's a false dichotomy to think that it's one or the other.

276
00:20:19,043 --> 00:20:22,585
think the reality is it's going to be an amalgamation of these pieces.

277
00:20:22,585 --> 00:20:27,828
And the net result is that it could be good in the end, but it's going to hurt along the
way.

278
00:20:27,828 --> 00:20:36,312
mean, we all work out and we go and lift heavy things and then we feel like shit after it.

279
00:20:36,756 --> 00:20:39,729
because we're tired and that's part of the process.

280
00:20:39,729 --> 00:20:44,122
Part of the process is breaking things down so you can actually get better, stronger and
faster.

281
00:20:44,803 --> 00:20:52,860
This AI is like all of us being forced to do like the Batan Death March together on this
particular area.

282
00:20:52,860 --> 00:20:55,192
And you're going to be sore whether you like it or not.

283
00:20:55,192 --> 00:20:58,074
You didn't volunteer for it, but you're on board.

284
00:20:58,256 --> 00:21:01,938
So pack your fucking trail mix and get your boots on because we're going.

285
00:21:01,938 --> 00:21:03,756
And this is what's happening.

286
00:21:03,756 --> 00:21:04,198
And

287
00:21:04,198 --> 00:21:12,888
Along the way, you're either going to figure out a way to adapt and make these pieces work
and get stronger, or you're going to fall off the trail and you're going to become a

288
00:21:12,888 --> 00:21:15,360
casualty in this process.

289
00:21:15,562 --> 00:21:22,269
And I think there will be a lot of casualties along the way and who owns that and who is
accountable for that.

290
00:21:22,470 --> 00:21:24,832
Everyone's just going to go, sorry.

291
00:21:24,976 --> 00:21:26,255
Certainly not the insurance companies.

292
00:21:26,255 --> 00:21:27,550
Has nothing to with them.

293
00:21:27,872 --> 00:21:29,432
I agree with you a hundred percent.

294
00:21:29,432 --> 00:21:41,597
I have, I have put a few models in place at work that, you know, augmented humans or made,
or made humans super humans to some extent.

295
00:21:41,597 --> 00:21:43,418
And you're not kidding.

296
00:21:43,418 --> 00:21:44,929
Like you are absolutely right.

297
00:21:44,929 --> 00:21:50,381
The people who can't go on that march fall off and die.

298
00:21:50,381 --> 00:21:54,303
Like it's, I've seen it at very, very small scale.

299
00:21:55,103 --> 00:21:58,236
But I totally believe you, like, that's the way it's gonna be.

300
00:21:59,068 --> 00:22:03,728
All right, well, since we've solved that global crisis, let's move on to my duh moment.

301
00:22:03,728 --> 00:22:11,528
You mentioned working out, Jason, and that just reminds me, you know, a few days ago I was
in one of my slumps, you know, when you live with depression, that's kind of part of the

302
00:22:11,528 --> 00:22:12,288
deal.

303
00:22:12,288 --> 00:22:16,428
And so you start, you know, when it comes to everything in your life, you're like, oh,
what's the point?

304
00:22:16,868 --> 00:22:17,468
Everything's pointless.

305
00:22:17,468 --> 00:22:18,748
I'm not doing this anymore.

306
00:22:18,748 --> 00:22:23,900
And so I'm considering the amount of time I'm spending in the gym and, you know,

307
00:22:23,900 --> 00:22:26,020
There's some days I look in the mirror and I'm like, Oh, hey, this is working.

308
00:22:26,020 --> 00:22:26,680
This is all right.

309
00:22:26,680 --> 00:22:29,120
And then there's some days I'm like, what a waste of time.

310
00:22:29,200 --> 00:22:29,720
Yeah.

311
00:22:30,100 --> 00:22:30,700
Disgusting.

312
00:22:30,700 --> 00:22:37,000
And so I got curious and started looking through, you know, the data that is automatically
tracked with my workout tool.

313
00:22:37,000 --> 00:22:37,840
I use fit bottom.

314
00:22:37,840 --> 00:22:38,620
I'm a big fan of the thing.

315
00:22:38,620 --> 00:22:40,840
It's not a sponsor, although there should be.

316
00:22:41,020 --> 00:22:43,700
And I just was curious, like, what, what is it collecting about me?

317
00:22:43,700 --> 00:22:44,460
What does it know?

318
00:22:44,460 --> 00:22:46,320
Is this doing anything for me?

319
00:22:46,320 --> 00:22:53,402
And so not only do I see an incredibly consistent streak over a number of weeks, I'm like,
wow, I've never worked out that much that consistently in my entire life.

320
00:22:53,402 --> 00:22:54,562
That's incredible.

321
00:22:54,562 --> 00:23:02,174
But it also, they have this number and you know, it's, one of these numbers where you take
it with a grain of salt, but they take all of your data and they gather how much stronger

322
00:23:02,174 --> 00:23:05,185
you've gotten over the course of, of using the app.

323
00:23:05,185 --> 00:23:14,033
And they have this arbitrary number and, and they say on their scale of zero to a hundred,
most users of their app should be at about a 50.

324
00:23:14,618 --> 00:23:21,492
And so when I started, according to this calculation, it shows that my numbers were down
in the 30, know, weak as a bird.

325
00:23:21,692 --> 00:23:25,588
And so now it's up around 60.

326
00:23:25,935 --> 00:23:27,956
And I just thought like, well, that's incredible.

327
00:23:27,956 --> 00:23:32,969
Like I'm, I'm twice as strong as when I started this little adventure in October.

328
00:23:32,969 --> 00:23:38,842
And it just hit me over the head with something that Zach, you and I have talked about on
the show forever.

329
00:23:38,842 --> 00:23:42,874
And that I've always been incredibly miserably bad at, which is tracking things.

330
00:23:43,504 --> 00:23:45,186
I'm not a details person.

331
00:23:45,186 --> 00:23:46,517
It's the kind of thing that drives me crazy.

332
00:23:46,517 --> 00:23:50,129
I will track something for about four minutes and then I'll get bored and give up on it.

333
00:23:50,129 --> 00:23:53,502
But it reminded me of the value of doing that if there's a way to do it.

334
00:23:53,502 --> 00:24:04,129
Like to have this little robot in my pocket for the last four months calculating how much
better I'm getting made me go, okay, I don't have to question this anymore.

335
00:24:04,129 --> 00:24:07,922
Like the proof is in my hand, I'm doing it.

336
00:24:07,922 --> 00:24:10,544
And so it just made me curious, you know,

337
00:24:10,860 --> 00:24:19,652
as someone who struggles to track things if it's not done automatically, what do you guys
do to track the things in your lives that you're trying to improve?

338
00:24:20,303 --> 00:24:22,347
Zach, I'll let you go first on this one.

339
00:24:24,735 --> 00:24:28,558
Well, I don't think we have enough time to go through all that, I do.

340
00:24:28,558 --> 00:24:29,538
have.

341
00:24:30,759 --> 00:24:32,220
I didn't, I am.

342
00:24:32,220 --> 00:24:41,315
I actually, took a shower right before this, so like you can't see it, but like I don't
have my Apple watch which tracks things and my whoop that tracks things and my aura ring

343
00:24:41,315 --> 00:24:42,676
and my bed that tracks things.

344
00:24:42,676 --> 00:24:47,138
I mean, these are all just like the general things I wear every day.

345
00:24:47,138 --> 00:24:48,029
Yeah.

346
00:24:48,029 --> 00:24:48,399
I know.

347
00:24:48,399 --> 00:24:50,561
Like I feel like I should be making gang times with my.

348
00:24:50,561 --> 00:24:52,927
I know, that's what we're doing over here.

349
00:24:55,012 --> 00:24:59,215
but you know, I also, you know, I also track like, you know, it's just a habit.

350
00:24:59,215 --> 00:25:02,497
Like, you know, I use my fitness pal just because I like it.

351
00:25:02,497 --> 00:25:07,400
You know, it's got good, macro that you can track with it and things like that.

352
00:25:07,400 --> 00:25:12,364
And it's a good catalog of, of things of, of food there.

353
00:25:12,364 --> 00:25:18,648
But I also track a lot of, I mean, I work in data, so like, I'm just, just a super big
fan.

354
00:25:18,648 --> 00:25:22,871
So whenever I see anything that I want to track, I don't know, it's just about discipline.

355
00:25:22,871 --> 00:25:23,295
I.

356
00:25:23,295 --> 00:25:27,268
You know, I look over and I see Jaco's book, like, and you know, the initials good.

357
00:25:27,268 --> 00:25:30,360
And I'm okay, it's going to suck to track all this stuff.

358
00:25:30,360 --> 00:25:32,201
And I just get down with it.

359
00:25:32,201 --> 00:25:39,265
I've got so many spreadsheets and so many things and you know, my better half calls me a
nerd.

360
00:25:39,326 --> 00:25:44,195
And, you know, she says it in a, a negative way.

361
00:25:44,195 --> 00:25:47,331
I think it's a positive thing because I like to do that stuff.

362
00:25:47,331 --> 00:25:48,003
So I don't know.

363
00:25:48,003 --> 00:25:49,022
It's just part of my culture.

364
00:25:49,022 --> 00:25:50,133
It's part of my DNA.

365
00:25:50,133 --> 00:25:51,014
I don't know.

366
00:25:51,014 --> 00:25:53,291
I love looking at that data and looking at.

367
00:25:53,291 --> 00:26:00,731
You the changes, but like the things you're talking about, like, know, you, gotta be
careful what you're looking at.

368
00:26:00,731 --> 00:26:05,011
Like if you look one day to the next, there's going be nothing.

369
00:26:05,011 --> 00:26:07,711
Even if you look one week to the next, you'll probably see nothing.

370
00:26:07,711 --> 00:26:17,411
Like you need to, you need to understand that in order to see those differences, you've
got to be consistent and have that months and months and months worth of data in there to

371
00:26:17,411 --> 00:26:18,671
see the changes.

372
00:26:18,671 --> 00:26:21,323
You know, like it's not up and down like.

373
00:26:21,323 --> 00:26:25,243
volatile stock market or something like that on different days.

374
00:26:25,243 --> 00:26:25,963
But I don't know.

375
00:26:25,963 --> 00:26:27,003
That's just how I do it.

376
00:26:27,003 --> 00:26:31,723
just have tons of, I automate it wherever I possibly can.

377
00:26:31,863 --> 00:26:33,923
And I've, it is sad.

378
00:26:33,923 --> 00:26:39,763
have a bunch of code that automates a lot of data tracking, but a lot of it's manual,
right?

379
00:26:39,763 --> 00:26:41,544
You just have to punch the things in.

380
00:26:41,544 --> 00:26:42,500
Yeah.

381
00:26:43,797 --> 00:26:53,379
Which gets measured gets changed if you don't measure a thing your ability to change it is
Irrelevant because you have no way to understand it So if you're not going to record it,

382
00:26:53,379 --> 00:26:58,456
you're not doing the thing So Jeremy this app that you had that said you are stronger.

383
00:26:58,456 --> 00:27:02,055
What signaling what signal messages was it based in that office?

384
00:27:02,055 --> 00:27:08,649
have to imagine that it's measuring the amount of weight that I'm lifting in any given
session, session after session.

385
00:27:09,230 --> 00:27:13,413
Yeah, yeah, it's like how many reps, how much weight over the course of a period of time.

386
00:27:13,413 --> 00:27:23,232
I'm sure if I looked back, I was lifting probably half of what I'm lifting now, that's,
I'm inferring in their description that that's how they come up with that data.

387
00:27:23,232 --> 00:27:25,912
And is that strength a one rep max?

388
00:27:25,912 --> 00:27:27,912
Is that 10 rep max?

389
00:27:27,912 --> 00:27:29,472
Is it supersets?

390
00:27:29,472 --> 00:27:31,832
Like, these are arbitrary things.

391
00:27:31,832 --> 00:27:36,792
And to have something tell you that and arbitrarily say you're 50 % stronger.

392
00:27:39,784 --> 00:27:43,824
They're all point in time snapshots of things that you've done over a certain period of
time.

393
00:27:43,824 --> 00:27:45,644
So it's nice because it shows progress.

394
00:27:45,644 --> 00:27:51,804
Now, what it also does is it encourages you to keep going with the application so that you
keep paying for it.

395
00:27:51,804 --> 00:27:56,884
And whether you actually pay out of pocket or you are the data itself, it's a different
story.

396
00:27:57,044 --> 00:28:01,024
But at the end of the day, it's about do I feel like I'm improving?

397
00:28:01,024 --> 00:28:07,904
Do I feel like I'm getting better and having a hard having something that says I am a hard
data asset says here's how.

398
00:28:07,904 --> 00:28:09,924
makes all of us want to keep going.

399
00:28:10,524 --> 00:28:14,784
So like I've always been fairly strong.

400
00:28:15,124 --> 00:28:19,004
And I ripped my distal bicep tendon off in 2021.

401
00:28:19,004 --> 00:28:24,444
And during that period of time, my deadlift was about 495 to 505.

402
00:28:24,444 --> 00:28:28,844
Ripped my distal bicep tendon off, came back six months later, was working out.

403
00:28:28,844 --> 00:28:32,124
And now my deadlift was like under 400.

404
00:28:32,304 --> 00:28:37,072
So I've had to take time to rebuild those things, put those things back together, get
myself

405
00:28:37,072 --> 00:28:50,751
Strong again, but I also went during that time for 295 to the 205 and then back to 245 so
Now my deadlift is about 455 which is about 45 pound about 40 to 50 pounds off of what it

406
00:28:50,751 --> 00:28:54,664
used to be when I was heavier But am I stronger now?

407
00:28:54,824 --> 00:29:04,080
like Maybe if you look at it as a relative ratio of what my weight was before how those
pieces go through I'm certainly my cardiovascular improvements have occurred then I can do

408
00:29:04,080 --> 00:29:06,772
a lot more pull-ups and I can do a lot more other things but

409
00:29:06,824 --> 00:29:10,667
Strength is a relative term and it's nebulous because it's on a definition.

410
00:29:10,667 --> 00:29:14,451
It's defined in context, the thing that you're trying to look at.

411
00:29:14,451 --> 00:29:19,755
So measuring those things over time is the only way you can figure out if this shit's
worth it or not.

412
00:29:19,755 --> 00:29:27,161
And by the way, if you're looking to not feel bad when you work out, no.

413
00:29:28,124 --> 00:29:29,804
I mean,

414
00:29:30,592 --> 00:29:37,412
Neatie basically, from a philosophical perspective, said that the experience of humanity
is suffering.

415
00:29:37,412 --> 00:29:42,332
So in order for you to improve and get better, you're going to have to suffer for a period
of time to make those things better.

416
00:29:42,432 --> 00:29:46,052
Diamonds happen because you put them under, you put coal under extreme pressure.

417
00:29:46,052 --> 00:29:51,872
So you can be happy coal that's not meant to do anything, or you can be a diamond getting
shaped into something better.

418
00:29:52,272 --> 00:29:54,872
But at the end of the day, does that matter to you?

419
00:29:54,872 --> 00:29:55,814
Do you give a fuck?

420
00:29:55,814 --> 00:29:58,355
Like, do you care whether or not these things pan out?

421
00:29:58,355 --> 00:29:59,475
And that's an individual thing.

422
00:29:59,475 --> 00:30:02,586
And you're going to have to figure that every person has to figure that out for
themselves.

423
00:30:02,586 --> 00:30:05,687
What the value is of this and what the real meaning behind it is.

424
00:30:05,687 --> 00:30:07,447
And nobody is going to be able to give that to you.

425
00:30:07,447 --> 00:30:09,538
You're going to have to figure that out for yourself.

426
00:30:09,538 --> 00:30:13,129
And talking about this in the context of depression and going, what am I doing?

427
00:30:13,129 --> 00:30:14,649
What's the meaning behind this?

428
00:30:14,649 --> 00:30:18,721
I find that when I go to work out, if I just go, there is no meaning to any of this.

429
00:30:18,721 --> 00:30:23,902
I'm just putting myself through suffering makes it much easier for me to just go and do
the thing.

430
00:30:23,936 --> 00:30:29,116
And then come back and look at it later on and go, all right, it doesn't matter whether or
not anything's better or not.

431
00:30:29,116 --> 00:30:31,716
I decided I committed myself to go and do this.

432
00:30:31,716 --> 00:30:34,996
I'm going to go and make it happen because I'm a stubborn asshole.

433
00:30:34,996 --> 00:30:42,996
And I think if other people take stubborn asshole approaches, just going and doing these
things and then looking at the results at the end, that's when you actually see the real

434
00:30:42,996 --> 00:30:43,896
value in the benefit.

435
00:30:43,896 --> 00:30:52,862
And if you set your goal as I want to be X amounts stronger over this period of time, no,
because there are physical and physiological things that might change.

436
00:30:52,862 --> 00:30:55,053
You could break something, could get wrecked.

437
00:30:55,053 --> 00:30:55,934
But that's not it.

438
00:30:55,934 --> 00:31:00,016
If your real goals should be, I showed up.

439
00:31:00,176 --> 00:31:01,794
I showed up this period of time.

440
00:31:01,794 --> 00:31:03,338
I tried these things.

441
00:31:03,338 --> 00:31:05,499
And part of that is that data tracking.

442
00:31:05,499 --> 00:31:06,559
I tracked my data.

443
00:31:06,559 --> 00:31:07,740
That's a showing up piece.

444
00:31:07,740 --> 00:31:15,504
And showing up consistently, trying to do those things and measuring them at the end will
give you motivation to keep showing up and doing the shitty thing.

445
00:31:15,504 --> 00:31:17,405
Yeah, and in the end, that is it, right?

446
00:31:17,405 --> 00:31:25,171
Like my goal is not to weigh X, it's not to bench X, it is literally to show up every day
and do the thing as much as I possibly can.

447
00:31:25,171 --> 00:31:35,688
But when the darkness comes and everything feels pointless to a point that it is kind of,
I don't like to use the word crippling, I just can't think of a better word, but it just,

448
00:31:35,688 --> 00:31:38,100
it's this very, it shuts me down.

449
00:31:38,100 --> 00:31:44,304
And so it was encouraging for me, and we talked a little bit about this on another episode
about like the...

450
00:31:44,624 --> 00:31:49,195
million points of data that all these things gather to give you this arbitrary number that
you're supposed to mean something to you.

451
00:31:49,195 --> 00:31:52,216
So like the aura ring and how my readiness score, like what the hell's readiness?

452
00:31:52,216 --> 00:31:54,695
I don't know, but if it says I'm at 90, let's go.

453
00:31:54,695 --> 00:31:57,027
I don't know what that means, but I'm pretty ready.

454
00:31:57,288 --> 00:32:00,376
And there are some days when it says I'm at 90 and I feel like it's at 60.

455
00:32:00,376 --> 00:32:02,729
And I'm like, well, I don't believe you, you're wrong.

456
00:32:02,729 --> 00:32:12,062
But it was just, it was a cool thing that I needed from an external point of view to go
something, something is measuring progress in a positive direction.

457
00:32:12,062 --> 00:32:13,560
whether my, whether I'm

458
00:32:13,560 --> 00:32:17,622
actually have, you know, or double my strength that I was in October.

459
00:32:17,622 --> 00:32:18,642
I don't know.

460
00:32:18,642 --> 00:32:22,794
But this arbitrary tracker says that I'm in a better place than I was then.

461
00:32:22,794 --> 00:32:25,945
And it's because of the fact that my goal is to show up every day.

462
00:32:25,945 --> 00:32:33,068
And so that was what I needed from an external point of view, because I know that for a
lot of my life, I've depended on this doesn't feel right.

463
00:32:33,068 --> 00:32:35,229
This doesn't feel like it's doing it.

464
00:32:35,229 --> 00:32:38,390
My feeling about this isn't pushing me in the right direction.

465
00:32:38,731 --> 00:32:41,434
Sometimes I need that number that says, no, you're fine.

466
00:32:41,434 --> 00:32:44,606
Like it might feel like it sucks right now, but it's actually working.

467
00:32:44,606 --> 00:32:46,007
So just keep going.

468
00:32:46,007 --> 00:32:56,574
So that I think in the end, no matter what path you're on, tracking it will help you have
the external data that tells you, yes, this is working.

469
00:32:56,574 --> 00:33:05,090
So that was, that was just sort of the duh aha moment that I had that I thought was worth
sharing in case anyone else struggles like I do to pay attention to those terrible

470
00:33:05,090 --> 00:33:05,992
details.

471
00:33:05,992 --> 00:33:07,853
Well, this brings up a really good point.

472
00:33:07,853 --> 00:33:16,419
So there's this common thought process of, you know, change your thinking from don't work
harder, work smarter.

473
00:33:16,699 --> 00:33:19,641
No, work harder.

474
00:33:20,682 --> 00:33:22,343
Motivate smarter.

475
00:33:22,984 --> 00:33:32,640
Do things smarter that keep you doing hard work, because if your entire goal is that I'm I
don't want to work as much as I'm going to do something smart, I'm going to make things

476
00:33:32,640 --> 00:33:33,731
peel off.

477
00:33:34,812 --> 00:33:35,624
No.

478
00:33:35,624 --> 00:33:39,027
Like find other shit to do, put things in that place.

479
00:33:39,027 --> 00:33:41,399
Don't stop working hard, working hard.

480
00:33:41,399 --> 00:33:50,497
mean, even if it's a mental health thing, like sometimes the working hard part when you're
having a rough day is it is hard for me to get out of bed and go take a shower.

481
00:33:50,497 --> 00:33:52,279
Like that's a thing.

482
00:33:52,279 --> 00:34:02,057
And you should be proud when you're like, I'm going to fight entropy and the universe's
decision to, you know, essentially annihilate me and turn me back into the stardust from

483
00:34:02,057 --> 00:34:03,228
which I came.

484
00:34:04,264 --> 00:34:11,330
If you just make all of these small little tiny things, little tiny incremental
existential crises that you're going to deal with.

485
00:34:11,330 --> 00:34:12,110
Perfect.

486
00:34:12,110 --> 00:34:12,831
Do that.

487
00:34:12,831 --> 00:34:15,934
Make them small, measurable, repeatable, attainable.

488
00:34:15,934 --> 00:34:17,634
All those things are great.

489
00:34:17,975 --> 00:34:20,376
But be smart with how you motivate yourself.

490
00:34:20,376 --> 00:34:21,968
Don't beat the shit out of yourself.

491
00:34:21,968 --> 00:34:25,241
Like, I'm going to do better if I keep beating myself up.

492
00:34:25,241 --> 00:34:26,572
Like, no, you're probably not.

493
00:34:26,572 --> 00:34:28,934
You're probably going to do better if you get proper rest.

494
00:34:28,934 --> 00:34:32,097
And that's one of those things that people don't take into account.

495
00:34:32,097 --> 00:34:34,164
Like they go through, I'm going to work harder.

496
00:34:34,164 --> 00:34:36,465
Great, how are you gonna rest harder?

497
00:34:36,904 --> 00:34:37,964
Well, you're not.

498
00:34:37,964 --> 00:34:40,176
I mean, that's part of the motivation factor.

499
00:34:40,176 --> 00:34:43,688
How do you do those things, prioritize those pieces and put them in?

500
00:34:43,688 --> 00:34:51,391
And that's the whole point is like, if you're gonna have an existential crisis, make it
this big, make it a little tiny, make it a little tiny existential crisis that you can get

501
00:34:51,391 --> 00:34:52,572
to the other side of.

502
00:34:52,572 --> 00:34:59,785
Don't set yourself up with big existential crises of not knowing if you're valuable, if
you have good worth, what am I doing?

503
00:34:59,785 --> 00:35:01,375
No, just make the little tiny thing.

504
00:35:01,375 --> 00:35:03,026
And sometimes you can go, all right,

505
00:35:03,026 --> 00:35:05,785
I lost this existential crisis battle.

506
00:35:05,788 --> 00:35:07,312
I'll get the next one.

507
00:35:07,636 --> 00:35:08,836
Just change the model.

508
00:35:08,836 --> 00:35:15,361
I'll put it into really, I'll put it into a way that I think about it occasionally when
I'm in those dark moments.

509
00:35:15,361 --> 00:35:17,902
So I work out, it's who I am.

510
00:35:17,902 --> 00:35:19,063
It's what I do.

511
00:35:19,063 --> 00:35:27,649
I do hard things on purpose so that when the things that I don't expect that are hard come
along, I know I can do hard things.

512
00:35:27,649 --> 00:35:31,792
But when I'm in that dark place and I'm thinking about, I want to go to the gym.

513
00:35:31,792 --> 00:35:32,835
have to go to the gym.

514
00:35:32,835 --> 00:35:34,413
I want to go to the gym.

515
00:35:34,474 --> 00:35:36,713
I think about it when I take a dump.

516
00:35:36,713 --> 00:35:38,955
I don't think I don't want to my ass.

517
00:35:38,955 --> 00:35:44,740
Like you just wiped your ass, And so you just go to the gym, right?

518
00:35:44,740 --> 00:35:48,313
You do the things that you need to do, put it into that small term.

519
00:35:48,313 --> 00:35:51,756
So when you're like, I don't want to go to the gym, just think that in your head.

520
00:35:51,756 --> 00:35:53,257
I got to wipe my ass too.

521
00:35:53,257 --> 00:35:54,420
And then you'll go to the gym.

522
00:35:54,420 --> 00:35:58,865
to defend myself a little bit here, it wasn't a moment of, don't want to go to the gym.

523
00:35:58,865 --> 00:36:01,828
was an overall, what's the meaning of it all?

524
00:36:01,828 --> 00:36:04,210
This stupid thing I do sucks.

525
00:36:05,612 --> 00:36:05,998
right.

526
00:36:05,998 --> 00:36:09,461
you are, even in those moments where you're like, what's the purpose of life?

527
00:36:09,461 --> 00:36:11,373
You're still gonna wipe your ass, aren't you?

528
00:36:11,373 --> 00:36:11,653
Right.

529
00:36:11,653 --> 00:36:18,803
Well, and I like, I'm sure this is, I know this has been said millions of times, but Mark
Manson is the one that drilled it into my head in a way that was effective.

530
00:36:18,803 --> 00:36:23,785
He's just like, the whole concept of you can look at it as life is pointless.

531
00:36:23,785 --> 00:36:24,866
So what's the point?

532
00:36:24,866 --> 00:36:26,446
This sucks or life is pointless.

533
00:36:26,446 --> 00:36:26,816
Cool.

534
00:36:26,816 --> 00:36:28,956
That means nothing matters and I can do whatever I want.

535
00:36:28,956 --> 00:36:29,887
This is amazing.

536
00:36:29,887 --> 00:36:38,290
So it's, totally just a perspective shift of like, when you're in that place, maybe
meaning this is meaninglessness is exactly what you need it to be.

537
00:36:38,290 --> 00:36:40,712
Nihilism can be very, very freeing.

538
00:36:40,712 --> 00:36:48,269
And if you just take that approach to things and just go, all right, none of this matters
anything until I'm going to give it meaning and purpose.

539
00:36:48,269 --> 00:36:48,990
Great.

540
00:36:48,990 --> 00:36:51,932
Give it your meaning and purpose and say, is what it is today.

541
00:36:53,232 --> 00:36:53,623
Well, cool.

542
00:36:53,623 --> 00:36:54,485
We solved the problem.

543
00:36:54,485 --> 00:36:55,775
We fixed that one.

544
00:36:55,775 --> 00:36:56,831
I don't know about the whole AI thing.

545
00:36:56,831 --> 00:37:00,574
We'll figure that out down the road as how many of us are left when this is all done.

546
00:37:00,574 --> 00:37:03,169
I suspect Skynet's going to fix that problem for us.

547
00:37:03,430 --> 00:37:04,060
Probably.

548
00:37:04,060 --> 00:37:05,471
But that's gonna do it for this episode.

549
00:37:05,471 --> 00:37:06,782
Thanks so much for sticking around.

550
00:37:06,782 --> 00:37:11,435
If you have gotten any value out of this and you know someone who would also benefit from
hearing it, please share this with them.

551
00:37:11,435 --> 00:37:14,346
There are links to do so at our website, thefitmess.com.

552
00:37:14,346 --> 00:37:15,767
But that's it for this week.

553
00:37:15,767 --> 00:37:17,889
Jason, Zach, thanks so much for being here.

554
00:37:17,889 --> 00:37:19,589
And you, the listener, thanks for being here.

555
00:37:19,589 --> 00:37:23,152
We'll be back in about a week with a brand new episode at thefitmess.com.

556
00:37:23,152 --> 00:37:23,991
Thanks for listening.