Transcript
1
00:00:00,124 --> 00:00:01,606
Hey, this is the Fit Mass podcast.
2
00:00:01,606 --> 00:00:02,447
My name is Jeremy.
3
00:00:02,447 --> 00:00:03,589
Thanks so much for listening.
4
00:00:03,589 --> 00:00:11,521
Here on the show, we talk about all kinds of things regarding your mental health, anxiety,
depression, ways to live with them, ways to try to live with them a little bit less.
5
00:00:11,521 --> 00:00:14,110
Today, we're going to venture out to something a little bit different.
6
00:00:14,110 --> 00:00:21,930
You know, I consider myself something of a geek when it comes to the various technological
ways to track the things we're doing so that we can try to make that little 1 %
7
00:00:21,930 --> 00:00:23,923
improvement we're all trying to make every day.
8
00:00:23,923 --> 00:00:29,203
So at any given time, I've got a number of different devices strapped to different parts
of my body to keep track of those things.
9
00:00:29,203 --> 00:00:37,987
So, you know, your smartwatch tracks your steps, your ring tracks your sleep, your apps
all analyze your diet, but who has access to all of that personal health data?
10
00:00:44,792 --> 00:00:44,715
This is gonna be fun today because we're gonna talk with not only one of my oldest
friends, but a incredibly smart cybersecurity expert.
11
00:00:44,715 --> 00:00:49,524
name is Jason Hayworth, and we're gonna uncover some of those hidden risks of your
favorite health tracking devices.
12
00:00:49,521 --> 00:00:53,672
anything that's free or low cost is because you're part of the product offering
13
00:00:53,729 --> 00:01:01,753
My hope is that you'll learn how insurance companies might use your fitness data against
you, companies are cozying up to politicians, and most importantly, how to protect
14
00:01:01,753 --> 00:01:05,423
yourself without giving up the convenience of modern technology.
15
00:01:05,423 --> 00:01:13,613
So we'll take a closer look at the truth about data collection and health apps, essential
steps to protect your privacy, why your fitness tracker might be sharing more than you
16
00:01:13,613 --> 00:01:16,665
think, to help keep your health data secure.
17
00:01:16,665 --> 00:01:22,952
so stick around, you're not gonna wanna miss this crucial conversation about protecting
your privacy in an increasingly connected world.
18
00:01:22,952 --> 00:01:24,929
My conversation with Jason Hayworth.
19
00:01:24,929 --> 00:01:25,829
is coming right up.
20
00:01:27,005 --> 00:01:32,122
What if your mornings didn't start with an annoying jolt, but with a calm focus and
energy?
21
00:01:32,122 --> 00:01:41,274
If improving your sleep routine is a top priority for 2025, meet the Lofty Clock, a
bedside essential engineered by sleep experts to transform both your bedtime and your
22
00:01:41,274 --> 00:01:41,923
mornings.
23
00:01:41,923 --> 00:01:44,786
Here's why I find the Lofty Clock to be a game changer.
24
00:01:49,501 --> 00:01:46,441
It mimics your body's natural rhythm.
25
00:01:46,479 --> 00:01:50,514
wakes you up gently with a two-phase alarm.
26
00:01:50,514 --> 00:01:56,450
first a soft wake up sound to ease you into consciousness, followed by a more energizing
get up sound to help you start the day.
27
00:01:56,450 --> 00:01:57,831
Lofty is not just an alarm clock.
28
00:01:57,831 --> 00:02:00,066
It's your all-in-one bedside sound machine.
29
00:02:00,066 --> 00:02:07,150
over a hundred options, including white noise, nature sounds, meditations, and even
bedtime stories to help you relax and unwind.
30
00:02:07,150 --> 00:02:08,962
So get that phone out of your bedroom.
31
00:02:08,962 --> 00:02:16,767
Let's face it, grabbing your phone to set an alarm or listen to an audio book often leads
to an hour or more of mindless scrolling.
32
00:02:16,767 --> 00:02:21,629
Join over 100,000 blissful sleepers who have upgraded their rest and mornings with Lofty.
33
00:02:21,796 --> 00:02:27,866
Go to buylofty.com and use the code fitmas20 for 20 % off orders over $100.
34
00:02:28,110 --> 00:02:33,020
That's b-y-l-o-f-t-i-e dot com the promo code
35
00:02:33,020 --> 00:02:34,333
Fit Mass 20.
36
00:02:35,557 --> 00:02:35,697
right.
37
00:02:35,697 --> 00:02:37,798
So Jason, you and I have known each other for a long time.
38
00:02:37,798 --> 00:02:39,159
You're one of the smartest people I know.
39
00:02:39,159 --> 00:02:45,641
So I'm going to do my best to keep up in this conversation, but I am currently wearing an
aura ring, an Apple watch.
40
00:02:45,641 --> 00:02:47,102
I've got my Apollo band.
41
00:02:47,102 --> 00:02:50,013
I use AI all day long for my work.
42
00:02:50,013 --> 00:02:50,613
I love it.
43
00:02:50,613 --> 00:02:51,324
I live by it.
44
00:02:51,324 --> 00:02:52,804
makes a lot of my decisions for me.
45
00:02:52,804 --> 00:02:54,755
It does a lot of my work for me.
46
00:02:54,987 --> 00:02:58,313
But this may not be the smartest, safest path to be going down.
47
00:02:58,313 --> 00:03:00,430
Is that a fair assumption?
48
00:03:00,430 --> 00:03:02,291
Well, I don't know about that.
49
00:03:02,291 --> 00:03:03,452
It's balancing, right?
50
00:03:03,452 --> 00:03:11,098
So, I mean, let me back up a little bit and say, first of all, if I'm one of the smartest
people, you know you need to expand your sphere a little bit.
51
00:03:12,819 --> 00:03:16,482
I'm just old and I've been doing this stuff for a long time.
52
00:03:16,604 --> 00:03:20,912
We'll add up the how many times I say I don't get it at the end of this conversation, then
we'll decide.
53
00:03:20,912 --> 00:03:22,322
glad we're recording it.
54
00:03:22,923 --> 00:03:25,384
Yeah, so all the devices that you're wearing.
55
00:03:25,384 --> 00:03:33,226
So I mean, I think you mentioned four of them, but there's probably more of them that are
around in your environment and your home and everything else you're not even aware of.
56
00:03:33,266 --> 00:03:36,066
So I mean, you're really talking about biometric trackers.
57
00:03:36,066 --> 00:03:45,459
So things that are looking at signals that your body's giving off and taking those and
sending them off to some ephemeral database out in the internet and whether that ephemeral
58
00:03:45,459 --> 00:03:48,310
database actually is secure.
59
00:03:48,762 --> 00:04:02,352
is not sharing your data, who they're sharing your data with, what the capacity for it is,
how they're enriching that with other information can potentially be used as a way to
60
00:04:03,093 --> 00:04:05,194
target you sometime in the future.
61
00:04:06,035 --> 00:04:07,946
We give up information all the time, right?
62
00:04:07,946 --> 00:04:13,330
So, I mean, we social media networks, we use streaming media.
63
00:04:13,330 --> 00:04:16,676
I mean, all these things require you to sign an end user license agreement.
64
00:04:16,676 --> 00:04:17,698
something as simple as your email.
65
00:04:17,698 --> 00:04:19,567
You're giving up tons of information,
66
00:04:19,567 --> 00:04:30,635
mean, anything that's free or low cost is because you're part of the product offering and
they are taking that information and the data and reselling it or using it to re-advertise
67
00:04:30,635 --> 00:04:31,396
back to you.
68
00:04:31,396 --> 00:04:38,471
And based upon the information that's there, they're creating demographic profiles about
you, finding information about you and figuring out the things that you like to do.
69
00:04:38,471 --> 00:04:46,406
Like, for example, if you're wearing an Apple watch or a Fitbit or a Garmin or a Polar or
any of the thousand devices out there that can
70
00:04:46,534 --> 00:04:53,770
look at the number of steps that you take, what your exercise patterns are, you know, do
you actually go into your watch and click the little button that says, I want to walk
71
00:04:53,770 --> 00:04:54,581
right now.
72
00:04:54,581 --> 00:05:02,847
Like that informs those people as to what your commitment is and what your repetitive
cycles are and how you're going to move in those things.
73
00:05:02,847 --> 00:05:08,602
And right around January, all these devices get like spikes because everyone's going into
the new year thinking they're going to create resolutions.
74
00:05:08,602 --> 00:05:09,333
Right.
75
00:05:09,333 --> 00:05:12,625
And there's a million different ways to look at these components.
76
00:05:12,625 --> 00:05:15,950
So Apple did a really good job of going through and kind of
77
00:05:15,950 --> 00:05:19,191
collecting all these different data sources and created Apple Health.
78
00:05:19,252 --> 00:05:20,893
And they have their own watch, right?
79
00:05:20,893 --> 00:05:25,155
The Apple Watch is a really great lifestyle product.
80
00:05:25,235 --> 00:05:30,978
Whether or not the data it collects from a signaling perspective on your health is good,
that's debatable.
81
00:05:30,978 --> 00:05:33,840
I think over the last few years, it's definitely improved.
82
00:05:34,400 --> 00:05:37,582
But what they really become is this data aggregation function.
83
00:05:37,582 --> 00:05:43,565
So they can take all these different signals and put them together and give you a
correlated data output.
84
00:05:43,605 --> 00:05:45,134
And you can actually
85
00:05:45,134 --> 00:05:53,057
Share some of this data with your doctors or your medical professionals or your fitness
coaches or your nutritionist.
86
00:05:53,177 --> 00:05:54,798
Those things are fantastic, right?
87
00:05:54,798 --> 00:05:57,869
And there's all kinds of other signal data that can come in down towards you.
88
00:05:57,869 --> 00:06:06,683
So diabetics back in the day used to use continuous glucose monitors that would go through
and track and look at the blood sugar level in your body at any given point in time.
89
00:06:06,683 --> 00:06:10,104
And it was, it's basically been a miracle for type one diabetics.
90
00:06:10,124 --> 00:06:14,496
One of the main problems with type one diabetics is that they take too much insulin.
91
00:06:14,828 --> 00:06:17,079
and don't get enough sugar in their body and then they crash.
92
00:06:17,079 --> 00:06:23,323
And when your blood sugar really gets below 50, actually really below 60, most people
start feeling kind of woozy and out of it.
93
00:06:23,323 --> 00:06:29,706
And type 1 diabetics have kind of trained themselves to go, know, I kind of feel this way,
I might want to take this.
94
00:06:29,706 --> 00:06:36,910
But over time, you don't know if you're feeling that kind of woozy and that kind of tired
because your blood sugar is low or because you're tired or whatever.
95
00:06:36,910 --> 00:06:41,333
So they will sometimes eat when they don't need it or they would sometimes take insulin
when they don't need it.
96
00:06:41,333 --> 00:06:44,184
It's that combination of finding the right values
97
00:06:44,184 --> 00:06:49,717
that have helped those people in that space actually get something of value out of these
types of wearable devices.
98
00:06:49,837 --> 00:07:00,503
If you couple that and take the alerting functions that tell you what your glucose is, so
they know when to take insulin and when to eat food, that's great for type 1 diabetics.
99
00:07:00,623 --> 00:07:05,356
Now let's go to type 2, or like where most Americans are, pre-diabetic.
100
00:07:05,356 --> 00:07:13,440
Yeah, I mean, we eat too much, we drink too much, we don't move enough, we don't sleep
enough, we're way stressed out.
101
00:07:13,858 --> 00:07:23,164
So there's a lot of companies out there that created these fitness programs that were
wrapped around the idea of taking these wearable devices and looking at measurable ways to
102
00:07:23,164 --> 00:07:26,446
find output to help people improve their health outcomes.
103
00:07:26,446 --> 00:07:32,870
And what they would do is they'd slap these CGMs by Dexcom or Libre or somebody else on
them and they would resell them back as a service.
104
00:07:32,870 --> 00:07:39,974
So if your insurance company is paying for this, you're probably paying 35, 40 bucks a
month, these little sensors that get dropped in your arm or your hip or your...
105
00:07:40,959 --> 00:07:43,235
Are these like the inside tracker or whatever?
106
00:07:43,235 --> 00:07:49,540
Like it's literally, it's a disc that you stick into your arm and it's like tracking your
blood sugar all day long and reacting to whatever you eat.
107
00:07:49,540 --> 00:07:50,341
Yeah.
108
00:07:50,404 --> 00:07:58,609
so it tracks that and the idea is that you're supposed to put in what you're eating and
what your activities are into some of these apps so that you can track it but for a long
109
00:07:58,609 --> 00:08:00,831
time these apps didn't have anything in there.
110
00:08:00,831 --> 00:08:04,894
Well, take all these different wearable functions that you have in there.
111
00:08:04,894 --> 00:08:14,541
So I mean you're wearing an aura, which aura is actually fantastic at doing passive SPO
blood oxygen monitoring and it does something that a lot of other wearables don't.
112
00:08:14,541 --> 00:08:16,366
It measures your body temperature.
113
00:08:16,366 --> 00:08:23,686
Which was great during kovat because it was a really good indication as to whether or not
you were you were potentially getting sick And they could look at your heart rate and if
114
00:08:23,686 --> 00:08:29,046
your temperature went up and your heart rate went up at the same time It's pretty likely
that you have an infection.
115
00:08:29,046 --> 00:08:41,826
I own one I don't wear one anymore because I work out and I had to take it off all the
time I broke one one time So I felt pretty bad about breaking this $300 wearable But I was
116
00:08:41,826 --> 00:08:42,636
taking that
117
00:08:42,636 --> 00:08:56,497
I was wearing my Garmin, I was wearing my Fitbit, and I was wearing another one called the
Weeby something that supposedly thought it would track hydration levels.
118
00:08:56,497 --> 00:09:05,904
So yeah, so I had one watch in one arm, two on the other, the Aura Ring, I have sleep
apnea, so I a CPAP.
119
00:09:06,045 --> 00:09:11,413
So I have that signal coming in bound towards me, and I have no way to collect and
correlate this data.
120
00:09:11,413 --> 00:09:12,449
Yeah, yeah.
121
00:09:12,462 --> 00:09:19,362
And Apple Health does about half of those and pull those things in, but it doesn't give
you the whole scope of what's there.
122
00:09:19,362 --> 00:09:25,402
It just takes the summary data that the application itself picks up from those devices.
123
00:09:25,802 --> 00:09:33,962
If you connect to the APIs on the back end, which stands for Application Programming
Interface, you can actually gather a ton of information.
124
00:09:33,962 --> 00:09:40,738
So Garmin devices have like 30 or 40 different signals inside of them that you can pull
from the API.
125
00:09:40,738 --> 00:09:43,521
and you can start looking at them over time as individual signals.
126
00:09:43,521 --> 00:09:46,685
But when you open up the Garmin app, you don't see all those signals.
127
00:09:46,685 --> 00:09:48,417
You have no way to see all those signals.
128
00:09:48,417 --> 00:09:52,651
What you see is a summary of that information and the output that's shown up.
129
00:09:52,692 --> 00:10:01,551
And that's going to be based upon whoever the algorithm experts are at Garmin giving you
that information, who may or may not have medical degrees or training.
130
00:10:01,551 --> 00:10:08,033
ask you about because I mean, you even mentioned that, that Apple has sort of come a long
way in terms of their tracking and that information.
131
00:10:08,933 --> 00:10:13,685
To some degree, I want to know how much I can trust that what I'm getting back is
accurate.
132
00:10:13,685 --> 00:10:16,586
want to know, I I'm sure I could use information better.
133
00:10:16,586 --> 00:10:19,436
What I mean, just breaking it down really simply.
134
00:10:19,436 --> 00:10:28,099
Initially, I thought the, the aura was lacking in terms of its ability to track steps
because it was always wildly different than whatever device I was using.
135
00:10:28,099 --> 00:10:29,039
But then.
136
00:10:29,239 --> 00:10:31,359
really is the other device any more accurate?
137
00:10:31,359 --> 00:10:32,359
Like I don't know.
138
00:10:32,359 --> 00:10:36,879
I'm not going to literally count every step I take to match up at the end of the day to
how close they are.
139
00:10:36,879 --> 00:10:39,699
So for me, I landed on this is the one I'm going to use.
140
00:10:39,699 --> 00:10:43,059
And it's basically going to be my, my tracker.
141
00:10:43,059 --> 00:10:50,179
Like if, if, if it said I did 10,000 steps today, then tomorrow I'm going to do 10,001 on
this device, not on another device, not on something else.
142
00:10:50,179 --> 00:10:51,799
Like this is my baseline.
143
00:10:51,799 --> 00:10:55,419
This is what I'm going to use and whether it's accurate or not, doesn't matter.
144
00:10:55,419 --> 00:10:59,125
It's going to show me my, the variation in my activity.
145
00:10:59,125 --> 00:11:03,251
based on the current, whether accurate or not, measurement.
146
00:11:03,251 --> 00:11:05,586
Is that sort of how people should use these things?
147
00:11:05,586 --> 00:11:06,407
Absolutely.
148
00:11:06,407 --> 00:11:13,070
So back in the olden times, the concept of a foot was not 12 inches.
149
00:11:13,070 --> 00:11:14,931
It was someone's foot.
150
00:11:14,931 --> 00:11:22,415
And if that foot was Ted's foot or Bill's foot, and they're different sized feet, you're
going to wind up a different size walls.
151
00:11:22,415 --> 00:11:28,535
And if one's person building one wall on one side of one's building one on the other, you
might wind up with some very uneven architecture.
152
00:11:28,535 --> 00:11:29,688
Right.
153
00:11:29,688 --> 00:11:32,459
The same thing goes for trackables and wearable devices.
154
00:11:32,459 --> 00:11:37,520
So how they measure them is typically based on some type of accelerometer function.
155
00:11:37,780 --> 00:11:46,023
it's the, imagine you've got a little sphere with another little sphere inside of it, and
it kind of rolls around.
156
00:11:46,203 --> 00:11:50,824
When it bounces off the edges, that's when it goes, this person has made a transition.
157
00:11:50,824 --> 00:11:52,115
I think this might be a step.
158
00:11:52,115 --> 00:11:53,385
And it's a guessing game.
159
00:11:53,385 --> 00:11:57,025
Because I can sit there and do this, and it counts as steps.
160
00:11:57,025 --> 00:12:00,388
yeah, I was gonna say a friend of mine went to a Performance one.
161
00:12:00,388 --> 00:12:05,525
I hadn't walked all day but clapped all night and she had like 3,000 steps because she was
just clapping to perform.
162
00:12:05,525 --> 00:12:05,955
Yeah
163
00:12:05,955 --> 00:12:06,415
right.
164
00:12:06,415 --> 00:12:14,037
mean, so there are actual devices out there, pedometers that you can put on your foot.
165
00:12:14,037 --> 00:12:15,027
You can wear on your ankle.
166
00:12:15,027 --> 00:12:18,958
So I've actually done this before where I had to go and had to buy a bigger band.
167
00:12:18,958 --> 00:12:21,029
I have ridiculously tiny ankles for a big man.
168
00:12:21,029 --> 00:12:24,960
So an extra large garment or Fitbit band fit just fine.
169
00:12:24,960 --> 00:12:28,521
And I could actually walk one path, check it.
170
00:12:28,521 --> 00:12:31,762
And I'd carry my iPhone with me, which also had a pedometer on it.
171
00:12:31,880 --> 00:12:41,124
And I would look at that as it goes across and I could compare them side by side and I
could definitely see a difference when I would wear it on my ankle versus when I keep
172
00:12:41,124 --> 00:12:44,275
something in my pocket, which is when I have something on my wrist.
173
00:12:44,495 --> 00:12:49,817
And I mean, the difference, it wasn't massive, like it was within five, six, seven
percent.
174
00:12:49,817 --> 00:12:50,618
It wasn't huge.
175
00:12:50,618 --> 00:13:01,198
But but that's, know, well worth looking at, because if you're looking at these things
over time, know, seven percent adds up seven percent.
176
00:13:01,198 --> 00:13:08,498
Body fat is much different than 12 % body fat or 15 or 25.
177
00:13:09,078 --> 00:13:10,339
Yeah.
178
00:13:10,339 --> 00:13:16,819
So these things are tracking all this data there, you know, on paper, it's to help us
improve, help me, help me know how to be 1 % better.
179
00:13:16,819 --> 00:13:19,899
What are the numbers of steps I need to take tomorrow to beat today?
180
00:13:19,899 --> 00:13:22,919
What do I need to eat tomorrow to beat today, et cetera, et cetera.
181
00:13:23,499 --> 00:13:25,899
On, you know, again, on paper, on the surface, this is great.
182
00:13:25,899 --> 00:13:27,019
It's helping me.
183
00:13:27,019 --> 00:13:31,959
It's, it's either a something I can afford cause I bought it or I, you know, it fits my
lifestyle.
184
00:13:32,139 --> 00:13:36,199
And if they are collecting data, I would hope that it's to give me a better product,
right?
185
00:13:36,199 --> 00:13:37,519
They're going to, they're going to improve the thing.
186
00:13:37,519 --> 00:13:38,721
They're going to add to it.
187
00:13:38,721 --> 00:13:42,350
But I imagine there's some nefarious things going on as well.
188
00:13:42,350 --> 00:13:46,390
So again, like we said, most of the time, if it's cheap or low cost, you're the product.
189
00:13:46,450 --> 00:13:49,290
And even with these devices, you're still part of the product.
190
00:13:49,290 --> 00:13:58,190
So they're going to use your data to make their product better and to figure out ways to
create things that are better, to market materials to you in a better fashion, to try to
191
00:13:58,190 --> 00:14:00,410
extract more value out of you.
192
00:14:00,930 --> 00:14:06,570
can look at Fitbit and you can look at Garmin and Polar, for example.
193
00:14:06,570 --> 00:14:12,410
If you want the good version of their app, that actually gives you more information and
gives you exercise programs and everything else.
194
00:14:12,440 --> 00:14:16,131
Great, pay you seven bucks a month, and then you get access to all these other tiers.
195
00:14:16,131 --> 00:14:24,133
And they're going to go through and they're going to send you targeted messages to try to
entice you to actually buy into these service layers.
196
00:14:24,873 --> 00:14:35,476
But there's another angle to this, and it's that your data might be being resold somewhere
else, or it might be being scraped by somebody else.
197
00:14:35,476 --> 00:14:41,230
Some of these AI inference models, know, ChatGPT, Claw, DeepSeek.
198
00:14:41,230 --> 00:14:46,010
people out there going through and gathering this information from publicly available
sources.
199
00:14:46,950 --> 00:14:50,650
And theoretically, your data should be locked in and secured in those sites.
200
00:14:50,650 --> 00:14:55,310
Most of them require multi-factor authentication, which is not easy to get past or break
through.
201
00:14:56,070 --> 00:15:00,150
But part of those ULAs also say that your data sometimes gets anonymized.
202
00:15:00,150 --> 00:15:03,390
But that anonymization of data can change over time.
203
00:15:04,070 --> 00:15:09,550
they're all based upon the idea that they're going to obscure what they call PII, which
is...
204
00:15:09,600 --> 00:15:12,812
your personally identifiable information.
205
00:15:12,812 --> 00:15:22,749
And typically speaking, that comes down to a telephone number or an email address or a
physical address, but even an IP address, like what the IP address is on your computer,
206
00:15:22,749 --> 00:15:28,062
that can be considered PII, depending on which, if you're looking at GDPR or California's
version of it.
207
00:15:28,843 --> 00:15:36,558
When you start looking at these things in context and you start taking that information
and kind of overlaying it over the top of each other and then giving it into an aggregate
208
00:15:36,558 --> 00:15:39,720
or somebody like Apple or Google or Samsung has their own health app,
209
00:15:40,140 --> 00:15:48,012
you start exposing yourself to multiple different layers and areas in which your
information can be extracted and pulled in.
210
00:15:48,012 --> 00:16:00,046
And if you look at things like the current administration, who seems to play a little
loose with the rules when it comes to protecting people's information and really
211
00:16:00,046 --> 00:16:08,178
information sovereignty, you're going to see a lot more things tend to pop up and
potentially be used against people.
212
00:16:09,171 --> 00:16:16,515
I'm glad you went there because this is something that I've heard a number of women
talking about particularly because a lot of these things will track their menstrual cycle.
213
00:16:16,515 --> 00:16:19,736
And they're like, I fuck this handmaid's tail bullshit.
214
00:16:19,736 --> 00:16:28,450
I'm not reporting a goddamn thing to a goddamn person because it just takes the wrong
person with the stroke of a pen deciding that that is now the government's business and
215
00:16:28,450 --> 00:16:29,360
not mine.
216
00:16:29,360 --> 00:16:32,762
Is that realistic or is that okay?
217
00:16:32,762 --> 00:16:34,382
How realistic is that?
218
00:16:34,382 --> 00:16:35,971
How concerned should people be about that?
219
00:16:35,971 --> 00:16:36,892
Very.
220
00:16:37,032 --> 00:16:38,392
I mean, honestly, they should be.
221
00:16:38,392 --> 00:16:45,794
So, I mean, it depends if you're if you're worried about pre-existing conditions and
you're talking with the US medical health care or the US medical system just in general.
222
00:16:45,794 --> 00:16:48,235
I mean, it's fucked and it's stupid anyways.
223
00:16:48,375 --> 00:16:58,698
And we pay three times the cost for the same or for worse health care outcomes than a lot
of other countries in what you would consider to be the first world nations.
224
00:16:59,412 --> 00:17:02,744
And we complain about insurance companies and yes, they had overhead.
225
00:17:02,744 --> 00:17:07,006
They're not making massive profit margins, but they're still adding 25, 30%.
226
00:17:07,006 --> 00:17:10,907
But that doesn't account for 200 % increased cost.
227
00:17:10,907 --> 00:17:19,992
And that comes down to the fact that we've got a for-profit medical system in this
country, and it's not designed to provide care as its primary function.
228
00:17:19,992 --> 00:17:22,212
Its primary function is to make money.
229
00:17:22,333 --> 00:17:25,218
And again, if it's free or low cost,
230
00:17:25,218 --> 00:17:26,178
you're the product.
231
00:17:26,178 --> 00:17:32,080
hence insurance, why these things get pumped through and you don't want it paying a lot
unless you don't have insurance.
232
00:17:32,100 --> 00:17:41,113
So there's a lot of pieces that go into this and all these companies are going to try to
use as much information as possible to deny having to pay your claim, whether you're a man
233
00:17:41,113 --> 00:17:42,343
or a woman or anything else.
234
00:17:42,343 --> 00:17:45,544
And we talk about women having their menstrual cycles tracked.
235
00:17:45,544 --> 00:17:52,834
Well, there's other ways to infer whether or not somebody is feeling sick or has other
conditions or the things that they're doing.
236
00:17:52,834 --> 00:17:54,025
based upon their movements.
237
00:17:54,025 --> 00:18:01,471
And if you look at things based upon movements, and you look at things based upon the
stuff that they would normally do, and you can create tracks and trends, you can create an
238
00:18:01,471 --> 00:18:05,225
inference model that shows how these things actually work.
239
00:18:05,225 --> 00:18:18,305
And by correlating all that, it's not enough to just not let it see certain bits of
information, because they can infer things from the other ANTLR information that is
240
00:18:18,305 --> 00:18:21,778
available in the regular operation of the device that you're
241
00:18:21,975 --> 00:18:30,795
So these insurance companies could theoretically use a tool to gather information about
how often you were sick last year, whether you went to a doctor and a card was swiped or
242
00:18:30,795 --> 00:18:30,935
not.
243
00:18:30,935 --> 00:18:35,595
But they can tell based on this biometric data that you got sick X amount of time.
244
00:18:35,595 --> 00:18:36,715
So you are a higher risk.
245
00:18:36,715 --> 00:18:40,398
So your premiums are now increased by 20 % or whatever.
246
00:18:40,398 --> 00:18:43,670
and imagine that for things like life insurance and automobile insurance.
247
00:18:43,670 --> 00:18:55,905
mean, everything that has a cost center basis where you may or may not be subject to
interpretation of risk is going to be affected by all these elements.
248
00:18:55,905 --> 00:18:59,287
I connected cars have the same aspects, right?
249
00:18:59,287 --> 00:19:06,742
Like you can, I think Allstate has a thing called drive wise, where if you put it on your
phone, it tracks how quick you drive.
250
00:19:06,742 --> 00:19:09,233
in different conditions, you know, are you breaking the speed limit?
251
00:19:09,233 --> 00:19:11,004
Are you moving these things across?
252
00:19:11,064 --> 00:19:15,666
But the idea is that if drive wise is on, you're doing well, we're going to give you a
discount.
253
00:19:15,767 --> 00:19:22,530
if the inverse is true, so if you're a reckless driver, they're going to use that against
you.
254
00:19:22,530 --> 00:19:29,083
And if you get in a car accident because you're driving recklessly and they see this thing
go through, your insurance company might actually go, well, you're responsible for this.
255
00:19:29,083 --> 00:19:31,354
So we're going to set that to pan these claims.
256
00:19:32,137 --> 00:19:41,529
Any bit of information that you hand over and give to folks in this fashion, it's going to
be litigated in court and how they pay things out is going to be used as evidence against
257
00:19:41,529 --> 00:19:42,320
you.
258
00:19:42,380 --> 00:19:47,081
And don't fool yourself that the companies aren't looking at ways to make this happen.
259
00:19:47,081 --> 00:19:51,223
There's just certain legal reasons today that they can't, but that will change.
260
00:19:51,223 --> 00:19:59,626
Yes, I think this shines a big bright light, not to turn this into politics because you
can get this shit on CNN or whatever, but like when you look at the inauguration pictures
261
00:19:59,626 --> 00:20:10,379
and there's Jeff Bezos, Mark Zuckerberg, Elon Musk, like these tech bros know that if they
align themselves with those in power, it's going to grease the wheels heavily to give them
262
00:20:10,379 --> 00:20:15,731
more access to this information to not only sell us more products, but to actually
manipulate.
263
00:20:16,380 --> 00:20:21,146
many aspects of our lives and the things we pay for and the amount we pay for them.
264
00:20:21,378 --> 00:20:26,562
Yeah, I mean, the thing that prevents people from running amok with your data is
legislation.
265
00:20:26,943 --> 00:20:28,824
It's not legal to do it.
266
00:20:29,345 --> 00:20:33,628
Well, if that changes, it goes in their favor.
267
00:20:33,628 --> 00:20:42,776
I mean, you can look at the H1B visa thing that they're pushing for right now, and you've
got Trump supporters getting riled up about this and being angry that H1B visas are being
268
00:20:42,776 --> 00:20:45,728
approved by these tech bros, but these tech bros bought their way.
269
00:20:45,728 --> 00:20:47,889
into the Trump administration and they're going to stay there.
270
00:20:47,889 --> 00:20:49,759
Like they're not going to, they're not going to drop out.
271
00:20:49,759 --> 00:20:57,701
mean, they've made an investment and I'm not going to debate their politics because you
don't have to.
272
00:20:57,701 --> 00:21:01,242
This is all business motivation and economically motivated.
273
00:21:01,242 --> 00:21:02,733
And you can call it greed.
274
00:21:02,733 --> 00:21:12,946
can call it whatever you want, but they're not strictly doing it for the social good
because if they were doing it for the social good, we'd probably have turned off social
275
00:21:12,946 --> 00:21:14,326
networks a long time ago.
276
00:21:15,735 --> 00:21:17,416
Yeah, absolutely.
277
00:21:17,836 --> 00:21:19,946
Okay, so what do do to protect ourselves?
278
00:21:19,946 --> 00:21:27,260
Because I mean, even as I'm sitting here with all these devices, I know that when I set
them up, I set up various privacies to do what I wanted them to do.
279
00:21:27,260 --> 00:21:29,111
But I also know that I have them talking to each other.
280
00:21:29,111 --> 00:21:34,683
My watch talks to my personal trainer app, which talks to Apple Health, which talks to
whatever.
281
00:21:34,683 --> 00:21:43,157
So I mean, while on each level, I might have some safeguard in place of what I don't want
shared, I am sharing it with other things that are maybe working around some of those
282
00:21:43,157 --> 00:21:43,987
rules.
283
00:21:44,203 --> 00:21:46,929
How do I protect my information as much as possible?
284
00:21:47,372 --> 00:21:54,961
Yeah, so, it's the best way to do is to completely and totally disconnect and go live in a
cabin in the woods.
285
00:21:54,961 --> 00:22:00,128
right throw these things in the river and go yeah go live in the mountains right
286
00:22:00,128 --> 00:22:02,380
and disconnect from society, stop using credit cards.
287
00:22:02,380 --> 00:22:07,934
So the practical reality is that some of this information is going to be collected.
288
00:22:07,934 --> 00:22:15,339
But what you should get in the habit of doing is sanitizing the information feeds and
sanitizing the information that you have from a historical perspective.
289
00:22:15,339 --> 00:22:24,838
So pretty much every company out there has a privacy and security department that can go
through and
290
00:22:24,838 --> 00:22:33,093
Eliminate your data from their system now won't get rid of the anonymous data and how
those things are put in but the PII Can be yanked out of the system and destroyed So if
291
00:22:33,093 --> 00:22:41,883
you decide i'm not going to use this device anymore Make sure that you follow the
instructions to ensure that all of your data is destroyed and email their support team and
292
00:22:41,883 --> 00:22:51,583
saying I would like to make sure that all of my data is taken out the Many states have
legislation around this and they have to kind of push in that direction but most these
293
00:22:51,583 --> 00:22:54,054
companies kind of you know, don't want to get into a
294
00:22:54,334 --> 00:22:59,678
a fight that's going to be related to something like GDPR or California State of
Protection Privacy Act.
295
00:23:00,039 --> 00:23:05,263
Those are things that you definitely want to do from a day-to-day basis.
296
00:23:05,263 --> 00:23:09,747
However, make sure you have multi-factor authentication set up on every single one of your
accounts.
297
00:23:09,747 --> 00:23:15,162
So if you don't know what that is, just type in MFA anywhere and you'll see it pop up.
298
00:23:15,162 --> 00:23:21,217
But basically what it does is you log in with your username and password and then they
send you a second authentication method.
299
00:23:21,217 --> 00:23:22,968
So that could be an app on your phone.
300
00:23:22,968 --> 00:23:24,129
They could send you a text message.
301
00:23:24,129 --> 00:23:25,540
They could send you an email.
302
00:23:25,540 --> 00:23:27,202
It doesn't matter which one it is.
303
00:23:27,202 --> 00:23:29,044
Use it and use it in everything.
304
00:23:29,044 --> 00:23:32,326
And if you can, rotate those things through.
305
00:23:32,587 --> 00:23:35,969
Next, set a strong password on everything.
306
00:23:36,370 --> 00:23:40,914
So don't reuse the same password over and over and over again.
307
00:23:41,115 --> 00:23:47,240
You want to try to use different passwords for different accounts because you are your
persona.
308
00:23:47,240 --> 00:23:49,358
Your identity online is...
309
00:23:49,358 --> 00:23:53,858
going to be based upon either handles that you have on different accounts or email
addresses.
310
00:23:53,918 --> 00:23:57,158
And that's typically how companies will track you or telephone numbers.
311
00:23:57,278 --> 00:24:00,818
But the idea is that these collections of things equal you.
312
00:24:00,818 --> 00:24:03,158
And I used to work in ad tech for a long time.
313
00:24:03,158 --> 00:24:09,858
And we would see anonymized folks, people that looked like they were anonymized coming
through with no cookie data, anything like this.
314
00:24:09,858 --> 00:24:16,462
But because we had a record of their IP addresses and we had a record of things that they
did before, we could actually build these anonymized profiles.
315
00:24:16,462 --> 00:24:22,942
And then if we get an email address, we'd start stuffing these email addresses in,
possibly this, possibly this, possibly this.
316
00:24:22,942 --> 00:24:26,222
And this is before AI inference and neural networks existed.
317
00:24:26,222 --> 00:24:30,642
This is all what you would call expert systems in the manual stitching together phase.
318
00:24:31,222 --> 00:24:32,982
They're a lot better now.
319
00:24:33,022 --> 00:24:34,522
You don't have to do all that.
320
00:24:34,522 --> 00:24:36,362
It just happens automatically.
321
00:24:36,902 --> 00:24:42,362
So make sure that you are using good password protection.
322
00:24:42,872 --> 00:24:52,166
great good password policies and that you're enforcing yourself to use MFA and make sure
that you delete as much of your bad data as possible.
323
00:24:55,271 --> 00:24:59,753
So even if I do all that, I I feel like those are the steps, those are going to keep the
bad guys out, right?
324
00:24:59,753 --> 00:25:04,475
Like the people that are out there looking for things that they shouldn't have their hands
on.
325
00:25:04,815 --> 00:25:09,197
I mean, I guess the answer is going to be very subjective.
326
00:25:09,197 --> 00:25:14,719
But how do you trust that the company, I people on the other side of this, how do we know
who to trust?
327
00:25:16,056 --> 00:25:17,186
Yeah.
328
00:25:18,388 --> 00:25:20,129
Yeah, you don't trust anybody.
329
00:25:20,790 --> 00:25:22,521
So, I mean, there's a really good case.
330
00:25:22,521 --> 00:25:31,058
There's a couple of good cases where Apple's gone through and said, we're not going to
provide access to iPhones for certain people, for certain terrorist groups.
331
00:25:31,118 --> 00:25:34,581
And they block people out and they push back and push back and push back.
332
00:25:34,581 --> 00:25:39,905
And other security companies came in and figure out how to crack the iPhone security,
which iPhone claim was uncrackable.
333
00:25:40,346 --> 00:25:45,960
So we know that security is an illusion and bullshit.
334
00:25:46,136 --> 00:25:55,713
So don't expect these companies to continue over a protected period of time to protect
your data and your interest.
335
00:25:55,713 --> 00:25:57,675
Just assume that these things are going to come out.
336
00:25:57,675 --> 00:26:07,812
And honestly, if you really want to do the best thing for you, stay in good shape, delete
your data, do good password control history, make sure that you're not leaving things out
337
00:26:07,812 --> 00:26:13,184
there that are easy to find you, and use the privacy settings in your social media
content.
338
00:26:13,184 --> 00:26:13,728
Right?
339
00:26:13,728 --> 00:26:14,639
that's a big thing.
340
00:26:14,639 --> 00:26:23,254
mean, people don't realize that all of their posts go out there on the public Internet and
they're searchable via Google and all these AI systems are supposed to respect this thing
341
00:26:23,254 --> 00:26:25,785
called robots.text on every website out there.
342
00:26:25,785 --> 00:26:29,457
That says you can and can't get this information, but nobody respects anything.
343
00:26:29,457 --> 00:26:33,870
Facebook's AI crawler flat out tells you it's not going to respect anything.
344
00:26:33,870 --> 00:26:42,304
So, you know, if you have these bits of bits of information or if you have a website or a
blog post or something else, use proper controls.
345
00:26:42,670 --> 00:26:45,141
put a good web application firewall in front of it.
346
00:26:45,141 --> 00:26:50,495
Do something good as a small business to try to make sure that your users are as protected
as possible.
347
00:26:50,495 --> 00:26:53,266
And I'm not saying that data breaches and data leaks aren't going to occur.
348
00:26:53,266 --> 00:26:54,877
They're definitely going occur.
349
00:26:55,177 --> 00:26:57,479
But it's something that you're just going to have to be aware of.
350
00:26:57,479 --> 00:27:04,883
I would also highly recommend spending the $50, $60, $70 a month for an identity
protection service.
351
00:27:05,544 --> 00:27:07,765
They all have some type of value around them.
352
00:27:07,765 --> 00:27:09,794
The likelihood that you're
353
00:27:09,794 --> 00:27:16,357
going to have your stuff hacked and someone's going to take your social security number or
your banking information, in your lifetime is high.
354
00:27:16,678 --> 00:27:18,649
I am a cybersecurity pro.
355
00:27:18,649 --> 00:27:22,100
Like I was chief product officer for a company that made cybersecurity products.
356
00:27:22,101 --> 00:27:25,102
And I've had my credit card stolen twice.
357
00:27:25,223 --> 00:27:28,284
I've had my social security numbers bounce from the dark web a dozen times.
358
00:27:28,284 --> 00:27:35,848
And like I have all these control systems and all these things in place and I use good
paths for sanitization and multi-factor authentication and it still happens.
359
00:27:35,848 --> 00:27:39,390
So buy insurance and put
360
00:27:39,470 --> 00:27:45,788
put effort forward to ensuring that when this thing does happen, you have recourse and a
way to try to protect yourself.
361
00:27:46,032 --> 00:27:48,415
is there anything we didn't talk about that you want to talk about?
362
00:27:49,618 --> 00:27:49,968
I'm sure.
363
00:27:49,968 --> 00:27:51,900
Yeah, yeah, yeah.
364
00:27:52,527 --> 00:27:59,167
the next topic it'd be fun to go into and actually talk about how I was able to go through
and take some of this wearable data.
365
00:27:59,167 --> 00:28:02,707
And I wasn't good at getting information from Garmin or Fitbit or anybody else.
366
00:28:03,007 --> 00:28:06,547
So a couple of friends of mine, we built an application called UpRove.
367
00:28:06,547 --> 00:28:09,527
And you can find that he.uprove.me.
368
00:28:10,567 --> 00:28:14,827
It's basically a data aggregator that takes data from all these different sites and all
these different signal angles.
369
00:28:14,827 --> 00:28:17,847
And it also allows you to share with other folks.
370
00:28:18,235 --> 00:28:22,457
correlates and collects this data and unique in different ways to show things over time.
371
00:28:22,457 --> 00:28:31,208
And I used it to drop a third of my body weight in four months by going through and
tracking and looking at how these things went through.
372
00:28:31,208 --> 00:28:32,281
And there's a long story around it.
373
00:28:32,281 --> 00:28:34,579
But maybe next time we jump into that.
374
00:28:34,637 --> 00:28:35,098
Well, I was right.
375
00:28:35,098 --> 00:28:35,740
That was fun.
376
00:28:35,740 --> 00:28:38,142
My thanks to Jason Hayworth for joining me here.
377
00:28:38,142 --> 00:28:42,830
like to learn more about his cyber security consulting work, can check out
HayworthConsulting.com.
378
00:28:42,830 --> 00:28:45,367
The link for that is in the show description for this episode.
379
00:28:45,367 --> 00:28:48,407
You can also check out UpRove, which he mentioned there at the end of the interview.
380
00:28:48,407 --> 00:28:50,527
The link to that is also in the show description.
381
00:28:50,527 --> 00:28:51,443
Stick around for a minute.
382
00:28:51,443 --> 00:28:54,861
for hanging out with me today and listening to my conversation with my old friend Jason.
383
00:28:54,861 --> 00:28:56,222
That is going to do it for this episode.
384
00:28:56,222 --> 00:28:56,832
I've got to go.
385
00:28:56,832 --> 00:29:01,976
I've got to set up a whole bunch of two factor authentications that I have not set up and
now I'm terrified.
386
00:29:01,976 --> 00:29:03,358
So I'm going to go do that.
387
00:29:03,358 --> 00:29:07,490
You do that as well, but come back next week to the fitmas.com.
388
00:29:08,742 --> 00:29:08,876
That's where we'll have another new episode for you.
389
00:29:08,876 --> 00:29:14,846
And before you go, if you do know of anybody who could benefit from hearing this
conversation that you just listened to, please do share it with them.
390
00:29:14,846 --> 00:29:18,052
You can find the links to do that at our website, thefitmess.com.
391
00:29:18,052 --> 00:29:19,334
We will see you in about a week.
392
00:29:19,334 --> 00:29:20,403
Thanks so much for listening.