Fortifying our Networks and Keeping Kids and Teachers Safe with Zero Trust, SSE, and SASE with John Spiegel

Welcome to the Cyber Shops Podcast.

I'm Jethro Jones, coming to you from
Washington, founder of the B Podcast

Network and author of the book School X
and how to be a Transformative Principal.

I'm a former principal at
all levels of K 12 education.

I.

Greetings, everyone.

I'm Frederick Lane, an author,
attorney and educational consultant

based in Brooklyn, New York.

I'm the author of 10 books, including most
recently, cyber Traps for Educators 2.0.

Raising cyber ethical kids and cyber
traps for expecting moms and dads.

Jethro and I have teamed up to bring
timely, entertaining and useful

information to teachers, parents, and
others about the risks arising from

the use and misuse of digital devices.

Over the coming weeks and months,
we'll be talking to some of the world's

leading experts from the fields of
education, parenting, sociology,

cyber safety, and cybersecurity.

Today.

Join us as we look at what it
takes to better navigate our

increasingly high tech world.

For more information or to
donate to our work, please visit

center for cyber ethics.org.

The Cyber Traps Podcast is a
production of the Center for Cyber

Ethics, a 5 0 1 3 independent
nonpartisan educational institute.

Dedicated to the study and promotion
of cyber ethics as a positive social

force through research, curricula
development, publishing and media,

professional training and public advocacy.

Greetings there, Jethro.

Fred.

Long time no see, so glad to be
doing the show with you once again.

We had a whole bunch of episodes
where I was solo and I missed

my partner, so welcome back.

It is good to be back.

The clouds are parting and I think
things will be a little smoother,

Oh,

excellent.

Well,

yes.

Excellent

we've got a great guest on today.

His name is John Spiegel.

He has , 25 years of experience
running global networks and

managing infrastructure.

He's an industry pioneer in
software-defined networking

and software-defined WANs.

John has spoken on the topic of
network transformation and industry

conferences such as Gartner Interop.

VMworld Palo Alto Networks Ignite, as
well as executive round table discussions.

He's also been a customer advisor
to companies like VMware, Palo

Alto Networks and Cisco systems.

Disruptive startups have also leveraged
John's knowledge to bring products to

market resulting in successful exits.

He hosts a podcast called The Edge,
where he discusses the role of CSO Zero

Trust and explores the emerging sass e
landscape when not helping companies on

their journey to modernize and secure
their networks, John can be found

cycling on the back roads of Oregon.

And if you're lucky enough to see a video
of one of his great answers in one of our

audiograms, you'll see he's got like six
bikes behind him, which which is awesome.

So John, welcome.

So glad to have you.

Thanks for having me on the show.

Yeah, this,

you're inspiring me.

I think I'm going to have to
hang up my old marathon shoes

yeah.

me from now on

Yeah.

A funny note about that in Kodiak, where
I was a principal for three years it's

an island in the Gulf of Alaska and
there was a running coach whose kid would

throw his shoes around the telephone
pole, they'd tie 'em together and then

throw 'em around the telephone wire.

And so there was this tree and
telephone pole that were just

covered in shoes from kids who had
finished the season or whatever.

Mostly his son's own shoes.

But you know, those, those traditions
do exist and they are real.

So that's not what we're
here to talk about though.

Even though exercising
is important and healthy.

John, why don't you start by explaining
what Zero Trust, s, SE and SSE is?

For our education, mostly
education focused listeners.

Yeah, this is an area that I
um, probably spend most of my

time working on now here at HPE.

Put simply zero trust flips your
traditional security strategy on its head.

In the past, our applications
the, the key data that we ran our

businesses, our educational groups
with even the government was hidden

behind what I call the four walls
of the corporate or a data center.

So, much like a castle.

So we had these.

Massive firewalls, security devices,
um, and everything behind it was deemed

to be trusted and everything outside
of it on the internet was untrusted.

It's, it's very similar if you look
at a firewall or an internet device

that you might have at your home.

Maybe it's a, a Comcast device or DSL
from a another vendor on one side.

It's your home.

Everything inside of
that home, you trust it.

It's it could be your tv, it could be your
kid's laptop, iPads, all of those things.

You just trust those items.

Everything outside of that.

That's the world of the barbarians.

It's where all the bad things happen.

Uh, and, and that's how we ran
cybersecurity for a very long time.

Then things changed.

The internet came around
and started roaming around.

Left the data center.

They became SaaS applications.

And if you're in the education industry
it's likely like Google Docs and all

the other Google items they have.

Or Seesaw is, is something
my daughter uses, which is

delivered as a SaaS service.

Many items like that.

They don't exist in the
educational data center.

They're outside hosted
by firms on the internet.

And how you secure those is gonna be very
different than what you did in the past.

On top of that, now people
they work from home.

Hybrid workforce.

We saw during COID the emergence of
the school of one, if you want to

call it that, or what we call the
branch of one in the corporate world.

Whereas, you know, these devices that
again, were behind the firewall wall

in the schools, were now at homes.

How do you secure those items?

What do you do with them that has
risen to this new strategy called Zero

Trust, whereby we look at relationships
between devices, applications, and

try to get them down to a point
where it's just the things that you

actually need to get your job done in
the education space, it may be those

applications or enablement tools like
Zoom or teams or, or things like that.

Instead of seeing All of the
applications that you might see in a

data center or a school network, it's
just those five or six applications,

nothing else, nothing more than that.

So you're constantly looking
at trust, relationships.

The simple way I put it is zero trust
is much like when you have a A ship,

call it, I don't know, the Titanic
whereby you have watertight compartments,

which the Titanic didn't have.

And that's why it hit the
iceberg and, and sunk.

But in a ship, if you're, if
you ever served in the Navy, my

father served in the Coast Guard.

He actually was stationed in
Kodiak for a little while.

ships are have watertight compartments.

When you go to an alert status,
you close everything up.

You, you, you, you sever off
you know, spots in the ship

from other spots in the ship.

So if they do get water in
them, it, that's the only

place that gets water in there.

And that's one of the main
differences with zero trust.

It's, it's looking at those trust
relationships and you only have access to

the things that you need and nothing more.

So if I could, John, I mean, it
seems to me like the bulkhead

metaphor is really good.

For intrusion, right?

If someone's trying to get into
your system and exfiltrate data

from places they shouldn't.

I think one of the issues that K 12
institutions run into though, is the

poor handling of data by some of the
vendors that you're dealing with.

So when you're talking about
providing that kind of security it.

Doesn't necessarily reach out
into the broader world and make

sure that it's protected if it's
sitting on a server somewhere else.

Yeah.

And that's, that's a really good point.

Which leads me to the other two.

Frameworks that, that we're asked
about, and that's Sass, e and SSE.

Um, SS SE is this framework basically
that brings together zero trust

with networking, with security.

It's two main pillars, is network
access in the form of what

we call software defined wan.

I won't get into it too deeply.

The other side is this other framework
called SSE, where I deal with a lot.

SSE has certain components.

Zero trust is at its foundation.

Um, but it also has protections
from the internet, which we call

a software secure web gateway.

Um, it has another pillar
called remote access.

Called zt, NA zero Trust, network
Access, and then the item that you're

talking about, how do we secure these
SaaS applications that is called

A-C-A-S-B, or a cloud Security Access
broker, which actually will go into

these SAS services and then start to
look at how do we treat that data?

Should that data be secured in a way?

Are the right settings available?

Um.

Even has mechanisms whereby we can
look at how that data is transferred.

If it's transferred through one of
these SSE systems we can apply data loss

prevention techniques in a very simple
way to understand, oh my gosh, this, you

know, data has social security numbers.

Or maybe it, you know, we know what
the nomenclature is for that school's.

ID system so we can say, oh, that has,
you know, 1000 school ID systems in

it and start to take action on that.

Should that data be
transferred to a Dropbox?

Probably not a good idea.

Um, or is it, you know, going
between a school administrator

to another school administrator?

That's an okay thing to do.

At the core of all of these systems is
and, and that is really where the focus

of security has to be going forward
because it's not enough to put it behind

a firewall in a data center, these massive
walls, we have to get down to the point

where identity is the mechanism that
we're leveraging to understand should you

have access, should you not have access?

I, I'm gonna come at this from a
slightly different angle, but one of

the projects I'm working on is a book
called The Rise of the Digital Mob,

and it really is examining the impact
of technology on our communications,

particularly our political communications.

And so one of the issues that gets
raised by that, I think is relevant

to what you're talking about, which
is the issue of online anonymity.

And so it seems to, what I would infer
from what you're saying is that This

is another reason to take a hard look
at the concept of anonymity online.

You know, particularly from a
cybersecurity perspective, you

really do need to know who's
coming in and out of your data.

Absolutely.

I, I agree with you.

I mean, there is a, there is a,
a, a balance point between being

anonymous and, and not being anonymous.

Um, but we do see it
in the school setting.

My son, uh, who goes to school, he was,
was bullied online and it was not on

your regular, it was on social media
obviously, but you know, people take.

A, a, a bit different stance in how they
talk with you, how they interact with you.

If it's online and they're anonymous,
they will say things that they

would never say to you in person.

And,

uh, to your point, I think, you know,
identity has to play a role in that.

We have to get back to more of a civil
society where what you say has to be

very similar to saying it in person.

Because what I might say to you, In,
in, in a setting when we're in real

life, could be very different than on a
political format or in a school format.

And identity has to play a role in that.

,
That's

it's conscious in our human, human life.

I mean, we're just mechanisms of
what, 250,000 years of evolution.

and you can't just change that
in a, in a matter of two decades.

It's just

Yeah.

Give or take.

I think you're ex exactly right.

No, look, I, and, and this is a little
off, off topic, so I don't want to go

too deeply into it, but the the issue
of anonymity, You know, I am trying to

recognize that it's a power issue, right?

So there are times when anonymity
is a recourse, you know, to power.

But I think now the table is
flipped and anonymity has taken

on its own power that gets abused.

Absolutely.

I, I agree with you.

I think one of the things that I see
missing, and, and maybe we can touch

on this a little bit at least from my
experience in school, is this thought

of bringing them up as digital citizens.

I.

What are the responsibilities, uh,
that you must have in this environment?

I, it's very new let's just say that.

And what's coming at us is coming
at us super fast, but I think that's

a topic for education is what are
the rules of the road regarding

these new mechanisms and how do we
learn them and how do we teach them?

How do we teach 'em about privacy?

How do we teach 'em about, you know,
Phishing and, and some of these other

cyber attacks that are going on out
there and, and, you know, how do we treat

'em to be civil people , in a public
center that might be a social media.

That's a great question.

I Jethro, I mean, digital citizenship is
a topic we have touched on many times.

Yeah, I'm, it really is, and it's
something that the, the reason

why we're having you on talking
about these more advanced topics.

Topics is because we want people to
have a better understanding of what

something like Zero Trust is and why
you should pay attention to it and know

what it is and how it, it, the, the way
that I was thinking about this as you

were talking earlier, John, is that I.

It's like having a a badge swipe at each
individual classroom door in the school so

you know who's going into each classroom
rather than just saying, once you're

in the classroom, everything's fine.

And, and we know from bad things
that have happened in schools that

somebody can get into the school.

In, in a way that is not authorized and
be in the school when they shouldn't

be or where they shouldn't be.

And, and that can then make it
not safe for everybody else.

And I think that like having everybody
wear name badges is important and being

able to identify who people are and being
able to say something as simple as, Hey,

What's your name?

What are you doing here?

And, and being able to ask
those questions is good.

I, I want to talk a little bit about
the identity, because we're talking

about a virtual space and, and how
do you ensure that someone is the

right person when people can share
passwords and things like that.

What are the mechanisms in place to
prevent that kind of stuff from happening?

Yeah, that's a, that's
a really good question.

I mean, that comes back to how strong
is your identity program If it's based

clearly just on identity and passwords.

Um, to your point, sharing
passwords, sharing those

identities becomes very prevalent.

And as well, what is the complexity of the
password with zero trust in SSE Sass e.

You need to go beyond that.

You need to start to
interrogate other items.

What's the device that they're using to
leverage that application, that system?

Is it a school device?

Is it a home device?

Is it up to a certain standard?

Because that is another area that we
see lots of vulnerabilities around.

If, if those devices are not
maintained to a certain patch level or

revelation OSS level that's an item.

As well as where is it coming from?

So if a device You know, I, I'm in
the Beaverton school district here.

If the device is coming from this
Beaverton area, it's probably, you

know, reasonably certain to assume
that that's, that's legitimate.

But if it's coming from North
Korea or Texas may not be so much

we may want to, you know, ask some
other questions around that device.

So that's a whole nother
mechanism around it.

The other area that to be aware of,
and, and I, I don't know how well

this plays into a school system
is MFA being able to have multiple

factors for that authentication.

Certainly you know, a student.

May not be possible, but for an
administrator, a principal, somebody who

you know, has that access to the, to the,
the crown jewels of the school system

MFA is is another area to play into it.

So, going beyond just the
username and password is critical.

Having a good identity system.

Being able to then to, you know,
interrogate further depending on

the type of access you're granting
that person that, that to that

whatever resource or application.

Another point I wanna you know,
the card swipe analogy that you

put out there zero trust looks to
go a little bit further than that.

So instead of having a school where
you have the ability to do card

swipes on e every door zero trust
really looks to eliminate You even

seen the possibility of those doors.

So if you have a school of a hundred
doors, um, and that person only needs

access to 10 of 'em what zero trust looks
to do is, is eliminate those other 90.

So, all they see are are the 10 doors,
and those are the only ones they get into.

The reason for that is what happens is
when an attacker, a si bad cyber actor

breaks into a system, they start to
leverage what we call lateral movement,

which means they start to move around.

Or if we're using the school analogy,
they can move around the school to

see where the doors are available.

And even if you might have
a card swipe on there.

They might be smart enough to break
into that card swipe mechanism and start

to open those doors so those nother
other 90 doors become vulnerabilities.

Yeah, I appreciate you bringing that up.

'cause I think that that's the other
piece of this that people who aren't

in the know may not even be thinking
about that there's, you know, if, if I

could make it so you could only see that
there are 10 classrooms in my school

and you can't access anything else
because you don't know that it exists

that's a much different approach than

You could only go in the
a wing of the high school.

And, and so I, I'm glad
that you brought that up.

I think Fred was gonna ask a
question, but Fred, I do want to

talk about student privacy and them
using VPNs and things like that.

So if, if you're, if you wanna stay on
this topic, let's stay on it, but if you

want to move on, we can move on to that.

I think we can do both.

Oh,

So

let's stick with Let's stick
with this topic for a second.

So, yeah, absolutely.

I, I think John Jethro raises a
great point, which is that, you know,

schools are a perpetual testing ground.

For cybersecurity because of students.

I mean, this is not a static situation.

You've got x hundreds of people
who are constantly testing what

the parameters of your system are.

Are at the heart of that, so, you
know, to how, how do you suggest

that school districts cope with that?

What are the best tools for kind
of keeping the little rugrats

from running amuck in your system?

Yeah.

Uh, that's a technology I personally,
uh, would like to see go away.

'cause what VPNs essentially do is they,
they put your device on the network.

I mean, it, it is clearly on the network.

It's an extension of the network.

There's nothing in between.

It.

And that's one of the challenges that
we're dealing with now, that applications

have moved outside of the data center.

In the past, that was okay
because you had a device.

Most likely it was a firewall that
was doing inspection of that traffic.

Um, but with VPNs, that's no longer the
case, uh, because maybe some of that

traffic is routed back to the data center.

Some of that traffic's inspected,
but the majority of it, if it's going

to Google, you know, pick your SaaS
provider in the education space.

Is not being inspected.

So, that's where tools like SSE or
frameworks like SSE come into play

primarily with this technology called
Zero Trust Network Access or ZTNA.

Essentially what it does is it routes
the traffic to a central that is local.

Uh, to the area to eliminate
or reduce penalty of latency.

And that traffic can be
inspected even if it's SSL based.

We can break that tr that.

And, and approximately 80, 85%
of traffic on the internet today.

Is encrypted.

So, that's one of the me one of the,
the advantages of an SSE system is

it can be that man in the middle,
that sanctioned man in the middle and

start to break down that traffic so
you can start to see really what's

going on and then apply policy.

And that policy can be,
uh, based on you, you.

You get access to these applications,
or it can even be inspecting the traffic

that's destined for the internet.

So, what categories are you
going to allow people to access?

And that's very critical in
a, in a school environment.

And yes I've got a 16-year-old I don't
know how many security mechanisms

I've gone through to make sure that
he can only access what he accesses.

And I, I'll tell you, it's,
it's really hard because.

He's gotten down to the point where
he figured out that he could change

his Mac address on his device and
rotate it to, to get around some

of the security protections that I
was putting in, in place to prevent

him from getting to certain sites
a teenager would want to go to.

So,

um,

You, you sound like a, you sound
like a proud, frustrated father

I, I think I think at the end of the
day I've raised a hacker and, and uh,

and I wasn't attempting to do that.

He just is naturally curious and
if there's something out there that

he wants to go after he'll spend
a lot of time on going after that.

it.

It has been an observation of mine since
I've been writing in this area for a

while, that there are a few people on
Earth more dedicated than a teenage boy

who wants to get at something on the end.

Yeah.

That

really, really impressive.

So, John, if I can beg
Jethro's, forbearances here.

Let's, let's pivot a little bit and let
me, um, let me just ask you this, the It's

clear from the recent invitations I've
gotten for school districts and so forth

that concerns about AI are exploding.

This has just become such a hot
topic and people are grappling with

understanding what it is to begin with
and then what the implications are.

I would think that someone who's
dealing with cybersecurity would

have some thoughts on that.

So, let me ask you a
couple of different angles.

So number one.

What do you see as the biggest
potential threats of AI to a

security professional like yourself?

And then the flip side, obviously, are
there ways in which you might be utilizing

it to strengthen the work that you do

Oh gosh.

Let me, let me start with the first one.

Or the, the, the latter part
of your, your question there.

Um,

Sure.

are there, are there benefits
to AI for cybersecurity?

Absolutely.

Cybersecurity is a very complex field
and we have multiple tools that we're

dealing with, and we're trying to
basically fight the bad guy in real time.

That's a hard thing for a human to do.

So if there is a mechanism that can
bring to light insights start to, to comb

through the data in real time and start
to understand patterns that is a benefit.

So how will, how will AI
be used in cybersecurity?

It's gonna be very much like a chat
bot that you will encounter on a

website when you're trying to buy
something or if you're trying to,

you know, change airline tickets.

I recently did this with Alaska Airlines
where I knew I was sitting on the runway.

I knew I wasn't gonna make
that next flight, so I.

Opened up the chat bot and we had a
conversation at the end of the day.

It, it, it changed my flights.

I was happy.

But how is it gonna be
used in cybersecurity?

It's going to essentially take away a lot
of the complexity of the past where I had

to know how to work with device A, B, and
C and being an expert at all of these.

What it's going to do is present
the relevant information to me.

And then give me a list of, of decisions
that I can make based on you know,

one, two, or three high priority,
low priority or, or just ignore it.

But those insights are gonna pop up
and it's gonna make me it's gonna

give me the decision making process,
how that decision is executed.

I might be able to leave that
up to the ai, or I can, I'll

send that off to a human.

So yeah, AI is gonna definitely, um, make
a difference in, in this, this battle.

But how, how can it be abused?

AI's only as good as the
data that it's trained on.

So if I'm nefarious what I'm looking
to do is get into that data train it

in a, in a, in a way that benefits me.

So, security around the data that AI
is leveraging is absolutely critical.

Um, the second area is if I want
to create an attack it's going to

be a lot easier using a, a format
like a chat GTP, uh, to start to

understand, okay, I'm going against a.

You know, A-A-A-A-V-P-N from Vendor
X, what are the major vulnerabilities

based on its operating system that I can
leverage, and how do I bring together,

you know, the attack in multiple ways?

I.

In the past that was hard to do.

You, you, you had to do a lot of research
and bring it all together and code it.

What's difficult with tools like
chat, GTP or whatever the follow

on is, I can just ask it, query
it, it will spit out probably 80%

you know, the framework there.

And then I'm, I'm left to
do the 20% and customize it.

That's a pretty easy thing to do.

So, it it, it's gonna go both ways, but
we are definitely in interesting times.

Well, I think the thing that was
fascinating for me is the implications for

things like phishing and spear phishing

attacks where .You know, traditionally
we've relied on bad English, bad grammar,

inaccuracies in spelling things like that.

And of course, the large language
models are going to take that away and

everything will look smooth and perfect.

And the background research
to make the underlying pitch

more credible is easier to do.

Yeah.

Now it'll say, let's dive in
and that'll be our key word.

And anytime I read that, I
know it was written by AI

Well, we need some kind of
linguistic watermark, do we not?

We have to.

That's one

thing.

We've never really, never
really, in the 14 months that

chat, GPT has really been active.

Nobody's really sat down
and talked about that, so.

That's good.

Yeah.

Any, any thoughts from you on that, John?

Yeah, it's, it's, again, it's early days.

It's very early days in our journey
to ai and we're gonna learn a lot.

But is moving fast.

It's, it's, yeah, it's,
and it's everywhere.

it, every concept, every conference
I've been to in the past, uh,

three months, AI has been.

Top of the charts for
anyone talking about it.

To the point even where Forrester, which
is a analyst firm that tends to be very

conservative about making predictions
around new technologies came out day one

starting their security risk conference
in Washington, DC and it was all about ai.

So, we're at a watershed moment for sure.

I absolutely agree.

Yeah.

I think for, for me, the, the other
piece of the student privacy that I

want to go back to is, . Students have a
right to privacy as well, and employees

have a right to privacy as well.

And so how do you balance the
need for privacy and the need for

security in a virtual environment,
both for the employee perspective

and for the student perspective?

I think it goes back to, always gonna
have access to all of these things, and

that's most likely the IT administrator.

So, they have to be well-trained
and understood that thou shalt

not look at certain things.

Um, and that's a, that's a critical piece.

I, I.

You know, previously my, I worked
in a company where that, that,

that was violated and it was the
one time that I met with the FBI

Uh, it was, I was the victim of it,
it, but the one time I met with the

FBI and I, I never have to hope to have
that thing ever happened to me again.

But That's insider threat
is, is always prevalent.

Especially in an environment like
an education system where you're

probably going to be leveraging
some of the students for, um, it, it

just the way it is and that's a good
thing in, in, in a lot of respects

because it gives them a leg up on.

Their journey, their career, their future.

They get access to some systems that
hopefully they can, you know, go on

and, and have a, a strong career in.

So there's, there's definitely a balance
point between how you, how you treat that.

And it's something where I don't think
we, we have all the answers and there's

still a lot of of questions about that
and you know, . A couple of thoughts from

me on that is, you know, employees do
have a right to privacy, but also they

shouldn't be doing stuff that that is not
pertinent to their work on work devices.

I had a friend reach out to me the
other day and he said, Hey, I think

I'm gonna get my own . Personal device.

'cause I've always used my work
computer for all my personal stuff.

And I'm like, dude, that's crazy.

Like, I couldn't even fathom it because
I, I haven't done that until I became

my, I I'm doing my own business.

And so that's, now I do, but before
that, I always had my own computer

because I didn't want my . You know,
shopping history, my personal stuff on

the district computers, because I thought
there needed to be that, that separation.

But a lot of people don't
believe that that is the case.

And I, I think more people should
believe that that's the case.

But you know, you shouldn't
be bringing that stuff in.

But then it gets a little more tricky
with students because they don't have

they may not have access to these things.

Except through their school devices.

And so those are, those
can be tricky situations.

And you know, if we
just dive into something

Specific if a student is struggling from
a mental health crisis or something like

that, and they're trying to find support
for that in a way that is, is working

for them, that they're, you know, trying
to research and learn what they need

to do to deal with the, the challenges
they're facing they, that information

I think should still be private.

But . You know, if, if we can track
everything they're doing and see

everything they're doing and everywhere
they're going, then we're gonna,

we're gonna be able to see that.

And so then does that mean that we
intervene or should we not intervene?

Or, or how, how do we, how do
we know what the right thing

to do is in those situations?

Is, is kind of where I'm going with that.

Any thoughts, John?

Yeah.

a, I mean, that comes down, that's
a hard, hard, hard question.

In terms of the devices think there's.

Probably a generational issue here.

Whereas some of us who are more
seasoned understand that, you

know, thou shalt not mix devices.

So what I do in my home world, I mean,
I have two, two computers that I use a

laptop for, you know, on the weekends
and after hours and a laptop for work.

Um, and I try not to mix 'em.

But my kids on the other hand you
know, they're leveraging their iPads.

For schoolwork.

It, it's just the way they do it.

And you know, where it, where my
challenges come is, is the mobile devices

like the, the, the, the smartphone.

That definitely is hard to mix.

I'm not gonna carry two smartphones
everywhere I go, uh, and try

not to mix and match 'em.

So, I think where that needs to
be looked at is you know, and this

is something still, uh, out there
in, in terms of it is How do we

deliver an an agentless solution?

So, how do we deliver applications
without and with security?

With the visibility on a device
you know, may not be ours, it

may not be controlled by us.

And how do we containerize that?

So.

It's leveraged in a way that doesn't
leave a footprint on the, on that device.

There's no data on it, but
yet they can get the work done

that they, they need to do.

You know, in terms of, of looking at
data and understanding are, are, you

know, is, is there a crisis within a,
a person, uh, mentally and based on

the websites they're looking at, I'm
gonna, I'm gonna kind of avoid that.

Well, yeah, look, this is a hugely
complicated area, obviously, and it's,

it's difficult for schools in particular
because they're in loco parentis.

They have their own security needs.

So you do want to have student privacy.

But if that student is then trying
to, you know, figure out how to

pose a threat to the school, the
school obviously wants to know that.

Um, .A possible solution down the road
and maybe not that far down the road,

will be the ability of AI systems to
evaluate these things without a human

intervention and only reporting out
if certain criteria are met that rise

to a level of human intervention.

I'll be curious to see if that's the
direction in which things start to move.

Yeah, I mean, certainly I think
you're onto something there.

You know, given the level of
data that's produced AI systems

eventually will be going there and,
and you know, there is gonna be.

A moment where we have a coach, right?

And, and that coach is based on
what your needs are, what your

goals are, what your objectives are.

And it's gonna kind of give you I
think Microsoft calls it a co-pilot.

They're, that's kind of their,
the term they're using in, in

marketing for all their applications.

But I think it's a, it's a good
term because we're going to have

a copilot and, copilot is probably
gonna take it a little bit further.

When you're using that device,
they're gonna be scraping

the data and and analyzing.

Um.

Based on, you know, what websites
you're going to, what applications

you're leveraging, what social media,
uh, you're viewing and get down to

the point where even on your body,
you know, I wear an iWatch and it

measures Yep, it measures my heart rate.

It can measure, you know,
my blood oxidation levels.

The, it, the technology there
is gonna get to the point where

it can really understand it.

And then on top of that as
well there's a lot of emerging

technology around the brain.

And so I to your point, I think
we're, we're at a moment where what is

personal and what is not personable.

It's very possible there's a
scenario within 10 to 15 years where

if I'm at work I'm required or.

Incented to wear a device that kind of
measures my brainwaves to know whether

or not I'm focused on work or not.

And that may be enhancing me as
well to, to do, to do a better

job and to provide me focus.

But again, it goes back to the point
is, you know, what is really me and, and

personal to me and what should my employer
or even school be able to see or not see?

So, interesting times.

Hmm.

Oh yeah, for sure.

Yeah.

Hey, this was a, a great interview, John.

Where would you like people to go to learn
more about you and the work that you do?

Yeah, so really the best is,
is to reach out on LinkedIn.

I do a lot of posting there.

Um, I have some contrarian
opinions on technology, so I

call it stirred, not shaken.

I.

Opinions as well.

I do run a group called the SSE Forum
where we bring together practitioners

of this zero Trust, SS, SE and SSE
frameworks together and have conversations

around how to apply that technology.

And then finally my
podcast called The Edge.

Where we interview a lot of CISOs
or influencers or even technology

evangelists within the industry.

So those are the three places.

Great.

We'll have links to those in
the show notes, so make sure you

check those out@cybertraps.com.

And thanks again, John, for being here.

We appreciate it.

Thank you.

It's been a real pleasure, John.

That wraps up this episode
of the Cyber Traps Podcast.

In the coming weeks, we'll continue our
coverage of emerging trends in a variety

of areas, including digital misconduct,
cyber safety, cybersecurity, privacy, the

challenges of high tech parenting, and
now of course artificial intelligence.

Along the way, we'll talk to a growing
collection of international experts who

are helping us to understand the risks
and the rewards of digital technology.

You can find the Cyber Traps Podcast
on all of your favorite podcast apps.

We hope that you will share the show
with your friends and colleagues and

reach out to us if you have guests.

Tech questions or topic suggestions.

If you'd like to follow us on Twitter
or x, I'm at Jethro Jones and Fred is

at Cyber Traps, and if you're still
listening, you must have loved this show.

If that's the case, please
leave us a five star rating and

review in your podcast service.

We appreciate you having you with
us and look forward to having

you join us for our next episode.

Fortifying our Networks and Keeping Kids and Teachers Safe with Zero Trust, SSE, and SASE with John Spiegel