Bob Friday, Chief AI Officer, Juniper Networks

Bob Friday Talks: Bytes of Brilliance, Unveiling the AI Canvas. An AI Primer.

AI & MLWireless
Bob Friday Headshot
Tarek Radwan and Bob Friday of Juniper Networks stand behind a table with the video title “Bob Friday Talks Episode: AI Primer Bytes of Brilliance: Unveiling the AI Canvas” and the Juniper logo on the right side of the image.

Bob Friday Talks: An AI Primer; Unveiling the AI Canvas

Join Juniper Networks’ Chief AI Officer, Bob Friday pulls back the AI veil and reveals the magic behind algorithms and neural networks. He explores the AI origin story and how it’s evolved over the last decade to become a valuable IT partner. You’ll also hear how Juniper is integrating GenAI and LLM into its network solutions to make it easier for users to explore data in their databases.

Show more

You’ll learn

  • About the evolution of Machine Learning (ML) and Large Language Models (LLM) allowing for more complex predictive behaviors.

  • How Juniper Networks has integrated LLM and GenAI into its Marvis Virtual Network Assistant to make your AI journey easier and more effective.

Who is this for?

Network Professionals Business Leaders

Host

Bob Friday Headshot
Bob Friday
Chief AI Officer, Juniper Networks
Tarek Radwan Headshot
Tarek Radwan
Product Marketing Director

Transcript

0:00 hello and welcome my name is tar redwan

0:03 I'm product marketing director for

0:04 artificial intelligence here at Juniper

0:06 Networks and we're going to have a a

0:08 quick discussion now about the

0:09 fundamentals of

0:11 AI you know when it comes to networking

0:14 AI has become very very popular as of

0:17 late so Bob I want to ask you the first

0:20 question what exactly is AI and what are

0:23 its sailing characteristics yeah TK you

0:26 know what I usually tell people is AI is

0:28 ultimately the next step in the

0:30 evolution of automation was a slight

0:33 twist in what we used to do in the past

0:35 when we automated things it was very

0:36 deterministic we wrote scripts and they

0:39 basically did the same thing day in and

0:41 day out and you really didn't think out

0:42 of it what we're really building with AI

0:45 is really more about cognitive reasoning

0:47 now we're starting to build AI solutions

0:49 that act more like a human and actually

0:51 change Behavior over time it's almost

0:54 like hiring a new intern right you bring

0:56 on that intern you bring on that AI

0:59 assistant the first thing you have to do

1:01 is understand their skills correct you

1:04 know and after you do that you expect

1:06 this person this intern this AI

1:08 assistant to actually get better over

1:10 time as it starts learning so that is a

1:13 fundamental difference between what I

1:14 call AI in the past very deterministic

1:17 ml type of algorithms and now we're

1:19 moving to these much more deep learning

1:21 type of algorithms that really have the

1:23 behavior of cognitive reasoning and

1:25 start to look like a human that's

1:26 actually learning things as you actually

1:28 interact and use it more

1:30 and in our industry when did AI exactly

1:33 become important you know that's an

1:36 interesting question because you look at

1:37 my experience you know back in the 80s

1:40 when I did my masters I was actually

1:41 doing neural networks so when did AI

1:44 really become a big thing if you look at

1:46 Google and the Google searches AIML took

1:49 off around the 2014 time frame and if

1:53 you think about what happened back in

1:54 the 2014 time frame that's when we had a

1:57 convergence of several things coming

1:59 together we had open source code coming

2:02 together we had many more packages

2:04 making it easier to develop these

2:05 algorithms we had these Cloud models

2:07 right AWS Google aure where compute and

2:11 storage got a lot cheaper so I think we

2:13 saw kind of a certain thing happened in

2:16 2014 where we saw AI startup startup

2:19 between a combination of cloud

2:21 technology and really the open source

2:23 Community bringing tools together and

2:25 make it easier to actually build bigger

2:27 Solutions and probably the other big

2:29 thing especially what we saw of chat gbt

2:32 is we found out compared to the models I

2:33 was doing 20 years ago now we have deep

2:36 learning models with 100 200 500 billion

2:40 param weights in them right these models

2:43 are getting on the size of your brain

2:44 now and that's the other thing that's

2:46 happening now in 2014 So lately is the

2:49 models have gotten very large and they

2:51 do much more interesting things let's

2:53 double click a little bit and maybe

2:55 explain in basic terms what exactly is

2:58 ml machine learning I would say machine

3:00 learning has been around for decades

3:02 this is probably things like logistic

3:04 regression decision tree algorithms

3:06 these are algorithms we've been using

3:08 for 20 30 40 years so that's what I call

3:11 machine learning what's really changed

3:14 is really what we call Deep learning

3:16 this is where we're starting to use much

3:18 more complicated models with many more

3:20 layers many more weights this is what

3:22 brought Transformers and chat GPT to

3:24 life is these really deep learning

3:26 models that are continuously learning on

3:28 a lot more data

3:30 and so similar what we''re doing with

3:31 teams in zoom and Marvis now we're

3:33 basically bringing in deep learning

3:35 models and a lot of continuous learning

3:37 into those models and that what brings

3:39 out this neric Behavior where these

3:41 models learn and basically get better

3:44 over

3:45 time now we've heard about these terms

3:49 of unsupervised learning versus

3:50 supervised learning can you kind of

3:52 explain what that all means yeah

3:55 supervised learning is like what they

3:56 did with chat gbt and Transformers right

3:59 they took all the data off the internet

4:02 and they trained a model to predict the

4:05 next word similar what we're doing with

4:07 zoom and teams where we're taking tons

4:10 of Zoom data and building models that

4:12 can actually learn the zoom and teams

4:15 performance that is supervised learning

4:17 where you have a whole bunch of labeled

4:20 data that you're trying to learn to

4:21 predict something unsupervised learning

4:24 is really what we did with location

4:26 where I don't have any label data you

4:29 know and we're trying to learn the path

4:31 loss model for every mobile device in an

4:34 area where I basically have no label

4:37 data but I'm learning to predict that

4:39 path loss model for all devices in your

4:41 network to make a better location

4:43 estimate and just to cover a bit more on

4:46 the AI alphabet soup can you can you

4:48 maybe touch on neural networks and maybe

4:51 deep neural networks yeah these are

4:54 subtly the same you know when you look

4:56 at the original thesis on neural

4:57 networks it was pattern after after your

4:59 brain right if you look at the neurons

5:02 inside of your brain and how you learn

5:04 you're basically analog weights so that

5:07 original neural network model was

5:09 basically building up layers of Weights

5:11 that would basically able to be trained

5:13 to predict something and as I mentioned

5:15 before what has really changed now is

5:18 these deep learning models like

5:21 Transformers that we're using with chat

5:23 gbt this is what we call assistant um

5:27 attention based trans famon models these

5:31 are really very large neural network

5:33 models now with 175 billion different

5:37 weights inside of them so that's the

5:39 subtle difference between neural network

5:41 and what we call Deep learning deep

5:42 neural networks is really around the

5:44 size and the amount of Weights we have

5:46 to

5:47 train since you did mention chat GPT can

5:50 you explain what is an llm yeah these

5:53 large language model is what got

5:54 everyone's attention about a year or so

5:56 ago uh and it's made our lives a lot

5:59 easier easier because I can tell you

6:00 there's a lot fewer AI Skeptics out

6:02 there once they actually use chbt so

6:05 these large language models are based on

6:07 a model called Transformers and tension

6:11 based Transformers and this is where

6:13 they're basically taking tons of data

6:16 they're taking basically everything on

6:17 the internet and they're training these

6:19 models to predict the next word so you

6:23 think of this as a conditional

6:24 probability where I take the last 500

6:26 words to predict the next word and this

6:29 is where these large language models are

6:31 starting to sound like a natural human

6:33 language and even more they're starting

6:35 to look like reasoning so large language

6:37 models are really an amazing thing

6:40 coming out in the networking right now

6:42 both for how they sound like natural

6:44 language humans and where they're

6:45 actually starting to have reasoning

6:47 capabilities and this is going to play

6:48 into helping us troubleshoot networking

6:51 going forward now is this the same as

6:54 generative AI or gen AI yeah I think gen

6:58 Ai and Alm are basically synonms for the

7:01 same concept here whether it's a large

7:03 language model General AI uh large

7:05 language models usually apply to some

7:07 sort of language you can technically

7:09 take these same attention based

7:11 transferer models these gen AI models

7:14 and apply them to other use cases uh an

7:16 example would be instead of using

7:18 language we can use these gen AI models

7:21 to actually translate text to search SQL

7:24 elastic search this will basically

7:27 enable any customers to actually start

7:28 to expl expl their network data and

7:30 replace some of these bi tools making it

7:33 easier for customers and Enterprises to

7:36 explore data in their

7:38 databases I can't help but ask the

7:41 question how has Juniper used or is

7:44 using gen AI today yeah so what we

7:47 announced last year Mobility field day

7:50 was basically integrating gen Ai and

7:52 large language models into our marest

7:55 and the first phase of this is really

7:58 around public documents

8:00 making it much easier for customers and

8:02 support teams to get specific answers

8:05 out of hundreds of thousands of

8:07 documents so this is Phase One of taking

8:10 gen Ai and LM and making it easier to

8:13 get both the answer and the documents

8:16 and the information you're looking

8:18 for now just to quickly Go Back to

8:21 Basics what exactly is model training

8:24 yeah so every model is slightly

8:26 different it depends on what problem

8:28 you're trying to train but you could

8:29 think of every model basically has a set

8:32 of trained AA for supervised learning

8:34 that we talked about so let's take Zoom

8:36 teams for example right I have a bunch

8:39 of Zoom or teams data that I have to

8:41 join with my network feature data and

8:44 then I want to train that model once a

8:46 day once a week and so when we look at

8:49 the training model it's basically the

8:51 problem you're Sig to solve with anomaly

8:53 detection we're basically training

8:55 models every night and every day on the

8:58 last 3 months of data and that's for

9:00 anomaly detection making sure we look

9:02 back in time trying to predict anomalies

9:05 for the next 10 minutes so that is a

9:07 training it's slightly different whe

9:09 you're doing self-supervised or

9:10 supervised but almost all deep learning

9:13 models have some sort of training

9:14 schedule assigned to them thanks again

9:17 for your time Bob that was really

9:18 informative appreciate it it's good to

9:20 be here TK always enjoy talking about AI

9:28 Ops

Show more