TLDW logo

Google Cloud Next '22— livestream

By Google Cloud

Summary

## Key takeaways - **Cloud Next '22: A New Era of Transformation**: Google Cloud Next '22 emphasized the transformative power of cloud technology, focusing on how it drives organizational change and helps tackle complex global challenges. [06:10] - **Customer-Centric Transformation with Google Cloud**: Companies like Renault and SEB are leveraging Google Cloud to achieve their transformation strategies, aiming for data-driven operations, improved scalability, and enhanced security. [07:10], [09:11] - **Open Data Cloud: Unifying and Unlocking Data**: Google Cloud is building an open data cloud that unifies data from all sources and formats, enabling all styles of analysis and integrating seamlessly with machine learning platforms. [20:49] - **AI/ML: Driving Innovation and Efficiency**: Vertex AI is accelerating ML model development, reducing coding time, and enabling easier integration of visual data, leading to faster time-to-value and improved business outcomes. [25:04], [24:51] - **Security: Engineered In, Not Bolted On**: Google Cloud champions invisible security, integrating it into operations to simplify processes and leverage expertise from securing its own business and billions of users. [36:06] - **Sustainability as a Business Imperative**: Sustainability is presented not just as an environmental goal but as a driver for operational efficiency and innovation, with data insights playing a key role in achieving both. [51:10]

Topics Covered

  • Cloud is now about Value Creation, not just Cost.
  • Unlocking Data from Silos to Drive Actionable Insights.
  • Invisible Security and Digital Sovereignty are Essential.
  • Sustainability Drives Both Efficiency and Innovation.
  • Hybrid Workplaces Demand Seamless Collaboration and Security.

Full Transcript

[MUSIC].

Today.

It's uncertain.

It's going to be a crazy day.

It's also unwritten.

I've got this.

Today is the day we can start to

change things.

Make things better.

And make better things.

Let's take on problems.

Big or small.

Not yet, I'm coding.

Because they're all worth

solving.

Let's make tech more helpful.

More open and accessible to

everyone.

Let's keep data safe and people

safe.

Look after the environment and

each other.

Today may surprise us, push us,

even scare us.

That's why we're here.

Let's take those challenges and

make something even better for

tomorrow.

[MUSIC]

>> Please welcome President of

Google Cloud international and

head of Google island Adair Fox

Martin.

For those who're in Munich it's

been three years.

Three whole years since we've

had the chance to connect at a

Czech event.

I'd like to thank all of our

partners who helped make this

event possible.

Special thanks to our accenture.

C3.AI.

And Deloitte.

Indeed Google Cloud could not do

what we do with our customers

without the ongoing support of

our partners.

To bring us all back together.

I had the pleasure of and it's

not called the technology

museum.

Technology is already implied.

And today we'll be looking at

the transformative power of

Cloud technology.

And how it can help drive your

organization's transformation

forward.

This is what today is all about.

Your transformation.

And how Google helps.

And from the main stage and in

our breakout sessions you'll

hear directly from our customers

about both the value and the

experienced they are driving via

their transmation via Cloud.

Taking on some of the world's

most formidable challenges.

Today, tomorrow and long into

the future.

Let's get started.

Let's connect with our our first

customers.

Renal alliance and SEB have

aggressive ly been pursuing

transformation strategies.

Cloud technology sits at the

very core of their tr

transfo

transformations.

Renault alliance of I.T.

services Stephan van nuke.

And group committee from SEB.

Please join me in welcoming them

to the stage.

>> Take a seat.

Great to have you.

Thank you so much for being with

us.

Maybe we start by helping the

audience here to understand a

little bit about the vision for

transformation that you have in

your company.

Stephan let's start with you and

the Renault from a company we've

24 years of history to tech

company.

We want to from a provider of

services.

When we say transformation we

mean transformation across all

areas of the business.

We are moving to offering

mobility services.

Our vision that 20% of our renew

will come from renewable

services by 2030.

Second, we are changing the

company business operating model

all operations will be

cloud-based.

Data driven and AI enabled.

We need to gain scaleability and

improve security and

continuously provide new

services.

>> I definitely agree about

Cloud being the best platform

you.

>> What about you Petra.

Tell us about SEB's.

SEB was formed 165 years ago.

There will definitely not be our

first transformation.

That's where we meet our

customer's expectations with

Nnew

innovative services.

To be the best banks of to

tomorrow.

Digital transformation is

critical to succeed.

We have a set to be cloud-native

by 2030.

Having said that it's important

for us to acknowledge we must

ackno

acknowledge.

Feet on the ground.

Head in the Cloud.

In areas like data annalytics.

Additional technology part is

like Google Cloud enable us to

manage and data security and

cyberdefense with defeconfidenc.

And in our view doing this in

the Cloud is the only way

forward

>> Yeah.

It seems that business and

digital transformation are

synonymous at SE.

Stephan how do Google support

Renault's in that regard.

>> We are working on the

end-to-end from the Karda sign

to its manufacturing supply

chain management to delivery to

the dealers.

It it will then be monitored.

Connected over manufacturing

plant and supply chain to the

Google Cloud platform in order

to collect data.

We are switching supply chain

supplies models.

And improve efficiencies through

improvement.

Which is a huge part of our

transformation objective.

We are addressing the B to C

channel using the unique

reliability of the platform.

Any company undergoing a shift

of this magnitude trusts the

security of the platform.

We trust Google secure and our

confident you will also have to

great solution for us to comply

with the French Ddigital as an

example.

Beyond trust when choose Google

as a partner for multiple

reason.

Google Cloud is referring Cloud

flexibility and ability to

respond to short and long-term

challenges.

The Google Cloud data is the

best value on today's market.

Then of course the Google team

are very skilled.

Thank you so much Stephan for

some of your kind remarks there.

Petra, how are you at SEB

working with Google on the other

digital transformation efforts

you have in the company.

When we started our journey to

become cloud-native.

Google Cloud has supplied all of

the functionality we needed so

far.

More than that their true

partnership to solve challenging

through close by RATION of

technology and.o of tb of techn

and.o of technology

ar of technology

and.at of technology

ai of

technol of

technoln of

technology and.

And when our new data platform

we'll be able to use current and

historical data.

Being able to automate more and

spend less time developing

infrastructure embeddeded

banking and self-service digital

banking.

SEB the actively supporting

sus

sustainable admission.

To the Paris agreement for

climate change.

So our role is very much to

enable companies to make choices

that contribute to sustainable

society.

So with our updated

sustainability structure we

raise our am bumper stickers

levels and take the next stepb

the nei levels and take

the net levels and take

the nei levels and take

the neo levels and

take then levels and

take the next step our

experienced with working with

Google provides the reliability

and responsibility of a major

enterprise.

And also the speed innovation,

and flexibility of a start up.

>> Thank you for that.

Thank you to both of for Kay

king the time to share your

insights with us.

I think that the stories you've

shared present a great backdrop

that come today.

Ladies and gentlemen, can you

please join me in a big thank

you for instStephan and Petra.

Now as Stephan and PRET raw

working with Google is more than

about achieving consistency and

convenience.

It's about helping organizations

right across EMEA on their

journey of sustainable change.

I had like to hand it to C.E.O.

of Thomas Corian.

>> Thank you Adair.

I'd like to extend a very warm

welcome to our customers and

partners.

We're really delighted to have

you all with us.

To echo Adara cross Europe,

Middle East and Africa, now more

than ever Cloud is essential for

digital transformation.

A lot of Cloud work to date has

been focused solely on cost

optimization.

They also want value creation.

Cloud has to deliver more value

and more innovation to

organizations.

From understanding your

customers better, helping you

make your supply chain more

resilient.

Bringing people together to

improve not just the

productivity but their

creativity and creating seamless

interactions aCRAURS your entire

value chain.c your entire

valuer your entire

value chain.o your entire

values your entire

values your entire

value chain.

And the continued investments in

Cloud, on Europe's terms to

support sovereignty, security

and sustainability.

It's inspiring to see how

organizations across the region

are leading the way in this new

era.

Vodafone for instance has

migrated to Google Cloud to

drive breakthroughs in

artificial intelligence and

machine learning that have

helped it improve customer

loyalty through personalized

offers.

Swiss international airlines is

to better align with booking

demand helping them better

accommodate customers and save

millions.

HSBC has lodged more than 250

live services across its

organization to support the

experienced of over 40 million

customers around the world.

What sets these organizations

apart and what you will hear

from many of our customers today

is the they have systematically

embraced Cloud as the foundation

for their digital

transformation.

To share more about this I'll

hand things back to you Adair.

Thank you again for having me

and I hope you all enjoy the

rest of Google Cloud Next.

>> Thanks, so much Thomas.

The transformation era of Cloud

is marked by completely kind of

conversation.

The questions from our customers

are no longer just about

convenience and efficiency.

They are about the very core of

their businesses.

An organization are asking five

key questions.

How do we become the best at

understanding and using data?

How do we ensure we have the

best technology infrastructure?

How do we know that our data,

our systems and our users are

secure?

>> How do we create the best

workplace for our people?

And how do we collectively

create a more sustainable f

future.

Last year we introduced those

questions on behalf of our

customers.

Today our customers are going to

share how they've answered with

the help of Google Cloud.

So for the first two questions

please Garrett KASMEYER right

here at Google Cloud.

Garrett.

[MUSIC].

>> All right.

Thank you Adair.

So how does an organization

actually become the best at

understand and using their data.

The challenge though that today

data is generated at far greater

rates and it's trapped in the

new data silos that different

formats of point solutions and

closed clouds.

When data is unlocked it can

improve everything.

It can make supply chains

smarter.

It can make help you build

con

contextualized solutions.

Google is a leader in the

analysis of structured and

unstructured data.

And we are unifying the

ecosystem for you creating the

most open data Cloud.

This includes all of your data

in all of your formats from all

sources from any clouds.

Collectively it's about enabling

all styles of analysis.

Today the they announce is

unstructured data in big query.

And with your structured data

from your operational and Sass

applications.

If that you can cannot connect

with our machine learning

platform all through the simple

and familiar space of big query.

A very interesting 90% of our

customers they are analyzing

data from other clouds.

If big query omniyou can in

azure and AWS without moving the

data it makes it simple and it

saves you egress fees.

Now we put all of this data

together.

What comes next?

You want to unify a query in a

simple.

Today we're to announce the

integration of spark into big

query.

And I can imagine what most of

you are thinking if you're into

data like A am.

Let's talk about data lakes.

Big query does support key file

formats which it can build data

lakes and big lakes.

Patchy iceberg and the upcoming

of the popular hoodie and Delta.

Now after we connected all of

the data connected to all let's

connect all of this data to

people.

The first stop is business

intelligence.

In the business intelligence

base let's be honest we have two

sets of data.

On the one side there is

centralized and managed data.

And on the other side well, we

have not so much official data.

It's often distributed and it's

many times used for self-service

dash boarding.

First we have looker.

This is for the leading for

governed access.

And it's really the data.

We have Google Cloud studio.

It's one of the most popular for

self-discovery.

If you combine them today they

have more than 10 million users

a month.

Today we're unifying these two

products called looker and

looker studio.

We're going to unify se

self-service BI with looker

studio pro with additional

support and management

capabilities and well, you know

with obviously you would think

this is a Google only story.

I told you about being committed

to an open Google Cloud.

We are already working with tab

blow.

Today we're look you are for

Microsoft PowerBI.

So let's move on to BI to AI.

Last year we have announced

vertex AI.

Our machine learning platform.

It requires 85% less code than

other machine learning models.

It has been really, really hard

to get meaningful insights out

of video streams.

So today I'm very excited to

vertex AI vision.

It is an end-to-end fully

managed platform for vision

applications.

It allows to easily invest and

install visual screen data.

And it takes down development

times from days down to minutes.

Now this is time to value.

And open data Cloud must also

connect a SA AAS applications t

many in the audience.

Our data Cloud makes it much,

much simpler to access data from

all of our SA SAAS, SAP and Ado.

We have you have models in big

query.

And it just takes seven minutes

to get deployed.

One customers who's taking

advantage is call for.

And they have reduced their

operating expenses and their

energy consumption by moving

their data centric to the Cloud

and now with our data technology

they are getting insights and

take actions in realtime on

online management and logistics.

We have 800 technology partners

who're building their products

on top of Google today.

And have joined us in the data

Cloud alliance program.

It is a program with the

commitment to open standards and

en

interoperability.

We also CLAEB raw.

MongoDB.

Service now and many more.

Because of the bottom line of

all of this you need an open

data Cloud.

And with all of these

announcements we are taking a

major step forward in making

this a reality for all of us.

DEPR me being an engineer great

infrastructure means you can

innovate and build really,

really quickly.

The challenge yesterday

infrastructure slows down

development.

It makes innovating really,

really hard and the reason is

quite simple.

Its complexity.

It's tough to build quickly when

you have to select from

thousands of options.

When you have to stitch together

infrastructure all the time.

Think about the proceed

developer managing costs and

keeping applications safe.

The good news though there is a

better way.

Tomorrow's Cloud creates S

simplicity for golden paths.

And the point is the they reduce

complexity right from.

IDE right through production.

Here's a great example of golden

path.

Today we are deploying software

shield.

Right from or source code into

deployment.

It has four key components.

First, a Cloud workstation to

develop.

Second, an assured open source

software service that we are

validating and securing.

Third, Cloud willed our

continuous INintegration servic

and GKE and our service

management desk coming together

in software shield.

Our workload optimized NCHCK

right from the silicon app.

We start obviously with the most

AI optimized infrastructure.

Google has TPUs.

We want to massively for AI.

Today we are announcing general

availability of TPU version 4.

They're even faster.

We are super prized and psyched

with NVIDIA from with AI

workloads on their latest

technology that ranges from GPUs

to managed learning services in

combination with GOLING AI

models.

Our joined to open source AI.

Not only to bring the ecosystem

together but to prevent platform

and model lock in.

I was talking about developers

and 50% of developers actually

using containers today and

Google contributed a very cube

neats.

Google's cube neats engine is a

fully formatted engine.

It scales to UN unprecedented

levels.

P T

PTS they have tripled their

deployment size with us.

And they scaled up to 21 pet

aflops.

PGS is top 25 list of world's

super computers.

Just through the power of cube

neats nets.

Kube

Kubernetes.

And today we are announcing a

set of enhancements to

multi-class management much

simpler.

Just deploying a single file to

hundreds of clusters and

hundreds of environments.

So we want to stay ahead of

emerging technologies and we are

working with pioneers in the

space they're using their for

workloads such blockchain for

instance.

Today we're announcing coin base

has selected as their premiere

Cloud provider to enable that

community.

These golden paths and talk to I

would like to turn to our great

partners and customers from the

Deutsche Bank team.

Let's roll the video.

If you don't have financial

stability and you don't know you

can take care care of your

deer ones, and your happiness is

impacted.

Deutsche Bank is a leading

German and European bank that is

transforming the industry. We

are on a journey with Google

Cloud to redefine banking.

We are building the foundation

and is and a half for compliant

secure experience, and also

enabling all our clients to

really now leverage that to

better their life

>> What really excites me is

the huge transformation we're

driving the at bank, creating

modern solutions and new

opportunities.

Deutsche Bank's frontier app is

a cash flow and invoice

management platform for our

clients. It allows them to, in

real time share their voices

with their customers collect

payments from them automatically

send them reminders, OEF SQL

decrease in time to capture a

mandate and accept payment, we

brought down from weeks to days

with 80% in sales efficiencies,

one of the biggest benefits of

developing this app on Google

Cloud is most of the services we

needed were out of the box.

So my team could really focus on

building the business features,

we are able to connect to our

clients ecosystems becoming a

part of their entire business

value chain and beyond the

traditional boundaries of

banking.

>> Some of the key benefits of

partnering with Google Cloud is

speed, security, and

scaleability.

By having data in one place, you

can apply analytics and build

machine learning solutions.

Partnership with Google cloud

allows from innovation point of

view to directly work with

Google and engineers on products

and services not available on

the market already.

>> Google Cloud is really at

the heart of transformation that

our great team has and able to

implement it efficiently, and

securely, and right velocity.

The sky's the limit.

[APPLAUSE]

>> So I want to end with this:

Companies are empowered with

intelligent, simple and open

data cloud, the sky is in deed

the limit, with that would like

to hand it back to Adaire.

>> Thank you so much Gerrit.

So having the power of open data

cloud and open infrastructure

cloud to drive your

transformation isn't meaningful

unless you can operate in a

trusted environment.

And this takes us to our third

customer question: How do we

know our data, our systems, and

our users are secure?

Cybersecurity is naturally a

top-of mind concern for CISOs,

and increase,ly top of mind the

C suite, and in the boardroom.

And on one hand, CISOs need to

defend against increasingly

sophisticated threats and

actors, but on the other hand,

they're faced with an

unprecedented shortage of

cybersecurity professionals.

So how do we reconcile this?

With our solution we are

championing a future of

invisible security.

In this Approach security is

engineered in, operations are

simplified, and we pursue a

shared fate together.

Our commitment to you is

two-fold.

First we work to keep you secure

from cyber attacks.

And to do this we utilize the

expertise we developed from

securing our own Google business

and our own billions of users.

Second, we help you to quickly

and effectively identify and

resolve cyber threats, so what

does this look like in action?

Using our new chronical security

operations suite, Morgan

Sindall which is a UK

construction and regeneration

company, ingests analyzes and

retains all of its security

information, eliminating what

was the typical trade off

between cost and security

blindspots.

Leading provider of equipment

and services for data centers is

now analyzing more than 20 times

more security data, and

responding to three times more

security events with exactly the

same resources. Security queries

that used to take hours, now

take seconds.

Our investment security is only

growing. We've recently

completed acquisition of

Mandiant, a leader in dynamic

cyber defense Mandiant, is known

for being the best, and threat

intelligence and an incident

response. . Google Cloud is the

best at data and analytics. Now

that we're together, we have the

latest threat intelligence from

the front lines.

And we can deliver this

automatically through SaaS

products, backed by the leading

managed offerings and consulting

services of Mandiant.

Jointly we can deliver on our

shared mission of a more secure

world.

We understand that to fully

embrace transformation in the

cloud you not only need

security, you also need trust.

You need confidence and peace

of mind that when you deploy the

latest innovations, you can meet

your unique requirements for

control, transparency and

sovereignty, whether that's

driven by your regulator by

geopolitical considerations or

indeed by the government and its

policy. Google Cloud, we take a

digital sovereignty seriously,

customers partners

policymakers, and governments

have surfaced three requirements

in 3 very specific areas.

First is data sovereignty. Which

is keeping control over

encryption and access to your

data. The second is operational

sovereignty. This is keeping the

visibility and control over the

provider of operations.

And third software sovereignty.

This is running workloads

without a dependence on a

provider software. To address

these requirements we launched

our initiative cloud on Europe's

terms in September of 2021.

To achieve data sovereignty we

offer unique external encryption

capabilities, allowing you to

store and manage your encryption

keys outside of Google Cloud

infrastructure.

And deny Google access for any

reason.

To support data residency

requirements we continue to

launch cloud regions in AMEA.

Just recently we launched new

regions in Qatar, the Kingdom of

Saudi Arabia, and Greece adding

to our 16 Cloud regions here in

EMEA.

Today I'm really excited to

announce new cloud regions for

for more countries for Austria

for Norway for Sweden, and South

Africa. [APPLAUSE]

To support operational

sovereignty, we offer an

exceptionally strong set of

trusted partnerships for

stringent local supervision, and

to support software sovereignty

needs. We offer hosted cloud

solutions. Now, these solutions

are built on open source

Foundation, embracing open API's

and services to enable

interoperability, and most

importantly survivability. They

can run on customer premises, on

partner premises, and meet the

strict needs for disconnected

operations.

And our unique relationship with

systems here in Germany

demonstrate how we deliver both

the full benefit of the public

cloud, and confidence in

compliance with German

regulations, and to discuss

this, I'd really like to welcome

onstage Adel Al-Saleh of

T-sysSY

T-SYSTEMS.

Please welcome me in welcoming

Adel.

Thank you so much.

So T-sysSYSTEMS google clouds ft

digital sovereignty partner in

Europe and as you know since

then we've partnered in France,

in Spain and in Italy and

probably more to come. But it

started with you. Why do you

think the partnership model is

so important not just for

Germany, but Europe and beyond.

>> Well, first of all, thank

you for having me.

It's great to be in a room

full of people again.

Sovereignty has been a topic for

a period of time, this is not a

new topic, right. It's driven by

multiple factors, geopolitical

tensions is driving it. The

bifurcation of the world between

East and West. The fear of being

dependent on somebody and not

being able to move is another.

The war in Europe, that we

didn't expect this fueling even

more the sovereignty sentiment.

Fewer dependency but also fear

of not complying with regulatory

environments, and he talked

about data, data is a big deal

here. How do I control my data

how do I make sure I know

exactly who is touching it.

Where is it, etc.

...

13:43

Takeaways

All notes, highlights, comments,

and action items

Empty gems

Highlight important notes

Or type a note below

Type note here...

We have seen this develop over

we see this work jointly with

you to develop the engineering

excl

solution.

I'm super excited about that

because it is a unique way of

providing a hyperscalers

solution with the sovereign

controls that allows companies

to ties allows him to use the

best of the cloud without losing

their sovereignty and their

worries about controls.

>> I think there is element of

trust here, and both of us

appreciate so easy to lose trust

and so hard to win, when we look

at organizations that want to

start or accelerate their

journey of digital sovereignty

as part of transformation

narrative.

A lot of questions and a lot of

option to navigate, what would

be advice to audience about

getting started

>> The first thing is the

solution we're talking about, is

real.

It's available.

We're launching it in phases of

course, introduce sovereign

controls every quarter.

We have now multiple customers

going live.

Using this.

So no longer theoretical debate

or theoretical discussion.

So my advice go for it.

Take advantage of it.

I believe regulators in Europe

are going to put more and more

focus on this area.

Already asking and looking at

legislations and require, of

course mission critical

infrastructure to comply to

certain security requirements.

That's going to be put into the

law and will extend beyond the

mission critical

infrastructures.

They're encouraging companies to

assess their environments and

encouraging companies to move

forward implement these

solutions to address the

vulnerabilities if you will.

My advice, is give us a call,

and Google or T-Systems would be

delighted to work through it.

>> So as a company,perform a

risk assessment, you know, often

looking at what the regulator's

have suggested, I think there's

a very strong sense that the

time to act is now as you said,

this solution is available. And

that's because of the upside,

that is significant So if I

asked you about that upside what

would excite me most about it. .

>> First of all, about this

engineering solution that

addresses European concern. This

is a unique solution. It doesn't

compromise in terms of access to

the hyperscale or stacks and all

of the exciting technology that

you just learned. So exciting

bringing to market and

showcasing, and bringing

customers on it. I'm excited

about onboarding our clients.

We're learning as we as we bring

every client on board and that

makes a stronger. That makes us

revisit our roadmaps and decide

what to prioritize and I'm

excited about our CO innovations

lab right and then it is a big

investment in Munich, where we

are building our team is

bringing them together with the

customers actually can spend

weeks working with the teams and

moving their applications intoif

the cloud.

>> All right, thank you for

Thank you for highlighting some

very important issues for us.

Ladies and gentlemen,please join

me in welcoming Adel Al-Saleh

from T-systems [APPLAUSE].

>> We are committed to making

partnerships like the one we've

just discussed. Available

wherever they're needed by local

legislation. Part of our

commitment to deliver the most

secure most trusted cloud to our

customers here in this region.

We want to help you to perform

in full confidence that your

data, your systems and your

users are secure and compliant.

Now when we think about

transformation, it's impossible

without people, and have to

ensure our people are able to

work together, and they are

empowered to drive the change.

Over the past 3 years we

collectively have experienced

nothing less than upupheaval in

our workplace.

Remote, an office work has

blended and combined into a

hybrid workplace and

accommodating this with the

right collaboration and

productivity tools has become

Absolutely paramount.

Let's look at this question:

How do we create the best hybrid

workplace for our people?

Google Workspace was built to

answer that question.

Workspace helps people

communicate and collaborate to

get things done regardless where

they work and how they actually

want to work.

The world's most popular set of

productivity tools, with over 3

billion users across 8 million

companies, and Absolutely

continue to deliver innovation,

with more than 300 new features

delivered in the past year

alone.

Workspace is secure by design.

Leveraging Google's industry

leading zero trust Architecture.

But how does that show up as

value for our customers.

First let's look at jusJust Eat

They run Google workspace to

keep staff connected and

communicating where they are.

And when the lockdown hit they

were able to launch a campaign

within weeks, to provide workers

in the UK, national health

service discounted meals for

themselves and for their

families, .

Revolut a FinTech also out of

your UK users Google workspace

for teams to collaborate across

regions and deliver new banking

products. At speed, and right

here in Germany Zalando uses

Google workspace for

communication across its rapidly

expanding operations. You can

see the momentum very clearly in

new rapidly growing business

>> According to Forbes, 96% of

companies in their 2021 next

billion dollar startups list, or

Google workspace customers. I

think it's really important for

established traditional business

to take note.

In a study of university

students 75% failed Google

workspace offered a more

advanced seamless way for teams

to work together. What's more,

this preference actually

informed their choices or

employer. 47% said Google

workspace would make a job offer

a job offer at a future

workplace, much more appealing.

So Google workspace is by being

more than the sum of parts and

its capabilities.

It's been designed from ground

up to help organizations thrive

in world of hybrid work.

More than any other set of

tools.

Googlework space helps create

the ultimate workplace for today

and I also think for the future.

Now, in our final segment

today, we move from the future

of the workplace, to the future

of our planet.

Many of you here will recall

the intensity of the heat this

summer.

Breaking records, trigger

droughts, and even threatening

food security for many.

There is no doubt that climate

change is upon us.

And today's final question

should be one always top of

mind.

How do we collectively create a

more sustainable future?

Sustainability isn't a nice to

have, it's an imperative.

In good economic times and bad

it has to stand at the top of an

organization's priorities.

Interestingly we found that you

can have your cake and eat it

too.

What I mean by this, you can

become more operationally

efficient.

And more innovative by becoming

more sustainable.

The secret and smarter data.

To discuss this further, let's

welcome on stage, Stephanie

Neumann, VP of it sourcing and

infrastructure at Lufthansa

Group, CEO of Google Cloud

par

partnerGeotb, welcome Neil and

Stephanie. Thank you so much.

So thank you both for joining

us. Different companies but

travel and transport sector.

Let's begin our discussion by

ask to what extent in your

industry is the sustainability a

data problem.

Neil, start with you

>> Thank you Adaire, and good

morning to this really

impressive audience here today.

Let me start by saying that

sustainability is at the core of

Geotab purpose. We are global

leaders in IoT and to vehicle

supporting over 40,000 Customers

many Fortune 500 companies in

over 150 countries The data

insights we gather from 3

million connected vehicles and

around 100,000 data points

process to second ensures that

we have the data intelligence to

support customers on their

sustainability. But the real

truth about commercial fleets,

large percentage of commercial

vehicles cannot yet be

electrified. So with that being

the case, what can organizations

do to be more sustainable. The

answer to that challenges, as

you said, data. First use the

data insights to non electric

more efficient reducing their

carbon footprint. You do this by

adding sensors to the vehicles,

optimizing the road routs and

type of vehicles for a certain

job, and fuel consumption and

idling and inefficient driver

behavior. Second, we're seeing

that electrification projects

often fail. By applying data

insights. You can make the best

decisions for electrification,

using the data to understand the

usage patterns that are

optimizing infrastructure is

readily available where and when

vehicles can be charged using

the insights from data you can

decide how much and what part of

your feets collect electrified

for maximum benefit. O to

summarize, yes. Data

Intelligence must be at the core

your sustainability transition

in our industry if you are going

to move quickly enough.

>> All right, and Stephanie,

how does it contribute to

sustainability from a data

perspective in the aviation

industry.

>> To create sustainability

can be a huge challenge, as you

can imagine, However, at

Lufthansa Group, we have set

ourselves very ambitious climate

goals, and will become a carbon

neutral company by 2050 on the

way towards that target we will

have the net emissions by 2030

already.

So as the first airline group

in Europe we are proud that our

clear emission reduction,

officially in line with the

Paris Climate Agreement.

Ambition here for the aviation

industry, a truly it is to

decarbonize and this challenge

message by 3 levers mainly.

The new aircraft, deployment of

sustainable aviation fuel and

operational efficiency on top.

And of course data driven

exercise.

So Lufthansa we have used Google

Cloud. To create a new cloud

based operational decisions

support suite.

So Lufthansa subsidiary to use

this they pulled in multiple

sources of data regarding their

planes, They applied AI tools

on top, so they are now able to

optimize operational decisions

with just subsecond response.

Let's look at one feature of the

platform.

About rotating planes.

In a shore time that feature

alone reduced fuel consumption

and CO2 emissions for us

>> Very clear to see the

impact on sustainability

objectives, for orgs

transforming, does

sustainability go hand in hand

with other desired business

outcomes, would that be cost

cutting, or reimagining the

customer experience means

definitely seeing as probably a

number of us used your services

to maybe arrive in Munich today

let's start with you on this

one. .

>> We have clearly see this

with the project I just

described.

The operation decision support

suite.

Yes, we're able to reduce

emissions and within the same

time and within the very same

projects, the optimization we

have saved money as well.

So during the first 3 months

already seen saving of 1.5

million euros and just from

optimizing 50% of roots of just

one of the companies meaning

Swiss and expect overall saving

of the project, very significant

for Lufthansa, also in the more

or less non digital to work

examples which show the

coexistence of cost cutting

sustainability increase more

customer comfort are easy to

get, look at new aircraft, save

up to 3% fuel and thereby

emissions. They lower costs

proceed and increase comfort the

customer comfort at the very

same time so reducing emission

and also noise pollution call it

what you want, win, win, win

>> Across multiple variables,

win, win, win like that.

Neil, iven your customer base is

data that drives Sustainability

also clearly driving other

business outcomes.

>> Yes, Absolutely, customers

telling us that's exactly had a

what is happening.

The largest best company in

Germany over 5000 buses

connected to the Geotab

platform. By harnessing data

insights there make sure that

the vehicles or drivers on time

and ready to go and state of

charge, and evaluation report on

consumption, and giving

immediate feedback for drivers

and feedback helps correct

behavior like heavy breaking,

and provide higher driver

satisfaction, and 40% reduction

in idling time, and 1400tonnes

of CO2, a substantial reduction

in fuel costs as well, and high

customer satisfaction, o

absolutely by leveraging data

insights customers have been

benefiting from this

transformative change

>> And first thank you best

interest being here, and

wonderful cases of

sustainability both organization

with operations cost saving and

wonderful customer experiences

thank you Stephanie and Neil for

joining today.

Thank you so much [APPLAUSE]

>> That conversation leaves me

truly optimistic about what's to

come.

And it's all part of our

overall goal: To help you make

the sustainable choice, the easy

choice for everyone.

In life, in work, and in the

cloud.

So I would like to close by

thanking all our guests for

illustrating what true

transformation looked like today

in the cloud, and possibly for

tomorrow.

You are very valued customers.

Bring the possible to life.

And you inspire us each and

every day to deliver the

technology, the tools, and the

solutions that drive value

creation in the cloud.

No matter what your

circumstances are, or where you

are on your journey, we will

help you to better understand

your data.

Help you to apply technologies

so you can lead in your

industry.

Help you to ensure that your

systems, your people, your users

are safe and secure.

We'll help you to create the

best workplace for your

employees.

And we will help you to drive

sustainability in your

operations.

So let me thank you once again

for placing your trust in Google

Cloud.

Google is investing for the

future.

And we are here to help whatever

comes next.

Thank you.

[APPLAUSE]

>> Hi, I'm Shaquille O' Neal.

>> Trying to build a nation

chain, communication is so

critical. .

>> Google Calendar is my

girlfriend. I don't know

anything I'm doing unless I

talked to my woman. . Google

workspace, productivity and

collaboration tools for all the

ways we work.

...

32:08

Takeaways

All notes, highlights, comments,

and action items

Empty gems

Highlight important notes

Or type a note below

Type note here...

Clear

and collaboration tools fo

the ways we work.

>> Hey, everybody, standing

backstage at Google cloud HQ,

We're literally seconds away

from kicking off the keynote for

developer.

>> . Hello, everyone. Please

welcome Google Developers vice

president, Jeanine Banks.

>> Hello, welcome to next 22.

I'm Jeanini baBanks.

My team and I love empowering

developers to build innovations

for the future.

That leads me to the theme of

Next this year.

Today, meet tomorrow.

To tell you a little bit more

how we think about tomorrow, my

friends and I at Google Cloud

will share top technology

predictions where we believe

Cloud is headed for the next 3

years, each one of us share one

prediction we believe will be

true by the end of 2025.

And we'd love to hear what

predictions you all come up with

too.

You can do that by responding to

our original video or creating

your own video on YouTube Shorts

or any other social video

platform.

Just use the hashtag

"GoogleCloudPredictions and tell

us all about it.

But before we get into that, I

wanted to talk about our

developer community for a

minute.

I get most excited about the

incredible ideas and new

products coming from our Google

developer community as well as

the opportunity we have to help

developers learn, grow, and

build powerful systems and

engaging experiences.

Google's developer community is

inclusive.

One where cloud developers at

every level of expertise are

welcomed while being challenged

at the same time.

And being part of Google's

Developer community creates true

economic impact because Google

Cloud Certified professionals

are some of the highest paid in

the industry.

This same cloud developer

community fosters the creators.

By that I'm talking about The

students, the career switchers,

and anyone else hoping to become

A developer who is encouraged

and

supported to bring their

creations to life.

Just like we saw with our

partner in Brazil, Soul Code

Academy, and one of our newest

Professional Data Engineers,

Patricia .

Take a look.

[APPLAUSE]

>> Don't you just love her

story?

This is why I'm excited to come

To work every single day, and

the potential that we can build

together with all of you.

Speaking of exciting...

We recently announced a new

partnership with The Drone

Racing League.

We've built new, immersive

learning experiences with DRL

that will blow your mind!

You can participate in the

Google Cloud Fly Cup Challenge

where you get hands on with

DRL's race data and Google Cloud

services.

You can predict race outcomes,

give performance tips to DRL

pilots to help them smoke their

competitors, and learn all at

the same time.

And you even get to compete for

a chance to win a trip to the

season finale of the league's

World Championship.

You can get started with the

challenge right now.

We look forward to seeing what

you'll build next as we fly into

the future of cloud together.

Do you see what I just did

there?

Fly Drones.

[Laughter]

Okay!

Are you ready?

Let's go!

[APPLAUSE]

>> I'll go first.

My prediction is, By the end of

2025, developers who start with

"neuroinclusive design will see

a 5x growth in user adoption in

their first 2 years in

production.

According to the National

Institutes of Health, up to 20%

of the world's population is

neurodistinct with the other 80%

being neurotypical.

These two groups make up what is

called neurodiversity.

And neurodiversity describes the

ways people experience,

interpret, and process the world

around them, whether in school,

at work, or through social

relationships.

And here at Google, we believe

the world and our workplace

needs all types of thinkers.

What do you think, Jim?

What do you think about that

Jim.

>> That's right.

One in 5 of us here are

neurodistinct.

What is neuroo inclusive design?

It is design for cognitive and

sensory accessibility.

One of the things that makes

participation in meetings

accessible to me is the raise

hand function.

It allows me to pause, to

provide an opportunity to share

my thoughts.

Then we realize that this made

meetings better for everyone.

It created more structure for

everybody, which enabled

visibility and broader range of

ideas.

So it starts with raising

accessibility for neurodistinct

people like myself, but everyone

benefits.

Good design is already

neuroinclusive.

When implement properly.

As an example when developing

interactive and visual features

we need to consider how noise

vibrations

or pop-ups show up in design.

Because it creates sensory

stimulation that leads to

distraction.

Design principles can be made

neuroinclusive when you plan

thoughtfully for balance,

proportion unity light color

space and patterns.

And here are some tips for

developers: No. 1 design

with simplicity and clarity.

No. 2, remove distractions or

extra visualizations like pop up

windows.

No. 3 avoid really bright

colors or too much of a single

color.

Four, stick with the predictable

and intuitive user flow.

Five, be thoughtful about the

vibe you are setting.

Do you need music or sounds to

set the tone or just an extra

element that can create

distraction.

And finally 6: Stay away from

pressure points requiring quick

reaction from the users this can

add unnecessary pressure.

Back to you Jeanine

>> Thanks Jim.

This upfront design

thoughtfulness is just so

important.

How often have any of you been

under pressure to ship

something, but you knew you

could have improved the user

experience with a little more

time?

At Google we feel ideation, user

research, testing and and

markets helped Teams launch more

inclusive products faster.

While these are standard

practices in software

development, when we made a

conscious decision to have all

types of thinkers included

across these phases, our eyes

opened to so many new ways to be

more inclusive.

For example, closed captioning

in Google Meet helps all of us

process information better

visually.

It also helps people

participating in meetings where

a different language is used.

When you build simple, clear

experiences with fewer

distractions, your products will

inherently drive greater user

adoption.

In fact -- what did I say?

I predict 5X more in

your first 2 years in

production.

This is because developers like

us will have built belonging

directly into our products.

So, that's my prediction.

Now I'm going to pass the mic to

some of my friends here at

Google Cloud to share their

predictions.

Thanks!

[APPLAUSE]

>> My name is Eric Brewer and

my

prediction is, By the end of

2025, 4 out of 5 enterprise

developers will use some form of

"curated open source." Now, let

... you are probably wondering

what is curated open source?

Curated open source is just open

source as you know it with a

layer of accountability.

What I mean by that is curated

open source comes with support

for developers.

The "curator in this case, will

focus on not just finding

vulnerabilities, but helping to

fix them too.

They'll update old dependencies

and track new ones.

With curated open source, the

curators will build in

automation for testing and may

even offer responsebased SLAs.

Here's why this is so important

For the community.

Open source is everywhere. It

helps power our electrical

grids, water supplies and oil

pipelines.

It's fundamental to all clouds

and most nations, and even

widely used in proprietary

software.

Open source is public

infrastructure and it's an

essential part of our everyday

life.

So now what?

Open source is here to stay, but

everything it powers is

vulnerable.

The incidents are real and

costly.

This is why governments are

stepping in with regulations

like FedRAMP and executive

orders to combat cybersecurity.

Regulations like these are super

important and show us just how

deeply security vulnerabilities

can impact our lives.

But in fact these regulations

are exactly why we need curated

open source.

Curated open source enables you

to depend on open source beyond

the "AS IS approach we use

today.

We believe in this philosophy so

much that we are already working

on it.

To help you build secure apps

faster, we're releasing Software

Delivery Shield.

This is a fully managed security

solution that protects your

software supply chain from

source to deployment.

And as part of SDS, we have our

initial curated open source

example called Assured OSS. This

service curates OSS packages

used by Google and makes them

available to you, our cloud

developers.

Google will scan, analyze and

fuzztest more than 250 Java and

Python packages for security

vulnerabilities on your behalf,

and update them as needed.

Still don't believe me that

developers will use some form of

curation?

How about this?

Let's show you what we're doing

at Google to make this a

reality.

Hi Aja.

Hi Eric.

Let's see what the development

life cycle looks like using

"Software Delivery Shield" to

enforce responsible use of open

source.

Through policy.

We'll start with Cloud

Workstations.

A complete development

environment in the cloud.

Cloud Workstations is highly

customizable.

The version I'm showing today

has all the tools and compilers

and everything I need including

the brand new source protect

extension.

Here, Source Protect has flagged

a dependency with a known

vulnerability.

I can fix this right now from my

workstation before everything is

checked in.

Cloud workstations detects

changes I make automatically and

redeploys the app on my

workstations as needed,

This shortens my Dev loop and

makes me more productive.

When I push changes when I'm

happy with them, cloud

Build runs continuous

integration on our code base.

Here you can see the Cloud Build

report.

You can see that Cloud Build

provides SLSA Level 3 compliant

build provenance.

Cloud build also scans for

vulnerabilities.

Here we can see list of

vulnerabilities including

details on many of them, and in

some cases even a fix we need to

make to address it.

And now happens to be the image

Eric and I working with today

had several external open source

dependencies, for example,

springbootstarter.

In general this could pose a

significant risk if we relied on

something pulled from the

internet.

Fortunately, those dependencies

are in the Assured OSS portfolio

so I can use a version of that

dependency that's been vetted by

Google.

Which is what you have been

telling us about.

>> This is exactly the point.

You don't have to worry about

the dependancies because they

are vetted for you.

>> So now that I know I don't

have to worry so much about the

dependencies time to push the

code to prod.

So let's look at

my delivery pipeline and push it

to GKE.

When I'm happy with my code, we

can deploy to GKE.

So, while that's deploying,

let's look at one last thing.

Here is the security pot

temperature page.

Here I see security concerns on

cluster and workload level, I

can dig to any concerns if I

need to.

Including seeing recommended

action to take to mitigate any

issues identified.

So that's the dev to prod

delivery shields, when you start

writing code when it's released

into production

>> Thank you.

>> [APPLAUSE].

>> We want to make sure there

is added layer of

accountability to better support

you and the apps you build.

This is why we believe that 4

out of 5 enterprise developers

will use some form of curated

open source Thank you.

>> First off it is so nice to

see everyone in the room give it

up for being back in person,

right?

[Cheering] [APPLAUSE]

>> Hello everyone, my name is

Iman Ghanizada and my prediction

is, By the end of 2025, 90% of

security operations workflows

will be automated and managed

As code.

SecOps teams are struggling to

Keep up with attackers we all

know this.

Too much data complex technology

environments and more

adversaries now than ever.

I mean, everyday on the news, we

hear about a new 18-year-old

that has breeched a company.

[Laughter] let's pair this with

the detection and

response workflow which is

notoriously centered around toil

and it requires a linear growth

in people to keep up with the

volume of threats.

We all know we can just hire

more people right?

Every CISO has a billion dollars

In their bank and just waiting

to hire

700,000 people.

[Laughter]

This inefficient workflow has

basically created the

cybersecurity talent shortage.

There are over 700,000 unfilled

Cyber jobs.

And these jobs are highstress,

and

and overloaded with toilbased

And a lot of folks are frankly

burnt out.

There is no way this issue will

get solved if we just keep doing

things the way we do them today.

So to scale security across

Cloud, we're going to make

security more agile and

accessible to everyone through

code.

Here's how.

So

Our Autonomic Security

Operations framework is designed

to help you take advantage of

our APIfirst approach with

Chronicle security operation and

is other tools, including tools

within our new

Mandiant portfolio.

The shift in our new framework

takes traditional assembly line

security operations workflows

into a codified, continuous

feedback model we call

Continuous Detection, Continuous

Response.

Or CDCR for short.

It's like CI/CD for threat

management.

We've seen customers like BBVA

and NCR, as well as our MSSP

partners, like CYDERES, use our

tools to build continuous

detection and response workflows

that scale across billions of

alerts.

So,earlier this year, we

partnered with MITRE Engenuity,

CYDERES, and others to launch

Community Security Analytics.

CSA is an opensource repo we

created to "foster community

collaboration" on security

Analytics for cloud workloads.

These analytics can be deployed

as code to complement the native

detection capabilities in

Chronicle and other Google Cloud

tools.

So it's kind of like having a

team of Devs collaborating on a

Detection rules.

I'll show you how to deploy

these rules.

First, let's use an example.

Let's just say a user outside of

your

DevOps team gets access to

impersonate a highly privileged

prod service account - probably

wouldn't be good, right?

Well, rule 2.20 analyzes your

admin activity logs for

permission grants on service

accounts.

For the next part, we went ahead

and prerecorded this, and wanted

to save time, and also didn't

want you to watch me fumble over

my keys.

First we're starting in the

terminal, and we already cloned

a private repo.

Now we're going to open up the

URL rule.

We can add parameters to

finetune the rule to our needs.

And in case you haven't noticed,

Chronicle's YARAL syntax is very

simple compared to other

detection languages.

Now let's commit and push our

changes.

We already have a GitHub action

To auto deploy these rules into

our chronicle instance.

By the way - Chronicle can be

deployed as code and it can

scale across petabytes of data -

without needing infrastructure

or human involvement!

It's as cloud native as they

come!

Here we're to pivot into the

chronicle instance.

Refresh and... let's see...

Voila!

Boom.

Here's our new rule, and from

This point on will start

alerting when this malicious

activity is identified.

We can also run a "retro hunt

which essentially runs this rule

against all our historical data

in Chronicle to see if there's

an alert we missed.

We've done retro hunts with

customers in seconds or minutes,

that have taken them hours or

days with their existing tools -

It

it's uber fast.

We can also use Chronicle SOAR

To create an automation playbook

to figure out how to respond to

the alert.

And have API to respond to the

entire thing from ingest to

analytics to response.

So the moral of story -- none of

us want to end up on the news.

In order to make this 90%

prediction a reality, security

analysts are going to have to

work a lot more like devs, so

they can free up time to focus

on the most important threats to

their organizations.

So you'll need to implement

modern, developer-friendly

workflows like CD/CR across your

detection and response practice.

What I've shown you is how we're

working to make it possible for

You to do so.

Thank you.

[APPLAUSE]

[APPLAUSE]

>> Hi, My name is Kamelia

Aryafar and my prediction is, By

the end of 2025, AI is going to

be the primary driver for moving

To a four-day workweek

[APPLAUSE]

So what does this mean to you

and me?

A three day weekend!

What it actually means is being

able to comfortably complete

five days worth of work in four

days, or even less, with

efficiencies gained through AI.

Enterprise use of AI alone

exploded over the past few

years, touching all aspects of

business.

One of the greatest reasons

behind this is AI's huge

potential to increase employee

productivity.

You've told us how excited you

are to work with Google Cloud

because we make all of the AI

research, AI models, and ML

toolkits from Google, available

to you as enterprisegrade

products and solutions, like

Vertex AI.

Since its launch, Vertex AI has

helped data scientists ship ML

models faster into production,

by automating routine tasks like

model management, monitoring,

and versioning.

With VertexAI, data

scientists can now build and

train ML models 5x faster

meaning increased time for

experimentation, reduced custom

coding, and the ability to move

more ML models into production.

Today, with the announcement of

Vertex AI Vision, we are taking

this a step further and

providing you with a fully

managed, development environment

for creating computer vision

applications.

In the general keynote, with a

smart city use case, my

colleague June Yang discussed

how you can use Vertex AI Vision

to reduce the time required to

build and deploy computer vision

applications from weeks to

hours.

Now, let's dive into three areas

that I, as a developer, am most

excited about when it comes to

Vertex AI Vision: The ability to

use your own custom models,

Integration with BigQuery, And

developing external applications

with SDKs, Let's see how.

First, in the smart city example

we used a prebuilt occupancy

analytics model to detect and

count vehicles.

Now, if I want to do the same

thing for bicycles, then I can

Use a custom bicycle detection

model in Vertex AI and easily

import it into my computer

vision application.

Basically, if the model works in

Vertex AI, then it will also

work in Vertex AI Vision.

Second, I want to use the power

of BigQuery to combine video

annotations with other

information in my data

warehouse.

By the way, I can also store

annotations in the included

Vision Warehouse feature to

easily search for insights

across all of my videos.

By using Vertex AI Vision

together with BigQuery, I can

correlate traffic patterns with.

By using Vertex AI Vision

together with BigQuery, I can

correlate traffic patterns with

weather patterns, or even make a

forecast with BigQuery ML to

predict future traffic patterns.

Finally, I can use the SDK to

access the processed video data

and annotations, and hook into a

Live stream of vehicle accounts

to power other

applications or dashboards.

It's that simple, and it can be

applied to one video stream or

even hundreds of video streams.

This level of flexibility and

scalability is unique to Google

Cloud, and that's how we help

you reduce the development time

for computer vision applications

from weeks to hours.

Not just with Vertex AI Vision,

but all of Google Cloud's AI

products are built to help you

be more productive and delight

your customers.

For example, using Contact

Center AI, call center teams can

manage up to 28% more

conversations concurrently.

that means a lot more

productivity.

With Translation Hub,

localization teams can translate

documents into 135 languages in

a matter of seconds, which means

time saved for other efforts and

a more inclusive workplace.

Similarly, with Google Cloud's

Recommendations AI,

merchandising and ecommerce

teams can now drive 40% more

customer conversions.

That means a lot more happy

customers and a happy sales team

too!

When you put together all of

these productivity gains,

powered by AI, across the

organization, a 4day workweek is

a very distinct possibility!

Thank you so much! [APPLAUSE]

>> Hi, I'm Irina Farooq, and my

prediction is that by the end of

2025, 90% of data will be

actionable in realtime using ML.

I'm sure many of you are pretty

skeptical of this prediction.

And that's understandable.

A recent survey uncovered that

only one-third of all companies

are able to realize tangible

Value from their data.

And we continue to hear that

many of you are trying to fix

that by taking on the

operational burden of managing

data infrastructure, moving data

around, duplicating data, to

Make it available to the right

users in the right tools.

So then, how do we begin to

overcome these barriers before

data can be actionable in

realtime using ML?

Since our inception Google has

been focused on delivering

highly personalized information

that is highly trusted by

billions of people around the

world.

Data is in our DNA. The same

data infrastructure that's

allowed us to innovate is

available to you.

That is why we believe we can

make this prediction a reality.

Take, for example, our customer,

Vodafone.

As one of the world's largest

telecommunication companies,

Vodafone unified all their data

so that thousands of their

employees can innovate across

the 700 different use cases and

5,000 different data feeds.

They now run AI development 80%.

They now run AI development 80%.

They now run AI development 80%.

They now run AI development 80%.

They now run AI development 80%.

They now run AI development 80%.

They now run AI development 80%

faster, more cost-effective, and

all without compromising

governance and reliability.

So, how can we help you achieve

your own data infrastructure

vision?

The short answer is in three

parts.

First, you can't act on data

unless you can SEE it and TRUST

it.

Today, we are announcing

automatic cataloging of all your

GCP data with business context

in DataPlex and you can

integrate 3rd party sources too.

This means you no longer need to

spend days looking for the right

data and instead can spend time

working with it.

But once you find your data, how

do you know you can trust it?

Have you ever been in a meeting

where someone questions the

validity of a data point and

then nobody can trust anything

that's being presented From that

Point onward.

That's why I'm excited about the

new Data Quality and Data

Lineage capabilities in

Dataplex, bringing intelligence

and automation to help you trust

your data.

Second, you can't act on data

Unless you can work with it.

To innovate you got to be able

to use the

best tools for the job across

All your data.

Speaking of the best tools,

I'm excited about

BigQuery's new support for

unstructured data.

Now, you can be sure that your

BigQuery skills will pay off

across all your data, from

structured, to semistructured to

unstructured.

It is also important to be able

To use the best of open source

tools.

Last year, we introduced our

serverless spark offering, and

today, we are announcing that

you can run spark districtly

from

from BigQuery, with fully

integrated experience and

billing.

But, this is just the beginning.

We have a bold vision for our

Spark offering. To leverage

Google infrastructure magic,

without forking the Open Source

.

Take, for example, Mindmeld.

the shuffle service powering

BigQuery and Dataflow that helps

deliver scale, reliability, and

Performance that so you know and

love and the services.

That's coming to

Spark job soon!

Lastly you can't act on today's

data tomorrow. We've heard many

of you struggle with making

realtime in-context experiences

A reality for your own

customers.

Dataflow, our streaming

analytics service, powers

critical Google services and we

believe it can do the same for

you.

With Dataflow, you can use

Apache Beam to build unified

batch and realtime pipelines.

You can start small, while

having the assurance that you

can process realtime events at

extreme scale if your

application needs it.

To summarize, when you can see

the data, trust the data, and

work with data as it's

collected, we can see how 90% of

data will become actionable in

realtime using ML and the

incredible innovation that is

That can unleash.

Thank you very much. unleash.

Thank you very much.

[APPLAUSE] ♪

>> Hi, I'm Andi Gutmans and I

predict

that, by the end of 2025, the

barriers between transactional

and analytical workloads will

disappear.

Traditionally, data

architectures have separated

these mixed workloads-and for

good reason.

Fundamentally, their underlying

databases are built differently.

Transactional databases are

optimized for fast reads and

writes, while analytical

databases are optimized for

aggregating large data sets.

Because these systems are

largely decoupled, many of you

are struggling to piece together

different solutions to build

intelligent, data-driven apps.

For instance, to provide

personalized recommendations for

e-commerce, apps need to support

both transactional and

analytical workloads on the same

data set and without negatively

impacting performance.

At Google Cloud, we are uniquely

positioned to solve this problem

because of how we've architected

our data platform.

Our transactional and analytical

databases are built on highly

scalable, disaggregated compute

and storage systems and Google's

high performance global network,

allowing us to provide tightly

integrated data services.

And to help you unify your data

across your apps, today, I'm

excited to tell you more about

new capabilities we recently

announced:

First, Datastream for BigQuery

which allows you to easily

replicate data from

transactional databases into

BigQuery in realtime.

Next, Database Migration Service

which provides 1click migration

from Postgres into AlloyDB for

operational analytics.

And lastly, we support query

federation with Spanner, Cloud

SQL, and Bigtable, right from

the BigQuery console, to analyze

data in transactional databases.

But don't take my word for it,

let's see how some of these

technologies remove barriers for

our fictitious company, Cymbal

Bank.

Cymbal wanted to integrate their

core banking features in their

app with market data to provide

personalized, real time

investment dashboards.

The problem was their app's

back-end was optimized for

transactional workloads.

So how do they maintain the

responsiveness of their app

while adding analytical

goodness?

Cymbal Bank chose Google Cloud's

new fully managed Postgres

compatible database, -AlloyDB-

offering the capability to

analyze transactional data in

real time.

Migrating all their existing

data with minimal downtime felt

like a big lift for the Cymbal

engineers, but it turns out that

Database Migration Service makes

this simple.

I'll walk you through just how

easy it is.

Database Migration Service lets

you migrate from Postgres to

AlloyDB with continuous

replication, minimizing

downtime.

Once we define where we're

migrating from, and what we're

moving our data into, we can see

the prerequisites for the

migration directly in our UI.

Sources are defined using

profiles which contain host,

username, and password.

You can predefine them like I've

done here for Cymbal's Postgres

instance.

Here we define our destination

and some basic configuration

options, and then we get to hit

create.

This part will take a few

minutes, so I've sped up time a

little bit.

Once it finishes, a quick test

to ensure that it will all work,

And then hit create and start.

Once the initial dump is

finished, we're now in a state

where we have both the old and

new databases populated with our

live data continuously

replicating from old to new.

That means we can do cool stuff

like testing the performance of

our new investment features

against both the existing

production Postgres database,

and the new AlloyDB database

side by side.

Since AlloyDB is fully

compatible with Postgres, you

don't have to make any

application changes.

As you can see, we're getting

much better performance out of

the new AlloyDB back-end.

AlloyDB is 4 times faster for

transactional workloads and up

to a 100 times faster for

analytical queries compared to

standard Postgres-making it the

perfect database for these kinds

of hybrid workloads.

We all want to act on data in

real time, without the toil of

infrastructure assembly and

operations.

We've given you a taste of how

Google Cloud makes it easier for

you to build data-driven apps on

a unified platform.

And this is why I predict that

by the end of 2025, the barriers

between transactional and

analytical workloads will

disappear.

Thanks!

[APPLAUSE]

>> My name is Amin Vahdat and my

prediction is, By the end of

2025, over half of cloud

infrastructure decisions will be

automated based on an

"organization's usage patterns"

to meet performance and

reliability needs.

At Google, we believe the work

we do today with our partners

will define the next generation

of infrastructure for the world.

While some people look at

infrastructure as a commodity,

we see it as a source of

inspiration.

This inspiration comes from

delivering capabilities not

available anywhere else and

pulling in the future by

operating at a level of

reliability and scale that might

otherwise seem unimaginable.

Our infrastructure is designed

with the "scale-out" capability

needed to support billions of

Users who use services like

Search, YouTube, Gmail, and our

Cloud services each and

everyday.

We pioneered the model of entire

buildings, operating as a single

computing and storage system.

And with Spanner, we showed how

services could run reliably, at

scale, across the planet.

And we've experienced network

innovations, like Google Global

Cache, B4 and Jupiter,

shortening distances across the

planet.

This gave us the opportunity to

reimagine what was possible from

infrastructure in terms of scale

and capability.

Look at the world around us.

The time for disruptive

innovation has never been more

profound.

We're seeing incredible demand

on the industry's cloud

infrastructure, yet simultaneous

infrastructure, yet simultaneous

plateaus in efficiency.

You and your companies continue

to push the boundaries of what

infrastructure can provide, yet

the burden of picking the just

right combination of components

continues to fall on you.

To address this, we've

engineered golden paths from

silicon to the console.

These paths combine

purpose-built infrastructure,

prescriptive architectures, and

an ecosystem to deliver workload

optimized, ultra reliable

infrastructure.

So let's talk about the

investments we're making in

infrastructure at Google in

power and performance to make

All of this possible.

We partnered with Intel to

codesign and build custom

silicon like this little thing.

This is called an Infrastructure

Processing unit and gives you

"massive performance and

scalability" to power high

performance, data intensive

apps.

These IPUs are at the heart of

our new C3 VMs.

C3s include the latest

generation Intel Sapphire Rapids

processor and custom designed

offload based on the IPU that

delivers 200Gbps, low latency

networking.

And coupled with our new block

storage, Hyperdisk, they can

provide incredible storage

performance.

Now let me show you something

else.

From the small to the big,

Meet the hardware behind the new

Tensor Processing Unit, TPU v4

platform, likely the world's

fastest, largest, and most

efficient machine learning

supercomputer.

This liquid cooled board is a

beast in both power and

performance density.

You see the pipes running across

it, running child water over the

board, over four

of the TPU's.

It allows secure, isolated

access and is at the cutting

edge of services like natural

language understanding,

recommender systems, and image

processing.

The TPU makes large scale

training workloads up to 80%

faster and up to 50% cheaper

compared to alternatives.

When you talk about nearly

doubling performance for half

the cost, you unlock your

imagination in terms of what

just might be possible.

The same IPUs and TPUs that

power your services are the

foundation that will enable us

to automate over half of cloud

infrastructure decisions in the

next couple of years.

They will support the telemetry

data and ML based analytics to

proactively recommend the best

infrastructure.

It will be based on an

understanding of how

infrastructure balance points

correspond to performance and

reliability for your individual

workloads.

We don't think that you should

have to think about hardware

Specifications.

That is last

generation cloud thinking.

You will specify a workload and

we'll quickly recommend,

configure, and place the best

option for you based on your

price, performance, and scale

needs.

We know that these automated,

adaptive decisions deliver lower

cost, more performance, and

higher reliability than any

handcrafted solution.

So as my prediction that over

half of cloud infrastructure

decisions will be "automated"

based on an organization's usage

pattern correct?

Honestly, I think it's going to

be much higher than that.

It has to be in order to keep up

with all of the advancements

you're making in technology.

The burden and complexity of

infrastructure decision making

you have today will disappear

through the power of AI and ML

automation.

And when you have freedom to

focus on your solution delivery,

the rate of innovation and

customer benefits will

Only accelerate.

While Cloud has been

transformative, we are still at

the early stages.

We're excited to continue to

Make the unimaginable possible,

and the possible easy, thank

you.

[APPLAUSE]

Hi, my name is Steren Giannini,

and

my prediction is that, By the

end of 2025, three out of four

developers will lead with

sustainability as their primary

development principle.

For the longest time, the focus

was elsewhere: We needed to

build it fast, build it

securely, build it at the lowest

cost, build it as simply as

possible, and build it reliably.

Now it's also time to build it

sustainably.

We can't ignore the urgency

required from all of us to meet

climate targets.

And while organizations are

moving in the right direction,

they struggle to take action.

65% of IT executives said they

want to improve their

sustainability efforts, but

don't know how to do it.

36% of them said they didn't

even have measurement tools in

place to track sustainability

progress.

So how can we help them out?

We can give them better data

about the environmental

footprint of their business.

So today I'm excited to announce

that Google Cloud Carbon

Footprint, which helps you

measure, report, and reduce your

cloud carbon emissions, is now

Generally Available.

[APPLAUSE]

Let's take a look.

Right from the Cloud Console,

You can access the carbon

footprint

dashboard of your account.

The underlying methodology is

quite unique and is based on

actual measurements of the

energy used by machines in our

data centers.

It also is the complete picture

of your emissions.

Not just emission froms

electricity production, and also

on site

fossil fuel emissions and other

embodied emissions from data

center hardware.

This is also known as Scopes 1,

2, and 3.

And of course, you can break

down this data and take concrete

actions.

You can explore emissions by

Google Cloud project, service,

and region.

With a simple click, you can

export your cloud emissions data

to BigQuery for further analysis

and provide sustainability teams

with the data they need to

report on your company's

emissions.

We also want to help you build

new applications that emit less

carbon by making the right

choice at the right time.

Let's say you want to deploy

A new application to US west

coCo

Coast.

From a latency perspective, all

these options are very similar.

Without more info, you might

have picked Las Vegas, which

happens to have a relatively

carbon intense electricity grid.

But up in Oregon you'd find a

very clean grid, full of

hydropower and therefore low

carbon intensity.

That is why it is indicated as a

low carbon region.

In fact, a simple choice of

Oregon over Las Vegas can reduce

the gross electricity emissions

of running that app by about

80%.

And with this move, you not only

saved carbon, but you also saved

money since a Compute Engine VM

is cheaper in Oregon.

A move that can help you save

money and carbon is easy to

make.

When we tested this feature, we

noticed new users were 50% more

likely to choose a low carbon

region when they saw the icon.

And that can make a big

difference.

So how do we make it easy to

identify more opportunities to

lower your carbon emissions, and

deliver those options at scale?

Well, Active Assist now shares

recommendations to remove idle

resources and their associated

emissions.

Actually, all of the

sustainability features I just

showed you are embedded into

Google Cloud "console and

documentation." They are

available out of the box at no

charge for all developers.

Sustainability is too important

to be complicated.

Before I go, here are 2 things

to remember: One moving to

Google Cloud gives you the

efficiency gains and energy

benefits to reduce your

emissions.

Two: When you build on the

cloud,

pick the region with the lowest

carbon impact for your

application.

Because these tools are

available, I believe that by

2025, 3 out of 4 developers will

lead with sustainability as

their primary development

principle.

Thank you!

[APPLAUSE]

>> Hey, everybody, I don't

know why I did jazz hands, but

play with that weird energy

[APPLAUSE]

My name is Richard Seroter and

my prediction is, By the end of

2025, over half of all

organizations using public cloud

will freely switch their primary

Cloud as a result of

the multicloud capabilities

available.

quick story.

I recently moved from the

Seattle area to San Diego after

A vacation this year with my

family, realized wanted

something different.

I bought a house in San Diego

before I sold my home in

Washington.

For a while there, I was

multihouse!

That was not good.

That wasn't my desired end

state, but sometimes you're in

these "multi situations while

you transfer from one stage to

another.

And that's related to my

prediction here.

I think in the years ahead,

we'll see companies use a multi

cloud strategy not just as a way

to hedge their bets, but as a

way to switch from their first

cloud to their next one.

Research data shows that a

majority of companies are

already multi-cloud, meaning

they use more than one hyper

scale cloud.

Sound pretty familiar to most of

you.

I'm personally talking to more

companies that are using

multi-cloud technologies as a

way to not just switch their

workloads, but their mind share,

to a different cloud.

Let's see what this journey

might look like.

Let's look at three steps.

We'll do live demos.

What can go wrong?

So first part of the journey

How can we first meet you where

you are today?

You probably use another cloud.

That's cool.

Hey, nobody's perfect.

Here at Google Cloud, we've made

unique investments in a

multi-cloud management plane

that works with your compute and

Data even on other clouds.

In this first step, you're

starting to use Google Cloud,

but want to incorporate existing

investments in other clouds.

Consistency matters a lot here.

Anthos plays a big part.

Let's see how.

First off, you see here I have

EK cluster,

As you can see here, I attached

My EKS cluster to an existing

management plane with Anthos.

I can view workloads, deploy

services, and apply common

security policies.

And with our new partnership

using Crossplane, I can create

Anthosmanaged GKE, AKS, and EKS

clusters the same way.

From Google Cloud which is

pretty cool.

What do you do with all your

distributed data?

How do I analyze all this.

We all know that data transfer

costs exist, so consolidation

isn't always the right option.

And for other financial,

strategic and policy reasons,

your organization might need its

data residing in multiple clouds

That's okay.

With BigQuery Omni, I can query

a data lake in Amazon S3 or

Azure storage accounts without

Moving the data to run queries

against it in big query.

I can actually analyze it where

it resides without moving the

data.

This is one of many Google Cloud

services that work wherever you

are.

This is your starter phase.

You're building skills and

comfort in your new "home while

not moving anything.

Some of you might think this is

where you stop with multi-cloud.

Just use a bunch of clouds!

I don't think so, many are going

a step further.

In the second stage, you start

Upgrading tech where it is and

growing

your adoption of the secondary

cloud.

Step 2, as I start using Google

Cloud

services like GKE more often,

Now you may introduce Anthos

clusters to Azure and AWS so

The GKE software running across

clouds, and amazing,

And just now, we shift the

capabilities where I can shift

it on place on another cloud.

I don't have to jump all over

the place to run all my

infrastructure.

Here you might also start

creating multicloud mesh, and

want to maintain services that

run across clusters in.

In this mesh, I've got my web

app with services spread across

GKE and EKS clusters.

All these different places able

to manage the services and see

and connect them wherever they

are.

Let's talk about data.

At this point in the game, might

use

moving your core data into

Google Cloud.

You can use Datastream to

redirect your Amazon RDS MySQL

cluster from Redshift to

BigQuery, or switch from a

remote PostgreSQL instance to

one running in Google Cloud.

I built a stage to feed it in

realtime to move it.

For some, multicloud is a phase,

not a permanent state.

You are not trying to get there.

Your final stage could be a full

on migration to a new primary

cloud.

Once you're invested in Google

Cloud you might want to start

take full advantage of the

unique GKE Autopilot mode for

fully managed Kubernetes.

You just stop managing clusters

and focus on your workloads.

Exclusive to Google cloud.

And what's really cool is that

May continue to use Anthos on

Google

Cloud to help manage fleets of

GKE clusters across our dozens

Of regions.

I'm showing you dash boards

coming out.

This let's me manage not just

GKE clusters but on-prem, edge,

doesn't matter.

I'm getting a view of my fleet

and managing it.

Now you get BigQuery for

Analytics Google Cloud, and

Spanner for

distributed data, better

insights with our incredible

AI/ML services, unique developer

tools, and so much more.

If you forget everything else,

remember that Google Cloud

offers a unique management plane

that meets you where you are.

But Honestly takes you further.

This is why I believe over half

of all organizations using

public cloud will freely switch

their primary cloud provider as

a result of multicloud

capabilities.

Thanks a lot [APPLAUSE]

My name is Jana Mandic and my

prediction is that by the end of

2025, over half of all business

applications will be built by

users who do not identify as

professional developers today.

One of the most interesting

opportunities as organizations

evolve, is that we will continue

to see more development work in

the enterprise taken up by teams

and individuals outside of

central IT.

The adoption of no code and low

code tools will unlock this

potential by making the

development process easier for

more users.

How many times have you been

asked to do something, but you

had to say no because your

roadmap or feature request list

was already way too long?

Well, with these tools, those

business users you had to say no

To can instead create apps and

workflow automations themselves,

with no programming skills

required.

These no code and low code apps

will be built collaboratively

with developers like you, who

will provide the guardrails to

keep the business secure, while

enabling business users to

deliver their own solutions.

And I'm not alone in thinking

this prediction will come true.

Leading tech analyst Gartner,

forecasts that by 2025, 70% of

new applications developed by

organizations will use lowcode

or nocode technologies, up from

less than 25% in 2020.

Organizations are getting ready

for this change and our

customers are already moving in

this direction.

Globe Telecom, a major telco out

of the Philippines, reduced

targeted business process

turnaround time by 80% from

experiences built by their

citizen developers. And now it's

demo time!

Let me show you how Google

Workspace's AppSheet is making

all this possible today and in

the future.

Now, I'm going to be Ann Gray, a

business analyst, trying to help

my team save time managing

request approvals.

Currently, the process is manual

and disorganized, spread across

ad hoc emails and chat messages.

So to fix that I'm going to

build a

no code request approval app.

Then, I'll show you how my team

and I can use this app to

efficiently manage our approval

workflow.

With AppSheet, I have a single

place to store my data and build

my apps.

my apps.

I can also connect to other easy

to use data sources like Sheets,

or with the help of IT, I could

connect to cloud DBs.

Let me show you the solution I

could build as a business user!

Here's my database.

AppSheet helps me structure my

data and prep it for app

building.

When I'm ready, I can create a

New application with a single

click.

This creates a prototype app

That will be useable on

any desktop or mobile device.

After some customization, here's

what I've built!

I have a New Request view for

users to make requests, and a

Approver view for approvers to

do their thing.

Each of these views is available

to me in mobile apps and desktop

apps by default, and I can also

configure them to show up in

Gmail and Chat, meeting my users

where they are.

Here's my Google Chat app.

With this, my team can make

requests directly from their

team Chat space.

It's reusing the same Request

view I configured earlier.

Here's my Automation that'll

send the Approval view to the

approver in email.

When I'm happy with my app, I

Can share with users and add to

my chat spaces.

Throughout this whole process,

everything I created is

controlled by governing policies

set by the company's Workspace

admin.

For example, here's what happens

if I try to share the app

"outside my domain." AppSheet

blocks me and keeps company data

secure.

This allows IT to do two things:

1: Keep a clear line of sight

of

all apps, being able to

deprecate, update and retire

them when necessary. And two,

Restrict access to only the

users who need it.

Now let me show you how my

colleague

Jeffrey and I can use this app

to manage our request and

approval workflow.

I'll be Jeffrey now.

I'm on Ann's team and I have a

request to make.

Let me ask Ann how to use the

new app.

She's added the bot to the space

and now I can quickly submit my

reimbursement request.

Ok, now I'll be Ann again.

I got this email for Jeffrey's

request.

I can review and approve

directly from here.

And if I have a pile to review,

I don't have to click through

individual emails, I can pop

into the app and see everything

In one place.

This app is accessible to all

the users requesters and

approvers.

With simple sheets like

expressions, it's configured so

that requestors see only their

own requests and approvers see

all the information they need.

You saw just how easy we are

making it for nontechnical users

to create business applications

that meet their immediate needs.

With no code and low code, you

and your business users now have

more tools to work with.

This is why I believe that over

half of all business apps will

be built by users who don't

identify as professional

developers today.

Thank you!

[APPLAUSE]

>> Well, folks that's a wrap on

our predictions.

We're excited about the

conversations this will start

And continue with all of you.

Remember, if you want to share

your own prediction, use the

hashtag GoogleCloudPredictions

to tell us all about it.

We can't wait to hear from you!

Thank you for your continued

inspiration and for partnering

with us to build what's next.

So, are you ready?

Enjoy the rest of the show!

Thank you everyone!

>> Hello, everyone.

Please welcome Google Cloud's

Head of Developer Communities,

Ashley Willis.

[APPLAUSE]

>> Hello, everyone.

Thank you for being here.

Thank you for being here for as

long as you have.

I'm sure you were like, I'm

ready to go.

My name is Ashley Willis.

I'm Head of Developer

Communities at Google Cloud.

I'm also a wife and a mother to

Three beautiful kids.

Some of them are watching today.

Thanks for watching.

And today I'd like to talk about

something a little bit different

than my colleagues have on

stage.

It's something that affects us

all one way or another, and that

is burnout.

Needless to say, I am NOT a

doctor, so I cannot diagnose you

with burnout.

But I have experienced this

numerous times over my career --

careers.

We're going to talk more about

that later.

And today's talk will focus on

why I think burnout exists, some

ways you can identify it.

And for the managers in the

Room, I'm going to give you some

tips to keep you from burning

your people out.

But first, I think it's helpful

to start with the definition of

burnout.

Burnout can be difficult to

describe because it's not

necessarily a medical condition

in itself, but according to the

APA dictionary of Psychology,

burnout is defined as,

"physical, emotional, or mental

exhaustion, accompanied by

decreased motivation, lowered

performance, and negative

attitudes towards oneself and

others."

Is any of this starting to sound

a little too familiar?

Yes?

Good.

So, please be honest, this is a

safe space.

This is a small room.

How many of you have experienced

burnout?

Yes.

The good and bad news here is

that you are not alone.

Employee wellbeing is the new

workplace imperative.

We are hearing our managers talk

a lot about employee morale,

what can we do for people?

And the term quiet quitting is

kind of setting the Internet on

fire.

A recent Gallup study cited that

74% of workers have experienced

Burnout on the job, and 40% said

that they experienced it

specifically during the

pandemic.

And I don't know about you, but

I found those numbers to be

shocking.

And because it's so hard to

recover from burnout, people

Will sometimes leave entire

industries trying to escape it.

For example, I used to be a

commercial photographer, and a

software engineer, and as you

may have already noticed from my

slides, I was also a graphic

Designer, or still am.

An Ashley of all trades, if you

will.

So I ran a consulting business

for about ten years.

My business was actually doing

well, it was thriving, but I had

burnt out so bad that I decided

to do something completely new,

learn a completely new skill in

my 30s, and I entered the

corporate world again, and here

I am.

Why are we like this?

I think there are a few things

that factor into burnout, but

the one I would like to talk

about today is status.

A status symbol used to be a

nice car or a fancy watch.

Maybe you lived in a nice house.

The point is that status was

visible to the world.

But now, status is about being

busy and keeping up with the

Joneses, along with a fancy

Car or a nice watch -- I see you

all preordering your Teslas.

And so, imagine for a moment

that someone asked you how you

were doing?

The normal answer is "I'm tired"

or "I'm super busy."

We are over scheduled.

We even overschedule our kids.

Our vacations are over

scheduled.

All of it.

If your calendar isn't filled

with back to back things, are

you even important?

If you're not sharing a selfie

at every tourist trap did you

even vacation?

BTW, this is Hootie the Owl.

And if you ever want to know a

good owl cafe in Tokyo, hit me

up!

Which brings me to Social Media.

Social Media is fully involved

in every aspect of our daily

lives.

We are more online than ever,

And we are constantly sharing

the highlight reels.

But It's not enough to just be

on social media anymore, we want

to be influencers.

We want all of the likes and all

of the engagement.

We really need that attention.

So we're trying to keep up with

our peers.

We've prioritized this public

perception over our own mental

health.

And through social psychology,

We also know this negatively

increases the frequency of our

own self-evaluation, of our

appearance, our health, and even

our jobs.

I like to say that every like

equals one serotonin, and that

might not be too far from the

truth because my little heart

flutters every time I see a

notification.

Yes, more likes.

It's kind of like that 1985

Oscar speech from Sally fields

where she's like you like me,

you really like me.

That's normal, though, because

we require connection.

Connection is known to reduce

anxiety, stress, and depression.

Socializing helps us learn to

navigate and cope with life's

challenges and can boost

self-esteem, it can also help us

Avoid loneliness.

So in a lot of ways, social

media helps us create those

connections, but the life that

we lead on social media is an

idealized version of ourselves

so we're not connecting

authentically.

So it's no wonder we're all

stressed out!

We're constantly trying to keep

up, we're constantly doom

scrolling, and we're constantly

available.

If I don't answer that call or

that text will they think I'm

lazy or unreliable?

What if they reach out to

someone else instead?

How can I be the hero in this

situation?

Which leads to what I like to

call the Hero Syndrome.

The addiction to being a hero is

no different than any other

addiction: As a hero you

spend most of your time saving

the day, and not nearly enough

Timesharing knowledge with other

people.

And that lack of knowledge

transfer causes mini single

points of failure across your

Business, also known as the

lotto factor.

Take Jane, for example, who

works as a software engineering

at a fast growing startup.

If Jane wins the lotto tomorrow,

she is not coming to work.

So who can pick up Jane's

workload when she doesn't show

up?

Because no matter how hard

working or heroic Jane might be,

she will ultimately burn herself

out.

Nobody can keep up that pace

forever.

So do not be a hero is a lesson

you should learn from that.

So how long do you think it

takes to recover from burnout?

A month?

Six weeks?

All of you are wrong.

The answer is going to surprise

you because it surprised me.

On average it takes two years to

recover from burnout.

That's right.

Which makes sense because it

probably took you just as long

to get there in the first place.

[laughter]

Burnout doesn't happen over a

single project, it's caused by

repetitive stress, and lack of

work / life balance over several

years.

The trouble is that by the time

you feel like you're burning

Out, you're probably already

there, and that's why taking

your vacation does not always

help.

Vacations should be proactive

instead of reactive.

Now that we've talked about some

of the reasons why I think

burnout exists, let's talk about

what the burnout cycle looks

Likes.

There are seven phases here, and

I'm going to take you through

those now.

So phase 1 is the honeymoon

phase.

I'm really excited about this.

I'm happy.

I'm committed.

I'm energetic.

And this is especially true when

we've started a new job.

We're solving hard problems, and

we have this compulsion to prove

ourselves.

That's normal.

Your productivity levels are at

an all time high.

That's great.

Phase 2 is the onset of stress.

This phase begins with an

awareness that some days

are more difficult than others.

You find that you're just not as

optimistic as you once were.

And you're more tired, like

sleepy all day and up all night

thinking about work.

Does that sound familiar to

anyone?

Yes?

Phase 3 is neglecting yourself.

By this point some people are

postponing self-care just to get

the job done.

And nobody likes pepperoni on

their keyboard.

Maybe you're ordering takeout

because you don't have time to

Cook a healthy meal.

I spend way too much money on

Door Dash.

Or maybe you're just dunking

your kid's dino nuggets in

ketchup and sadness.

Another example may be canceling

plans because you just can't

enjoy a night out with your

looming workload.

It's terrible.

Maybe you're engaging in

unhealthy coping mechanisms like

drinking too much just to get

your mind to stop racing.

Phase 4 is interpersonal

problems at home and at work.

That's my kid who I'm completely

ignoring.

Is your family checking in a

lot?

Is the vibe just generally off?

There's tension and you can't

really put your finger on it,

but work is stressful and that

stress is starting to bleed into

Your home life.

You're snapping at your kids,

you're canceling plans, and the

things you used to love just

aren't as much fun anymore.

Does it sound a little like

depression?

That's probably because burnout

is often diagnosed as

depression.

When we reach this phase, we're

just not present anymore, and I

don't know about you, but I feel

like a bad partner, a bad

co-worker, a bad parent.

It feels like I'm failing

everyone, so why bother trying?

Phase 5 is reduced performance

and cognitive problems.

That's right.

That's me just scrolling through

TikTok.

Burnout and chronic stress is

known to interfere with your

ability to pay attention or

concentrate.

Distracted breaks become much

longer and you just can't seem

to concentrate no matter how

hard you try.

When we're stressed, our

attentn narrows

and even sma tasks become a

huge burden.

There I am ain spending my

afte

know I have a lot of work to do,

I just don'tnow how to do it.

Phase 6, I cannot get exted

about work amore.

By this time'm three cups of

Now I'm feeling a bit cynical

about my worng conditions and

even my cowoers.

I can't seemo get ahead no

matter howard I try, and the

worklo is so big I don't even

ow how to ask for help at ts

point.

ase 7 is physical symptoms.

Experts knowhat chronic stress

can create rl health problems

like digesti issues, heart

dise.

And accordinto an ITA Group

study, emploes who say they

are burned outre 23% more

lily to visit the emergency

room

When you're red all day and

Thinking abo work all night,

of course at's going to

trigger someing physical.

And phase 8 total burnout.

Here you canee my soul

departing tohe great

workstream

It's at thisoint that I d't

trust myselfnymore and I dbt

whether or n I'm even

qualifiedb

By now I feelike the only way

to recover ito find a new job,

and spler alert, that's not

going to helyou either.

Thats merely a Band-Aid for a

much larger oblem.

This isn't necessarily phase,

but as a developer, I thoug

this w a really cool visl of

burnt.

You casee the really rong

commitisto the first two

years.

This i Jonnie, he'sreat and

saiwhat got him here wa

workings

own ap

He's de.

There ve bn some studiesn

recentearshat evaluate On

Source and bnout.

For ma tha are contribing

to Open Sourcehat u're not

ing part of your job,

develors s it gives them

ener, whh makes a lot of

sens d

you'reryinto get paid

rougOpen Source, it'sard

to say no, andt can lead to

burn.

seing a little later, and

think.

Now ifou'rfeeling any of

that, re'she good news,

there ALLYre things you n

do abo it.

I founthatome of the things

I'm

He's seven tips avo

burnou

The fit onis to get clary.

A Gall stu cited that on

half oworks actual kno

what's eecteof them.

Think about that r a moment.

Do o

Rae your hds i you do.

I see you here, y.

And alworks, regaress

age ortagen their career

want too

them.

And thlackf clear

pectations canreatanxiety

and frtration.

Even iemployees feelnerged

Those o lack

clear pectations andpendoo

much te woing on the wro

thingsan'treate value fo

their ganization andhey ll

exhausthemselves justryi

to fige out what theare

supposed tbe dng.

St multi-tasking

u'reot good at it.

promise.

u'reeally not.

lti-sking actuallyeduc

ur pductivity becae yo

ain n only focus oone

ing a time - the re w

ltitk, the less wechie

and len.

's bause we divideur

tentn and don't alw ti

d spe to do deeper

processing an.

e coinued context itchg

finily leads to buout.

, ifou're at capacy an

yo maner asks you too on

re tng then ask yo maner

helprioritize youtodo

list.

at i sometng tt you can

op ttake on this n thi?

caot be the baby.t it

cannot drop the baby.

aughter]

Sehealthy boundaes.

can be mad at ouranagers

r asng us what we n

ndle

arell adults and ur

nageexpects you tonow and

mmunate your limit

yingo helps you ANyour

corker

ur eire organizati

tual benefitfrom you

tting healt bouaries

caus when yoburnut from

beina hp

Likelyour ers.

rnouis contaous d

dostream ilicaons for

eryo involve

Take controlf yo

notifition

ery time your one zzes and

u get that shoof damine,

ur body is alstensg and

acting.

's spending engy ndlessly.

less you are opageduty,

ere's nothing at c't wait

til tomorrow.

Find somethi outde of

rk that you arpassnate

abt

is ise nily asking you

t a hobby.

ease all of yoget hobby.

mething that'shallging and

gaging becauseeseah shows

at pple withobbi are

le liky to suffer fr

stressnd dression.

en you stakeour tire

lf-worth onour b,ou

become lesesilnt to things

ke layoffsr netive

rformance revis.

also makest rely hard to

switchr

Have a cleasepation

tween work andome.

s roomba cat lded t?

Okay good.

understand thahavi a

dicated officepaces not a

ivilege that eryonhas, but

ere you work ahomelays a

ge re in youprodtivity.

you work in fnt of the TV,

u're basicallyetti

Nelix marathon.unplanned

eaking of --

s everyone herseen season 4

Stranger Thin?

's great.

highly recomme it.

e point is tha

having aediced space for

rk that's a hethy distance

om your relaxi and eating

aces willreatthat

seration from ho andork,

which isery portant for a

health.

Get enough sep

eep deprivatioand burnout go

nd-in-hand.

fact, the Natnal Sleep

undation says,sleeping less

an six hours eh night is one

of the best predtors of

-theob burnout."

Most of us have awareness

that sleeping beer helps us

Perform better ark, but a

lotf us have a hardime

putting itnto practice.

You're not alone

dia poll in 2018hat showed

on 10% of Americ adus

prioritized theisleep,

And I have also und that ZERO

percent of childn prioritize

eir parent's sep.

So youight as well get it

while you n.

Now that I've gin you some

tips to identifynd avoid

burnout as an invidual - I'd

like to dedicateome time to

helping out the nagers in the

room.

This is arguablyhe most

important sectioof this

presentation becse I

Believe thatoor management is

directly respoible for

burnout.

A Harvard Busine Review survey

revealed that 58of people say

they trust stranrs more than

eir own boss .

And employees whfeel strongly

supported by the manager are

70% less likely burn out.

So here's some ts for you

manag.

Be clear about your

expections.

I made this comic as a joke.

You're in a one-on-one with your

boss, and they're asking for the

status of a project you didn't

even know you were on.

Thesare the "unspoken

Objectives."

And I amotally nailing them

always.

How many ofou in here have

to

manager, I have a resource

problem I need to solve.

4: Try not to send email or

messages after working hours.

Use that message button.

Do not underestimate the

influence you have on your team.

People will feel obligated to

respond to you even if you tell

Them they don't have to.

I see your signature line, only

respond on your own time,

whatever.

It doesn't matter what you say.

It matters what you do.

5: Encourage people to take all

of their vacation time and you

take yours as well.

Model the behavior you want to

see.

Many people don't take vacation

because they have anxiety about

The workload that they will have

when they return.

As a manager, meet with your

people and see what you can

remove from their plate so they

can relax while they're out.

Number 6, have regular

one-on-ones and actively listen.

Do they sound stressed or

over-worked?

Are you taking something off

their plates?

If you're taking notes during

the meeting then you're not

actively listening.

Remember what I said about

multi-tasking?

You're not good at it.

I suggest you recap at the end

by saying what I heard was, and

repeat back what you said and

send an email recap after the

meeting.

7: Don't only recognize the big

wins.

We do this a lot.

Our jobs are mainly a

Combination of small wins, and

us managers only have a tendency

to recognize the big ones.

People crave recognition, so

show them that you appreciate

them early and often.

Otherwise, they're going to kill

themselves trying to figure out

how they're supposed to gain

your approval, and you will

unintentionally create a bunch

of competing heroes.

So this next section is Q&A.

I've done this talk a couple of

times, and there's three

questions I get asked the most.

The first one is I heard you say

it can take two years to recover

from burnout, but I don't have

two years.

So I did consult a professional

about this talk in general, but

specifically this question.

My friend Lindsey Paoili is a

licensed therapist, and she says

we often believe this is an all

or nothing thing, and it's not.

If you start incorporating

certain things like movement,

attention, deep connections,

fresh air, touch some grass,

you'll start to see results.

So she suggests you start

layering these things into your

day every single day while

mindfully knowing why you're

doing it.

Hey, I'm feeling burnt out,

these are the things I'm doing

very intentionally, and then you

will start to see results

towards recovery.

Question 2 is I have a friend

who's burnt out.

How do I help them?

We have a tendency to want to

solve our friend's problems?

Don't do that.

Just listen.

If they've never heard of

burnout before, maybe you can

forward them some articles and

then suggest they talk to a

professional.

Or maybe send them this talk.

Number 3 is how do I restore

work/life balance?

Remember what I said, you need

to hear something seven times,

this is number two.

Back to question number 1, go

outside, touch grass, move your

body, talk to your friends, stop

cancelling your plans.

Figure out who on your team can

help you with your workload.

Have an honest conversation with

your manager.

Your manager really does want

you to succeed.

Something that I did, though,

was I turned my entire garage

into a maker space.

I like to build things, so I put

a 3D printer out there.

I have a glow forge, all of my

soldering stuff.

These are the things I've done

recently.

It's a lot of fun.

Highly recommend hacking on

things to add some life to

something here.

But in closing, get a hobby, but

also, if your family is telling

you like, hey, something's off,

you need to slow down, those are

the people that know you best

and want the best for you.

Listen to them.

Take a moment.

Take a breath.

We are all in this together, and

I hope that everything that I've

said here has helped you in some

way.

I will be mingling around here,

and I'm happy to talk about this

at length.

Thank you.

[APPLAUSE]

>> Hi, I'm Shaquille O'Neal, and

I'm the founder of big chicken.

You've got to do that at the end

when you say Big Chicken.

>> Big Chicken is Shaquille

O'Neal's emerging food chain

that focuses on big fun and big

food.

When you're trying to build a

national chain, communication is

so critical.

To do that, you need a great

partner, and we're really lucky

that partner is Google

Workspace.

>> Josh, every time he does a

presentation, he just loves

Google Slides.

>> As the person responsible for

our market, probably the best at

Google Slides.

>> His presentation is great.

>> I've got some great new

chicken sandwiches for you to

try, brand new recipes.

Isn't there something important

we're supposed to be talking

about?

Good recipe development comes

with collaboration.

Using Docs Google Workspace, we

can do it together.

Shaquille's life gets crazy

busy, as does our entire board.

>> You want to talk to me, make

sure you put it on Google

Calendar.

Google Calendar is my

girlfriend.

I don't know anything I'm doing

unless I talk to my woman.

Google Workspace, productivity

and collaboration tools for all

the ways you work.

>> Google products provide the

information you need when you

need it.

But why can't you get the same

kind of answers for your

business?

Google Cloud's intelligent

business solution is here to

solve the problem, enabling you

to go beyond traditional and

make your company innovative.

Looker is Google for your

business data.

Here's what we mean.

What if Google AI were built

into the tools you use to store

and analyze at work?

It takes data like video,

images, and audio, and in

realtime turns it into

structured data ready for

business intelligence.

Going beyond the dashboard means

using Google class enterprise to

see insights and recommendations

based on your data in realtime.

More access, more transparency.

Now that's Google for your

business.

With Google Maps, you know if a

restaurant is busy before you

go, or you can get rerouted or

out of a traffic jam.

Looker will help you connect

similar dots in a predictive

way.

>> A concert will increase foot

traffic by 65%.

Would you like to adjust

staffing and inventory?

>> Yes.

>> Looker in AI allows you to

turn insights into action.

>> Foot traffic continues to be

busy.

Encourage customers to visit an

alternate shop with a rewards

card?

>> Yes.

>> Smarter insights mean better

experiences and happy customers.

Go beyond the dashboard and

transform the way you do

business with Looker, powered by

Google Cloud.

>> We started with the

exponential road maps goal and

zero carbon emissions by 2050.

>> Where do emissions primarily

stem from, device, networking

and cloud.

>> Our goal is to get to zero

emissions by 2030.

>> Backstage was built

internally at Spotify, so it

utilizes your services, Docs,

and apps under a consistent UI.

We donated it to the cloud

computing foundation.

>> Amazing how many people at

Spotify care about this topic.

>> Cloud carbon footprint is a

thought resource tool developed

by Thought Works.

>> The only thing limiting us

now is people hearing about it.

>> It leverages cloud APIs to

provide visualations of

estimated carbon emissions.

>> We leverage GKE.

It starts not just from the

cloud, but it goes all the way

out to our end user devices.

>> We want to empower not just

Spotify internally, but the

broader developer community to

reduce their carbon footprint.

>> Google's infrastructure

powers services for billions of

people.

>> And then Google Cloud takes

those lessons from running these

services in order to deliver an

innovative and easy to use cloud

infrastructure.

>> Today Google Cloud helps user

automate the lifecycle of their

workloads.

>> In the future, we'll use AI

to understand workload patterns

and do this automatically.

>> Intelligently optimizing for

higher performance with lower

latency, cost, and power

consumption.

>> Today we optimize our

infrastructure for AI email and

your data, ensuring that it is

accessible anywhere.

>> But we're not stopping there.

Chiplets are a new design and

manufacturing process that

brings Open Source agility to

the world of silicon by using a

building block approach.

>> Chips for a vast range of

config

configurations.

>> Sustainability is important

to us because our planet depends

on it, and we will operate on

carbon free energy by 2030.

Everything you send through

Gmail and every question you ask

search and every virtual machine

you spin up across our cloud

will be supplied by carbon free

energy every hour, every day.

>> We are reinventing

infrastructure where AI-based

automation will recommend the

most efficient design for your

workloads based on your usage.

>> And we'll run them on

Google's unique work structure

optimized specifically for you.

>> All of this delivered on the

cleanest public cloud in the

world.

>> This is our Next.

We can't wait to see what you do

with it.

>> Welcome to the customer

innovation series where you're

about to see six unique stories

of transformation. Your peers

from around the globe will talk

about their challenges, share

how they solved them, and offer

lessons learned. Let's kick off

the series with a Google Cloud

partner customer story from Atos

Maven Wave featuring Jason

Sharples, Chief Technology

Officer, of Global Payments.

Jason shares how they improved

employee collaboration with

Google Workspace and began their

journey of migrating core

applications from on premise

data centers to the cloud.

>> Today I want to talk to you

about something that's probably

dear to many of you, how to get

your increasingly distributed

teams to work better together.

How to create order from the

disparate systems that come with

mergers, acquisitions, and

changing procurement policies

over generations of tech. How

to start working faster, more

flexibly, and more securely.

These are all things we have

accomplished at Global Payments

over the past few years.

Working with Google Cloud and

their consulting partner, Atos

Maven Wave. Even if you haven't

heard of Global Payments, you've

encountered us. Among other

things, we are one of the

world's largest payments

technology companies serving 4

million merchants and thousands

of financial institutions.

Our merchant solutions segment

provides customized software and

services to help merchants run

their businesses from front of

house to the back of the house.

Our solution segment provides

technology products and

processing services to large

financial institutions,

fintechs, banks, start ups, and

retailers who issue credit

cards. It also offers B2B

payment solutions. Today we are

far ahead of our competitors

from a technology perspective.

As just one example, we are

already a top quartile SaaS

company with even more ambitious

goals. As you can imagine, we

have acquired a lot of companies

over the years as we've grown.

Each with their own technology

stacks, communications, and

productivity systems. We have

seen and adopted several

generations of communications

and collaboration tech

ourselves. So in 2016, we

decided we would develop a

single, reliable, cutting edge

productivity platform that could

empower and strengthen every

employee and team, speeding our

customer responsiveness and

boosting innovation. We chose

Google Workspace and Atos Maven

Wave to make this a reality. We

chose Google for their products

like anywhere, anytime

Chromebooks, realtime

collaboration in Docs in Sheets,

realtime on collaboration in Jam

Boards, and tools like Chat and

Meet that enable

us to work together to deliver

for customers in realtime. To

give you just one example,

before Google, a business leader

spilled a can of soda on their

laptop, losing three years of

data, and requiring a month to

get back up to speed. With a

Chromebook, they would have just

stroked, picked up another

machine, and been back at work

in a minute. That's

incalculable savings. And it

wasn't just the X link Google

products. What convinced us was

their responsiveness. They

walked the walk, improving their

products, and responding to our

requests for new features and

changes very quickly. That all

sounds great, I know. But

here's the thing. When you are

using technology to change your

corporate DNA and if you're

serious that this is what you

are after, you need to think

long and hard about the human

factor. So when we thought

about minimizing the stress we

know people go through when they

have to learn new ways of

working, we at Atos Maven Wave

focused on the human element.

Leadership at our company is

technically adept, so it wasn't

hard to make ourselves the first

part of the organization to go

to Workspace. Besides, everyone

everywhere knows how to use

Gmail and Calendars, and

learning Chromebooks just means

learning how a browser works.

But the corporate commitment

signal it sends, having these

leaders adopt the platform

first, is invaluable.

Elsewhere, we activated our

Culture of peer comment and

collaboration to prepare, teach,

and evangelize the benefits of

Workspace and show directly the

ease of use of added features.

Our Google Guides were team

members who volunteered to

assist and answer questions as

the program rolled out across

the organization. The group has

continued to evolve by itself,

adopting best practice where

they find them to ensure that

the full employee base gets the

very best service. Recently the

group created efficiency

workshops, such as how to be

really good at Sheets, that

spread through positive word of

mouth recommendations across the

company. This demonstrates how

we are achieving our objectives

through decentralized,

democratic, and high proficient

ways to improve performance.

The proof is always in the

doing. The original phase

brought 12,000 users onto

Workspace in a year.

Subsequently smaller

acquisitions took a month or so.

The last larger task was

bringing on 12,000 additional

users in a year without

generating noise or distracting

them from their ability to do

business. We have replaced

expensive hard to manage laptops

with Chromebooks, tablets, Macs,

phones, really anything people

want to use. Whereas before

meetings would start with people

spending a few minutes starting

their computers, looking for

documents, and complaining, now

we just get to work. We devised

Independent intranets in favor

of a single intranet that plugs

directly into Google Workspace,

leveraging docs, contacts, and

analytics.

We are opening

offices that are Wi Fi only with

no desk phones since we've got

stable and secure communications

and video, chat, and, of course,

good old email. So there's lots

of networking and telecom hassle

out of the picture as well.

When COVID hit, the 24,000

employee base at Global Payments

was perfectly placed to pivot to

remote working without skipping

a beat. This allowed us to

continue to execute our

projects, answer calls, and

deliver to our customers. One

of the things I'm most excited

about is the way these products

are constantly growing and

improving. I've mentioned how

efficiently they blended our

feedback into the products with

seamless updates. We are now a

common driven organization. We

have the ability to collaborate

effectively to drive progress

through an asynchronous and

synchronous manner, meetings

focused on resolving comments.

We are opinionated, we are

engaged, and we are highly

efficient. And I've got to call

out to our friends at Atos Maven

Wave here who provided high

Contact training for App Sheets.

Our teams have started using App

Sheets to build internal and

desktop apps.

And this is being done by the

people in departments that will

use them such as people from

accounting, HR, and procurement.

Because one thing we know is

that everyone is now a digital

native capable of leveraging

great tools to solve their

business problems. The cultural

shift engendered by Atos Maven

Wave and Google Workspace

Changed the way we as a company

perceive the capabilities of the

cloud.

If Google can run everything we

rely upon, can we run our

workloads in the cloud to

support our customers? And if

we can run workloads, we can

modernize from monolithic

mainframes to services. Atos

Maven Wave has been instrumental

in creating a pragmatic

modernization platform for one

of our most important workloads

that supports a million

merchants across 15 countries.

They are helping convert 5

Million COBOL lines of code into

Java on GCP, taking advantage of

the best advances in cloud

development, testing and

deployment, all the way through

to scalability up and down to

remove bottlenecks and move at

the speed that business demands.

The job of migrating merchant

Acquiring technology to Google

Cloud is not only focused on

bridging the application

technology gap, but also

enabling our existing

developers to embrace cloud

technologies. Sometimes these

developers perceive cloud as an

enemy, but they are a critical

facet of the cloud journey

because of their business and

application knowledge. Atos

Maven Wave helped us upskill our

team members in cloud practices

so they could continue to be a

critical part of our innovation

efforts. I can't wait to see

what the future holds for this

partnership, which has already

saved us time, money, and

headaches, and helped take

Global Payments into a more

dynamic and collaborative

future. Thank you very much.

>> What a great story from

Jason, right? Showcasing great

work between the teams at global

payments, Atos Maven Wave and

Google Cloud. We'll now hear

From Laura Merling, chief

transformation and operations

officer at Arvest bank.

officer.

Laura dives into how they are

building a new data platform to

accelerate their journey from a

community bank to providing

services nationwide.

>> This week is my first year

Anniversary at Arvest.

It might be interesting to share

with you

why I left a Silicon Valley

company to join a community bank

in Bentonville, Arkansas. Well,

the answer is the financial

services industry is in the

middle of a disruption. And I

like the opportunity that

disruption brings. It's an

important opportunity and for us

it's an opportunity to rethink

what it means to be a community

bank in a digital world. I find

it exciting, and I hope you do,

too. So who is Arvest Bank?

Arvest Bank is a leading

community-based financial

institution with more than $26

billion in assets.

We are also serving more than

110 communities across Arkansas,

Missouri, Oklahoma, and Kansas.

It's a high priority for us to

continually invest in providing

the digital tools and services

that our customers expect. Both

our retail customers and our

growing commercial customer

base. So where are we on this

transformation journey one year

in? Well, we know that

transformation impacts every

aspect of our business. We are

reimagining what it means to be

a community bank in a digital

world, and so in order to do

that, we spent the last year

identifying our path forward to

align our business strategy with

our technology strategy. The

technology stack is critical in

order to allow us to be flexible

in meeting our customers' needs.

It all starts for us on the path

with cloud computing as well as

a new data platform, and we have

decided to take on building a

new banking core as the

foundation. So where are we

headed from here? Our journey

to defining what it means to be

a community bank in a digital

world means we are taking a look

at each aspect of the bank, and

we are looking to create a

consistent experience across all

channels. We need to be

hyperfocused on the customer and

what they need. That means we

actually have to think about

data at the center of everything

that we do, whether it's front

office or back office, and

especially when it comes to

facing the customer. So let me

tell you a little bit about a

customer story and data. One of

the things that we learned was

we had done some research and

understood that our customers

preferred or told us they

preferred more ATMs and longer

branch hours. Well, that tells

you one story, but then when you

look at that same data from a

different perspective, those

exact same customers actually

told us that they preferred a

digital channel. Over 95% of

them preferred digitally. And

so you have to take a step back

and say, well, what does that

really mean? And if we hadn't

looked at the data from both

angles and thought about it, we

wouldn't get that answer right.

It's at the center of everything

we do is data, at least from

this point forward. So that was

the foundation. Now we have

other foundational pillars that

support our things like our back

office and our contact center.

We have to provide a level of

simplification and automation,

removing manual processes and

creating operational

efficiencies. So around that,

we had to think about what does

it mean? How do we get these

efficiencies, create new

customer experiences, and so at

the center of our transformation

is a shift as a business to

having a data mindset. We

needed to actually redefine our

customer interaction models and

our back office operations. And

so at the heart of it, we

decided to build a data platform

based on GCP. So this data

platform, we also wanted to give

it a vision, a vision that

aligned with our business

vision. So the data platform is

to be a living architecture that

will be built as a foundation to

support Arvest's future. It

will be using realtime data for

experiences and decisions. And

it's really important to keep

that as part of your mindset and

create a vision for where you're

taking your data platform to

know what you need to create and

how you think about that future

state. So in defining the data

platform, we also said, well,

how are we going to do this? We

can't just lift and shift all

the data. So we identified six

use cases. And each one of

these use cases had a set of

criteria. The first criteria

was it had to be solving an

immediate pain point for the

business, and it had to be able

to be solved within 90 days.

Each one of the criteria or each

one of the business cases I

should say or use cases needed

to actually test an end to end,

you know, aspect of the data

platform. So whether it was

access to realtime data for AI

decisioning or whether it was to

do reporting or even creating

new digital experiences for

customers. And, of course, we

all wanted to be able to ingest

third party data. And so what

does that look like? Well, we

did all of this, and we've been

doing all of this while standing

up our GCP foundations over the

last year. We had a desire to

move fast, and we have been

moving fast. But, of course,

there's lots of lessons to

learn. We identified data

sources. We set up the

infrastructure. And we enabled

access and permissions. Then we

ingested the data, and we did a

bunch of transformations, and we

did those once it was all in

GCP. And then you'd think we'd

be ready to go. So we partnered

with the Google PSO team, the

Google Professional Services

Organization, to jointly pursue

an

automation of underwriting. So

think about this, how do you

decide who you give a loan, and

so think this is small business

customers and with what we call

Our Arvest Opportunity Fund.

The Arvest Opportunity Fund is

where we extend loans to small

business customers that might

not normally be eligible.

And so being able to automate

that process is really key to

our business. Now, we've got

all those steps done, and we're

on our path forward. But we did

take a lot of lessons learned

along that way. The three

primary lessons that we think we

have gotten out of this

transformation so far around our

data platform are around --

we'll start with and say the

first one is really around who

needs to know what? Who needs

to learn the data platform, and

what do they need to learn?

First and foremost, we didn't

have enough people trained on

the platform, and we needed to

make sure that -- so it was

about who did we anticipate

needed to be trained versus who

needed to be trained? And then

did we have learning journeys

identified for them? What did

they need to learn and over what

period of time? And then, of

course, making the time

available for them to learn

while they're trying to build

the data platform. It's all

kind of tricky but really

important to do. Second, we

learned that we had not

established a framework that met

the needs of our internal teams,

nor our partners, as they looked

to get access to the different

data sources. We ultimately

needed to create a set of

persona templates for the

access. Now, it seems like you

would have thought of that up

front, and we thought we had.

But once we started getting

people access to the data and

started thinking about the use

cases, we learned what the real

needs were and the access

requirements. Lastly, we

realized that while we had set

up some of the foundational

aspects of GCP and we had set up

the data platform, we had not

actually set up the environment

to begin using the pre build AI

models that come with Google.

So think the vertex platform.

Were we really ready to consume

it? We weren't. So those were

kind of our three major lessons.

They're all good learnings.

Lots of progress since then.

And we're off and running. Back

to moving fast. Look forward to

seeing you on the other side.

>> Thank you so much, Laura. To

Hear more about Arvest Bank's

digital transformation, be sure

to catch her panel discussion

with Google Cloud and Thought

Machine.

Look for session ID INV 108 in

the next session catalog.

Let's dig into another dimension

of the financial industry with

Ari Studnitzer, managing

director

Architecture And Product

Management at the CME Group, the

world's leading derivatives

marketplace. This Deloitte

customer will share how their

Google Cloud platform experience

team worked with application and

platform teams to better manage

costs, improve efficiencies, and

accelerate migration.

>> Just under a year ago we

announced that CME Group would

move our technology portfolio to

Google Cloud. We have an

aggressive time line for our

transition, and along with

Deloitte and Google Cloud, we

have been working hard to

deliver on our commitments. We

are working not just to improve

our technology but change how we

work, creating a more outcome

focused process that helps CME

Group customers use our products

more efficiently, more securely,

and in ways that better meet

their needs. What I want to

talk about today is how we

created a new outcome oriented

system of architecting,

migrating, and training our

teams on Google Cloud. We did

it to serve our customers

better. And I believe our

method has important lessons for

many of you watching today.

First, let me tell you a little

bit about CME Group. As one of

the largest financial exchanges

in the world, CME Group is the

only exchange where every major

asset class can be traded on a

single platform. Our futures

and options products help

customers manage risk across

commodities, interest rates,

currencies, energy, metals, and

many other asset classes. And

our markets create data and

information that is then used by

traders, financial institutions,

farmers, governments, news

organizations, and anyone

interested in critical aspects

of the global economy. In some

cases our customers use our

applications directly, or they

might use our data and

information from other

providers. In fact, one of the

many reasons we partnered with

Google Cloud is its strong

capabilities in data, with a

goal of increasing developer

productivity, increasing

software flexibility, and

improving operational excellence

while maintaining the highest

levels of security. Many of you

know the adage, security,

resiliency, and velocity, pick

two. Well, this is an effort to

be able to get all three. To

make it even more interesting,

and I don't think I'm alone, we

wanted to move from development

to production in just a few

months with limited resources

and internal staff who were

gaining GCP experience on the

fly with still an on prem system

to support. And, of course, all

of that using a mixture of GCP

native services, legacy systems

on prem, and a number of third

party tools. It's sort of like

maintaining a jet engine in

flight while transforming the

fuel system and training the

ground crew all at the same

time. So here's what we did.

We started with principles.

What are the outcomes we are

trying to achieve? What kind of

Experience do we want our

customers to have?

When you're passionate about

customers like we are, the

opportunity to enhance our

technology and improve our

customer experience is quite

motivating.

The experience we want them to

have is software they can easily

access at scale to make better

decisions. So working with

Deloitte, we built a cloud

experience team designed to

bridge our application and

Platform teams.

The goal was not only to improve

the productivity of our platform

team, allowing them to focus on

their delivery, but to increase

the velocity of our application

migration teams as well. The

CloudEX team was the first thing

We set up, ensuring there was

proficiency in software tools

and a common set of goals to be

able to communicate to other

teams.

They also act as a central point

of questions, building a

repository for documentation,

for self service training, the

CloudEx team is made up of CME

Group with Deloitte team members

with operational and GCP

knowledge. We support

application migrations in an

outcome oriented, sustainable

way. Each team member has

specific expertise which

underlines the importance of

initial team selection. Working

with the application teams, the

CloudEx team embeds and helps

delivery of software and Google

Cloud services.

If questions arise, the team not

only answers them but documents

what was done so others can

learn and deliver faster. The

mantra is experiment, learn,

share quickly. With the CloudEx

team supporting the application

teams, our platform team can

focus on their delivery. This

approach increases confidence in

delivery while overcoming many

of the ramp up challenges common

in a cloud migration. So how do

we judge the success of this

approach? We wanted better

productivity and track that by

ticket resolution. Issues have

cleared faster than we

projected. Applications are

growing in depth and capability,

and people are spending more

time on their core delivery

rather than context switching.

We wanted an effective support

and training process to quickly

bring team members up to speed

on Google Cloud. Centralizing

support with the CloudEx team

allowed us to enhance training

while providing a sustainable,

outcome oriented delivery model.

We wanted fast adoption. I

mentioned that we had an

aggressive time line for

bringing CME Group to Google

Cloud, and I'm pleased to say

that we're pretty much on

schedule with increasing

velocity. Maybe even more

important, I think we'll arrive

at our goal to deliver more and

better customer products and

experiences thanks to our

adoption framework. Based on

our success, here is what I can

Recommend you remember.

One, there's already a lot of

knowledge in your organization,

and there's a hunger to deliver

and learn more by your teams.

Leverage that by connecting with

all of your teams through a

central group that is focused on

the key outcome of customer

experience. Share information

and practices to ensure

consistency of performance.

Two, Everyone learns a little

bit differently in their roles.

So you want to allow healthy

experimentation. That means

asking questions, trying things,

and above all, sharing learnings

and outcomes. That is the start

of building your team's best

practices. Lastly, success and

technology always comes from a

healthy blend of understanding

your customers' problems and

knowing how technology can solve

them. Outcome oriented adoption

is a consistent blend of those

things with a common framework

of delivery. Thank you very

much for your time, and good

luck in your efforts. See you

in the cloud.

>> What I loved about this story

Is the teamwork and focus on the

customer experience to allow for

health experimentation and

continuous learning.

Let's move on to Brazil and hear

Priscilla Miehe,

the Chief Technology Officer,

inspire us with how they are

reimagining ways of maintaining

health and delivering health

care to 11 million people living

in the south of Brazil.

>> Hello from Brazil. I am

Priscilla Miehe, and I am the

chief technoloTechnology Office

Hygia Saude.

We are finding new ways of

maintaining health and

delivering healthcare to people

living in the south of Brazil,

which makes up about 11 million

people.

Today I want to talk about how

We are doing that affordably

with data driven services on

Google Cloud, and we have a

diversity based approach to an

organization that has grown

thanks to Google Cloud's

efforts.

Healthcare for the

people of Brazil is an enormous

opportunity. Our research shows

only 20% of Brazilians have

health insurance. Not only

that, only 19% of people with

health insurance have regular

medical exams. Almost two

thirds don't know what a routine

checkup is. The reason for this

is that Brazilians see medicine

as a financial worry, not to

Health benefit.

They fear they won't have money

to take care of their illness.

They need better health

insurance delivered to them in a

personalized way they can

understand and effort.

By bringing together experts in

Medicine, pharmacy, and data

science, we developed a solution

that provides individuals and

companies different types of

health services, including

preventive screenings,

follow-ups, health scoring.

We can offer products with

discounts, benefits, schedule

appointments, and exams. On the

financial side, we provide

financial and credit solution

for both clients and business

partners. Working alongside HR

departments, we can understand

the health of all employees

safely and securely. Our

affiliated network has an

extensive portfolio of

insurance investment

telemedicine, and digital

prescription. As you can

imagine, building this company

to scale across Brazil, large

Population, is going to be an

enormously costly proposition.

Our technology must be secure.

It must be easy to operate and

reliable for the customer.

It must handle enormous Taye at

that loads.

That's why it was to be

reliable. It must be able to

grow without overloading our

operating costs with charts for

storage, compute, and network.

We had a vision and we had built

much of the technology but not

in a way that could meet all of

our needs or take us to where we

needed to grow. After some

success, we realized we needed

to improve our operations and

user experience by restructuring

our entire architecture. That's

when we came to Google Cloud.

One thing I didn't mention

before, over 50% of our top

executives are women. And we

Have designed Hygia Saude to be

a diversity first organization,

so we can better react to

Brazilians at every level of

society.

Half of our company has been

hired through our diversity

program.

Attending Google Cloud

Accelerator for start ups

allowed us to think not just

about the architecture in a

stronger and more robust way,

but we began to look at ways we

could accelerate the performance

of our algorithms while also

working on our internal

learning. Google Cloud believes

their programs, technical

mentorships could help

facilitate our learning and

Growth, and it really helped us

better analyze the health of

employee populations. With the

mentorship we have created to

learn and work together on a

technological challenge.

Directions almost hands on to

help us as much as possible.

The first challenge was the

cloud migration. Immigrate from

our previous cloud

infrastructure and rearchitected

our offering to function as a

series of microservices. This

decreased our cost and built up

scalability. Next, we also

added key products like Google

Vision to read medical

Prescriptions and reports, apply

ML predictions, and help people

change their routines.

Cloudera managed instances for

location hosting.

DataProc, our data cloud storage

run our algorithms faster and

more reliable than before. Here

are some results from operating

on Google Cloud platform after

just 60 months. Our data

requests have 40% less latency

when compared to our experience

on another major cloud provider.

Ramp optimization has cut

storage costs by a third. Our

optical character recognition,

of course, is 15% better on

Google Vision, ensuring better

results for our physicians and

associated customers. Using

DataProc, our clustering time

went from five minutes to 90

seconds.

Faster, cheaper, and better --

nice right?

But the real benefit is in the

business.

Like I said, these are the early

days. We are excited by the

results, and we are confident

that Aegis can play a positive

role in giving people more

control over better health.

Today 70% of deaths around the

world are due to preventible

chronic and

noncommunicable disease with

better education and better

means of control, we want people

to accumulate the financial

resources to take care of their

health better, especially in a

preventive way. Our two biggest

priorities for the future are to

be a reference in the digital

transformation and to generate a

greater positive impact for

efforts like ours in the

business community. I'd like to

thank you, Google, for their

assistance in helping us

building our dream better. And

encourage you to contact me if

you want to discuss more about

our company, your company, or

ways to build better in the

cloud. Thank you.

>> We're so excited to see how

Hygia Saude continues to make

progress in supporting people

with the financial resources to

address preventible diseases.

Now let's travel to Australia

and meet Duncan, the General

Manager of retail sales and

marketing at Origin Energy.

Working with the center and

Google Cloud, the energy company

is helping homeowners better

manage their own power solutions

with a new consumer solar

application.

>> Origin is an integrated

energy company in Australia with

4.5 million customer accounts

across electricity, natural gas,

LPG, and broadband.

We retail through our key three

segments. Consumer where I'll

focus today but also small to

medium enterprises and the

commercial and industrial

segments. In our consumer

customer base, we also provide

other services such as solar and

storage where we will design and

install solar and solar

solutions for our customers. At

Origin our purpose is getting

Energy RILTS for our customers,

community, and planet.

And our vision and strategy is

to lead the transition to net

zero through cleaner energy and

customer solutions.

One way we'll do this and a key

pillar of our strategy is by

offering unrivaled customer

solutions. And part of this is

making it simple and easy for

customers to access clean and

smart energy solutions. Origin

partnered with Accenture and

Google Cloud to launch the new

Solar Growth platform app. The

tool used 3D data, visual AI,

and advanced analytics to show

customers how solar panels can

help save energy. It can

measure things like roof pitch,

area, and energy consumption to

calculate the most suitable

solar product for any household.

This innovation is a great

example of how we can equip

homes with solutions to make a

Difference.

Origin has been a leading

retailer and installer of

rooftop solar in Australia for

over 15 years.

In that time we've helped more

than 110,000 Australians go

solar. And that is more than

any other installer over that

time period. However, the

industry itself is a very

desegregated and highly

competitive industry with

relatively low barriers to

entry. In today's market, we

are the second largest provider

with about 3% market share and

the largest provider has about

4% market share based on

installed capacity. If you

combine that with around 3,000

competitors, it's a very

interesting industry to compete

in. And for us on a national

scale. It's also worth noting

that historically it was made by

physically visiting a site,

reviewing the customer's site,

and then going away to consider

the right system and form up a

quote. We would then wait while

the customer considered that

quote, and subject to the sale

being made, the installation

would then be scheduled again on

another day. It was often the

same person or business doing

the sale and then coming back to

do the install later. But at

Origin as a national retailer,

we do most of our work over the

phone. Probably around 80%.

And because that's historically

the nature of our relationship

with most of our customers. So

we have for many years operated

over the phone and used

satellite imagery to do our

designs and try and avoid a

Presale and a preinstall site

assessment.

In some cases, we would need to

do a preinstall site assessment

if there was any uncertainty

about the premise and the

requirements for a successful

install. In providing solar and

storage options to consumers,

it's always worth considering

why a customer chooses to invest

in solar and why they would come

to Origin. You can see in this

chart that while there is an

environmental benefit of the

investment, the main driver is

still whether or not the

consumer will save money. Due

to the nature of the up front

payment and the benefits over

time, the decision for consumers

is more like a total cost of

ownership decision that has

payback periods of typically

between three to seven years if

the system is sized correctly.

You'll also see tariffs and

payment terms are key

considerations. But they are

just inputs into that total cost

of ownership decision. A

consumer's decision is based on

three key drivers. Firstly, the

up front cost of the system.

Noting any financial or payment

terms. The second is the

displacement of good energy by

using the energy from the solar

system directly in the home.

And the third is the earned

revenue from energy exported to

the grid with what we call feed

in tariffs.

As the example in the attached

image shows, a household that

consumes, say, 5 megawatt hours

per year from the grid, pre

solar may move to a household

that only consumes 3.5 megawatts

from the grid and produces 2

Megawatt hours from their solar

install and offsets 1.5 megawatt

hours of usage in the home and

exports the other half in

megawatt hours to the grid.

This example is oversimplified

and can be impacted by a couple

of other variables. One

variable is grid tariffs. Many

consumers pay time of use

tariffs which means that the

grid price is normally cheapest

during the day when the sun is

shining and the load on the

network and the generation is at

its lowest. The price then

becomes higher in the later

afternoon to evening when there

is more demand on the network on

households. Another impact is

Feed in tariff rates.

In a competitive market, these

rates vary from retailer to

retailer, depending on how that

retailer values the energy that

is exported or returned to the

grid.

Another impact is usage

profiles.

How much is generated from the

solar? And then there is the

Differential grid rates.

Depending on the time of use

tariffs, grid rates can

sometimes be up around 25 to 30

cents during peak times. They

can also be down as low as 6 to

10 cents a kilowatt hour in off

peak times. And your feed in

tariffs sit at around 6 to 10

cents. So all of this means

there is much more value for the

household in the energy that is

used directly in the home. So

the new system allows users to

provide a simple address to look

up and see what their solar

opportunity is. It has an

Intuitive interface that allows

customers to see in near

realtime what the solar setup

will look like on the property.

The inputs utilized include the

home, its orientation, the roof

size, the roof pitch, and roof

material. It also matches up

the technical options around the

panel types, sizes, the inverter

capacity, and even the battery

and storage options if that's a

viable investment for the

customer. Given our internal

information around the

customer's usage, we can also

orchestrate an outbound campaign

that matches the customer's

existing tariffs and usage

profiles to the insight about

the house as described above.

This will allow us to show a

customer what the best option

for them is from a payback

perspective. As I mentioned

earlier, the old way of selling

solar was to invest a lot of

time into each system with a

large amount of human

intervention and gathering of

information and data from

numerous sources, and sometimes

this was onerous on the

customer, too. With the new

system that we developed

together with Accenture and

Google Cloud, we can digitally

capture the required

information. We then use the

artificial intelligence and

machine learning to optimize

system and options. This then

allows us an immediate playback

to the customer of a quote. The

machine learning and AI models

used means the customer has a

tailored solution, and they can

be confident that the system

hasn't been oversized or

undersized for their usage

profile. Historically assessing

the suitability of a roof for

solar and determining where to

place panels has been performed

by knowledge workers who piece

together many components to

arrive at a recommendation.

Whilst this is a prudent

approach, it does take time, and

it can't really scale to provide

A realtime advice to many

customers simultaneously.

The innovative Origin model uses

visual AI and geometry to enable

almost unlimited parallel

solution assessments within a

very easy to understand customer

experience. There are five key

steps in the new process. The

first step looks to understand

the type of roof being

considered. Is it the right

material? Or more tiles which

can be problematic for the

installation? The second and

third steps break down the

overall roof area into segments,

provide insights on the roof

slope and orientation to the sun

and provide the gross available

area that potentially could be

used for solar. The final

steps, 4 and 5, place panels on

the roof in line with the

recommendations of what a

customer in terms of usage and

what is possible from a solution

perspective. Collectively,

These steps are performed in

under 30 seconds. To date, we

are seeing customers' journey

time shortened, while still

being able to configure a system

and outcome that is personalized

with a customer's property and

usage profile. The quote is

accurate and allows confidence

that it is a robust assessment

of the data and information to

build the recommendation quote.

We are getting higher

interaction net promoter scores

through the new platform than

previously, and the customers

can choose what time of day they

choose to interact with us and

consider the purchase options.

We are also seeing lower

cancellation rates on orders due

to the increased confidence in

the system design and roof

structure information. This

lower cancellation rate leads to

more efficient operations. The

machine learning and the team

progressing the platform are

constantly working on improving

the proposition and building

continuous improvement cycles

that have us enjoying the

opportunity of many ideas and

options to keep refining the

performance of the platform and

the customer experience. I

would also note that the

platform allows some of our

partners to consider white

labeling the competency to allow

them to introduce their

customers to renewable and

distributed generation and

storage in a way that they can

trust the experience, that it's

going to be customer experience

accretive and that the customer

won't be delayed or held up in

any transfers or movements

between the organizations. An

important philosophy that work

well with this was to try and

get an MVP to market as quickly

as possible so that we could

test and refine the customer's

experience with real insight and

data. Our MVP focused on

limited geographies and house

types in the first instance and

then expanded beyond there. We

have seen customer preferences

change rapidly in this

environment. So our ability to

adapt the proposition is very

important. We knew the outcomes

we had in mind, and with that it

was fundamental to bring the

partners together and ensure the

cross functional teams saw the

vision and shared the optimism.

This was important because they

work across many businesses,

teams and skills. After

building the MVP, it was just a

matter of iterating through

until we had most of the

geography and building types

covered. And now we are

circling back on other ideas and

opportunities we saw along the

way as well as thinking about

how we continue this experience

all the way through the sales

cycle and included all the way

through the installation cycle.

Thank you.

>> That was a great example of

how to take advantage of market

dynamics to quickly develop a

new valuable application to

surprise and delight consumers.

Last and certainly not least, we

wanted to leave you with a

special story discussing Web 3

and the world of cryptocurrency

with Hedera. Joshua Cindy is a

Staff engineer with Swirlds

Labs, and was previously a

principal architect and DevOps

Meg manager for Hedera.

>> What excites me about a

conference like this is that we

get to talk about the future.

All the wonderful possibilities

in front of us and the

opportunity to turn them into

reality. It's literally in the

Title, what's next?

But innovation doesn't really

happen in a vacuum. So in order

to understand what's next, we

need to look into the past what

was to contextualize and

understand what can be. And I

want to go back to ancient

history. So, like, 11 years ago

in crypto terms. That's the

beginning of time, roughly.

About 11 years ago I built my

very own bitcoin exchange. You

could buy a bitcoin through it.

And I was working with this

payment provider. This was so

early that there weren't any

real regulations around the

industry. And that provider

ended up suspending me, and that

was the end of that service.

Why? They saw some risk here.

What was the risk exactly?

Well, to give you an example, my

solution itself bought bitcoin

from Mt. Gox. Mt. Gox if you

didn't know was a bit infamous

eight years ago for losing

150,000 or so bitcoins through

some theft, fraud,

mismanagement, or a mix of all

three, it was never really

clear. So the payment provider

saw some risk here and

de-platformed me, and rightfully

so. That payment provider was,

well, Google Checkout. So I may

be the only customer success

segment here where I've actually

been de-platformed by Google at

one point. But that's water

under the bridge. So why am I

here? Well, fast forward to

present day. We have built

something amazing at Hedera that

I would like to share with you.

How Google Cloud helped us build

it and why I'm excited to share

about what's next. Again, in

the interest of looking to the

past to understand what's next,

12 years ago Dr. Baird invented

the Hashgraph algorithm. This

algorithm underpins our network.

They set out to solve a decades

old math problem. Not just

build a better blockchain. The

concept of blockchain has been

around for over 20 years. There

are 2,000 plus projects which

all depend on blockchain. But

if blockchain didn't exist,

Hedera would still exist.

Hashgraph solves a network of

computers coming to consensus on

an event where no individual is

necessarily trusted. But does

so with extreme speed, high

throughput, and at the highest

security possible. It also has

low costs which are predictable

and which an enterprise can plan

around. I got into Google Cloud

in 2016 for their Kubernetes

Service.

This solved 90% of the problems

I didn't want to focus on.

Interestingly enough, Kubernetes

is antithetical to

decentralization.

It is a control plain for

managing resources on large

compute clusters.

We were in a position where we

needed each node in the Hedera

network to be completely

independent of each other and

maintains by a separate

organization.

These organizations could be

adversaries or competitors.

Kubernetes, the tool of choice,

wasn't going to cut it. We

continued to use containers, but

we developed our own node

management tool. This tool uses

threshold signatures from our

council to orchestrate updates

in a decentralized way. While

this is itself its own

interesting problem, it's a

pretty straightforward

engineering problem. And what

it unlocks for us is to focus on

our business case. How do you

establish a network governed by

large corporate and noncorporate

entities across industry, across

business segment, and across

continents and have them govern

our network? What do these

enterprises, banks, and

universities all have in common?

Well, try to get them to decide

on something about their own

business. How to make an

investment, how to structure

themselves, or plan for their

future. Imagine now how the

largest aerospace manufacturer

Makes a decision or the largest

bank in Africa.

How about the largest search

engine?

Now, what about all of them

trying to make a decision

together. When we talk about

what's next, we talk about

Hedera. That aerospace company

is Boeing. The bank is Standard

Bank. And the search engine is

Google. Google joined the

governing council in December

2019, accepting council member

responsibility such as Google

runs a single consensus node in

the Hedera main net. Google is

equal part owner of Hedera.

Google also sits on the

technical and regulatory

committees that govern the

network. They are bought into

the idea of Web 3 and what the

Hashgraph algorithm can do and

what the Hedera public network

built on top of it offers. For

Google Cloud, this led to them

understanding and investing, not

just in Hedera but the broader

web ecosystem in general.

Google Cloud has formed a team

dedicated To Web 3 called

digital assets web 3 and has

approximately 50 people on that

core team. With an additional

150 individuals working across

Google. In YouTube, search,

engineering, ventures and

payments. And that describes

Google and Google is one of 26

members. Hedera intends to grow

that to 39 governing council

members across different

geographies and industries. So

what can I share? When I think

about taking your idea and

making a business about it, I

think about those poor baby

turtles. If you've watched

Planet Earth, you know what I'm

talking about. Most of them

fail to make it, getting eaten

by birds, crabs, various sea

life, whatever. So do a vast

majority of ideas fail. Maybe

the idea was bad or worse the

idea was good, but it simply

Wasn't its time.

For me, my timing just wasn't

right.

But for you, that baby turtle,

that fragile unproven idea could

make it.

And maybe that time is exactly

right, and it starts something

amazing. So what do these Web 3

turtles have in front of them?

What makes the time right for

them? Well, a lot of things I

didn't have. We have partners

like Google who are eager to

have you. Who will grant you

credits, even, to use their

platform like we do. You have

demand. And I'm not just

talking about the demand of

something like pictures of rocks

or monkeys or coins with dogs on

them. But demand from

established companies. Look no

further than our governing

council. They are bought in on

this idea. You have proof of

economic value, of the

technology underlying your idea,

whether it be large banks

establishing remittance POCs

using digital tokens, industry

managed nonprofits building a

platform for redeeming coupons,

law firms tokenizing real world

assets, the largest packaging

and distribution company in the

world proving the carbon

footprint of your product as a

service. These are the Hedera

use cases. And as a Web 3

entrepreneur, what you have in

front of you is a network like.

Hedera. We offer a number of

things. Fast finality, meaning

your transaction is final in

seconds, fairness that keeps the

leaders from adjusting the

transaction order, energy

efficiency, per the University

of London Economic Study, we are

the greenest solution on the

planet. We are even carbon

negative. Fix transaction fees,

meaning your costs of your

solution isn't wildly

unpredictable. Decentralized

governance by the big names we

mentioned before. And so what

does it mean? What's next?

Well, with the Hashgraph

algorithm and platforms such as

Google Cloud, maybe that idea

you had might not fade into

obscurity because of the

ecosystem, the demand, and the

underlying technologies here.

Maybe it has the potential now

to grow up to be a massive sea

turtle. Thank you.

>> Thank you so much for tuning

in to these valuable stories of

innovation and transformation.

If you are interested in sharing

any one of these stories, you

can find these sessions in the

featured section of the next

catalog. Thank you and enjoy

the rest of Next.

[ MUSIC ]

>> Google products provide the

information you need when you

need it. But why can't you get

the same kind of answers for

Your business?

Looker, Google Cloud's

intelligence solution is here to

solve that problem, enabling you

to go beyond traditional

dashboards and make your

organization's information

accessible and useful.

Bringing this innovation to be

business will be revolutionary,

just like navigating a city

after Google Maps. Looker is

for your business data. Here's

what we mean. What if Google AI

were built into the tools you

use to store and analyze data at

work? Google's Vertex AI vision

takes data like video, images

and audio and in realtime turns

it into structured data, ready

for business intelligence.

Going beyond the dashboard means

using Google Glass Enterprise to

see insights and recommendations

based on your data in realtime.

More access, more transparency.

Now, that's Google for your

business. With Google Maps, you

know if a restaurant is busy

before you go or you can get

rerouted to avoid a traffic jam.

Looker will help you connect

similar dots in a predictive

way.

>> A concert in five days will

increase foot traffic by 65%.

Would you like to adjust

staffing and inventory?

>> Yes.

>> Looker and AI lets you

respond to changes in demand and

turn insights into action.

>> Foot traffic continues to be

busy. Encourage customers to

visit an alternate shop with a

reward card?

>> Yes.

>> Smarter insights mean better

experiences and happy customers.

So go beyond the dashboard and

transform the way you do

business, with Looker, powered

by Google Cloud.

>> Whenever you're ready.

>> Google's infrastructure

powers services for billions of

people.

>> And then Google Cloud takes

those lessons from running these

services in order to deliver an

innovative and easy to use cloud

infrastructure.

>> Today Google Cloud helps

users automate the life cycle of

their workloads.

>> In the future we'll use AI to

understand workload patterns and

do this automatically.

>> Intelligently optimizing for

higher performance with lower

latency, cost, and power

consumption.

>> Today we optimize our

infrastructure for AI email and

your data, ensuring that it is

accessible anywhere.

>> But we're not stopping there.

Chiplets are a new design and

manufacturing process that

brings open source agility to

the world of silicon by using a

building block approach.

>> Chips for a vast range of

configurations.

>> Sustainability is important

to us because our planet depends

on it. And we will operate on

24/7 carbon free energy by 2030.

Every email you send through

Gmail or every question you ask

search and every virtual machine

you spin up across our cloud

will be supplied by carbon free

energy every hour of every day.

>> We are reinventing

Infrastructure where AI-based

automation will recommend the

most efficient design for your

workloads based on your usage.

>> And we'll run them on

Google's unique infrastructure

optimized specifically for you.

>> All of this delivered on the

cleanest public cloud in the

world.

>> This is our next. We can't

wait to see what you do with it.

>> Let's serve up another round

of high speed drama!

[ MUSIC ]

[Music]

>> Hello, and welcome to what's

New for application developers.

I'm Thomas De Meo, and we're

thrilled to have you here.

We're so excited about what the

future holds and we have a

invested a lot to make you

successful with Google Cloud so

you can build applications

faster and more secure than

ever.

You may have heard Thomas talk

about how we offer prescriptive

guidance with an opinioned

approach to solve developer

velocity challenges.

And today's session, I'm going

to show you how exactly we do

this by highlighting how Google

Cloud lets you deliver secure

applications in an open manner.

Driving the developer velocity

is especially important given

there are a global shortage of

developer, in fact, the global

shortage of full time developers

will increase from today's 1.4

million to over 4 million in

2025.

So we want to make every moment

count to be as fast and

productive as possible.

To help alleviate this developer

shortage, last year we

announced the goal to equip 40

million people with Cloud skills

Over the next five years.

Today, we're taking that to the

next level with our enhanced

Google Cloud skills boost with

innovators plus.

This annual subscription

provides access to training,

special events, Google Cloud

credits and expanded developer

benefit, all for $299 U.S. per

year.

Even though working remotely has

become the norm, working faster

remotely and across distributed

teams continues to be a

challenge.

Typical friction points include

onboarding remote employees,

setting up DEV and test

environments and long build

times just to name a few.

When it comes to accurate,

code, and data exfiltration

risks are key.

This is especially true in

sensitive and regulated

environments.

Customers tell us that current

solutions to secure DEV

environments can add developer

friction to the experience.

For example, streaming-based

solutions may introduce unwanted

latency.

And running your specific

containers may not be fully

supported.

Solving develop velocity

challenges requires a

two-pronged approach.

How do we not only make it

easier for developers to build

applications but also easy for

I.T. admins to securely scale in

DEV environments.

To address this, I'm excited to

announce the availability of

Cloud Workstations.

Think of Cloud Workstations as

providing preconfigured but

customizable developer

environments in the Cloud, with

your favorite DEV tools

preinstalled, and up to date.

Cloud Workstations come with

multiple IDE support such as

IntelJ., Pi Charm, rider,

SeaLion and others.

You don't need to emulate

services or databases which can

save hours by developing and

running code in your staging

environment.

Cloud workstations enables

consistent developer stations

among developers

with all environments defined

via containers.

Fixing the, well, it works on my

machine problem.

You can also access your

favorite DevOps tooling

including third party tools such

As GitLab, TeamCiti as part

of your end to end workflow.

From an admin's perspective,

cloud workstations can

dramatically simplify the on

boarding of new developers at

scale.

You can create a workstation

configuration, which defines a

shared template of the tools

developers need, including VM

Type, IDEE tensions, libraries,

code samples and environment

settings.

Increasing developer velocity is

incomplete without having the

right security controls in place

for remote developers.

This is why cloud Work Stations

has enterprise grade security

requirements built right in.

This starts with VPC service

controls, where you can limit

developer access to sensitive

areas.

You can update our patch your

environments so that developers

get the most up-to-date version

and you can also use a

fully-private Gateway so that

only trusted users within your

network have DEV access.

Next up, Christian Gorka, head

of the cyber center of

excellence at Commerz bank tells

us how cloud work stations is

driving remote developer

velocity while maintaining the

bank security needs.

>> Thank you, Thomas.

Commerzbank has arbeen a

strong partner for 28,000

corporate client groups and

around 11 million private and

small business customers of

Germany.

We are bringing new products and

experiences faster to the

market, optimizing system

performance as well as running

costs and sustainability.

For many organizations,

developer product and in

particular for us,

security are top of mind and

first class priority.

Because of the nature of our

business, that is being part of

a strictly regulated industry,

our developers handle a lot of

Very sensitive applications.

To enable development in a safe

environment, we have an

extensive list of security and

compliance controls that need to

be checked off before we can

adopt a solution.

I am excited about Cloud work

staying Stations because it

helps us to take care of many

items on that list.

For example it less us integrate

our development environment into

our virtual private cloud so we

can make make sure intellectual

property do not leave the

premises and it makes sure data

confidentiality and location is

under our control and while

cloud work stations comes with

preinstalled software many

things are still configureable

and we can easily update or

images across various teams

which not only saves us time but

improving overall security

posture.

Finally our developers like the

speed and responsively Cloud

work Stations provides.

Onboarding can be done within

minutes and service is

accessible easily from anywhere.

At the end we are in need of

something that works out of the

box, is fully managed and

integratable into the Cloud

ecosystem and its services.

Hence I'm glad we can leverage

Cloud work Stations.

Back to you, Thomas.

>> Thanks, Christian.

It was great to hear how

developers of Commerzbank can

securely collaborate across the

world.

Cloud work stations can provide

security capabilities as awe

code which it extend to other

parts of the ply chain from

build dependent, management and

deployment.

Recent attacks such as Solar

Winds and Mindcast have

showcased the importance of

ensuring security across your

software supply chain.

Beyond the coding environment.

In fact, the Whitehouse is now

requiring all federal suppliers

to conform to software security

standards across their supply

chain.

The agency for cyber security

has recognized this threat as

well.

To further complicate things,

the extensive use of open source

and their dependencies makes

this a challenging problem to

solve.

To help, Google pledged $100

million last year to support

Third-party foundations like

Open SSF, that managed open

source security priorities and

helped fix vulnerabilities.

We also pledged to help

100,000 Americans earn Google

career certificates, to learn

skills including data privacy

and security.

On the product side, we are

taking those efforts to the next

level, with the launch of

software delivery shield.

software delivery shield.]

Software delivery shield

provides a fully managed and end

to end software supply chain

security.

This starts with the IDE, it

includes CI/CD pipelines and

while you're focused on writing

code, software delivery shield

or SDS is helping to make sure

that your policy is enforced

across the software delivery

process enabling to you develop

faster, specifically SDS adds

new security capabilities in

four major areas.

First, to shift left into the

IDE, we are launching cloud code

source protect, a private

preview IDEE tension which helps

you better understand open

source vulnerabilities and

licenses to go faster by

avoiding costly and frustrating

rework down the road.

Second, when it comes to

securing dependencies, we are

excited to announce our Assured

Open Source Software service.

It scansing for scans for known

vulnerabilities,

analyzes, and fuzz-tests over

250 packages across Java and

Python.

These packages are built using

Google-secured pipelines to help

External dependency risks and

come with remediation SLAs.

Third, SDS helps secure CI/CD

with Cloud Build, which now

supports SLSA Level 3.

For those unfamiliar, SLSA is an

emerging standard that

incorporates best practices for

software supply chain integrity.

Cloud Build provides verifiable

build provenance to help you

trace a binary to its source

code and build process to

prevent tampering, and prove the

Artifact you're using is

legit.

Fun fact, prove Nance is not

just available for containers,

but for Java Maven packages as

well.

Lastly, SDS can extend security

protection into Cloud Run and

GKE.

Cloud Run's security panel now

Includes software supply chain

security insights.

This provides information such

as SLSA level on the running

container images, build

provenance, and service

vulnerabilities.

To help protect Kubernetes

workloads, GKE security posture

management provides opinionated

foundational guidance into your

GKE clusters.

We do this by providing detailed

security assessments, actionable

remediation advice, and scanning

for OS vulnerabilities in your

running images.

We're integrating the

platform to further help drive

speed and efficiency.

For instance, Cloud Deploy can

make continuous delivery for

Cloud Run much simpler.

Promote from pre-prod to prod,

conduct rollbacks, and manage

gate promotion approvals in a

cohesive and intuitive

interface.

With the new Cloud Run

Integrations, you can connect

Cloud Run with Google Cloud

services fast.

For example, configuring domains

with a Load Balancer or

connecting to a Redis Cache is

now as easy as a single click,

and by the way, we have more

scenarios on the way!

With these enhancements you no

longer need to be an expert in

scaling, securing or managing

your pipelines or connecting to

other Google Cloud services.

Everything we discussed, from

Cloud workstations, Software

Delivery Shield, and Cloud Run

enhancements are designed to

help developer velocity, and we

do this in an open fashion.

In fact, we continually evaluate

the developer experience across

Google Cloud and have a

dedicated team making ongoing

improvements to reduce developer

friction and make things faster.

At Google, we're big proponents

of open source and open

standards.

As the number one contributor to

the CNCF Open Source projects,

our goal has been openness

wherever possible.

With that, I am excited to

announce that Google has joined

the Eclipse Adoptium Working

Group, a consortium of leaders

in the Java community working to

establish a higher quality,

developer-centric standard for

Java distributions.

As a strategic member of

Adoptium, Google will promote

Open standards that benefit all

developers everywhere,

regardless of where they run

their workloads.

And that starts by making

Eclipse Temurin available across

Google Cloud products and

services.

Eclipse Temurin provides Java

developers a higher quality

developer experience and more

opportunities to create

integrated, enterprise-focused

solutions, with the openness

they deserve.

Next up, Victor Salvay from the

product team discusses a

simplified and secure Java

developer experience.

>> Thanks, Thomas, and hey,

everyone.

I'm a product manager here at

Google Cloud, focused on supply

chain security.

We're going to be taking a look

at all the components Thomas

just outlined, collectively

known as Software Delivery

Shield so let's get started with

the demo.

Cloud work stations provides my

team with project-specific

profiles, configured with the

memory, CPU, and the tools

required, such as the IDE.

Today we're going to be looking

at a ja that Maven project so

we'll launch a Java DEV

environment and get started with

our Maven work.

To shift left on security, Cloud

code now provides dependency

insights right in the IDE,

including vulnerabilities in

direct and transitive

dependencies.

It also has trust-based policy

gating.

In this case, I have a policy in

place requiring that deployed

images be built a Cloud Build

and conform to a threshold so

random images like this one are

blocked.

So let's use cloud code to find

and fix all of our vulnerable

dependencies so our policy

thresholds are satisfied.

I can trigger a process that

using cloud Build, my build

succeeded and my image was

successfully deployed.

Cloud Build now provides

security insights into the built

artifacts like this container

image, providing the information

about the SLSA level of the

build, any vulnerabilities

working in the build as well as

a list of dependencies and the

prove Nance of the build.

I hope that demo gave you a

sense for software delivery

shield.

Back to you, Thomas.

>> Thanks, Victor.

We're allowing developers to

build data-driven applications

faster.

After all, data often powers the

most impactful experiences.

For example, with Google

services such as Cloud

Firestore, we're making it easy

To infuse AI and ML across

data-driven work flows to help

build rich end-to-end

data-centric applications.

In fact, over 4M databases have

been created in Firestore to

power mobile and web

applications.

We're providing integration with

Vertex AI, our AI/ML platform.

This enables model inferencing

directly within the database

transaction.

In addition, we're announcing in

preview, Spanner integration

with Vertex AI.

This integration allows data

scientists to build their models

easily in Vertex AI and

developers to access those

models using the SQL query

language.

We also understand that data can

Reside in third party databases,

too, and developers still want

those experiences to be fast and

easy to deploy.

So, when working with data

-centric third party stacks,

such as the MEAN stack,

JavaScript developers can

quickly deploy MongoDB, Atlas

centric 3rd party stacks, such

as the MEAN stack, JavaScript

developers can

uickly deploy MongoDB Atlas and

Cloud Run, our serverless

compute solution, with Google

provided Terraform scripts.

Read our related blog post to

end the repo where you can grab

This, with other tech stack

combinations soon to follow.

And doing this with the piece of

mind that security is embedded

at every step of the way.

And everything I just described

was designed to help you focus

on what you love - writing code,

fast!

We are excited to create a

future together.

Thanks for joining us!

>> When you're ready.

>> Google's infrastructure

powers services for billions of

people.

>> And Google Cloud takes those

lessons from running these

services in order to deliver an

innovative and easy-to-use cloud

infrastructure.

>> Today Google Cloud helps you

serve the automated life cycle

of their workloads.

>> In the future we'll use AI to

understand workload patterns.

>> Intelligently optimizing for

higher performance with lower

latency, cost and power

consumption.

>> Today we optimize our

infrastructure for AI/ML and

your data, ensuring that it is

accessible anywhere.

>> But we're not stopping there.

A new design manufacturing

process brings open source

agility to the world of silicon

by using a building block

approach.

>> Chips for a vast range of

compilations.

>> Sustainability is important

to us because our planet depends

on it and we will operate on

24/7 carbon-free energy by 2030.

Every email you send through

Gmail or every question you ask

to search and every virtual

machine you spin up across our

cloud will be supplied by

carbon-free energy every hour of

every day.

>> We are reinventing

infrastructure, where AI-based

automation will recommend the

most efficient design for your

workloads, based on your usage.

>> And we'll run hem on it

Google's unique infrastructure,

optimized specifically for you.

>> All of this delivered on the

cleanest public cloud in the

world.

>> This is our Next.

We can't wait to see what you do

with it.

[Music]

>> Welcome to what's Next for

data sign 'tises and analysts.

I'm June Yang.

>> And I'm Sudrih Hasbe, seen

why are director for data.

>> AI is here and is growing

fast.

Every day I have so many

interesting conversations with

our customers about the

opportunity of innovation with

data and AI.

>> Absolutely.

>> Hello everyone.

I'm Jason Sharples, and I'm the

And we have amazing anno

announcements to talk about to

you today.

Let's get started.

>> The promise data AI is

undeniable and a reality today

for many organizations.

It is a ground-breaking era for

AI.

The last few years have led us

to a tipping point with AI

adoption, where we have seen the

impact of AI across more and

more organizations and more and

more use cases.

Organizations for many

industries and varying level of

ML expertise are solving

business critical problems with

AI from creating compelling

customer experience to

optimizing operations to

automating routine tasks.

These organizations learn to

innovate faster and ultimately

get ahead in the marketplace,

but how did they get there?

AI is heart, fundamentally

heart.

It is a challenge to manage the

growth and complexity, it is a

challenge to prepare data for AI

and ML usage.

How many experts do you need to

get your first model into

production?

What about the next study use

cases?

Finally, even if you have all

the pieces in place, the road

from prototype to production can

take months, if not years.

How can you scale faster

with confidence?

We've learned from Google's

years of experience in AI

development on how to make the

Day to do AI journey as seamless

as possible and we have poured

this experience into our

products and services.

In today's session, we will walk

Through how our data cloud

simplifies the way teams work

with data.

Built-in AI and ML expertise and

capabilities are

designed to meet users where

they are, with their current

skills.

And finally, with our

infrastructure, governance, and

MLOps capability help

organizations to leverage AI at

scale.

And now, let me hand it over to

Sudhi to share more about data

cloud and how to overcome these

challenges.

>> Thank you, June.

Let's talk about the first

challenge you brought up.

Data complexity.

The problem is unpredictable

nature of data.

Data comes in all shapes and

forms, speeds and sources.

It's structured, semistructured

and unstructured.

Mostly it's adjusted in batch

today but it is increasingly

required to be transformed and

utilize for real time

decision-making.

If you're not already there,

soon your company will find

itself in the center of

multicloud multiformat

multisource data ecosystem.

An ecosystem that frankly, the

monolithic and expensive

Architectures of the past is not

built for.

We have designed our data cloud

to meet your needs of today and

tomorrow.

With our data cloud, you gain a

complete and unified data and AI

solution so you can manage each

Stage of data life cycle

from operational data to

analytical and intelligent

applications.

With our Data Cloud, AI and

Machine learning comes built-in,

helping you make full use of

your data, to improve insights

And automate core business

processes.

That's why today I'm proud to

announce the general

availability of BigLake, to help

You break the data silos and

unify your data lakes and

warehouses.

BigLake will support Apache

Iceberg, which is becoming the

standard for open source table

format for data lakes.

And soon, we'll add support for

formats including Delta Lake and

Hudi.

Our built-in support for Apache

Spark in BigQuery will allow

data practitioners to create

BigQuery stored procedures

unifying their work in Spark

with their SQL pipelines.

This open data ecosystem is at

the heart of our strategy.

As of today, over 800 software

partners power their

applications using our data

cloud.

And 40 data platform partners,

like DBT, Dataiku and Tableau,

support BigQuery through

certified integrations.

And, customer adoption continues

to grow fueled by ecosystem

initiatives like Data Cloud

Alliance.

Now, we know that 80 to 90% of

The data we have today is

unstructured.

This includes images, video, and

audio files.

Today, we're announcing support

for unstructured data in

BigQuery through object tables.

Object tables enable you to take

advantage of common security and

governance across your data.

You can now build data products

that unify structured and

unstructured data in BigQuery.

This makes BigQuery one-stop

solution to store, manage and

process all types of data at

global scale.

One example of a company making

their data work for them is ANZ

Bank, the second largest bank in

Australia.

ANZ uses Google Cloud to help

customers make better decisions

by analyzing aggregated data

sets and delivering powerful

insights to create personalized

customer experiences.

Manual operations that used to

Take days now only take seconds.

So in a nutshell, to overcome

data complexity challenges We're

bringing lakes and warehouses

together with BigLake and

additional data format support.

We're unifying structured,

semistructured and unstructured

data workloads into BigQuery

Openness, unification and trust

are the foundation of an

intelligent data driven

organization.

>>

And with that, I'll turn it over

to June to share what we are

Doing about the AI skills gap in

the city.

>> Thank you, sudhir.

It's amazing to see the progress

our teams have made here.

Now, let's talk about harnessing

the power of AI despite the

skill gap challenge.

According to the U.S. Bureau of

Labor Statistics, Data Scientist

Is the number six fastest

growing job in the U.S. and the

race is on to hire.

This is a challenge for

organizations that are looking

to apply AI across their

business.

Google Cloud addresses this

challenge head-on by offering a

wide range of capabilities that

can increase the reach of AI/ML

to more users and help your data

scientists to achieve greater

productivity.

Organizationing can start with

our auto box API, like

translation, transcription and

many more, where developers can

directly apply Google's

state-of-the-art AI to quickly

solve real-world problems

without the need to build AI

models themselves.

When you want to work with your

own data to build custom models,

Vertex AI offers a range of ML

tools to build, deploy, and

manage ML models.

This includes the ability to

start from scratch to create

fully custom models or build on

top of our existing models and

Fine tune them for your tick

already needs.

Starting with BigQuery ML, a SQL

interface that unlocks ML

capabilities for Data Analysts

and simple ways to build and

Train their models.

As you heard from Sudhir,

BigQuery will support

unstructured data, which means

BigQuery ML will also work with

unstructured data.

You can bring your own model or

use a model pretrained by Google

directly on BigQuery object

tables.

The results can be stored and

managed in BigQuery for further

analysis or deployed to Vertex

AI for realtime predictions.

The next training capability is

AutoML.

People can create custom models

quickly with their own data and

minimal data science expertise.

More and more we are seeing that

even expert Data Scientists want

to start with a base AutoML

Generated model and then fine

tune it to achieve greater

accuracy with their own data.

AutoML provides advanced models,

trained by Google, as a jumping

off point for customization.

Seagate, a partner for Google's

data centers, used Machine

Learning to predict recurring

disk drive failures.

With AutoML they were able to

achieve precision of 98%,

Far more than the 70% to 80%

achieved with the custom model.

For users who want more control

Over ML, we recently announced

an exciting announcement for

over AutoML, we recently

announced an exciting

enhancement to

AutoML Workflows, which let you

selectively modify each step in

the model building and

deployment process, offering

even more control working with

AutoML.

We will start with structured

data and expand support to

unstructured data in the coming

months.

TabNet is a powerful algorithm

developed by Google researchers

that leverages neural nets for

tabular data.

Today we are adding TabNet to

Workflow to make it easy for

organizations to work at a

massive scale without

sacrificing explainability or

accuracy.

And finally, Custom Training

which provides the most

flexibility.

Data Scientists and ML Experts

can work with their familiar

Open Source frameworks to build,

train, deploy and monitor ML

Models five times faster with

Vertex AI.

For some workloads, faster is

what it's all about.

I'm excited to introduce Vertex

AI Vision, a revolutionary

end-to-end computer vision

application development service,

that helps reduce the time to

Build compute vision application

from days to minutes at a

fraction of the cost.

With an easy-to-use

drag-and-drop interface and

pretrained models for common

tasks, Vertex AI

Vision is the one stop shop to

build, deploy, and manage

Computer Vision Applications.

From ingestion, to analysis, to

storage, now you can easily

create computer vision

applications for any business

need from inventory management

in retail to improving safety

In the plant workers in

manufacturing, and even traffic

management in large cities.

Plainsight is a leading provider

Of computer application and

solution and early adopter of

Vertex AI Vision.

With the speed and cost

efficiency benefits, Plainsight

is already developing new

applications and revenue streams

that were previously not viable.

At Google, our goal is to make

products that are truly helpful

To everyone, whether it's

solving big problems or

providing assistance in everyday

life.

AI has already had a profound

Impact in our lives and there's

even greater potential to come.

This opportunity comes with a

Deep sense of responsibility to

build for AI for the common

good, AI that benefits all

people in society.

We prioritize the responsible

development of AI.

This includes testing to

mitigate bias, prioritizing

responsible use in product

design, providing transparency

and developer education.

We apply responsible AI to all

of our products.

We are committed to iterating

and improving, and will continue

to incorporate best practices

and lessons learned into the

products we build.

You have just seen a variety

of capabilities in Vertex AI to

develop AI/ML models.

we realize one size doesn't fit

all, we want to offer

organizations the choice to pick

the best tool for the job at

hand.

Putting powerful ML capabilities

in the hands of more people and

helping Data Scientists build

models faster means you can do

more with your data, fueling the

data driven transformation.

>> June, it's amazing when you

make technology accessible to

more users the kinds of results

they can get and especially AI

and ML from a technology

perspective.

But the next challenge for

Organizations with data and AI

in the hands of more users is

how do you scale with

confidence?

Vodafone, one of the largest

telecommunications companies in

the world, made a huge leap

forward with its AI

capabilities.

First, they unified their data

into a single data ocean in

BigQuery, establishing a single

source of truth and making data

accessible across their

organization.

This unleashed a huge number of

use cases and increased demand

for AI/ML capabilities.

So next they built AI Booster,

an internal ML platform powered

by Vertex AI.

Now, their AI development is 80%

faster, and more cost-effective,

all without compromising

governance and reliability.

So how did they get there?

Scaling data and AI across an

organization first requires

Strong data governance and

management, and secondly, in

order to move from data to AI

efficiently across use cases,

organizations also need to

streamline end-to-end processes

from preparing data,

to building, deploying, and

Maintaining machine learning

models.

Let's start with unified

governance.

Our data cloud provides

Customers with an end-to-end

data management and governance

layer, with built-in

intelligence to help enable

trust in data and accelerate

time to insights.

To further these capabilities,

we are announcing various

innovations to Dataplex, our

intelligent data fabric.

Dataplex helps organizations

centrally manage and govern

distributed data.

Today, we're introducing Data

Lineage so you can get complete

End-to-end lineage from

ingestion of data to analysis to

machine learning models.

Data Quality will enable you to

gain confidence in your data

which is critical to get

accurate predictions, and

More importantly,

Dataplex is now fully integrated

to BigLake so you can now manage

Fine grained access across the

organization at scale.

Vertex AI integrations across

our Data Cloud streamline access

to data, all the way from

prototyping to production.

This brings me to the

centralized

MLOps capabilities in Vertex AI.

No matter how you train your

model, our platform can

register, deploy, and manage it

throughout its entire lifecycle.

Vertex AI Model Registry is now

GA, providing a

unified repository for all

models to help with version

control, labeling metadata, and

easily deploying for batch or

online predictions.

Vertex AI Pipelines takes MLOps

to the next level with

serverless orchestration and

automation of your ML workflows.

Prebuilt pipeline components are

Available across data sources,

available across data sources ━

like Dataproc and Dataflow, and

training capabilities, including

AutoML and BigQuery ML.

And to maintain model

performance, Model Monitoring

and Explainable AI capabilities

help you detect skew and

interpret predictions.

We recently announced “Example

Based Explanations,” which help

you mitigate data challenges

such as mislabeled examples so

you can quickly identify

problems and improve model

performance.

Walmart's incredible dinlg

digital transformation journey

illustrates what's possible for

organizations that choose our

data cloud.

Walmart's adoption to BigQuery

has enabled them to unleash the

potential of AI across the

entire business from predicting

demand to managing and stocking

inventories to optimizing supply

chain to freeing up associated

to focus more on customers and

serving customers.

In one case, they were able to

optimize processes and save $10

million of food waste every

week.

I love Walmart's journey.

>> Thank you, sudhir.

It's great to see so many

exciting product announcements.

To learn more, we have many

amazing sessions for you and

invite to you watch all of them.

Here are our top sessions.

And for those of you looking to

get hands on, we are partnering

with The Drone Racing League to

bring you new immersive learning

experiences.

Visit drl.io/GoogleCloud to

learn how you can work with

The data to be able to predict

race outcome and provide tips to

enhance pilot performance.

We hope you enjoyed this

session.

Thank you so much for attending,

and we hope to see you soon.

Bye-bye.

>> Thank you.

[Music]

>> Hi, everyone and welcome to

Next.

I'm Andy Gutmans for Google

Cloud and excited to share with

you what's next for builders.

Later, Scott Wong, VP of

infrastructure at Credit Karma

will talk to me about their

jumpy on Google Cloud.

Today every organization on the

plan set going through some form

of digital transformation.

At the heart of this is mission

critical data-driven

applications, powering each of

these applications for

operational databases that must

be reliable, resilient,

available, performant and safe

for users.

At Google Cloud, our mission is

to accelerate every

organization's ability to

digitally transform.

A large part of that helping

customers innovate with a

unified, open and intelligent

data platform.

We focus on four key areas.

First a unified and integrated

data cloud for all your data.

Second a commitment to openness,

leveraging open source and open

standards.

Third, infusing AI and ML across

data-driven work flows and

lastly empowering builders to be

more productive and impactful.

Let's start with the first focus

area, creating a unified and

integrated data cloud for your

operational and analytical data.

At Google, our mission is to

make information universally

accessible and useful.

As evidenced by our most popular

globally available product like

YouTube, Google Search, Maps and

Gmail.

These products leverage a

uniquely integrated and scalable

data architecture.

We've taken these learnings and

them into Google Cloud making it

the best place for all your data

workloads.

The way we've built our core

services such as Cloud Spanner,

big tables, AlloyDB and BigQuery

is truly differentiated.

The services leverage Google's

common infrastructure which is

unique in the industry.

Our highly-scalable distributed

storage system ordus aggregated

compute and sornlg allow us to

provide industry-leading tightly

operational and analytical data

services.

Today for example Spanner our

globally distributed relation at

database service processes over

2 billion requests per second at

peak.

It has more than six exobytes of

data under management and offers

up to five nines of

availability, which is

remarkable.

Big Table our fully managed

no-SQL database service

processes over 5 billion

requests per second at peak and

has more than 10exabytes of data

under management and offers up

to five nines availability.

These services offer

industry-leading availability,

scale and global reach.

Building this, customers need to

have easy movement of data

within the Google Cloud.

We heard you, and that's why we

announced in preview data stream

for BigQuery.

Datastream for BigQuery provides

easy replication of data from

operational database sources

such as AlloyDB, Postgres, and

Oracle directory into BigQuery.

This is special because we

worked closely with the BigQuery

team to develop an optimized

integration to replicate

database hub at low latency.

Setup is a few simple clicks.

Datastream from BigQuery is

going's next big step towards

realizing our vision for the

unified data cloud combining

databases, analytics and machine

learning into one single

platform.

But don't take my word for it.

Let me colleague, Gabe, show you

how easy it is to get started

with Datastream for BigQuery.

>> Creating a stream just

requires a name, unique ID,

region, source and destination.

Today we're capturing Postgres

to BigQuery.

One of the great things about

Datastream is to shows you the

prerequisites in the UI so you

know what to prepare for before

streaming.

Connection profiles are used to

define your source and

destinations.

They represent the information

required to connect to be and

assistance like the host IP,

user name, and password.

We've got one ready to go for

the postgres source.

The next step I can us can

tommize which schemas and tables

we want to bring over into

BigQuery.

I'll grab two tables from our

employee's schema.

Destination profiles similarly

to the source can be created

beforehand.

We can define a prefix here so

it's easy to see which data is

coming from our postgres source.

One last validation check to be

sure we haven't missed anything.

Create and start and right away

I can start using the explorer

to see my data that's come

across into BigQuery.

>> Wasn't that simple?

And there's more.

To continue on this theme of

easy replication from

operational databases for use

cases like analytics,

event-based architectures,

compliance cor archival we're

announcing in preview Bigtable

change streams.

This capability joins our

already existing

recently-launched Spanner change

streams.

With change streams, you can

check writes, updates and

deletes, so that they can be

replicated to downstream systems

in real time.

You will see us help make your

journey on our data cloud

simpler as we continue to

provide out-of-the-box data

movement.

The second focus area is our

continued commitment to open

source and open standards for

increased flexibility and

portability without vendor

login.

We offer managed services that

are fully compatible with the

most popular open source engines

such as MySql, postgres.

We helped manage a complexity of

running databases to increase

your team's agility and reduce

risk and we don't stop there.

We want to help you break free

from legacy proprietary

databases with expensive and

restrictive licensing.

And in the process help you

modernize to open standards and

open APIs in the cloud.

Postgres, and open source

database has emerged as the

leading alternative to legacy

proprietary databases because of

its rich functionality,

ecosystem extensions and

enterprise readiness.

It's not surprising that

millions of users across the

industries have adopted

postgres.

We're focused on making Google

Cloud the best place to run your

postgres workloads.

We offer not one, not two but

three fully-managed services

that support the postgres

interface.

First, Cloud SQL for postgres,

enterprise ready, fully managed

relational database service. You

get the same experience of open

source postgres with the strong

manageability, availability and

security capabilities of Cloud

SQL, and you can use the same

service APIs to also manage your

MySQL and SQL database.

Cloud SQL is used by more than

90% of the top 100 Google Cloud

customers.

Second, AlloyGBs are a fully

postgres database service ready

for top tier workloads.

In our performance test, AlloyDb

is more than four times faster

than open source postgres and

two times faster than Amazon's

comparable postgres compatible

service for transactional

workloads. Also delivers up to

100 times faster analytical

queries than standard postgres.

Open isn't just about our

technology.

It's also about developing an

open ecosystem of partners.

AlloyDB integrates with many

leading technology solutions

ands aa fast-growing ecosystem

of partners with expertise ready

to support your deployments, and

finally, we've also added a

postgres interspace for Spanner

our transformative, relational

database with unlimited global

scale, strong external

consistency and up to five nines

availability.

With the postgres interface for

Spanner, developers can take

advantage of familiar tools and

skills from the postgres

ecosystem, and to further

democratize access to Spanner,

we recently atune insed the free

trial to give builders an easy

way to try out Spanner at no

cost.

Get started building with

Spanner today.

With these capabilities, we've

made Google Cloud the best home

for all your postgres workloads.

And to make it easy for to you

take advantage of our open data

cloud, we have simplified our

migration approach with the

right methodology, tooling and

support to help accelerate your

journey.

Take advantage of the program

today.

The third focus area is around

how we are infusing AI and ML

across data-driven workloads.

We use AI and ML across data

technologies to make our

services more intelligent.

Capabilities such as Cloud SQL

cost recommenders and alloyDB

autopilot enable us to have

performance and capacity for

lornlg databases.

In addition to infusing AI and

ML into our databases we're

providing integration with

Vertex AI, our AI and ML

platform to enable model

inferences directly within the

database transition and I'm

excited to announce today in

preview the integration of

Vertex AI with Spanner.

You can now use the SQL query in

Spanner to call a model and

Vertex AI.

With this integration, both

AlloyDB and Spanner can call the

Vertex AI models Ewing SQL in AI

transactions allowing data

scientists to build their models

in Vertex AI and developers to

access the models using the SQL

query language.

All the AI and ML capabilities

can allow to you simplify

management of your databases and

enable builders to deliver

intelligent applications.

Our final focus area is around

empowering builders to be more

productive with innovative,

one-of-a-kind developer

experiences.

Industry-leading services such

as cloud Firestore are loved by

developers because of how fast

one can build an.ly indication

end to end.

More than 4 million databases

have been created in Firestore

and the applications power more

than 1 billion active end-users.

We've also been pushing the

envelope on database operations

with openability features across

our key services.

We've introduced cloud SQL query

insights and make Cloud SQL cost

recommenders generally available

and introduce postgres system

insights in preview.

Today we're excited to announce

the preview of security and

performance recommenders for

Cloud SQL.

These capabilities help builders

optimize their data base

configuration.

Let's see our UX leader

demonstrate the cloud insights

in action.

>> Insights helps you

investigate and detect

problematic queries and find the

root cause of the problem from a

single pane of glass.

System insights helps me

understand the overall health of

my databases.

I can immediately see that the

P99 of CPU utilization is at

100%.

In looking at the query latency

and CPU utilization graphs, I

can see regular latency spikes

indicating that there are

problematic queries causing high

CPU utilization.

To understand this further, I

can navigate to query insights.

Looking at the top level, query

insights dashboard, I am

immediately drawn to the

database's load graph.

It confirms that there are

several problems, including one

that started around 9:15.

The colors in the graph help me

see that there's an increase in

IO wait and even larger increase

in lock wait.

Traditional monitoring tools

only provide a query centric

view of performance.

Insights finds which application

code caused the problem.

For example, the tags table is

especially helpful to me as a

developer, since this

application was built using

Jango's ORM rather than by

writing the SQL queries

directly.

Insights using SQL commenter, an

open telemetry standard

providing instrumentation to

augment SQL from frameworks.

This payment for the controller

and the route looks like it's

the problem.

With this context, I can go look

at the source code now to

investigate further.

>> I hope you enjoyed the demo

and can now see how we aim to

make our database services easy

to use to help every builder

focus their energy on innovation

and differentiation.

We've talked through a lot

today, so now, let's hear from

Scott Wong, VP of infrastructure

at Credit Karma to learn how

they reduced operation and

burden of cost with Google

Google Cloud.

We're glad to have you here

today.

>> Thanks for having me, excited

to be here.

>> Tell me, Scott, who is Credit

Ka

Karma.

>> In 2007 credit karma was the

mission to be the technology

platform helping our members

achieve financial progress.

Over the last 15 years we've

built this around free credit

scores providing over 4 billion

credit scores to our consumers

across the U.S., UK and Canada.

Today nearly 130 million members

use our product.

Financial project is much more

credit reports.

We're the go-to destination for

everything related to financial

goals and to provide this, we

provide personalized data-driven

insights to our members to feel

more confident about their major

money decisions, and at the

center of these insights is our

data models and data systems,

all being powered by Google

Cloud services today.

Our cloud migration journey

started almost six years ago.

On the left-hand side is our

infrastructure and traditional

data centers before we moved to

cloud and right-hand side is our

current infrastructure in Google

Cloud services.

We moved to cloud with our data

warehouse and moving to BigQuery

and started methodically moving

down the stack to our whole

recommendations pipeline, data

flow, Bigtable and AI platform

were all part of that migration.

You can also tell in the middle

of the screen we have a reverse

ETL process.

Our user store and BigQuery

serves those features into

Bigtable and gets scored or

recommended for our modeling

scoring service, all on GKE.

Those recommendations get to our

members through our products 63

billion times in predictions in

a day.

And today, we think there's even

more opportunity for future

innovation, with the possibility

of Spanner and Vertex AI using

in our recommendation system.

>> Oh, how are you using

Google's Data Cloud for key use

cases and what are some business

benefits awe chieved by moving

to Google Cloud?

>> One, we focused on making our

data scientists as efficient as

possible and that meant

simplifying access to our user's

feature store as well as

deploying models as quickly as

possible.

To give you some set of scale,

today we deploy over 700 models

a week, as opposed to pre-cloud

it was almost ten a quarter.

Additionally in experimentation,

we do over 7X more experiments

today than before Cloud and with

the help of Bigtable and

BigQuery, we have 10X more

features deployed daily through

our batch data.

These gains can be attributed to

our unified model training

powered by BigQuery, Bigtable,

data flow and Google's AI

platform.

>> Nice.

What makes Spanner appealing to

you for your operational

workloads?

>> We're considering Spanner for

our primary relational database

to reduce engineering toil,

drive higher efficiency and

reliability to our core mission

critical production databases

here.

We're a fast-growing company to

scale and Spanner provides

interesting advantages in their

multiregion, global consistency

and five nines availability

offering, natively through the

products.

This would be one of our most

complex migration processes

moving live production data into

a new database tier so we'll

take a lot of engineering

effort.

We look forward to working with

the Spanner product team on this

>> Thank you so much for that

conversation Scott.

I really appreciate you joining

us today.

>> Thank you for having hme,

Andy.

I enjoyed being a part of Google

Next.

>> As spoken about today, the

future of data has endless

possibilities.

Tune into all our sessions at

nesting for more details and

announcements you heard today.

Thank you.

Enjoy the rest of Next.

[Music]

>> Hello.

I'm Sach Gupta, VP for

infrastructure at Google Cloud.

The role of enterprise

architects and developers are

evolving.

Not only do you have to keep the

lights on, you're expected to

stay on top of the ever-changing

trends and technologies that

create business ral.

Value.

You have to do all of this

lowering costs and improving

performance so IT infrastructure

runs quickly and smoothly.

Your role is critical to a

successful transformation.

Atoopting AI/ML and containers

is becoming vital to businesses

with more than 76%

of surveyed enterprises saying

that AI projects are their top

priority.

Meanwhile, security breaches are

so common these days it doesn't

even make the top news.

They are disruptive and costly

and can be avoided with the

right preventative methods.

After all, as Gartner says, 99%

of cloud breaches are due to

human error.

With Google Cloud, you can

innovate faster and more easily

While optimizing costs.

We know organizations like yours

still have a lot of

infrastructure to migrate, and

we are committed to helping you

migrate more securely and

efficiently.

Global customers and local

partners like Palo Alto

Networks, H&M, Major League

Baseball rely on

us to deliver scalable, high

performing, highly available

cloud infrastructure and

services.

A big area we're investing in is

the expansion of our global

footprint to meet the

unprecedented global customer

demand.

Today, I'll be discussing how we

are partnering with you to help

drive business value in three

key ways.

First we're driving business

transforges and achieving new

outcomes with industry-leading

AI/ML unparalleled security and

modern infrainstructions and

solutions designed for your

industry.

Second, we're helping you

optimize your workload

performance while reducing

costs.

From migration to management,

our mission is to help you

unlock this value simply and

easily.

Customers come to Google Cloud

to transform and innovate.

Let me share a little more about

how we are driving this change

through AI leadership, invisible

security and cutting-edge

industry solutions.

AI is in our DNA, from

AI-powered search to YouTube

recommendations and Google

Assistant.

We have decades of experience

running scaled, diverse ML

workloads and industry-leading

AI infrastructure products and

solutions.

Wayfair is using Vertex AI to

forecast global customer demand

ensuring customers can quickly

access what they need, and to

automate and personalize

AI-powered customer support.

Salesforce is using performance

optimized cloud TPU v4 for

conversational AI.

These outcomes are made possible

because of the innovation across

our AI stack.

And it starts with hardware

choices and performance that

help you keep pushing the limits

of AI in large models.

Cloud TPU v4 delivers

industry-leading ML training

performance and scale.

With 6 TBps interconnect, you

can run large-scale training

workloads up to 80% faster and

up to 50% cheaper compared to

alternatives.

That's how companies like Cohere

deliver cutting-edge natural

language processing faster and

with a lower carbon footprint.

We're also announcing new

A2ultra GPUs, built on Nvidia's

A100 80GB GPUs with highspeed

memory.

AI Singapore has reduced the

loading time of large-scale

language models by 40% and

increased throughput by over 50%

with A2+, resulting in increased

productivity.

Customers are also using Google

Batch to orchestrate and

schedule AI jobs of any scale.

With Batch, our customer

Locomation was able to unlock AI

insights from their autonomous

trucks 80% faster.

Google is committed to making AI

and machine learning more open

and accessible.

To further this, in partnership

with Meta, we recently cofounded

the PyTorch Foundation.

And for over a decade, we've

contributed to critical AI

projects like TensorFlow and

JAX.

Today, we are announcing a new

industry consortium, the OpenXLA

Project, that will unite an

ecosystem of leading machine

learning compiler technologies,

and accelerate and simplify

machine learning innovation.

These open source AI

contributions enable you to take

your AI idea and turn it into

reality, easily, and at low

cost.

Next, I want to share how we're

transforming security.

At Google Cloud, we are

championing a future of

Invisible Security, where

security is engineered in, and

operations are simplified.

We package the expertise that we

use to protect our own business

and our billions of users and

make it available to you.

You can easily deploy a wide

range of tools depending on your

own risk profile from

prevention, to detection, to

remediation.

Today, I want to highlight the

next step in our cybersecurity

journey as we welcome Mandiant

to Google Cloud.

By taking advantage of Google

Cloud's existing security

portfolio, our Google

Cybersecurity Action Team, and

Mandiant's leading cyber threat

intelligence, you can stay

protected at every stage of the

security lifecycle.

Cloud Armor is another security

innovation that provides

advanced ML-powered DDoS and WAF

protection for web apps,

services, and APIs.

It has prevented some of the

largest DDoS attacks on the

planet with zero impact to

customers.

Recently, the largest HTTPS

attack was staged against a

Cloud Armor customer.

It was 76% larger than anything

previously reported the

equivalent of Wikipedia's daily

requests in 10 seconds, and the

customer experienced no impact.

And for regulated industries,

with stringent and

country-specific requirements,

we offer controls to meet your

digital sovereignty objective.

Sovereign Controls allows you to

define the location of your core

data, set access permissions,

and control your cryptographic

keys.

Supervised Cloud, which is

coming soon, is a fully

partner-managed and operated

solution that supports data,

operational sovereignty needs,

and country or region-specific

regulatory requirements.

For highly sensitive workloads

that require the most stringent

security requirements, Hosted

Cloud offers air-gapped hardware

and software with managed

infrastructure, AI/ML, and

database services.

Since transformation

takes different forms for

different industries, we partner

with customers to build

industry-leading, innovative

solutions.

Together with the CME Group, we

plan to transform the

derivatives market through

technology, expanding access,

and creating efficiencies for

market participants.

In Telecom, Communication

Service Providers like Bell

Canada rely on Google's network

to expand globally and deploy 5G

networks with Google Distributed

Cloud Edge.

GDC Edge GPU-optimized

Configurations brings the power

learning to enable the future of

retail.

Customers anpartners such as

66 Degrees, M Smart Shelf, and

Ipsotek are ing GPU

optimizationo deliver

innovative rail solutions at

the edge, inuding AR in the

store, shelftock out

notificationfor quicker

restocking, d cashierless

checkout to duce lines.

In media and entertainment, we

pride solutions to customers

like UNEXT, th streaming built

on the same ogle

infrastructu we've tested and

tuned to ser YouTube's 2

billion userglobally.

To get a betr picture of how

our media anentertainment

industry cusmers innovate with

Google CloudI'm proud to

introduce Seor Vice President

Technical Inastructure of

Major Leagueaseball, Truman

Boyes.

>> He's got !

>> Major league basell's

technologyission is to connect

withur fans.

Part othe structure team has

historical maintained

applicatio on prem and now we

haveunlimited compute om the

blic cloud and this allowed us

shut down four data centers,

mornize all of our

infrastrucre and spin things

up rapidly and in thoffseason,

w.

Google Cloudelps us to

undetand the entire fan jump

AI and articial intelligce

allows us to derive a better

conntion to them.

Wee able to have personalid

content wi that fan, it gets

richerver time as weearn

more abo them.

Workinwith Google we'r

preservin

19s and able toake to the

highghtsvailable to our

fans

Googleloudosts all of the

videclipfor us and now we

ha an portunity to enrich

thisf

day.

We're okinto modernize t

entire platform th we ve and

move io delivery thugh media

CDM.

Major ague Baseball anGoog

Cloud e coecting with ou

fans, e exrience is

happing venue as wells

the gita experience d wee

knocki it out of the park.

>> Tha you so much, Truman.

It i

invati for the fan

perice, leveraging ogle

techlogi like AI, media CDM

d the reliabilitynd

elascityf our global

infrastrucre.

We've ared number of wayin

which 've ilt our

frasucture to enab

transfmati.

But welso ntinue to buil

solutis anproducts tunedo

supporyourop workloads a

data alicaons.

And wee opmized these fo

both pformce and ct.

One exple this isoogl

Cloud Warengine.

VMwarenginis a ful

manage nate Googleervi

that hps y lift anshif

your Vare plicatio to

Googleloudaster aneasi.

We arehe first exterl

provid to support VMre's

Cloud iversal progra whi

makes easier for yoto

migratto the cloud.

And wi builtin pointdcli

gratn tools and ouinstt

ovisning feature, u ca

get workads nning inour

Private Cld iness than 1

ur.

blueprints and broad support for

third party components such as

the Slurm scheduler, Intel DAOS,

and DDN Lustre storage.

Next, I'm really excited to

announce C3 VMs, the first VM on

the market to feature the latest

generation of intel Sapphire

rapids processors and built on

new intel Google codesigned

infrastructure processing units

for IPUs.

All of this together means

differentiated performance,

security, isolation and

flexibility.

C3 is the first VM in our fleet

with 200 Gbps low latency

networking to support a variety

of workloads such as data

processing, web serving, and

high throughput HPC workloads.

Because clusters can be scaled

and parallelized more densely,

we're seeing customers and

partners like Ansys, and

Snapchat completing jobs faster.

And Parallel Works is seeing 10x

faster performance with C3

compared to the prior

generation.

Contact your sales rep to join

our private preview.

Moving on to another product

built to leverage the IPU,

Google Cloud Hyperdisk is the

next generation of block

storage, which will be available

on both Compute Engine and GKE.

We are decoupling block storage

performance from the VM,

allowing you to tune your

storage performance to your

workload needs.

We estimate you'll see around

50% total customer ownership

persistence disk, and 80% higher

IOPS per vCPU compared to any

other hyperscaler.

We have built cost optimization

into many of our core products,

and we have new exciting

capabilities to announce.

Our new Flexible Committed Use

Discounts or Flex CUDs, can make

it easier to save and manage

costs across teams by giving you

region and VM family

flexibility.

With Autoclass, customers like

Redivis are reducing storage

costs and achieving better price

predictability in a simple and

automated way.

It automatically transitions

Objects to cooler storage based

on the last time they were

accessed, and transitions to

standard storage upon access.

That brings us to the third way

we drive business value -- ease

of use.

As cloud platforms have become

more versatile, they often have

Also become more complex to

adopt and operate.

That's why Google Cloud strives

for radical simplicity, from

migration through management.

Speaking of migration, our new

Migration Center can reduce

complexity, time, and cost by

providing key capabilities

In migrating and modernizing to

virtual machines, containers or

serverless computing.

With Migration Center, Viant, a

large media company, in

partnership with Slalom,

successfully migrated an entire

datacenter to Google Cloud in

less than six months.

We also have a new offering in

our Mainframe Modernization

solution called Dual Run.

Dual Run lets you replicate your

mainframe workload in Google

Cloud and run the two

environments in parallel.

This allows you to confirm

successful operations in Google

Cloud before your cutover, which

can massively reduce risk.

That's why customers and

partners across industries, like

Finance service company

Santander, are seeing success

with Dual Run.

We also want to simplify the way

you manage and scale.

Managed Instance Groups or MIGs

with autoscaling use application

metrics to radically simplify

and improve operational

efficiency, allowing you to

scale in and out without manual

intervention.

And with the power of Google's

ML, MIGs can predictively scale

in and out based on historical

data.

These three defining pillars for

Google Cloud Infrastructure

transformative, optimized, and

easy are the tenets behind our

intentional engineering efforts.

This is why so many customers

trust Google Cloud, and what

helps to power such innovation

across the industry.

I invite you to try Google Cloud

and our innovative new releases.

We look forward to delighting

you.

Thank you.

[Music]

[Music]

>> A name is Amar Gandhi senior

director product management.

I'm joined by my colleague

Jerome Simms and we're sharing

new products and capabilities

we're introducing at Next.

>> I'm Jerome Simms focused on

Google's DevOps portfolio.

It's great to share exciting

updates.

AMarch will kick off the

session.

>> At Google Cloud, our mission

is to accelerate every

organization's journey to

digitally transform the

business.

When it comes to DevOps we serve

customers of all sizes and

types.

While some of you are early on

your journey, some are way ahead

of their peers.

rd rahless of where you are on

your DevOps journey we want to

help.

First, Gordon Food Service, the

largely family operated food

distribution can. In North

America.

With CD they've increased 4

times a year to 2,900 times a

year, that's a huge jump and

Lowe's, America's leading

retailer in home improvement.

I go there every other weekend.

They went from doing just one

release every two weeks to over

20 releases every single day.

Or Vodaphone which I'm sure most

of you are familiar W one of the

world's leading

telecommunication companies.

They used Vertex AI and DevOps

services to build a cutting edge

AI/ML platform to enable next

generation AI use cases for

their customers.

We partner with companies of all

sizes to enable such

transformations.

Our goal is to make deops on

Google Cloud easy for your

organization as well and to

fulfill this mission we focus on

four key areas which address the

key challenges we hear from

these customers every day.

First is security.

Security in DevOps has become

increasingly critical and we are

seeing more and more hackers

preying on the security

vulnerabilities of your software

supply chain today.

Software supply chain simply put

is a jumpy that your software co

takes from development all the

way to production.

Software supply chain attacks

are on a sharp rise in recent

years.

Gartner predicts that 45% of

organizations worldwide will

have experienced a software

supply chain attack by 2025.

Number two, multicloud.

More and more organizations are

dotting multicloud today for

many reasons, including the need

for distributed applications,

data sovereignty, security,

compliance, et cetera.

This is not easy.

How do we ensure efficiency,

security and consistency as we

develop, deliver and deploy

across multiple clouds?

This becomes a critical

challenge.

Number three, sustainability.

Given the global climate change,

all organizations are rising to

the challenge.

Virtually every team in every

enterprise today is looking at

how it can help their

organization reach their carbon

emission targets.

Now Google has long been a

pioneer in achieving

sustainability in our internal

operations and now we want to

provide tools to support

sustainable development and

operations for all

organizations.

And finally, integrating and

scaling your DevOps toolchain.

This is still a challenge for

many organizations.

A typical DevOps tool chain can

consist of many open source

commercial products and spanning

across multiple areas.

This can lead to large, complex

and fragmented toolchains that

are very difficult to integrate

at scale.

At Google Cloud, we're working

diligently on integrating our

DevOps toolchain within the

broader ecosystem, and we also

want to support you to run your

DevOps tools on Google Cloud

with ease and scale.

This year, we are launching many

new products and capabilities

across all of these four areas,

and now, I'm going to hand over

to my colleague, Jerome, to tell

you more.

>> Thanks, Amar.

Let's dive into details of our

announcement the today.

Software supply chain security

is becoming and creasingly

critical concern for many DevOps

teams and to help you better

protect your software supply

chains, we're very excited to

bring you software delivery

shield III, a comprehensive, yet

modular set of capabilities that

spans a set of Google Cloud

products, delivering a

fully-managed end-to-end

solution to help protect your

software supply chain.

It can start from helping to

protect your applications at the

local developer environment,

enhance the security posture of

your software supply, build a

more secure CI/CD pipeline and

finally, protect your

application once deployed to

production.

On top of that, we let you

establish, maintain and verify a

chain of trust along your supply

Thein through policy

enforcement.

At Next this year, we're

introducing new capabilities

across many of these areas.

First, shifting all the way left

to help you better secure your

applications during development,

we are launching cloud

workstations, a new service

which provides a fully managed

local development environment on

Google Cloud, with built-in

security measures.

If you're worried about source

code exfiltration or privacy

risks, Cloud work Stations

allows to you limit access to

sensitive resources or the

public internet or even use a

fully-private Gateway.

If tomorrow you catch a

vulnerability or base image,

with Cloud work Stations forced

image update, your developers

will automatically have updates

reflected in their own local

environments the next day.

With Cloud work Stations, you

can be much better off in terms

of securing your local

development environments.

More than that we are also

giving your developers tools to

help them code faster with

greater security.

With Cloud Code source copse

Tect elf dolors get realtime as

they work in their IDE, such as

identification of vulnerable

dependencies and licensing

information.

This quick and actionable

feedback can allow developers to

promptly make corrections to

their code at the beginning of

the software development

process, thereby saving hours of

time that would otherwise be

spent in costly future fixes.

When new developers are coding

in Cloud work Stations Artifact

Registry and Container Analysis

can give them a secure space to

store and manage their images

and language packages and also

scan them for vulnerabilities.

We are adding more language

support for vulnerability

scanning.

You can now do on-push scanning

for Maven and Go packages in

containers an for

non-containerized Maven packages

as well.

To help you improve the security

of your open source

dependencies, our assAssured Op

Source Software service provides

a trusted source for to you

access open source packages.

It provides over 250 packages

across Java and Python.

These packages are built in our

own secured pipelines and

regularly scanned, analyzed and

fuzz tested for vuler

inabilities and also includes

verifiable SLSA built prove

Nance.

SLSA stands for supply chain

levels for software artifacts.

It is a frame, that brings

industry-recognized best

practices for software supply

chain integrity.

In continuing to help secure

your pipelines, I'm really

excited to announce Cloud Build,

our fully-managed continuous

integration platform now

supports SLSA Level 3 builds.

In addition to providing an

ephemeral and ice lated build

environment, Cloud Build

generates authenticated and

non-falsifiable build prove

Nance for containerized

applications and

non-containerized Java packages

and displays security insights

for built applications.

Finally, to help secure the run

time environment, we're

introducing a new set of

security features in JKE.

JKE can now help assess your

container security posture and

give you active security

guidance.

It also includes many outof the

box security capabilities.

How about we look at dteknow po.

>> I'm Victor Vzalvay.

A protd manager at Google Cloud.

GKE's new security posture

management capability provides

foundational Kubernetes security

for your clusters by analyzing

your workloads, and this

includes things like

configuration concerns, so like

your pod spec security settings.

It also looks at your images,

your running images and scans

them for vulnerabilities on a

daily basis.

So if I drill into this report,

I have all sorts of ways I can

slice and dice.

So for example, I can, you know,

look at it from a workload

perspective, name space, so

forth.

I can filter by severity, things

like my critical and highs and

get a report of just the things

that I want to prioritize and

address most immediately.

In this case, I have a

vulnerability in Zlib but gives

me specific remediation and

there's' one of the things that

stands out with posture

management for GKE is, it gives

you direct insights into where

these things are happening in

your system, so the affected

workloads, how to remediate them

and so forth.

If I slice this into workloads,

I can see what's affecting them.

In this case a number of

vulnerabilities in my current

service workload I have

configuration concern so I can

go in and get specific

remediation instructions

directed right at this

particular issue for this

workload so I know how to

address it.

It's not just a vague concern.

And of course, I can then just

go in and update it in my pod

spec, and make sure that I'm

running with the best security

possible for my containers and

for my applications.

>> With these many features in

GKE, we're helping to make

security easy for every customer

who is using our fully-managed

Kubernetes services.

For customers on Cloud Run, our

serverless platform we're

introducing new enhancements to

Cloud Run security panel.

It now displays software supply

chain security insights such as

the SLSA build level compliance

information, build prove Nance

and vulnerabilities found in

running services.

When new developers are building

application, oftentimes they

will need databases.

I'm happy to introduce Cloud SQL

Security Recommender powered by

active assist.

Cloud SQL is the fully-managed

relational database from Google

Cloud, with Security Recommender

it can automatically monitor the

security posture of your

databases, alert you on

potential security

vulnerabilities, and also

provide guidance to help

mitigate the risks.

Today, more and more DevOps

teams are being asked to support

multicloud deployment.

To make multicloud easier for

you, we're introducing a set of

new features to our Anthos

platform.

Anthos is a cloud-centric

container platform to run modern

apps anywhere consistently at

scale.

With the newly-introduced

features, Anthos customers can

now enjoy a unified management

experience everywhere from a

single Google Cloud console, and

to drive consistent security,

governance and observability

across a fleet of clusters

spanning all environments,

whether on prem, hybrid or

multicloud.

In addition, Anthos now supports

VM deployments for your Edge

environments so customers can

use the edge infrastructure

using a common platform that

uses containers and VMs.

Sustainability has been a core

value of Google from the

combi

beginning.

I'm pleased to announce that

carbon footprint is now

generally available.

Carbon footprint introduces a

new level of transparency to

support you in meeting your cli

national goals.

Let me invite my colleague

Cynthia to show you a quick demo

of carbon footprint.

>> Thanks, hi, everyone.

My name is Cynthia, product

manager of Google Cloud carbon

footprint.

You can access you can

access carbon footfingerprint

from the console navigation

under tools.

You can see the carbon

footprint associated with our

GCP usage will use granular

machine level energy consumption

data, coupled with hourly

emissions factors, which is then

a portion to each customer based

on usage. More details, You can

see the measures broken down

into steps one, two, and three,

all of which are following the

greenhouse gas protocol carbon

reporting and accounting

standards. You can also see a

monthly together with breakdowns

by project. Product, and region.

Google invests in enough

renewable energy and carbon

credits to neutralize all of our

operational greenhouse gas

emissions The net operational

emissions associated with your

Google cloud usage, are also

zero. Beyond this dashboard,

you can also drill down to the

data scheduling and an export

into BigQuery which you can

then use to Save to Google

Sheets or to customize your own

dashboards using Looker, or Data

Studio, if you have it enabled,

from either the carbon footprint

UI or recommendations hub, you

can review the project

identified as idle which will

not only reduce your carbon

footprint, but also help you

save costs.

>> With the carbon footprint.

We believe it will help

organizations achieve a much

greener operations on Google

Cloud. . As previously

discussed, integrating and

scaling your DevOps tool chain

is a challenge for many DevOps

teams. For that happy to

announce managed service for

Prometheus is offered in GA.

It offers a fully managed an

easy to use monitoring service

based on open source Prometheus,

with the speed and scale,

brought to you by Google Cloud

with this service no longer to

fed rate or add resources

manually.

You can focus on scaling your

business and not Prometheus.

In addition to make continuous

deployment easier for you.

We added integration between

cloud deploy, and cloud run, our

leading serverless runtime

environment, with this

integration in place, you will

be able to do continuous

employment through cloud deploy,

directly to cloud run.

One click approvals and

rollbacks enterprise security

and built in delivery metrics.

Next, log information is very

useful for our DevOps teams, and

to make the use of logs easier

on Google Cloud, excited to

announce log analytics, and new

feature of Cloud Logging through

an innovative partnership with

BigQuery,Cloud Logging now

allows your DevOps teams to get

more value out of logs through

power of SQL queries.

Having your logs readily

accessible in BigQuery. You can

also leverage big queries,

innovative machine learning for

more advance use cases.

Well, everyone, that's

everything I have today, our

Teams work super hard to bring

all the new products and

capabilities and hope you are

excited about them as I am.

Now let me pass it back to Amar.

Thanks, Jerome sharing the

exciting announcements with us.

With these capabilities your

organization can adopt more

secure intelligent and

sustainable DevOps practice.

If you want to learn more, I'm

sure you do.

We invite you to check out the

sessions in the track.

Our subject matter experts will

take you through them in more

detail.

And last but not least, we have

just released the 2022 edition

of the state of DevOps report.

You can download it by scanning

this QR code, or from the

addi

additionalresource section down

below.

Thank you, and on behalf of

Jerome and I we hope you have a

great Next 2022 ♪

[Music]

[Music]

>> Hello, everyone.

Thank you for joining me and the

entire Google cloud team today,

and special welcome for what's

next for security professionals.

My name is Sunil Potti, and VP

and general manager for Google

Cloud security.

As many of you know,

organizations large and small

are realizing that digital

transformation, and the changing

threat landscape, requires a

ground up security

transf

transformationtackers technique

and is procedures have evolved

and shifted and desired outcomes

have changed, long gone are the

days of limited number of

malicious nation state actors

only targeting specific

governments or critical

infrastructure.

These days, new normal and

persistent attacks and off the

shelf attack tooling leveraged

by sophisticated threat actor

gangs and nation states.

These folks primarily focused on

financial gain and business

disruption across the mainstream

enterprise, from the midmarket

credit union bank to very large

enterprise in fortune 500.

To tell us more directly from

the front lines, would like to

welcome Sandra Joyye, VP of

intelligence and Government

Affairs. And it's my great honor

to introduce her And the entire

Mandiant family and to Google

Since the recent acquisition,

Sandra.

>> Thanks Sunil.

Greetings to our audience at

Google Next. As a new member of

Google Cloud family.

Mandiant brings expertise and

threat intelligence and

consulting to double down on

Google's commitment to security.

And then the threat intelligence

we're always vigilant tracking

threat actors across the cyber

domain as they seek to spy steal

and sabotage the networks of

organizations around the world.

While cyber attacks used to play

out completely behind closed

doors, the threat has changed.

And we're seeing an enormous

amount of activity in full

public view, state and criminal

adversaries aren't just quietly

hacking victims, they're

creating public spectacles,

which are designed to undermine

the credibility of institutions

and companies. Despite rumors to

the contrary. Ransomware is not

dead. Those actors are still

going strong, but the nature of

their activity is always

changing.

Criminals simply need to find

some way. Anyway, so compelled

victims to pay they're

undermining the businesses they

target and they will not just

stop at leaks, we've seen these

criminals, reach out to partners

or customers or even to the

media to garner interest and

create public pressure for the

victim.

Unfortunately many businesses

find themselves in the

impossible position of having to

decision about preserving their

data and or acquiescing to

threat actors nation state

actors are playing a very

similar game, recent major

attack on Albania included

network disruption and leaked

information similar to what you

might see in many criminal

cases.

These governments are taking a

page out of the cybercriminal

playbook, but not all cyber

activities or a straightforward

information operation seek to

target the hearts and minds of

their audience and threat actors

use the cyber domain to carry

out these types of campaigns.

Information operation we see are

designed to attack institutions

like government alliances or

even democracy itself.

We've been seeing these nation

states use information

operations to target competing

companies for instance, an

Information Operations we call

Dragon Bridge has been posting

on social media as residents

living near a mineral processing

facility is fabricated online

personas complaint about the

facility in order to stop

competition of their countries

activities influence operations

to bolster their countries

market share while attacking

competitors and influence

operations.

A lot of things driving threat

actors to their targets.

Some victims or targets of

opportunity that are compromised

by actors, our supply chain has

already proven to be an

effective means of gaining

access to downstream victims and

aggregated access has been

abused by criminal and nation

states to great effect.

Targeting companies to gain

access to their customers.

In Ukraine, broad access has

been abused to great effect in a

destructive attack. What kind of

big data might interest in

adversary data that might be

used to track people for

instance? We've seen threat

actors compromised hospitality

airlines and other travel

resources to attract people of

interest.

One threat actor we track has

history of targeting people

directly with spear phishing

attempts, but they also target

organizations with data on

their victims a potentially more

fruitful and efficient means of

doing business, another threat

actor targets dissidents

activists, journalists and

academics are critical of that

country's activities is our

mission to ensure that these

activities are called out and

provide defenders the tools and

the intelligence they need to

detect block prioritize and

respond to threats.

Many in Google Cloud share

strong commitment to security

and work together to keep

customers and defenders and

entire globe community safe,

back to you Sunil, thanks so

much

>> Thanks Sandra, so exciteded

you and the Mandiant team along

side with threat Intel team are

joining Google Cloud.

Shares our mission to reinvent

how enterprises detect, and

respond to threats and incidents

.

Mandiant products, services and

expertise Will all combine to

enhance our Google Cloud

security portfolio and amplify

our joint patient to keep

customers safe. Now here, Google

Cloud, we continue to champion

invisible security to help you

move from today's reality where

security is bolted on and the

thought, your future where cloud

security is engineered in

operations are simplified and

shared responsibility evolves to

model of shared fate s and cloud

provider has skin in the game,

might ask why invisible security

now.

As you heard Sandra mitigating

advanced and persistent threats

can be difficult for enterprises

if they don't have the

resources, the talent, or the

security engineering

capabilities of a Google or a

handful of other cutting edge

organizations is ultimately what

keep some of these actors at

bay.

So begs the question, can the

mainstream enterprises ever be

protected unless they can be

like Google.

Imagine if enterprises of all

sizes could form on the same

cloud use the same tools, and

use the same best practices that

protect Google. That's

essentially what we're doing

with Google Cloud. And so first

we're helping enterprises become

Google by providing the

industry's most trusted cloud.

At the same time, knowing that

most enterprises will take a

while before they fully adopt

cloud we're bringing the best of

Google to enterprise with

security exclusion on on-prem,

private, and multicloud

environments.

And we are helping

organizations address these top

of mine security initiatives

across a variety of dimensions,

starting with Cloud governance

and digital sovereignty. As most

of you know digital sovereignty

has become top of mind.

Many of you internationally and

global issues with regulations

and many unique compliance

requirements across a wide set

of regions.

So the goals of Google Cloud

Approach is putting control in

your hands.

So above and beyond data

location and protect from

external access, and predefined

residency controls, as well as

assured workloads. And as most

of you know we've been focused

on cloud with trusted

partnership.

And in Germany, embarked on

strategic alliance and many more

to come.

Overeignty is key. Managing an

understanding cloud partial and

risk is essential to a wholesome

experience in cloud.

Now we help teams understand the

Google Cloud security posture

and risk profile by

incorporating world class

innovations starting with

roundbreaking technology and an

acquisition into Security

Command Center. With this new

edition you can now fully

understand your attack posture,

you can prioritize contextualize

vein RABLit allows us to providd

attack path simulations. So you

can apply targeted actions

before attackers take advantage

of high risk vulnerabilities.

Another area that needs to be

reimagineded by the entire

security team is security

operations all up on all

environments cloud and

on-premis.

On this journey instead of

having... our new Google Cloud

Security operation exclusion

converges security operations

capabilities so security Teams

can now pivot faster and manage

alerts more effectively with our

best in class.

And in addition to Mandiant

leading incident response

services threat intelligence,

and gained from the front lines,

and Mandiant, advantage

platform, all of these will

collectively help us accelerate

security operations

transformation.

This combined Approach will

help organizations move from not

just modernizing security

operations to a state of

proactive cyber defense, which

ultimately we believe is the

future of security operations.

And now to tell us more about

how leading organizations are

transforming security it's my

great pleasure to welcome a

friend and great partner and

customer Bashar from Schwab over

to you Bashar

>> Thanks Sunil.

Transforming security for me is

all about how security can be a

business enabler, while making

sure the team embraces change

and leverages all the cloud

native security controls

available to us, really focused

on 3 key areas: Security

transformation to support

business growth, zero trust by

default and threat detection and

response going cloud native.

Now let's dig a little deeper

obscuring our cloud

transformation.

We aren't taking more risk just

because we're embracing the

cloud.

Our risk appetite has largely

stayed the same, what has

changed is the how.

Now what does that mean to us?

It means that just because we

used to do things certain way

mostly in legacy data centers

doesn't mean we should do them

the same way in the cloud.

Yes, we need to stay true to our

risk appetite, and also need to

use this as an opportunity to

innovate, to champion and

embrace change, to do things

differently, if and where it

makes sense.

To use the power of hyperscale

cloud infrastructure, cloud

native controls and AI and

machine learning to achieve

greater automation.

I want to use machines to reduce

my team's toil and enable faster

decision making based on data

sets we weren't able to analyze

previously due to scale and

various constraints now as

mentioned second fee focus area

is embracing zero trust

Architecture at scale, for us

means putting identity at center

of all decision and moving

implicit trust in all

relationship.

Focus instead on establishing

explicit trust for each

transaction.

Context and visibility, are the

other dimensions that are

crucial to a successful

implementation of zero trust in

my opinion. So how do we make

sure we have visibility and

context from as many sources and

signals as possible to

dynamically and continually

assess policies on the fly. How

do we make sure that works where

our teams are these days is not

likely in the office, ultimately

zero trust, is all about linking

identity and access to

prevention strategy, but

realistically dovetails into our

third key focus area, rethinking

threat incident detection and

response.

To me scaleability visibility is

foundation to modernizing threat

detection and response.

Especially in a world where

our data sources continue to

grow exponentially. The only way

to process all that security and

contextual data and actually

make useful to embrace cloud

native technologies and ensure

scale and speed.

Scale is super important to us,

ultimately enable us to use

advance analytics to make better

and faster decisions and also

the harsh reality of security

talent availability in our

industry today. My belief is

that we need to leverage

machines to do more and help us

see more and make decisions on

our behalf where it makes sense,

that being said, people and

expertise will continue to be

important, even if outside of

your direct organization.

That's why choosing the right

security partner is key.

Especially in the context of

security transformation.

Make sure you choose partners

based on a shared vision of the

outcomes you want to achieve

together.

The last thing I want to leave

you with is that make sure your

team doesn't get hung up on

previous security patterns, push

them to embrace change and

innovate using all the latest

tools at your disposal and with

that Sunil thank you for having

me and back to you.

>> Thanks Bashar! Bringing the

best zero trust access for apps,

is top of mind to many of you.

And made significant investments

along side extra teaking

partners as Palo Alto, and

variety of other partners in the

ecosystem.

This now gives you

comprehensive zero trust options

to secure private and SaaS app

access while mitigating internet

threats across managed or

unmanaged devices.

However successful adopting zero

trust security Architecture

isn't always easy.

So to help you packaged up

proven experience and best

practices with our cybersecurity

and select partners, but this

will be available to support

anything from exploratory zero

trust conversations to

architecture reviews to

implementation support. Now we

know implicit trust that we've

covered so far, can create an

opportunity for insider threat

management and other significant

security risk, not only in

context of access, but software

supply chain.

My mind that's the last green

space of potential opportunity

to be really reimagined within

an enterprise security posture,

and to further help enterprises

secure software supply chains

we're introducing an all new

offering called software

delivery shield that takes the

complex challenge with tested

Approach based on best practices

that we internally, and secure

our own software supply chains

for 100,000 + developers here.

To specific area, we have made

significant progress and assured

open source software. Very

excited to announce the preview

pen source that now provides

access to the same open source

software packages that Google

depends on, allowing you to

benefit directly from Google's

own in depth, end to end best

practices.

So in addition to everything

just covered, we're releasing

wide varieties of innovations.

Across our entire security

portfolio. To hear more about

them, or to learn from others.

Join breakout sessions to go

deeper into topics and engage

forward looking beyond Cloud

Next.

We are so excited to help you

become like Google with our most

trusted cloud And by bringing

security magic to you wherever

you are, as a two fundamental

pillars of cloud security Then

one thing I wanted to highlight

was that in this journey of

modernizing security, either on

GCP or wherever you are. Unlike

some of our peers were chosen to

actually offer best in class

partner capabilities in

conjunction with first party

google solutions in cohesive

experience, versus all or

nothing capability in closing

you may recognize the pace of

innovation on the path to

visible security has not slowed

down, in fact, if anything it

continued to accelerate,

especially with Mandiant now in

the mix.

Hope you join us and partners on

the journey as we reinvent

security to meet the

requirements of tomorrow.

Thanks again, stay secure and

have a great rest of Next.

[Music]

>> Hello everyone, it is an

honor to kick off the session as

new leader for the Google

Workspace business, some of you

are current customers who use it

every day at work and some are

considering it for the first

time, regardless of which camp

you are in.

Every time you use Gmail google

chat or drive, Docs, Meet in

your every day life you are

using workspace, bridging

communication collaboration

across the product is magic of

Workspace, development of these

products for the past few years

and it has been remarkable to

see that workspace has become

the world's most popular

productivity tool relied on by

more than 3 billion user across

the planet.

Serving billion of user every

day gives incredible insight

into the human experience, and

insight is the heartbeat of

innovation, our mission to

meaningful connect people to

create, build, and grow

together, we're here for you as

you grow and build your

business, whether mom and pop

shop or small business or global

corporation, more than 8 million

customers entrust Workspace

today and could not be more

enthusiatic to fulfill that

responsibility.

Hybrid work challenges have

dominated recent headlines. I

have the good fortune to talk to

customers every week. And what

I hear from them is that

Flexibility is the key,

achieving flexibility relies on

solving two core issues, first

overcoming begans between people

working in the office and people

working remotely, and second,

securing data and preventing

cyber attacks everywhere that

work happens. And for many

businesses the physical office

is no longer the center of

gravity for work.

But as they look to establish a

hybrid workplace that energize

with ideas and a drive for

getting things done. Many are

finding that their legacy tools

just aren't meeting their needs.

Why are organizations struggling

with hybrid work.

Well apart from deciding how and

when you come together in the

office, There's also the

question of how they can harness

the effectiveness and the fun of

creating together in the same

room, they want that power

without giving up the well being

that comes with the best of the

hybrid workplace. Fewer commutes

and more focused time.

Good news is that we're right

here with you.

Our Teams live and breathe

hybrid.

And here's what we're building

into workspace to help make

hybrid, the very best experience

for everyone.

Now first, let's acknowledge

that meetings are not going

away.

But they can be dramatically

improved to make the most of

meetings we've introduced

features like automatic light

adjustments and noise

cancellation.

So that you can look and sound

your best. Companion mode gives

everyone a front row seat and

hybrid meetings. Whether

they're joining from their

phone, or from a conference

room, and to reduce digital

fatigue, we've added features

like CO presenting, and the

ability to unpin your own video

tile.

Now beyond meetings creating

community and spacesdirectly

into conversation with Docs, is

something we pie neared in the

work space.

And making docs come alive.

Just a mention in the... to

create flow.

You can even hold a meeting

directly within a doc with one

click bringing the voices and

faces of the team into a

discussion without leaving the

document, now once you set

yourself up for success in

hybrid work, how can you be sure

your data and people are working

in secure environment no matter

what the location or device,

there is tendency securing

people and data in hybrid

environment is more challenging

than before, that's only if you

are coming on from systems that

were built in legacy precloud

era.

These systems have security

bolted on as an after thought

and frankly, simply can not

scale to the threats we face

today.

Workspace has always been cloud

only.

This means that you are

benefitting from decades of

Google expertise in threat

protection, AI and global scale.

This is deep computer science

you simply can not develop over

nights the safety of our

customers and data is not just

baked in the solution, it's

fundamental part how we develop

software here at Google.

Our cloud native zero trust

security model protects your

data against both external and

internal risks.

Today Gmail blocks more than 99%

of spam, phishing attempts and

malware before they even reach

our users, that same production

and extended to all document in

Google Drive.

Millions of customers have

already made the move to

workspace, and I'm so inspired

by their stories. Let me share

just three of them. Korean Air

carries more than 27 million

passengers in a given year, they

adopted Workspace in July 2019,

to improve internal

collaboration and communication

such as more fluid exchanges

between teams and leadership.

They over hauled everything from

mail document collaboration and

internal communication.

Workspace helps shape a new

mindset for their workforce, and

a new way of working. Today,

secure collaboration happens,

end to end with Workspace

documents created per flight

safety and management, nd are

stored securely in drive and

shared between corporate teams

and their flight crews, .

Whether on the ground or in the

air, Wayfair is another example,

things home, giving their 29

million customers the power to

create spaces that are just

right for them. Started by

adopting Drive and Google Docs,

and eventually replaced all

their legacy tools. Every month

they host more than 150,000

meetings in Google meet,

ranging from just a few

participants to hundreds.

Capitalize the seamless flows

from Google Docs to chat to meet

in workspace allowing their

teams to get work done, no

matter where they are.finally

very new customer, Macro, ne of

the largest domestically owned

private banks in Argentina. The

bank was founded almost 50 years

ago. Their mission is to build

relationships of trust and

foster a unique culture of

customer care.

Their goal of becoming leader in

digital market meant needed to

switch to modern tools for fast

based collaboration and

communication, they chose

Workspace.

They want to attract emerging

talent with tools they already

know how to use in their

personal live and is shift the

organizational mindset to

seamless cocreation.

With those goals Workspace was

the natural solution for them to

turn to attracting and retaining

talent is top of mind for many

of us as we prepare for

tomorrow's workforce.

New study stated that 75% of

recent college graduates prefer

working in Googlework.

There is 1000s of engineers

across google working to deliver

the Workspace mission.

Every year we double the pace

of innovation. This year alone,

we delivered over 300 powerful

new features to help teams get

things done, so much more to

come.

And today excited to announce

new ways we're making Workspace

even more powerful.

Capabilities like adaptive

framing give everyone in the

crowded conference a chance to

be seen, and companion mode to

bridge the gap between those at

home and those in the office and

also introduced transcription,

not just in English, but French,

German Portuguese and Spanish

also reduce the chore of note

taking, and to make it much

easier for people who couldn't

be there stay in the loop.

Building on our investments

and Smart Canvas. Earlier this

year we introduced auto

summaries in Google Docs and

pages format, and new smart

chips. Today I'm excited to

share canvas to more places in

workspace starting with Google

Slides. Let's talk about

presentations, often it's the

delivery, and not the content

that makes the presentation

impactful However, in hybrid

presentations, audiences often

have to split their attention

between slide and is speaker.

And we're bringing the

storyteller and story together

from far more integrated

engaging view creating focus and

engagement, we're extending

smart canvas to Google sheets

too.

Smart data extraction and new

timeline view allows you to

focus on content and not data

encentury, here at Workspace we

always believed in open

ecosystem and extensibility, and

thrilled to announce that we're

opening up slot Canvas for third

party applications, new smart

chips the Salesforce Zendesk

figma and other partners will

allow people to view and engage

with this rich third party data

in the flow of the work, rather

than switching tabs or context

to help secure your environment

and data. We've introduced two

major capabilities. Trust rules

in Drive, and the ability to set

global DLP rules that apply

across your workspace

deployment. Now, including

Google Chat. Unlike other

solutions they happen in

realtime.

This means we scan the content,

detect sensitive data and apply

action instantaneously without

delay with the standard across

the industry.

With Workspace you don't have to

trade off security with speed.

Our groundbreaking client side

feature allows customers to have

complete control over access to

their data. Today, we're excited

to announce that we are

extending client side

encryption to Gmail, and Google

Cal

Calendar.

On it's own Workspace is

comprehensive productivity

solution. And it's even more

powerful when you connect it to

other tools in your environment.

To help with this introduce APIs

for meet and chat, and allows

you to bring the power of the

tools directly into the third

party apps you are using every

day.

Our partners Figma and Asana are

using to embed apps directly

into the meet.

Add on, will enable teams to

collaborate on figma design

files and fig jam digital

whiteboards directly into Google

Meet.

Finally, we're bringing app

sheet and Google chat together

so that users can interact with

custom app sheet apps right

within the chat that they're

already using. Now, all these

capabilities are designed to

help organizations thrive in

hybrid. They're also an

expression of our mission to

meaningfully connect people so

they can create build and grow

together and now time to see in

action.

I'd like to introduce you to

Ilya Brown, our vice presidents

of Product Management.

>> Thanks Aparna.

Hi, everyone, great to be here.

Like many of you, I manage a

team that works across time

zones and locations. It can be

tricky so I'm excited to get

specific, and show you how

workspace can help teams adapt

and thrive from wherever they

work for demonstration, a

company that relies on workspace

for team collaboration. This is

Megan she's working on a big

website launch with our remote

and in office colleagues, we

joined Megan as she starts her

day. The website project is

mission critical Megan's top

priority is making sure

everything is running smoothly.

Let's take a look at how the

projects going.

Cymbol, website hasn't been

updated in five years and

everyone is excited to launch

the new design. They can opens

Google workspace to find her

tools in one place. She's

thrilled to see the final

website messaging and design are

in from the agency and can't

wait to share them with her

team. She greets team in their

project space, and lets them

know the good news. Both the web

developer asks to take a look at

the final website design. Megan

shares the file. Now, the file

is also stored in files in the

project space, preserving the

website design for everyone in

the space to access.

Saving team time and helping

team stay productive.

Working with different schedules

and time zones can make it

harder to assign

responsibilities and keep

everyone on the same page. Megan

uses a project planning doc to

keep track of our team members

responsibilities and tag people

for input. It shows project

tracker with some team members

already tagged.

Doc is long and detailed. The

team can quickly get up to speed

with autogenerated summaries of

all the main points. Meghan can

easily track tasks nd assign

responsibilities with people

chip and the table she created

automatically with drop down

template, she let the team know

plans to organize meeting where

they can review the new website

together, and shares custom

emoji to express excitement.

These quick impromptu chats help

keep everyone connected when

working from different

locations, and custom emojis, an

allow individuals to express

themselves in a more personal

way. Megan creates a calendar

invite, from within her team's

project space. She can see her

team members availability, which

shows their working hours work

locations and time zones by

specific day with an available

time that works for everyone.

She books a meeting room for

people at the office, and sends

an invite.

Next day, Megan receives a

reminder about the meeting a few

minutes before it starts to join

Google meet directly from the

project space, she notices that

her room is quite darkand

construction work happening on

the street outside, Meet

automatically applies noise

cancellation optimized lighting

and image enhancement to ensure

that Meghan can be seen and

heard, the team has joined from

mobile devices home laptops and

have a meeting room in the

office. In room Google meet

hardware supports adaptive

framing with multiple

intelligent cameras from Hudley.

These use active speaker

tracking to automatically frame

people as they speak and capture

attendee responses, so everyone

can be seen clearly and feel

included, Attendees also use

companion mode from their

laptops and check into the

meeting room, which displays

their names in the video tile,

and people pane to everyone

knows who is in the room.

And brings her team meeting

there with one click. This way,

she can see the team while they

review and discuss the final

version of the website. They

collaborate in real time, adding

comments and feedback as they

go.

>> Being able to see people

while looking at the same

document can make collaboration

so much easier, and frankly so

much more fun.

>> That's right, and I

personally love the AI magic in

noise cancellation and

lightening I don't know about

you but my home can be a little

camped, a little crazy, little

noisy, these features help me

focus on the meeting rather than

how I look or how I sound.

Megan's ready to share the

website prototype to gather

feedback. Let's see that in

action. Megan is going to have

a group of internal stakeholders

and trusted external customers,

test the website prototype and

pro

provide feedback before they

launch, her colleague Amani, a

program manager is in charge of

setting up the process to gather

feedback. She has created an app

in app sheet that uses chat to

automatically request feedback

from everyone with a central

place to track the responses.

She messages Megan over chat to

ask for the list of testers,

when Megan tries to share the

sheet of invited testers the

automatic data loss prevention

feature known as DLP, lets her

know that the file includes

personal user data that's been

flagged as sensitive by her

admin.

Megan removes the sensitive

information, and shares the

sheet.

DLP is great for companies

that want to help protect

sensitive data, and help make it

easier for users to do the right

thing when working with

confidential information. Megan

sends a team wide update

announcing that final testing is

underway. She sends the message

in Gmail, with client side

encryption enabled. Since the

launch is still highly

confidential. This means the

email can remain private and

encrypted, every step of the

way. No one outside Megan's

company will see the contents of

the message. In minutes, Megan

start seeing feedback come in,

and she jumps into action.

Regional marketing manager

reported translation issue.

Megan knows the developer team

will need to handle this. So she

files a ticket in the JIRA bot

directly in the project space.

It's great to see the smooth

workflow happening. And with

client side encryption coming to

Gmail, all emails, including

attachments can remain private

At last, the website is ready

for launch. The final stretch

of the project, Meghan is

excited about presenting the

work to the global team. The

opens the project planning doc,

noting all the completed tasks.

Only task left is to present the

team. Megan launches the draft

presentation through the

document, in the notes column of

the review tracker.

She uses the slide library. To

add a cover page. This will

allow her colleagues to see

Megan's video directly on the

slide as she presents, helping

to foster a more engaging

presentation. Now that she put

final touches on the

presentation she starts the

meeting right on time.

With controls in Meet she can

navigate through the

presentation allowing her to

easily see speaker notes and

meeting participants, without

leaving the Meet window.

Later in the day, Megan's

director leadership announcement

8000 employees, congratulating

the team and sharing the full

marketing plan to amplify the

new website. Well done team.

That was awesome. Congrats Megan

and the symbol team said how

workspace helps teams thrive in

the face of hybrid challenges.

That was such a great demo.

Thank you, Ilya, how we can

help users and customers today,

and I'm so excited about where

we're headed. Thank you everyone

for joining us. We hope you

check out the on demand content

and the breakout sessions for

much more on workspace. Bye for

now.

[Music]

>> Google products provide the

information you need when you

need it.

Why can't you get the same kind

of answers for your business.

L

Looker, Google clouds business

intelligence solution is here to

solve that problem, enabling you

to go beyond traditional

dashboards, and make your

organization's information

accessible and useful. Bringing

this innovation, the business

will be revolutionary. Just like

navigating a city after Google

Maps Looker is Google for your

business data, choose what we

mean, what if Google AI we're

built into the tools you use to

store and analyze data at work.

Google's vertex AI vision makes

data like video images and

audio, and in real time, turns

it into structured data ready

for business intelligence, going

beyond the dashboard mean using

Google Glass enterprise to see

insights and recommendations

based on your data in real time,

more access more transparency

that's Google for your business.

With Google Maps, you know if a

restaurant is busy before you

go, or you can get rerouted

around a traffic jam. Look, it

will help you connect similar

dots in a predictive way, a

concert in five days will

increase foot traffic by 65%

Would you like to adjust

staffing and inventory. >>

Yes.

>> Let you respond to changes

in demand and turn insights into

action foot traffic continues to

be busy, encourage customers to

visit an alternate shop with a

reward card. >> Yes.

>> Smarter insights mean

better experiences and happy

customers to go beyond the

dashboard and transformed the

way you do business with Looker,

powered by Google Cloud.

[Music]

>> Hi, everyone, welcome to

the Vertex I'm Fabien product

manager at Google Cloud.

Hello everybody, this is Nelson

Gonzalez, I'm a product manager

in the Google Cloud team. Today,

we're very pleased to introduce

vertex AI vision, let's get

started. Today. We're glad to be

joined by Carlos o founder and

CEO at plainsight.

And finally Nelson and Botond

will present some examples on

how vetrix ei vision is applied

to the retail sector. So let's

dive right it.

Today, there is about 1 billion

cam

cameras worldwide in building,

cities and highways, the future

holds the rapid growth of

cameras and sensors, streaming

data from factories, retail

stores coming from cars

satellites drones, and just

about everywhere. In addition

to this expected growth of

sensors and data We see new

developments in AI, the

migration to higher sensor

resolution and communication

devices happening at 5G speeds.

All these factors are enabling

rapid expansion of novel

computer vision application to

be created in order to serve

every industry, and every use

case. This is a huge

opportunity, but not an easy

journey.

Classically, if you want to

build video AI analytic

application, would have to use

multiple tools and engineering

involve to form the streaming

pipeline, and AI analytics and

data warehousing, taking time

and expensive.

And importantly you need to

trust the insights delivered by

the applications with VertexAI

vision all the things are

simplified and designed

everything from the ground up to

obtain efficient and easy to use

pipeline, and reduced time to

visionary applications from days

to minutes, at the 10th of the

cost.

Also you can trust we developed

Verte XISHGS AI vision

responsibly and according to

Google's AI principle.

And constant training including

testing for bias performance,

and incorporating features and

protect privacy and security

like person blur, and choosing

not to offer any kind of of

personal identification features

such as facial recognition or

multicamera tracking.

So what is VertexAI vision, one

stop shop that provides all the

functionalities to easily build

and deploy escapable video and

analytics pipeline at low cost

and low latency.

Today I'm pleased to announce

we're lunching VertexAI vision

in public preview.

With VertexAI vision, you can

inguest media, from live cameras

and existing data process media.

And pretrain AI models and

custom build models, store data

in vision warehouse which comes

along with powerful search

capabilities. And finally,

analyze the data and sort of

meaningful actionable business

insights to your customers.

You can do all this in singer

user interface Vertex AI vision

Studio.

Or viaSDK for expanded

capabilities.

Now we're going to show how to

build and deploy your first

application with a quick demo

>> Hi, everyone, , In this

video I'm going to walk you

through the process of creating

and deploying a vertex AI vision

application, and today we're

going to build a smart city

traffic analytics application

using live video from cameras

and AI to help city planners

better understand traffic

patterns in order to reduce

congestion and increase citizen

safety. So let's get started.

First, we create a new

application with vertex vision

studio. Next step is to select

the video stream as an input of

the pipeline. Demo we are going

to select some streams

associated to live video

cameras. By taking live view,

video fed directly into the user

interface, and now going to

process the data adding AI

models to our application, and

first thing we want to do is

detect and count all the

vehicles Crossing the

intersection, to do so, we are

adding occupancy analytics

prebuilt model.

Then we want to configure this

model by selecting the video

feed we are interested in, and

using the line crossing tool to

associate a smart event to each

leg of the intersection. Also to

answer privacy adding a person

blur model so if a person comes

into the field of view of camera

blurring will be automatically

applied on that person.

With Vertex AI vision all the

models are available right out

of the box, but can also bring

in custom model.

As example, importing bicycle

detector trained in VertexAI,

all you need to do is bring a

model previously trained and

import it.

Next is the storage of

application output, and we're

going to connect our pipeline to

warehouse is really easy. You

just need to specify a name, and

the warehouse and set up for

you. And finally, we're going to

add a BigQuery connector to

store structured data in the

table. Here we browsing the

available tables and selecting

one. Now that we have the

inputs, the analytics and the

outputs define, we can deploy

our application by just a few

clicks.

Let's look at the data stored in

warehouse.

Warehouse assets accessible

directly in the Vertex AI

Studio.

We can perform powerful search

based on time and metadata. For

example, I want to search for

all events that happened today.

In the afternoon, that contain

five to 10 vehicles. Structured

data is being streamed in the

table and we can run queries for

analytic and is connect the

output for this query, to Data

Studio to easily visualize

results and create live

dashboards. Finally, we can use

the DK to consume output in

other ways. In this case we're

serving a better city planners

via a web based dashboard to

better understand traffic

patterns and make more informed

decision.

To conclude our session today,

Vertex vision is your one stop

shop where you can easily

ingest, analyze and store video

streams on Google Cloud, and

build application to power use

cases across more today's

retail, manufacturing, and much

more. Thank you.

>> Hope you liked the demo.

Proud to work with independent

solution vendors that are

helping their customers, built,

inte

integrate and execute visionary

AI workflows, One fantastic

example is our collaboration

with Plainsight, who's been

pioneering the use of vertex AI

vision.

I'm glad to introduce Carlos

Anchia CEO and co founder of

plain sight AI to tell us more

about them. And the journey with

vertex AI vision.

>> Plainsight unlock

successful computer vision

solutions. And it does that

through providing a unique

combination of AI strategy, a

visual data science toolset and

deep learning expertise to

develop, implement and oversee

transformative computer vision

solutions for enterprises across

industry. And we do that really

by addressing speed

standardization production

analyzation and oversight

delivers value in days, not

months. Standardization is a way

to be able to maintain and

repeat the solution over time.

Production realization is a

codified automation of the

workflows, and then oversight,

any computer vision solution

requires oversight and

responsible vision monitoring

management of that lifecycle and

talking about repeatable

enterprise computer vision,

we're talking about hardened

solutions end to end proven

vision AI solutions And because

the technology spans across the

horizontal. We work in a lot of

different sectors, smart AG is

one of them, manufacturing,

quick service restaurants,

energy for oil and gas.

And really AI-driven data

discovery company responsibly

applying AI We're available on

the Google Cloud Marketplace for

deployment, and we have

maintenance and oversight over

the entire solution. Computer

Vision enablement with vertex AI

vi

vision.

A couple quotes here from

leadership from Plainsight.

One from Elisabeth S cofounder,

and chief product officer at

Plainsight.

Vertex AI vision is changing

the game for use cases that have

previously been economically

unviable at scale, the ability

to run computer vision models on

streaming video up to 100X cost

reduction is creating entirely

new business opportunities for

our customers. And

additionally, just a little

quote from my side, with pre

built components, easy

configuration and instant

deployments we're accelerating

delivery from weeks to minutes

for our customers diverse use

cases.

And it's really driving a low

cost adoption of computer vision

accelerated uploads to Google

Cloud and streamline processing.

It's a speed to solution the

streaming video . Connection

handling is done in minutes. And

it's an ease of use, it's a

point and click model selection

with insights in minutes. I want

to take a little time talking

about one of our customers we

done solution for DRW is

diversified training firm

innovating across both tradition

and cutting edge markets.

The problem we were addressing

with them really around office

heating and cooling needs

fluctuating wildly has employers

balance in office and work from

home schedules.

Not just the HVAC system but

facility and services provided

to employees.

This flexibility provided to

employees provided DRW an

opportunity to dynamically

adjust HVAC needs based on

occupancies.

And conserve energy is

practical. And part of the

requirements here we're really

highly accurate real time IP

camera streaming applications

where privacy and security

standards are maintained

throughout the entire

application, and wanted to

highlight here the ease of

interface in the Google Cloud

console.

On the left you see the

application graft for this

specific application.

It's one stream being processed

in realtime, a person blur for

fully occlusion being implement

here, and occupancy count, the

data is being stored in the data

warehouse for vision AI, and

also in BigQuery so we can

analyze it later. Here's the

resulting video from that graph.

As you can see PII is being

maintained and anonymized

throughout the process in real

time, no data is landing and

Google Cloud that is not

anonymized, thank you.

Fabien

>> Thank you, Carlos, I'm

eager to see the many innovative

solutions that plain sight will

bring to your customers,

leveraging AI vision. Like to

bring Nelson, will share some

exciting announcements about AI

models in Vertex AI vision.

>> .

>> Thank you, Hello,

everybody. This is Nelson

Gonzalez Customers expect

Google Cloud to deliver strong

AI capabilities offer a broad

portfolio of AI models within

vertex AI vision, ranging from

models that you can build in

vertex AI pre built models

offered by Google, for example,

the occupancy analytics models

include active zone counting and

dwell time features.

These models combined with

person blurred model, and

relevant across industries and

protecting privacy of your end

customers.

With google take responsibility

development of AI seriously, and

believe all AI technology

requires responsible Approach.

We are committed to upholding

the highest standards for

ethical use of AI.

And our Approach combines

sociotechnical assessments with

actions plans that amplify

opportunity and is mitigate

potential risks.

For Vertex AI vision we have

taken steps to incorporate

mitigation for risk concerns

that arise in development

process.

Informed choices based on

experience, and also taken steps

to evaluate models for fairness.

Introducing responsible AI

focus features such as person

blur.

In addition customer education

and transparency is key.

Providing customers with

educational materials and best

practices, as part of Vertex AI

vision collaterals.

We're continuously iterating

improving and incorporating

lessons learned from Cloud and

across Google.

Today we're pleased to introduce

first of several models we plan

to bring to retail customers

through Vertex AI vision.

The product recognizer solves a

very difficult challenge, how to

recognize products at scale

based solely on the product

image.

Product recognizer does exactly

this.

It recognizes the product, based

on visual and text features of

the product package, and

recognizes the product at the

UPC GTIN level.

In order to do at scale, Google

brings distinctive capabilities

in terms of breadth maintenance

and depth.

Leverage state of the art AI and

Google graph and includes

billion products grows daily,

continued maintenance by Google.

As product recognizer grows we

plan to deliver additional

attributes about each product

beyond the UPC number such as

whether the product is gluten

free or not for grocery

products.

Retail analytics models found

within Vertex AI vision

including product recognizer

will enable customers to realize

broad set of use cases, and also

know our customers need flexible

scaleable, and financially

viable solutions that are built

on trust.

In terms of flexibility, we want

to enable customers to use

multiple sensor modalities

ranging from fixed cameras, to

robots and mobile devices that

capture the images.

Scaleability requires partners

that bring ISV solutions that

integrate with retail vision, as

well as SIs that are able to

integrate into the storage

systems.

Financial viability is key to

ensure the first steps taken by

your company retail innovation,

while delivering positive ROI

and a strong value creation is

really critical to go beyond the

pilot store and prove solution

that applies to the whole

network.

And lastly, importantly the

solution needs to be built on

trust, for retail customers that

will interact with the solution

at the stores.

Including very importantly

protecting your end customers

privacy.

We're pleased to introduce

today a set of key customers and

partners that have been able to

bring the innovation of Vertex

AI vision and product

recognizer.

Today, I like to introduce one

of these partners.

Whether you are a child or

adult, you probably been

intrigued by robots.

AI-powered robots that hold the

promise to solve challenging

problems that are challenging

for humans to address.

These problems include being

able to solve the analytics

needed at each store.

It is my pleasure today to

invite Botond SzatmaryVP from

brBrain Corp.

>> Pleasure to be here, I've

been with Brain pretty much day

1.

It's been amazing to be part of

the journey and see the company

grow to where it is today. Today

brain Corp is an automation

company. We are optimizing and

automating workflows with

robots. We are number one, when

it comes to mobile robots in

public spaces. We have more than

20k units out there operating

safely, and daily basis,

covering more than 88 million

miles, being super helpful to

our customers.

In addition to automating basic

functions like floor scrubbing,

we introduce additional

functionality to our robots,

with payloads like mechanism to

collect images of self content

operating in retail spaces and

partnered with companies to

analyze the images and bring a

new functionality and help our

customers and do shelf

analytics.

We have this function first

rolled out at some of the

largest retailers in the US.

On our existing fleet.

And today I would also like to

introduce a new dedicated

inventory scanning robot, that

is cost optimized for data

collection.

And more flexible in adapting in

variety of store form factors

and zero touch autonomy, and has

a long battery life.

What this means is that it will

allow our customers to manage

these units remotely, and... to

give you an example: Imagine

you are a market manager and

want to see if all the seasonal

displays properly out in your

market.

You could do this with a click

of a button at your own will.

[Music]

In addition to this dedicated

multiple scanning robot, super

happy to announce partnership

with Google retail store

analytic team with Nelson, and

bring the scaleable flexible,

financially viable inventories

scanning solution to the market.

The workflow is very simple and

elegant. We started a robot

that collects raw images, and it

always has an updated map of the

envir

environment.

We use the product recognizer

and tech recognizer from the

retail story analytic and is put

this together and stored the

insights in aggregated warehouse

in a big query warehouse and

will know what products are on

display and what is out of

stock, and what is low on stock,

and what is missing and what

needs to be replenished and

where the product are on display

and all in centralized data

warehouse, having everything in

one centralized place will

enable us to deliver accurate

and actionable insights and

optimize e-commerce and retail

operations and increase sales.

At store level, enable, shelf

stock alerting, task master

management, and automation price

and compliance alerting, and Pro

duct location compliance

alerting.

On the e-commerce will link the

physical with the online.

What this really means is the

following, retail stores -- the

store you usually go on daily

basis, increasingly becoming

multifunctional environment,

also a warehouse for your online

orders with that having accurate

inventory is increasingly

important, shelf analytic

service will allow you full and

accurate and up to date

visibility of what is available

in the store, and with that able

to be able to link online to the

physical.

You at all times will have

accurate representation of

online presence to what is

available in the store and with

that optimize your operation.

We will also give you an

updated map of the store, with

the updated map of the store,

and knowing what is available on

shelf you will be able to

optimize your picking routes and

save labour and serve customers

better.

Thank you, Nelson thank you.

Botond.

So summarize things, Vertex AI

vision is your one stop shop

that allows you to easily and

quickly build and deploy cost

efficient vision applications

using state of the art AI

models.

Partners compliment Vertex AI

vision by No. 1 bringing

capabilities and main expertise

to Vertex AI vision integrating

Vertex AI vision platform and

models into the solutions.

With that, I like to turn it to

Fabien for additional thoughts

>> Thank you, Nelson, before

we go.

I want to spend a moment on the

exciting things to expect in

2023 and beyond.

Edge, we're going to enable the

deployment of AI vision

application on devices at the

edge.

Custom models, can expect more

capabilities to help you build

your own models and better serve

your use case, while we don't

have the time to go through all

the exciting details today we

look forward to sharing more

with you as we continue our

journey.

With that, here are a few ways

you can learn more about Vertex

AI vision and get hands on with

the platform, and you can get in

touch with us via Google Cloud

contact.

Thank you, Nelson, Botond

Carlos, joining me in person

today for the Vertex AI vision

session.

[Music]exponential

roadmaps goal zero carbon

emissions by 2050 .

>> Where our emissions

primarily stem from divides,

networking, and cloud. >> Our

goal is actually to get to zero

emissions by 20 or 30 backstage

was built internally at Spotify,

so it unifies your tooling, your

services, Docs, and apps under a

unified consistent UI, we

donated it to the Cloud Native

Computing Foundation.

>> It's amazing to see how

many people have actually cared

deeply about this topic,.

>> Cloud carbon footprint is

actually an open source tool

developed by ThoughtWorks, the

only thing that's limiting is

people hearing about it.

>> It leverages cloud API's

to provide visualizations,

estimated carbon emissions.>>

We leverage big table GKE, it

starts not just from the cloud

that it goes all the way out to

user devices.

>> E want to empower not just

Spotify internally, but the

broader developer community,

reduce their carbon footprint.

[Music]

>> Hi, everyone, welcome to

Next 22 session on document AI,

I'm Sudheera, today joined by

Managing Director and Head of

document lifecycle at Commerce

Bank, Andreas Vollmer.

So let's dive right in.

Economies businesses, and

livelihoods depend on digital

documents. Digital manual labor

comprises the work that many of

us do. Essential to the vast

majority of today's business

workflows, and it captures the

value of the data in those

digital documents. AI ml for

documents, employee experience

significantly lower automation

of manual work, speed, and error

reduction. So, reasons why our

customers have told us they try

to automate tedious manual

processes with AI legacy

technologies require supporting

labor? Understand a document is

much more than reading a word

document AI comes in. But let's

talk about how we want to help

our customers and their

involvement toil of digital

manual labor. In a nutshell

It's unstructured content into

business read your data and AI,

we have made it our goal to turn

documents into business ready

structure data. As we work

closely with customers, realize

part of the problem is legacy

technologies solving problem

that's too simple, yes, you can

get some STRAURD data with the

table parcel, but heard what

customers really looking for is

business ready structured data.

In other words looking for

technology that read documents

in ways similar to humans.

They're looking for the

confidence, when they start to

automate processes they're not

signing themselves up to

different kinds of digital

manual labour.

Google's document AI presents

simple and cost effective path

to build document AI processers

and complimentary system of

record to manage documents and

data.

Today, we're announcing

document AI Workbench in public

preview, allows us to automate

document processing to using

your own data to build models

with document and state of the

art computer vision, natural

language and neural networks.

With Doc AI Workbench.

You can use your data to create

ml models for many document

types such as printed scan

handwritten You can train for

free at the click of a button,

you can reduce your time to

market, with owning your own

data within your own GCP

project, you can import already

labeled documents and have two

training options can train from

scratch to create model of any

document type, and also up

strain to get accurate results

faster, and document types that

have relevant processer already

and use starting point as

invoice processer.

And BBVA had this to talk about

the workbench, and customers

echoing the same sentiment

>> Another customer at Libeo,

has obtained an invoice

processor with 1600 documents

and increase their score from

75 to 80, thanks to a training

document our results now meet

results of competitor and help

Libeo save 20% of overall cost

in the long run.

Estimates to market, reduce 80%

with workbench versus building

customer models.

Google workbench officer

flexible easy to use interface

with end to end functionality,

and customer document extractors

and classifiers not only reduce

prototyping from months to week,

but also offer clients, clients

added cost reduction reductions

compared to their current

technologies. Extract the data

from documents more accurately

and with less training data for

flexible documentW9 variants,

types like invoices, receipts,

bank statements and pay stubs, a

third party agency used okay as

workbench, alongside major

competitors products to automate

this document processing.

Here are some of the notable

features lunching with API

Workbench announcement today.

Okay, well thanks to first

essential workforce to develop

custom document extraction

processes data import schema

creation and annotation of

training evaluation and

troubleshooting, and model

deployment and version

management, and human in the

loop integration for quality

assurance.

We also have various partners

available to help customers with

document AI.

In addition to workbench, DocAI

has trained processer on

documents that matter to you

delivering highly accurate data

lowering process cost

dramatically.

Over the past year improved

existing procurement offering

and launch new preview offerings

and span across processing, and

receipts and purchase orders and

support for 6 new languages and

expanded to new regions, and

today announcing the following

for document for AI.

Refresh for invoice and expense

pretrained processers with

improvements to normalization

and line item entity detection,

and also launching up training

for public preview invoices

expenses and purchase order

pretrained parsers, and unlock

new possibilities for improving

accuracy, adding new language

support as customizing scheme,

and offer support for 5 new

languages and expand

availability to procurement to

Canada and Australia.

Upgrade for invoice expense

and purchase order featuring

process unlocks new

possibilities, as I said for

accuracy, language support and

customization. N. Here is an

example. Voice of training use

case where we have used a

training to improve results.

Here is another example of

training used to train expense

pretrained processer for new

language support for Japanese.

Today we're also excited to

announce ID document proofing as

part of identity products suite.

You can also upgrade pretrained

processers and wide variety of

training processer invoice,

expense, purchase order and

contract and so on and so forth.

In addition to pretrained

processer product suite as well

as workbench, Doc AI also has

state of the art OCR form parser

and document splitter features

that enable customers to convert

unstructured documents into the

text and structured data.

P

Doc Ai OCR is fine with text,

however as developer can be

difficult to integrate into

application or storage system.

So here with form parser able to

get back set of key value pairs

and layout structure really

implies question and answer

dynamic for the content in the

document, with the parser, its

easier to developers to

integrate into another system.

With that excited to announce,

Andreas Vollmer managing

director of document life cycles

at Commerz bank, AI to

transform their document process

automation systems and I'm

excited to hear more from

Andrea's over to you.

>> Commerzbank is the second

largest private bank in Germany.

The majority of corporate

clients in Germany, and a strong

partner for approximately 11

million private and small

business customers in Germany is

a client's clients and

portfolio of financial services

in two segments. We serve our

customers in Germany and it's

approximately 41,000 employees,

heading the cluster document

lifecycle responsible for the

terminal related services from

creation of document, I think,

as well as the digital

communication to customers.

Driving in parallel, paperless

bank, and supporting

transformation to advisory bank.

The Doc AI use case is key

initiative to enable the core

strategy, optimization of

business models toward advisory

bank in combination with

significant reduction of number

of branches.

At the core of the turn as

around strategy, new advisory

centers created which directly

depend on the document injection

pipeline, as service bank, in

middle of transformation, facing

a lot of paper baseded

communication with clients, by

introducing for example the

advisory center, we redesign the

handling of documents coming in.

The new pipeline is relevant for

proposed segments going forward,

private client and corporate

clients.

Aim to automate the recognition

of incoming documenteds

traditionally done by step in

the branches and the back

office.

Automation via google Doc AI

allow to operate significantly

less staff taking care of

assigning docs to business

processes.

In Summary switching from late

scan, and central, and

processing and sorting, and

followed logistic to the back

office for scanning and filing,

to early scan, which is central

delivery, direct scanning, and

automated sorting and assigning

pipeline, and followed by

transfer to manual processing.

We talk about approximatelily

50 million scanned pages per

year, and approximately 30,000

incoming letters per day.

Adding up to very high number of

document types to be brought on

the pipeline.

We will run trained documents

structured, unstructured and as

well as documents through the

pipeline.

As benefit it will free us from

own infrastructure, by using

fully cloud based model.

At the same time, we will

increase customer experience by

reducing the running time for

incoming document and is there

much faster fulfillment for

requested service for customers,

overall helping to digital

process for significantly higher

efficiency.

In addition the new pipeline

will allow step by step MIE GAGS

for approximately 20 historical

processes with high degree of

manual action in the pipeline

within the next few years.

Not only innovation case, but

renovation case.

Going forward we will use the

same technology to fully end to

end automate high volume

processes, which will be done by

part

partnering... and we continuous

ly ramp up and retrain to

increase degree of automation.

Commerbank, established cloud

partnership with ranging from

infrastructure service, decided

to partner with workbench to

innovate and future Approach as

a strong partner, and covering

broad spectrum of documents and

by different industry and well

comprehensive Approach for

custom documents and already

running proof of concept already

in 2021.

First step is achieved and

adding value significantly.

And going forward, documents

will continuously be ramped up

to the pipeline.

Using a customized human in the

loop in that case to comply with

all our requirements and data

sources.

Making the handling as easy as

possible for office staff.

Nevertheless, it's just the

beginning.

Besides ramping up the new

pipeline, have to address the

process and document complexity.

And based on this trying to

simplify those dimensions in

parallel, to being able to scale

as good as possible.

We understand it as multiyear

journey creating added value

with every step for our

customers, as well as the

efficiency of document handling

and processing.

We're looking forward to

realizing the full potential

together with our partners.

Thank you for having me today.

Back to you Sudheera

>> Thank you, Andreas.

Very excited to hear from

customers, such as Commerzbank,

how Doc AI has been useful in

furthering their mission and

strategy, now to the next

exciting announcement in Doc A

product suite.

Doc warehouse.

House. Now once customers

extract data from documents The

next challenge they face is

managing and using this data

There are three challenges

today. There's no cloud native

service to store documents,

along with its unstructured data

from stitch together multiple

cloud components.

Search on unstructured, is

complex to assemble, and from

documents requires complex

integration of the extracted

data with various applications

and tools.

As you see on the left we have

mature portfolio to process

structured data, However

unstructured data constitutes

80% of enterprise data and is

yet underserved or the cloud

product was developed to address

this gap and unlock value from

the documents.

Warehouse, address 3 problems.

Provides best of Google semantic

search technology on documents,

and integrate with best in

class, doc Ai with scaleable and

accurate classification

extraction and under cloud

native elastic managed service,

we manage and scale the compute,

storage and databases.

And warehouse features across

search organization governance

and compliance, workflows and

integ

integrations, and AI and data

processing, as you see Doc

warehouse is applicable to broad

range of use case and is

document types and work flows,

we have seen initial traction

with financial services,

healthcare, supply chain

industries and applicable to

broader set of operation

document based applications.

What we're announcing today is

UI to administer search, browse,

folder and govern documents,

API's, client libraries to

manage, search documents,

folders schemas.

Self serve provision, with

catalog documentation.

And connectors and batch docAI

pipeline management workflows,

that wraps up all the

announcement in Doc AI product

suite today, here few ways to

learn more and get hands on with

Doc AI, please get in touch with

Google contacts with external

at-Google.com.

Thanks again for joining us

today and look forward to

working with you will all on

document related challenges.

[Music]

[Music]

Loading...

Loading video analysis...