﻿WEBVTT

00:00:03.520 --> 00:00:04.240
I'm Farabostik.

00:00:04.320 --> 00:00:06.960
I'm just here to facilitate
this conversation.

00:00:07.040 --> 00:00:09.520
We have a wonderful panel of expert

00:00:09.600 --> 00:00:14.680
panelists who are going to talk us through
these topics related to emotive AI.

00:00:14.760 --> 00:00:18.480
But the way that I wanted to just ground
the conversation is

00:00:18.560 --> 00:00:23.080
a little bit about getting underneath
the hype, because I think you may have all

00:00:23.160 --> 00:00:28.800
seen these headlines about by next year,
we'll see something like $200 billion

00:00:28.880 --> 00:00:31.960
in investment, according
Goldman Sachs, in AI.

00:00:32.040 --> 00:00:35.600
We're seeing more and more of these
generative AI tools like ChatGPT

00:00:35.680 --> 00:00:40.440
and Claude and Midjourney, which is
one of my favorite things to play with.

00:00:40.520 --> 00:00:43.720
Actually, I just like to watch other
people make things in Midjourney.

00:00:43.800 --> 00:00:45.600
Get yourself a Midjourney account just

00:00:45.600 --> 00:00:47.320
to watch what everyone else
is making in the Discord.

00:00:47.400 --> 00:00:50.240
It is a good time.
It's a really good time.

00:00:50.320 --> 00:00:55.120
But you also have a sense that things
are really being disrupted.

00:00:55.200 --> 00:00:59.960
Hi, Vinnie.
One of the ways that it's being disrupted,

00:01:00.040 --> 00:01:04.720
I think, really was brought home to me
in a headline recently about Tyler Perry,

00:01:04.800 --> 00:01:09.680
the filmmaker, deciding not to build
a movie studio because he'd seen a demo

00:01:09.760 --> 00:01:13.640
of Sorra from OpenAI
and was just like, That's it.

00:01:13.720 --> 00:01:15.200
That's what we're going to be doing now.

00:01:15.280 --> 00:01:16.920
I think one of the questions that we're

00:01:17.000 --> 00:01:21.000
going to talk about tonight is the role
of generative AI in creativity more

00:01:21.080 --> 00:01:26.080
generally, and also to really dig
underneath what emotive AI means and why

00:01:26.080 --> 00:01:28.440
it's a topic that we should all
be concerned about right now.

00:01:28.520 --> 00:01:30.120
I think it starts to bring things

00:01:30.200 --> 00:01:36.240
into a more concrete and relatable space
as opposed to some of the big,

00:01:36.240 --> 00:01:37.920
broad topics that so
frequently get talked about.

00:01:38.000 --> 00:01:42.600
What I want to do first
is introduce our panelists.

00:01:42.680 --> 00:01:45.720
We have a wonderful group of people here,

00:01:45.800 --> 00:01:51.640
and each one of them comes from a slightly
different expertise and point of view.

00:01:51.720 --> 00:01:55.600
To begin with, I want to introduce...

00:01:55.680 --> 00:01:57.520
I'm going to start actually
in the middle and work my way out.

00:01:57.600 --> 00:02:00.040
I'm going to start with Carly,
Carly Burton.

00:02:00.120 --> 00:02:03.800
Who is Director
of Gen AI Product Design at Metta.

00:02:03.800 --> 00:02:05.800
We're going to have a lot
to talk about there.

00:02:05.880 --> 00:02:08.640
We have Syra Jaisani on my left,

00:02:08.720 --> 00:02:12.560
who is Deputy Executive Director
at the Data and Trust Alliance,

00:02:12.640 --> 00:02:16.280
which is incredibly important for us to
get our heads around in this conversation.

00:02:16.360 --> 00:02:19.160
Next to her is Alejandro Matamal-Ortiz,

00:02:19.240 --> 00:02:22.800
who is co-founder
and Chief Digital Officer at Runway,

00:02:22.880 --> 00:02:27.240
which is doing really cool stuff with ML
and has been at this for a while.

00:02:27.320 --> 00:02:29.520
We'll be talking about
storytelling with him.

00:02:29.600 --> 00:02:34.840
At At the very end, Joe Preston,
who is VP of Product and Design at Intuit.

00:02:34.920 --> 00:02:36.600
Last, but certainly not least,

00:02:36.680 --> 00:02:40.560
is Maz Karimian, who is Head
of Strategy at our Hosts Us, Two.

00:02:40.640 --> 00:02:42.360
We are going to actually...

00:02:42.440 --> 00:02:45.680
Why don't we just welcome
the panelists one more time?

00:02:45.760 --> 00:02:50.000
All right.
So seal broken.

00:02:50.080 --> 00:02:51.400
We can get on with it.

00:02:51.480 --> 00:02:52.760
Maz, this is all your fault.

00:02:52.840 --> 00:02:56.520
So why don't you talk about why
you wanted to talk about Emotev AI?

00:02:56.600 --> 00:02:57.920
Why does it matter now?

00:02:58.000 --> 00:03:01.040
And what do you mean?
Yeah, totally.

00:03:01.040 --> 00:03:04.640
Well, I'm going to begin by talking
about my favorite topic, which is me.

00:03:04.720 --> 00:03:05.880
So I'm Maz.

00:03:05.960 --> 00:03:08.840
I head up US strategy for us, too.

00:03:08.920 --> 00:03:12.840
What that means is a hybrid
of product and business strategy.

00:03:12.920 --> 00:03:17.000
And a lot of what I do is focused
on humanizing emerging technologies,

00:03:17.080 --> 00:03:21.200
taking them from capabilities
to useful applications.

00:03:21.280 --> 00:03:23.440
And a big lens that we apply to that,

00:03:23.480 --> 00:03:26.880
an emotive lens that we apply to that,
is what we call play thinking,

00:03:26.960 --> 00:03:32.720
which is essentially a design ethos
informed by our game design DNA.

00:03:32.720 --> 00:03:33.840
We're a product design studio,

00:03:33.900 --> 00:03:36.400
but we have a lot of game design
experience in DNA,

00:03:36.480 --> 00:03:40.520
and we seek to bring those very
emotion-centric,

00:03:40.600 --> 00:03:45.040
game design-informed principles
and practices to product design.

00:03:45.120 --> 00:03:47.320
Where that links into emotive AI,

00:03:47.400 --> 00:03:50.400
of course,
is the way in which we have this

00:03:50.480 --> 00:03:56.160
focus on AI, this discourse around AI
as being very functionally maximizing.

00:03:56.240 --> 00:03:59.280
It automates processes.

00:03:59.280 --> 00:04:01.400
It might even automate
or accelerate people.

00:04:01.480 --> 00:04:06.480
But we think that there's a whole side
to AI, an emotive side to AI that we're

00:04:06.480 --> 00:04:08.360
really just beginning
to scratch the surface of.

00:04:08.440 --> 00:04:09.480
What do I mean by that?

00:04:09.560 --> 00:04:11.960
We have folks like Alejandro who's here

00:04:12.040 --> 00:04:16.240
and tell you more about what Runway ML is
doing in the text to video and video

00:04:16.320 --> 00:04:20.280
to video and A to B space,
all these kinds of things.

00:04:20.360 --> 00:04:24.800
But what I'm getting at is that there are
really, really emotive outputs from AI.

00:04:24.880 --> 00:04:27.120
These things can make us feel things

00:04:27.120 --> 00:04:29.480
and they can help us make
other people feel things.

00:04:29.560 --> 00:04:31.000
Why does that matter?

00:04:31.080 --> 00:04:36.080
Fundamentally, because we as people make
choices about the products and services we

00:04:36.160 --> 00:04:40.800
buy that we bring into our lives,
not just on the basis of their function,

00:04:40.880 --> 00:04:44.880
but in terms of what they make us feel
and how they satisfy us emotionally.

00:04:44.960 --> 00:04:49.440
So that's why we want to explore
emotive AI as a topic today.

00:04:49.520 --> 00:04:54.120
I think the place I want to go next is
maybe to you, Carly,

00:04:54.200 --> 00:04:59.120
to talk about the context
for these interactions, because I think

00:04:59.200 --> 00:05:02.080
we We have certainly
gone through this before.

00:05:02.080 --> 00:05:05.120
I'm old enough to remember when the hot
new thing was Facebook,

00:05:05.200 --> 00:05:09.480
and we were talking about participatory
brands and consumer-generated content.

00:05:09.560 --> 00:05:12.320
I think context has always mattered

00:05:12.320 --> 00:05:15.480
and sometimes gets overlooked
in these rushes to do new problems.

00:05:15.560 --> 00:05:18.880
And so maybe you could talk a bit about
how brands should think about

00:05:18.960 --> 00:05:23.960
and businesses should think about
the context of the interaction as they

00:05:23.960 --> 00:05:27.000
start to think about more emotive
interactions with generative AI.

00:05:27.080 --> 00:05:28.600
Hi, everybody.

00:05:28.680 --> 00:05:30.360
I think I'll start with a story to answer

00:05:30.440 --> 00:05:36.000
that question is about a decade ago,
I did a global ethnography with digital

00:05:36.080 --> 00:05:40.200
natives to understand what is it
that they expect out of technology.

00:05:40.280 --> 00:05:42.560
And some of the stories that they shared

00:05:42.640 --> 00:05:46.880
were things that I just never thought
someone would have that close

00:05:46.960 --> 00:05:50.480
of a relationship,
having grown up not with a computer

00:05:50.560 --> 00:05:54.440
myself, but their
relationship was so different.

00:05:54.520 --> 00:05:57.360
And so I ended up writing about artificial

00:05:57.440 --> 00:06:01.640
empathy, and I thought, well, gosh,
you need to have have so much data

00:06:01.720 --> 00:06:05.440
in order to enable that sense
of knowing and understanding.

00:06:05.520 --> 00:06:09.960
And so when I think about the context
in which Emotive AI needs to present

00:06:10.040 --> 00:06:13.120
itself, I think first about,
what's its relationship?

00:06:13.200 --> 00:06:15.400
So it's an intelligent actor.

00:06:15.480 --> 00:06:19.720
It's going to change how you're going
to engage with a product or a service.

00:06:19.800 --> 00:06:23.120
And so there's a framework
that you could consider.

00:06:23.200 --> 00:06:24.560
What's its role?

00:06:24.640 --> 00:06:26.400
What's my reaction to it?

00:06:26.480 --> 00:06:28.480
What's my relationship to it?

00:06:28.560 --> 00:06:30.320
And how does it change my routine?

00:06:30.400 --> 00:06:32.280
So these four different R's.

00:06:32.360 --> 00:06:37.680
And the reason I think about that is you
need it to deliver on how you expect it

00:06:37.680 --> 00:06:40.240
to show up within the context
that you're designing for.

00:06:40.320 --> 00:06:45.280
So I'm sure there's a lot of product
designers or product managers or even

00:06:45.280 --> 00:06:48.360
technologists in the room,
is that you're always thinking about,

00:06:48.360 --> 00:06:50.880
okay, what's the value proposition
that I'm going to deliver?

00:06:50.960 --> 00:06:52.600
And that means that there's a specific set

00:06:52.680 --> 00:06:54.880
of needs, someone's coming to your
product,

00:06:54.960 --> 00:06:59.320
and then you're going to deliver these
capabilities, and the AI is one of them.

00:06:59.400 --> 00:07:03.560
Well, How do I get it to emote the type

00:07:03.640 --> 00:07:06.760
of experience and have
the right type of engagement?

00:07:06.840 --> 00:07:10.600
And then when I think about context,
I think about public versus private.

00:07:10.680 --> 00:07:12.560
So if I'm going to have a conversation

00:07:12.640 --> 00:07:16.480
and something deeply knows me,
maybe I just want that to be between me

00:07:16.560 --> 00:07:19.920
and that AI, and maybe I want
it to behave in a certain way.

00:07:20.000 --> 00:07:23.440
Well, in order for it to do that,
it needs to know quite a lot about me.

00:07:23.520 --> 00:07:26.960
Then if I think about public instances,

00:07:27.040 --> 00:07:30.920
what's its role, relationship,
and the reactions that are going to get

00:07:31.000 --> 00:07:34.280
created of the individuals
maybe in that group?

00:07:34.360 --> 00:07:38.040
One of the jokes that I was actually
making with my neighbors recently was

00:07:38.120 --> 00:07:42.000
like,
I don't want it to replace

00:07:42.080 --> 00:07:46.360
the relationship that I have with you as
I've gotten to know everyone

00:07:46.440 --> 00:07:49.720
in the neighborhood,
but I would like for it to help me get

00:07:49.800 --> 00:07:53.960
to know new things about you so that I can
figure out what this crazy pickleball is.

00:07:54.040 --> 00:07:56.240
That sounds great.

00:07:56.320 --> 00:07:58.720
So, yeah, it's a long-winded way of saying

00:07:58.800 --> 00:08:04.440
that I think the younger generation thinks
about artificial empathy,

00:08:04.520 --> 00:08:11.600
and so the context in which AI can show up
is quite broad, but how it should show up,

00:08:11.680 --> 00:08:15.920
what it can do is very dependent
on the underlying technology,

00:08:16.000 --> 00:08:21.560
that being the data and then the ways
that you shape it, and then the framework

00:08:21.560 --> 00:08:23.080
you put around it when
you bring it to life.

00:08:23.160 --> 00:08:26.920
I think the other interesting challenge
then, if we have brand managers

00:08:27.000 --> 00:08:31.320
in the room or people who work
in that space, is wanting to present

00:08:31.400 --> 00:08:34.640
a brand personality
through these interactions.

00:08:34.720 --> 00:08:36.880
I think when we were talking a couple

00:08:36.960 --> 00:08:41.200
of weeks ago now, we were talking about
the difference between you have

00:08:41.200 --> 00:08:42.840
a personality,
your personality is your personality.

00:08:42.920 --> 00:08:45.000
But depending on the situation,

00:08:45.080 --> 00:08:49.120
your tone, your word choice,
whatever might be different.

00:08:49.120 --> 00:08:52.560
I'm curious if you have any thoughts about
how brands might think of that new

00:08:52.640 --> 00:08:56.800
potential for dynamism where the agent is
acting on its behalf and trying to live

00:08:56.880 --> 00:09:01.920
the brand's values and all of that,
but still being in tune

00:09:02.000 --> 00:09:05.960
with the moment and what's needed
there from a voice point of view.

00:09:06.040 --> 00:09:06.400
Yeah.

00:09:06.480 --> 00:09:10.040
I mean, contextual reasoning
is a high compute use case.

00:09:10.120 --> 00:09:13.040
I think that when you're talking about

00:09:13.120 --> 00:09:18.160
dynamicism, it's how dynamic can
these agentic behaviors be expressed.

00:09:18.240 --> 00:09:22.840
What you're mentioning is we all
talk about personality, right?

00:09:22.920 --> 00:09:24.760
And personality is your voice.

00:09:24.840 --> 00:09:28.480
It's innately who you are,
but your tone fluctuates.

00:09:28.560 --> 00:09:30.360
Today, maybe I'm a little bit I'm

00:09:30.440 --> 00:09:35.520
lethargic, and so I'm going
to talk slower, or that changes.

00:09:35.600 --> 00:09:38.280
But sometimes whenever these models are

00:09:38.360 --> 00:09:43.400
trained to express as a brand,
it wants to reflect that, right?

00:09:43.480 --> 00:09:50.520
And so how How somebody trains that,
it needs to be able to express.

00:09:50.600 --> 00:09:52.240
But then can it be dynamic?

00:09:52.320 --> 00:09:56.240
And that contextualizing,
can it know through multi-term

00:09:56.320 --> 00:10:00.040
conversations to pick up and store
that memory and make those changes?

00:10:00.120 --> 00:10:02.120
These are high compute use cases.

00:10:02.200 --> 00:10:08.760
And so I think that there's the ambition
and intent of brand expression and voice

00:10:08.840 --> 00:10:13.560
and making sure that you're able
to create these agents that reflect that.

00:10:13.640 --> 00:10:18.520
But then there's also,
we as humans are incredibly dynamic.

00:10:18.600 --> 00:10:19.680
We are unpredictable.

00:10:19.760 --> 00:10:21.840
We change.

00:10:21.920 --> 00:10:27.320
And can those models change for that
context and express in the right way?

00:10:27.400 --> 00:10:28.080
Yeah.

00:10:28.160 --> 00:10:30.680
I wonder then, and maybe Now,
as I'll start with you,

00:10:30.760 --> 00:10:36.080
but anyone feel free to jump in on this,
but I'm wondering about how more emotive

00:10:36.160 --> 00:10:39.920
AI experiences could
really fundamentally change

00:10:39.920 --> 00:10:41.960
the relationship between
the brand and the customer.

00:10:42.040 --> 00:10:48.880
You can imagine suboptimal scenarios.

00:10:48.960 --> 00:10:54.160
Here I'm even thinking this happens even
with humans in the situation where you're

00:10:54.160 --> 00:10:57.640
calling into a customer care center
because you're freaked out about a bill

00:10:57.640 --> 00:11:00.480
you got that you weren't expecting,
and the person on the other line is not

00:11:00.560 --> 00:11:03.400
meeting you where you are
emotionally in the call.

00:11:03.480 --> 00:11:06.840
They're super chipper and you're
having the worst day of your life.

00:11:06.920 --> 00:11:08.920
Those things could really matter.

00:11:08.920 --> 00:11:12.440
It seems like there's more opportunity
to flex up and down those kinds of things.

00:11:12.440 --> 00:11:15.280
But I'm wondering how you're thinking
about that from a design point of view.

00:11:15.360 --> 00:11:20.080
Yeah, I think to put it really crudely,
I think right now we have relationship

00:11:20.080 --> 00:11:23.040
with apps, and we're going
to have relationships with agents.

00:11:23.120 --> 00:11:24.680
We're not going to call
them that, I think.

00:11:24.760 --> 00:11:26.920
But where the difference will come

00:11:26.920 --> 00:11:30.400
in and where I think we'll really feel it,
is that when you think about how you

00:11:30.480 --> 00:11:33.800
engage with a brand today,
it is forgetful.

00:11:33.880 --> 00:11:35.920
It is not continuous.

00:11:36.000 --> 00:11:38.720
At best, when you engage with a brand

00:11:38.800 --> 00:11:42.320
through an app or even in a physical
location or what have you,

00:11:42.400 --> 00:11:45.280
the best that you can hope for is
that they're going to pull up your file

00:11:45.360 --> 00:11:49.080
and they're going to maybe hopefully
remember what it is that you

00:11:49.160 --> 00:11:52.520
bought that you're having an issue with or
that you bought last time,

00:11:52.520 --> 00:11:56.200
and maybe you want to buy something or
change something about it, what have you.

00:11:56.280 --> 00:12:00.800
When we think about the possibilities
with AI, and especially emotive AI,

00:12:00.880 --> 00:12:05.720
the ability of AI to understand not just
your stated preferences, your purchases,

00:12:05.800 --> 00:12:11.800
and what have you, but your personal maybe
style in the context of clothing or

00:12:11.880 --> 00:12:14.640
cosmetics,
your financial goals and your risk

00:12:14.720 --> 00:12:18.600
aversion or your risk appetite
in the context of finance or any number

00:12:18.680 --> 00:12:24.040
of psychoemotional factors in the case
of health and wellness and fitness,

00:12:24.120 --> 00:12:29.120
we have that ability to not just remember,
but adapt to you and your experience as

00:12:29.200 --> 00:12:33.800
a customer on on the basis of what we know
about you at every single moment

00:12:33.880 --> 00:12:37.880
in your experience as a customer,
not just at those moments when you get put

00:12:37.880 --> 00:12:40.120
in contact with a human
who can review your case.

00:12:40.200 --> 00:12:43.040
And maybe, hopefully,
sift through the information and maybe,

00:12:43.040 --> 00:12:46.000
hopefully, make a connection that is
relevant and meaningful to you.

00:12:46.080 --> 00:12:48.000
What we're getting at here is that agents

00:12:48.080 --> 00:12:53.120
can feel that connective tissue
that always on relationship to the point

00:12:53.140 --> 00:12:56.640
Carly was making about having a real
relationship dynamic with a brand.

00:12:56.720 --> 00:12:58.280
That is completely different than what we

00:12:58.360 --> 00:13:01.240
face now, which is where we
have relationships with apps.

00:13:01.320 --> 00:13:03.920
We will soon have
relationships with agents.

00:13:04.000 --> 00:13:06.320
That does seem like...
Oh, go ahead, Carly.

00:13:06.400 --> 00:13:09.840
I was just going to say,
Enter the data challenge, right?

00:13:09.920 --> 00:13:14.000
And user's perception
of what you explain, right?

00:13:14.080 --> 00:13:18.080
It's like, Do people want their
data used in that way as well?

00:13:18.160 --> 00:13:21.280
Yeah, I think that's always...

00:13:22.120 --> 00:13:25.360
Years ago when I was at Holland Partners
and we were talking about participatory

00:13:25.440 --> 00:13:30.000
brands, a lot of the conversation was
like, How intimate of a relationship do

00:13:30.080 --> 00:13:33.240
you really want to have with every
brand you interact with.

00:13:33.320 --> 00:13:34.960
Some of them, you just want to give them

00:13:34.960 --> 00:13:36.560
your money, they give you
the product, it's over.

00:13:36.640 --> 00:13:38.720
That's the end of the interaction.

00:13:38.800 --> 00:13:42.080
But others do instill more

00:13:42.160 --> 00:13:47.080
of an emotional interaction or
have a more sensitive interaction.

00:13:47.080 --> 00:13:48.680
Maybe this is a good place for you to come

00:13:48.760 --> 00:13:51.280
in, Joan, talking about
a couple of things.

00:13:51.360 --> 00:13:54.120
One is your work is dealing with what

00:13:54.120 --> 00:13:57.560
could be extremely sensitive financial
information about your customers.

00:13:57.640 --> 00:13:59.240
The other thing that's interesting is it's

00:13:59.320 --> 00:14:03.440
a portfolio of products, and not all
of them are even in financial services.

00:14:03.440 --> 00:14:04.920
But maybe talk a bit about how you think

00:14:05.000 --> 00:14:09.040
about brand and context and interaction
across a portfolio of brands.

00:14:09.120 --> 00:14:14.160
Yeah, I love this conversation because I
think once we've crossed that line of it's

00:14:14.160 --> 00:14:16.560
not perceptible to humans anymore,
which I think we have.

00:14:16.640 --> 00:14:19.960
I think the designers role in most

00:14:20.040 --> 00:14:24.000
companies is asking,
should we be doing this or not?

00:14:24.080 --> 00:14:27.480
Because I think we can't really see what

00:14:27.560 --> 00:14:31.680
the implications are to some of this
stuff, especially to some of our audiences

00:14:31.760 --> 00:14:37.800
or customers that will be extremely
vulnerable or able to be manipulated.

00:14:37.880 --> 00:14:40.240
From a financial context,

00:14:40.320 --> 00:14:46.640
why emotional AI is really interesting is
because if you think about your financial

00:14:46.720 --> 00:14:51.320
life and how you start it out and how you
want to be explained things in very simple

00:14:51.400 --> 00:14:58.160
terms and maybe in a tone that
manages your confidence and your trust.

00:14:58.240 --> 00:15:00.880
And then as your financial life

00:15:00.880 --> 00:15:04.520
transcends, maybe you become a business
owner, maybe you become very savvy.

00:15:04.600 --> 00:15:07.280
It's a really opposite end of the spectrum

00:15:07.360 --> 00:15:13.280
with how you want to be communicated to
your financial savviness, all that thing.

00:15:13.360 --> 00:15:17.040
But again, I go back to we're crossing

00:15:17.120 --> 00:15:20.640
this interesting chasm,
I think, right now.

00:15:20.720 --> 00:15:23.280
Just today, I was on my Instagram.

00:15:23.360 --> 00:15:28.480
My teenage girl is falling
in love with the ChatGPT voice.

00:15:28.760 --> 00:15:30.240
Good voice.

00:15:30.320 --> 00:15:33.240
Yeah, I was like, wow,
I never would have considered that.

00:15:33.320 --> 00:15:35.480
And I'm sure OpenAI didn't consider that.

00:15:35.560 --> 00:15:39.160
And I don't know how that will
play out in the years to come.

00:15:39.240 --> 00:15:43.160
But it's incredible opportunities.

00:15:43.240 --> 00:15:44.320
But at the same time,

00:15:44.400 --> 00:15:49.160
I think this line we're crossing right
now is really telling for humanity.

00:15:49.240 --> 00:15:51.560
Yeah.
I think the other interesting opportunity

00:15:51.640 --> 00:15:54.160
here, and maybe, Alejandro,
you can comment on this,

00:15:54.240 --> 00:15:59.440
is thinking about what this means beyond
just the tone and manner

00:15:59.520 --> 00:16:01.640
of an interaction,
what it means for storytelling.

00:16:01.720 --> 00:16:04.720
I mean, this is the space
that you at Runway really work in.

00:16:04.800 --> 00:16:05.440
Yeah.

00:16:05.520 --> 00:16:12.080
So for us, we like to think that it's
a little bit different from product

00:16:12.160 --> 00:16:17.360
on a service, where when you are seeking
for a product or a service,

00:16:17.440 --> 00:16:22.920
on one side, you have
the provider that can tailor

00:16:23.000 --> 00:16:28.920
the experience for you based on your
previous purchases, maybe on your likes,

00:16:29.000 --> 00:16:32.080
on things that you have
I've searched before.

00:16:32.160 --> 00:16:37.400
And that's great to create hyper
personalized experiences for you to maybe

00:16:37.480 --> 00:16:42.880
the next time that I come there,
I can get exactly what I want with maybe

00:16:42.960 --> 00:16:46.760
less effort or a hyper
personalized experience.

00:16:46.840 --> 00:16:48.160
Whereas in storytelling,

00:16:48.240 --> 00:16:53.600
we believe there might be a different
scenario where perhaps hyper

00:16:53.680 --> 00:16:59.280
personalisation doesn't work the same way
that it works with a product and service.

00:16:59.360 --> 00:17:00.080
Why?

00:17:00.160 --> 00:17:06.480
Because I don't know how many of you
watch succession, the show on HBO.

00:17:06.560 --> 00:17:10.720
Succession happened, the finale
happened on a Sunday, right?

00:17:10.800 --> 00:17:16.200
Next day, Monday, we were all talking
about what happened on that finale.

00:17:16.280 --> 00:17:18.520
It's likely that we will not be talking

00:17:18.600 --> 00:17:23.440
about personalized stories next day
at work because those are so tailored

00:17:23.520 --> 00:17:30.320
to you that maybe that social aspect
in storytelling is crucial for creating

00:17:30.400 --> 00:17:36.760
this more emotive connections
with storytelling.

00:17:36.840 --> 00:17:39.160
For us, I like to think a lot about

00:17:39.240 --> 00:17:45.840
hyperdiversification where you,
instead of have tailored stories for you,

00:17:45.920 --> 00:17:51.440
what would happen if you have right after
the finale, you have a different angle

00:17:51.520 --> 00:17:54.040
of the finale that you can
also share with others?

00:17:54.120 --> 00:17:58.520
Maybe what if the same story is presented
to you by a different character?

00:17:58.520 --> 00:18:00.120
What if maybe the story never ends.

00:18:00.200 --> 00:18:02.280
Maybe have different ending,

00:18:02.360 --> 00:18:05.960
so maybe different ways of seeing
different angles of the same story

00:18:06.040 --> 00:18:10.440
that are maybe less of one to one
and maybe more one to many.

00:18:10.520 --> 00:18:13.920
Yeah,
it's interesting to think about everyone

00:18:13.920 --> 00:18:17.200
has an imagination of what happened
after the last scene of a show.

00:18:17.200 --> 00:18:20.520
And so what does that look like and how
does that spark new conversations?

00:18:20.600 --> 00:18:23.720
I think the other interesting thing about
the work you guys do is

00:18:23.800 --> 00:18:28.920
tools for creators, because I think a lot
of times the hype around AI can be very

00:18:29.000 --> 00:18:31.680
much like Well, that's
the end of everybody's jobs.

00:18:31.760 --> 00:18:35.560
But there are other ways that people can
use these tools

00:18:35.640 --> 00:18:42.800
to tell great stories or to bring an idea
to life in a way that maybe previously...

00:18:42.880 --> 00:18:44.320
There's some great case studies on you

00:18:44.320 --> 00:18:47.400
guys' website about the thing that used to
take me five hours takes five minutes now.

00:18:47.400 --> 00:18:48.520
But maybe you can talk about how this

00:18:48.520 --> 00:18:50.760
really opens up possibilities
for creators as well.

00:18:50.840 --> 00:18:52.440
When we started the company,

00:18:52.520 --> 00:18:56.040
we wanted to create tools
for people to tell their stories.

00:18:56.120 --> 00:18:58.080
People who couldn't tell these stories

00:18:58.160 --> 00:19:02.560
before because budget, because
of timing, because it takes time.

00:19:02.640 --> 00:19:07.720
There is a learning curve to learn how to
edit videos or tell stories.

00:19:07.800 --> 00:19:09.880
But not just for people,
not just for creators,

00:19:09.960 --> 00:19:15.800
also for brands and for enterprises,
how to package their story.

00:19:15.880 --> 00:19:18.960
But when we created the company,

00:19:18.960 --> 00:19:20.400
we started thinking we
come from that background.

00:19:20.480 --> 00:19:23.200
The three of us, the three
founders, met at Art School.

00:19:23.280 --> 00:19:27.360
We were doing this things
before we started the company.

00:19:27.440 --> 00:19:32.400
And for us, the creator on
the loop is crucial.

00:19:32.480 --> 00:19:34.520
It's something that we want to...

00:19:34.600 --> 00:19:37.560
We're creating tools at the end.

00:19:37.640 --> 00:19:43.080
We're enabling creators to tell their
stories as they want,

00:19:43.160 --> 00:19:49.280
things that they have on on plan on their
heads and they want to communicate.

00:19:49.360 --> 00:19:51.280
So for us, it's

00:19:51.360 --> 00:19:57.960
all about building the necessary tools
for them to execute on those ideas.

00:19:58.040 --> 00:20:02.640
And it's never been created in a way
that we try to replace the artist.

00:20:02.720 --> 00:20:05.920
We're not thinking on a printer where I

00:20:06.000 --> 00:20:08.440
give you the script,
you give me a movie back.

00:20:08.520 --> 00:20:10.760
That's not what we aim to create.

00:20:10.840 --> 00:20:13.600
We're aiming to create tools that can help

00:20:13.680 --> 00:20:18.160
power this ideas and turn
them into realities.

00:20:18.240 --> 00:20:19.720
I heard the other day,

00:20:19.800 --> 00:20:28.160
Guillermo Torres saying that the natural
state of a movie is not to happen.

00:20:28.240 --> 00:20:31.320
And when they happen,
it's actually a miracle.

00:20:31.400 --> 00:20:32.680
And that's true.

00:20:32.760 --> 00:20:37.840
And what we are trying to enable is coming
from the three founders come

00:20:37.840 --> 00:20:40.520
from different places,
two from Chile and one from Greece.

00:20:40.600 --> 00:20:45.440
We're thinking in the idea of
making a movie that last two hours is

00:20:45.520 --> 00:20:49.040
crazy because first,
you don't have the crew to do it,

00:20:49.120 --> 00:20:52.080
you don't have the budget,
you maybe don't have the ambition

00:20:52.160 --> 00:20:55.840
to create a 2 hours feature film
movie because it feels so far.

00:20:55.920 --> 00:21:02.160
But by enabling the tools for you to get
a quick access to that and quick feedback

00:21:02.240 --> 00:21:05.680
to visualize those ideas is the thing,
the products that we're building.

00:21:05.760 --> 00:21:09.200
I wonder about thinking about
personalization,

00:21:09.280 --> 00:21:13.120
thinking about storytelling,
thinking about unlocking creativity in new

00:21:13.200 --> 00:21:17.760
ways for people, the way
that the socialness of it plays out.

00:21:17.840 --> 00:21:22.160
I wonder, Carly,
what you think about that.

00:21:22.240 --> 00:21:24.520
If we're all living in generative worlds

00:21:24.600 --> 00:21:28.960
that are totally tailored to ourselves,
what do we talk about at the bus stop?

00:21:29.040 --> 00:21:32.160
What do we talk about at the PTA meeting?

00:21:32.240 --> 00:21:35.960
What are the things that- We were talking
earlier and I was like,

00:21:36.040 --> 00:21:41.920
I don't know why I joined the PTA board
and I'm a full-time working mother of two.

00:21:42.000 --> 00:21:44.520
I'm in negative hours right now, people.

00:21:44.600 --> 00:21:49.440
But I actually want to first remark
on something I think Alejandro was

00:21:49.520 --> 00:21:55.440
explaining is around this idea
of centar systems, human plus machines.

00:21:55.520 --> 00:21:57.520
I, as a human, am able to do certain

00:21:57.600 --> 00:22:01.280
things, and the machine,
and in itself can do certain things.

00:22:01.360 --> 00:22:03.720
But the human plus the machine can

00:22:03.800 --> 00:22:06.320
ultimately create much
more interesting outputs.

00:22:06.400 --> 00:22:08.440
I think that collaboration

00:22:08.520 --> 00:22:14.000
and that dynamic is why I'm also
an optimist in terms of why these creative

00:22:14.080 --> 00:22:19.320
tools are so important to empower,
to enable, because Sintar systems,

00:22:19.400 --> 00:22:22.760
more often than not,
when you also look at the data,

00:22:22.840 --> 00:22:28.840
produce incredible outputs, whether it be
in medicine, identifying cancer and so on.

00:22:28.920 --> 00:22:33.960
Then when you ask, Well, what does this
look like in terms of social dynamics?

00:22:33.990 --> 00:22:36.040
Let's say I have a personal agent and you

00:22:36.120 --> 00:22:40.240
have an agent, and those agents
are all talking to each other.

00:22:40.320 --> 00:22:44.440
I'm going to keep going back
to the compute because the whole reason we

00:22:44.520 --> 00:22:48.800
can have generative AI is because
of the advancements in compute.

00:22:48.880 --> 00:22:51.280
I just don't want to understate that.

00:22:51.360 --> 00:22:53.200
But I

00:22:53.440 --> 00:22:58.880
do think that it's about augmentation
and isn't about replacement,

00:22:58.960 --> 00:23:04.720
and that I hope, is about
amplification and deeper connection.

00:23:04.800 --> 00:23:11.200
I also think that you can get more by...

00:23:11.280 --> 00:23:12.800
I mean, we live in an information economy,

00:23:12.880 --> 00:23:17.040
and I can't sort through everything
that hits me all the time.

00:23:17.120 --> 00:23:19.680
I want to have deeper connections.
I want to be more present.

00:23:19.760 --> 00:23:25.680
I would love when I see my kids that we're
talking about our day and I'm not having

00:23:25.760 --> 00:23:29.760
to sort out all of these things across
the bazillion of apps that I use.

00:23:29.840 --> 00:23:32.000
Wouldn't it be great to have that?

00:23:32.080 --> 00:23:35.480
I guess I'm an optimist.

00:23:35.560 --> 00:23:37.200
There's maybe some pessimists in the room,

00:23:37.280 --> 00:23:39.920
and if you're pessimist,
ask a question at the end

00:23:40.000 --> 00:23:43.440
or meet up for a drink at the bar
and have a discussion about that.

00:23:43.520 --> 00:23:48.080
But I think of these agents

00:23:48.160 --> 00:23:51.520
as additional actors,
and that's why I go back to that framework

00:23:51.600 --> 00:23:56.200
of relationships and roles and your
reaction and how you choose to engage.

00:23:56.280 --> 00:23:59.200
I think this brings us into a thing that I

00:23:59.280 --> 00:24:03.720
want to bring you in on Zyra, which is
what we're talking about is not just

00:24:03.800 --> 00:24:06.640
relationship, if we're going
to use this word relationship.

00:24:06.720 --> 00:24:09.000
One of the core elements that makes

00:24:09.080 --> 00:24:12.920
something actually a relationship
is some modicum of trust.

00:24:13.000 --> 00:24:16.000
At the very least,
even if you have a fairly transactional

00:24:16.040 --> 00:24:19.640
relationship, you trust that the
transaction will be the same every time.

00:24:19.640 --> 00:24:20.680
You trust that the other side will honor

00:24:20.760 --> 00:24:23.080
the transaction, you'll
honor the transaction.

00:24:23.160 --> 00:24:25.440
I want to bring in this idea of trust

00:24:25.520 --> 00:24:31.840
because I think that there is this
trust building that needs to be baked

00:24:31.920 --> 00:24:37.160
into how we think about this stuff,
in addition to the extremely expensive

00:24:37.240 --> 00:24:40.720
compute costs of bots
talking to each other.

00:24:40.800 --> 00:24:45.200
But setting that aside for the moment,
what are the building blocks of trust

00:24:45.280 --> 00:24:47.720
based on the work that you've done
with algorithms and other things?

00:24:47.720 --> 00:24:49.360
What are the things that brands
should be thinking about here?

00:24:49.440 --> 00:24:54.280
Yeah, I wanted to just pick up on, Carly,
what you were saying about augmentation,

00:24:54.360 --> 00:25:00.000
because I actually think that we
need to learn how to do that.

00:25:00.080 --> 00:25:03.280
Because if I think about it in much higher
stakes in what we're talking about,

00:25:03.360 --> 00:25:11.720
we're talking, let's say, to your point,
about cancer diagnosis or fighter pilots.

00:25:11.800 --> 00:25:19.000
When I'm looking at that AI, as a human,
sometimes I might under trust

00:25:19.080 --> 00:25:22.440
the technology or I might
over trust the technology.

00:25:22.520 --> 00:25:28.280
And we actually need to figure out
what is the right augmentation look like.

00:25:28.360 --> 00:25:31.360
And so that will take a long
time to figure that out.

00:25:31.440 --> 00:25:33.760
I don't think we're there yet.

00:25:33.840 --> 00:25:38.000
We have to figure out,
first of all, what are we building?

00:25:38.080 --> 00:25:41.920
How do we build that relationship
with human and machine?

00:25:42.000 --> 00:25:46.520
And then there's a lot of work that's been

00:25:46.600 --> 00:25:51.880
done on how do you actually add
more friction into the relationship.

00:25:51.960 --> 00:25:56.560
And the reason for that is you want
the human to be able to take a step back,

00:25:56.640 --> 00:26:00.000
stop and say, Is this actually
what I should be doing?

00:26:00.080 --> 00:26:02.640
And Joe, you were saying
that earlier with the designers.

00:26:02.640 --> 00:26:05.160
Exactly.
Should we actually be doing that?

00:26:05.240 --> 00:26:07.800
So you need lots of moments of extra

00:26:07.880 --> 00:26:12.320
friction along the way to make people
pause and say, Do I trust this?

00:26:12.400 --> 00:26:14.480
I was going to say not just designers,

00:26:14.560 --> 00:26:19.160
but like, government organizations
like Data and Trust Alliance.

00:26:19.240 --> 00:26:25.720
I think this is why the AI Act that came
out in European Union,

00:26:25.800 --> 00:26:31.160
it has a whole section on emotional AI,
and it puts most of it in the unacceptable

00:26:31.240 --> 00:26:35.880
risk, in the high risk category,
except in a few specific use cases

00:26:35.960 --> 00:26:40.080
for some of the reasons you just
mentioned, because these are very high

00:26:40.160 --> 00:26:44.720
stakes life or death decisions
that have to hold up in court.

00:26:44.800 --> 00:26:47.760
They have to hold up in front
of Congress, things like that.

00:26:47.840 --> 00:26:51.120
Can we double click, though,
on the trust element?

00:26:51.200 --> 00:26:55.560
Because when you think about
when you build trust with a person,

00:26:55.640 --> 00:27:01.880
it's off of consistent behavior
and meeting value system to some extent.

00:27:01.880 --> 00:27:04.040
I actually wanted to ask you
this before we came down here.

00:27:04.120 --> 00:27:06.520
I was like, how do you define trust?

00:27:06.600 --> 00:27:11.480
Because for me, it needs
to be a shared value system.

00:27:11.480 --> 00:27:13.560
And I know you're not
going to disappoint me.

00:27:13.640 --> 00:27:15.800
And so you have my trust.

00:27:15.880 --> 00:27:18.520
And you don't get it, you earn it.

00:27:18.600 --> 00:27:20.080
And in order to earn something,

00:27:20.160 --> 00:27:22.840
that means that you're having
an engagement over time.

00:27:22.920 --> 00:27:25.320
But you need to adopt the AI.

00:27:25.400 --> 00:27:27.520
You shouldn't just blindly take it.
Absolutely.

00:27:27.600 --> 00:27:30.080
I think that's what you were starting at.

00:27:30.160 --> 00:27:31.840
That's a fabulous point.

00:27:31.920 --> 00:27:36.840
The Data and Trust Alliance is an alliance
of about 25 different companies,

00:27:36.920 --> 00:27:39.720
each of which are very
influential in their ecosystem.

00:27:39.800 --> 00:27:43.040
So huge companies think
Nike, CVS, et cetera.

00:27:43.120 --> 00:27:44.240
And to your point,

00:27:44.320 --> 00:27:48.600
when we talk about trust,
it is all about, these guys are making

00:27:48.680 --> 00:27:54.840
phenomenal tech, but do I, as a company
or as an organization, trust it?

00:27:54.840 --> 00:27:56.680
That means I need to have my own value

00:27:56.760 --> 00:28:01.840
system, and I need to be able
to understand whether that tech fits it.

00:28:01.920 --> 00:28:03.840
And so we're now just at the beginning

00:28:03.920 --> 00:28:06.680
of being able to understand,
well, if I look at a...

00:28:06.760 --> 00:28:08.560
I'm not going to be building a foundation

00:28:08.640 --> 00:28:13.720
model in my company because these things
cost $500 million to build.

00:28:13.720 --> 00:28:15.560
None of these companies are going to be

00:28:15.640 --> 00:28:19.680
making their own models, but they are
going to be using third-party models.

00:28:19.760 --> 00:28:23.600
And so then the question is,
which third-party models should I use?

00:28:23.680 --> 00:28:25.720
Which one do I trust?

00:28:25.800 --> 00:28:29.280
Which one aligns with my values,
my policies?

00:28:29.360 --> 00:28:31.040
How do I even to get that out?

00:28:31.120 --> 00:28:35.600
We're actually quite early
in being able to figure that out.

00:28:36.200 --> 00:28:38.480
So that's how we do it.

00:28:38.560 --> 00:28:39.720
And here's exactly...

00:28:39.800 --> 00:28:43.880
You just gave me my first perfect
segue from a panelist today.

00:28:43.880 --> 00:28:45.280
Because, Joe, I wanted
to bring you in on this?

00:28:45.360 --> 00:28:46.760
Because I feel like every new round

00:28:46.840 --> 00:28:50.160
of technology, one of the big
questions is, do we build?

00:28:50.160 --> 00:28:51.440
Do we buy?
Do we partner?

00:28:51.520 --> 00:28:53.120
I think particularly when you're thinking

00:28:53.200 --> 00:29:00.520
about a portfolio of brands,
each with their own contexts, customers,

00:29:00.600 --> 00:29:05.080
and stakes,
how do you think about whether to build,

00:29:05.160 --> 00:29:07.920
buy, partner,
how to approach this problem?

00:29:08.000 --> 00:29:10.040
Initially, we worked with partnerships,

00:29:10.120 --> 00:29:13.520
so licensing all the top foundational
models is where we started.

00:29:13.600 --> 00:29:18.000
But I think to Carly's point,
eventually it gets down to compute.

00:29:18.080 --> 00:29:23.800
And for a corporation of Intuit's size,
it gets very expensive very quickly.

00:29:23.880 --> 00:29:28.880
And at some point, we have to look at what
does it take to build our own foundational

00:29:28.960 --> 00:29:34.000
model, Maybe one that specializes
in the domains that our customers expect,

00:29:34.080 --> 00:29:40.040
ones that we can
validate the authenticity with.

00:29:40.120 --> 00:29:42.920
So right

00:29:43.800 --> 00:29:47.520
now, we've started with essentially
a partnership model,

00:29:47.600 --> 00:29:52.280
but I think a lot of companies will
be moving into build their own models.

00:29:52.360 --> 00:29:56.800
I think Bloomberg here has their own
model that they've been building.

00:29:56.880 --> 00:30:01.600
And so I'd imagine that'll be
coming with lots of companies.

00:30:01.680 --> 00:30:02.560
I think it's going to be a really

00:30:02.640 --> 00:30:08.200
interesting ecosystem where some people
are going to be

00:30:08.280 --> 00:30:12.360
licensing a model and then training it
on their own data versus just building

00:30:12.440 --> 00:30:16.280
something from scratch,
which is like, why?

00:30:16.640 --> 00:30:18.000
But I ain't doing that.

00:30:18.000 --> 00:30:19.240
But other companies might.

00:30:19.320 --> 00:30:29.240
What are the challenges there for
brands that are just starting to dive

00:30:29.240 --> 00:30:32.480
into this space Okay, so maybe, Maz,
you've started to think about this as you

00:30:32.480 --> 00:30:35.040
think about the clients you work
with wanting to create these more emotive

00:30:35.120 --> 00:30:38.440
experiences, but what's
the right way to get started?

00:30:38.520 --> 00:30:40.480
Yeah.
I mean, everything that you folks have

00:30:40.560 --> 00:30:44.880
been saying around relationship building
is really fascinating because number one,

00:30:44.910 --> 00:30:48.960
the brand, the business a little bit needs
to figure out their relationship to AI.

00:30:49.040 --> 00:30:55.240
What is it about our experience that we
think AI could usefully augment?

00:30:55.320 --> 00:30:58.960
And how do we define the role of our

00:30:58.960 --> 00:31:02.640
people versus the role of our agents
and the systems that are more automated.

00:31:02.720 --> 00:31:10.400
There is that aspect of really a little
bit of education and bit by bit adoption.

00:31:10.480 --> 00:31:12.360
Experimentation is critical.

00:31:12.440 --> 00:31:14.600
Something that we really advocate for as

00:31:14.680 --> 00:31:19.520
a design studio is make, test,
learn as a concept, very familiar to a lot

00:31:19.600 --> 00:31:23.480
of you, I'm sure, but
extra essential when you're considering

00:31:23.560 --> 00:31:27.280
the stakes of the kinds of technologies
that we're talking about here,

00:31:27.360 --> 00:31:31.360
especially as it relates
to engaging engaging with customers.

00:31:31.440 --> 00:31:37.680
So moving on to the how do we approach
customers building relationships with our

00:31:37.760 --> 00:31:42.160
tools, with our systems that are
more AI-powered or even AI-overseen.

00:31:42.240 --> 00:31:44.560
I think it's the same way that you
build a relationship with a human.

00:31:44.640 --> 00:31:48.800
It's the same way that when you're first
dating somebody, you're not going to go up

00:31:48.800 --> 00:31:51.400
to a cabin in the woods with them
and take their tax advice.

00:31:51.480 --> 00:31:56.760
You're maybe going to say, Hey,
I'd like to get a little bit of a tailored

00:31:56.810 --> 00:31:59.760
to my knowledge level guide to a dummy's
guide to investing,

00:31:59.840 --> 00:32:05.280
or maybe I need to understand the basics
of saving for a house or what have you.

00:32:05.330 --> 00:32:08.480
You're going to start
small and you're going to build up and you

00:32:08.560 --> 00:32:14.000
need to trust
but verify your way to the level

00:32:14.080 --> 00:32:18.440
of comfort that you want to arrive at
with regards to your relationship with AI.

00:32:18.520 --> 00:32:22.200
I think for both brands,
they need to build their comfort and trial

00:32:22.280 --> 00:32:27.760
and learn what is an acceptable
error rate or what have you.

00:32:27.840 --> 00:32:30.080
What caveats do we put in place?

00:32:30.130 --> 00:32:31.880
What watermarks do we put in place?

00:32:31.960 --> 00:32:33.400
That is all essential.

00:32:33.480 --> 00:32:37.080
But then on the consumer side,
on the user-facing side,

00:32:37.160 --> 00:32:40.760
that relationship building needs to take
place incrementally and iteratively, too.

00:32:40.840 --> 00:32:42.520
I wonder, Alejandro,

00:32:42.600 --> 00:32:47.360
what you have seen at Runway with your
customers and how they onboard to the use

00:32:47.440 --> 00:32:51.920
of these tools and adopt them and feel
their way through how they want to use

00:32:52.000 --> 00:32:55.520
machine learning and AI
to create and tell stories.

00:32:55.600 --> 00:33:00.920
Yeah, we see a lot of different use cases
Yes,

00:33:01.000 --> 00:33:04.880
from a brand perspective or enterprise
for narrative planning,

00:33:04.960 --> 00:33:12.720
for conceptualizing some of the ideas for
either a campaign or for a concept.

00:33:12.720 --> 00:33:14.880
They're exploring.
There's a lot of what Matt is saying,

00:33:14.960 --> 00:33:21.800
a lot of tinkering still of how
can I apply this to the workflows.

00:33:22.360 --> 00:33:31.560
Back to the conversation in terms of how
can I use this on my a company in terms

00:33:31.640 --> 00:33:37.720
of outsourcing a model or
maybe buying or using one.

00:33:37.800 --> 00:33:40.440
In our case, we have seen that for us it's

00:33:40.520 --> 00:33:46.560
crucial to invest in creating our own
models because as we're working

00:33:46.640 --> 00:33:50.640
with creators, we know
that control is fundamental.

00:33:50.720 --> 00:33:55.800
As much as we can provide control to what

00:33:55.880 --> 00:33:58.800
they're seeking for is it
provides a better tool.

00:33:58.880 --> 00:34:01.120
So in order to When you have full control

00:34:01.200 --> 00:34:08.240
of the outputs, you need to understand how
the model works, how the model is created,

00:34:08.240 --> 00:34:11.400
the architecture of the model is,
how it's deployed in the model,

00:34:11.480 --> 00:34:14.840
and how you build applications
on top of the model.

00:34:14.920 --> 00:34:18.280
So for us, in that case,

00:34:18.360 --> 00:34:24.280
building, knowing how the full stack
works, helps us to provide that grade

00:34:24.360 --> 00:34:29.280
of control that we can give
to our customers and users.

00:34:29.360 --> 00:34:34.320
Then When some users have some ideas
of maybe what we're talking about,

00:34:34.400 --> 00:34:39.080
context, where maybe I have some context
that wasn't not part of the training data

00:34:39.160 --> 00:34:43.360
set of what my foundation model was,
then it's when maybe it comes to a second

00:34:43.440 --> 00:34:48.000
layer of fine-tuning on top of that model,
then maybe I just need one extra layer

00:34:48.080 --> 00:34:53.520
on top of that to still remain all grade
of control that I can get from your base

00:34:53.600 --> 00:34:57.000
model plus your applications,
but I can add this second layer on top

00:34:57.080 --> 00:35:00.720
where the context that I
need to provide fit my needs.

00:35:00.800 --> 00:35:01.840
Yeah, go ahead, go.

00:35:01.920 --> 00:35:04.640
I'm just a visual person,
so I was always listening to people.

00:35:04.720 --> 00:35:07.840
It's like I have this diagram in my head.

00:35:08.120 --> 00:35:14.640
It's like when we're talking about
the relationship with AI is that you have

00:35:14.720 --> 00:35:19.600
You have creative and proactive,
and you have left brain and right brain.

00:35:19.680 --> 00:35:24.520
It's like your analytical use
cases and your creative use cases.

00:35:24.600 --> 00:35:26.600
You have very proactive engagement,

00:35:26.680 --> 00:35:31.520
and you have very reactive in the sense
that I invoke access to certain things.

00:35:31.600 --> 00:35:35.640
And I think that that two by two

00:35:35.720 --> 00:35:41.320
and thinking about use cases within
that for brands is maybe a helpful way.

00:35:41.400 --> 00:35:46.600
And then marry that with when you're
asking about this whole model situation.

00:35:46.680 --> 00:35:49.080
Personally, I think there's a strong case

00:35:49.160 --> 00:35:54.480
that there are going to be
a few winners with foundational models.

00:35:54.560 --> 00:35:56.800
And I think there's a strong argument

00:35:56.880 --> 00:36:03.440
for fine-tuning, bring your own data,
and inference services because

00:36:03.520 --> 00:36:09.840
I just think the cost of entry,
to your point, is really high.

00:36:09.920 --> 00:36:12.840
Also the number of parameters that these

00:36:12.920 --> 00:36:16.280
larger models have are
just so significant.

00:36:16.360 --> 00:36:18.680
If we were here talking about narrow AI

00:36:18.760 --> 00:36:21.400
use cases, maybe it would
be a different answer.

00:36:21.480 --> 00:36:29.400
But with these LLMs and future multi-model
models, I think the argument

00:36:29.480 --> 00:36:32.160
for partnering doing is
going to be really strong.

00:36:32.240 --> 00:36:33.720
Yeah.

00:36:33.840 --> 00:36:35.280
Then I want to bring it back to the work

00:36:35.360 --> 00:36:40.320
you're doing at DATA+ Trust, Zyra,
because I think the pressure is very much

00:36:40.400 --> 00:36:46.360
on with this technology in a way that I
think it wasn't before to have regulators

00:36:46.360 --> 00:36:50.160
feel like they got their arms around it,
to have large companies who could be held

00:36:50.240 --> 00:36:54.280
liable for the decisions that the software
makes, to get their arms around it.

00:36:54.360 --> 00:36:58.160
I wonder if there are things that you've
learned, your team has learned

00:36:58.240 --> 00:37:02.320
from looking at algorithmic bias and other
things that might be applicable here when

00:37:02.320 --> 00:37:05.720
you're vetting partners, when a company
is considering which partners to go with.

00:37:05.800 --> 00:37:09.760
When we first started looking at this,

00:37:09.840 --> 00:37:18.040
we realized that actually what we need to
start assessing is, do I trust the data?

00:37:18.120 --> 00:37:20.480
Do I trust the models?

00:37:20.560 --> 00:37:24.720
And do I trust the people
and the processes that sit around them?

00:37:24.800 --> 00:37:26.400
So that might be at a vendor,

00:37:26.480 --> 00:37:31.480
but also in our company,
have we set it up in the right way.

00:37:31.560 --> 00:37:37.080
And so when we first did our very first
project,

00:37:37.160 --> 00:37:42.920
we went into the world of HR because it
is a very high risk application of AI.

00:37:43.000 --> 00:37:46.160
And we said, okay,
are you guys using AI in HR?

00:37:46.240 --> 00:37:48.600
Yes.
Are you planning on using it more?

00:37:48.680 --> 00:37:49.840
Yes.

00:37:49.920 --> 00:37:52.200
Are you building these things in-house?

00:37:52.280 --> 00:37:53.400
Are you buying them?

00:37:53.400 --> 00:37:54.440
Well, we're buying them.

00:37:54.520 --> 00:37:58.560
What are you concerned about
in the world of HR bias?

00:37:58.640 --> 00:38:03.240
Okay, well, are you asking any of these
companies their approach to bias?

00:38:03.320 --> 00:38:05.520
And the answer was no.

00:38:05.600 --> 00:38:09.480
So then we had to build a whole vendor
evaluation tool so that if you're bringing

00:38:09.560 --> 00:38:13.840
AI into your world,
you're not just bringing it on the basis

00:38:13.840 --> 00:38:16.920
of like, Okay, is this the right
cost or performance or whatever?

00:38:17.000 --> 00:38:18.560
But I'm asking you,

00:38:18.640 --> 00:38:24.280
do you know how to detect
or mitigate or monitor algorithmic bias?

00:38:24.360 --> 00:38:28.720
And so now we're finding that actually
somehow the Data and Trust Alliance is

00:38:28.800 --> 00:38:33.880
turning into Vendor Evaluation Central
because they're all...

00:38:33.960 --> 00:38:39.320
To your point, it's going to be a handful
of folks that are doing foundation models.

00:38:39.400 --> 00:38:41.400
And then if we're partnering with them

00:38:41.480 --> 00:38:46.880
to finetune it, you've actually already
trained the model with your data.

00:38:46.960 --> 00:38:48.800
I need to know what that was,

00:38:48.880 --> 00:38:54.760
because even if I'm giving you my data
to finetune it, it was already trained.

00:38:54.840 --> 00:38:56.640
So I need to know.

00:38:56.640 --> 00:39:00.000
And one of the most important questions
these companies should be asking actually

00:39:00.080 --> 00:39:04.520
is, what are the unintended
consequences of me taking this on?

00:39:04.600 --> 00:39:05.240
No.

00:39:05.320 --> 00:39:09.640
If you're not actually asking
that question to yourself,

00:39:10.200 --> 00:39:14.760
regulation or not, you might find
yourself in a very sticky situation.

00:39:14.840 --> 00:39:15.560
Yeah.

00:39:15.640 --> 00:39:18.440
I wonder if you have any reflections
on that, Joe, for the kinds of products

00:39:18.520 --> 00:39:22.880
and people that you're trying to serve,
thinking about.

00:39:22.960 --> 00:39:25.040
Yeah.
The thing that's immediately calling

00:39:25.120 --> 00:39:29.960
to mind is Mayor Adams chatbot that's
advising you on the employment status

00:39:30.040 --> 00:39:36.120
of your bodega cat and giving
people some advice to break the law.

00:39:36.200 --> 00:39:39.720
But maybe a more serious
actor maybe has some.

00:39:39.720 --> 00:39:39.800
Yeah.

00:39:39.880 --> 00:39:45.080
Is there any MailChimp folks
in the house tonight, for any chance?

00:39:45.160 --> 00:39:46.360
No.

00:39:47.160 --> 00:39:52.320
The Mailchimp use cases are really
interesting, but there's some of the most,

00:39:52.400 --> 00:39:56.280
I think,
obvious ones for generative AI because

00:39:56.360 --> 00:40:01.120
managing email campaigns and all
the content that gets generated, fairly

00:40:01.520 --> 00:40:04.520
low risk other than
representation of your brand.

00:40:04.600 --> 00:40:07.400
That's very different than some other use

00:40:07.480 --> 00:40:12.640
cases like lending money,
where bias is a huge issue.

00:40:12.720 --> 00:40:13.640
I don't know if you remember a couple

00:40:13.720 --> 00:40:18.680
of years ago, I think Apple
got busted when they were doing payments

00:40:18.760 --> 00:40:22.600
acceptance, where they found out that the
data they trained on was biased,

00:40:22.640 --> 00:40:25.440
and they were eliminating all
these people from payments.

00:40:25.520 --> 00:40:29.000
So I think that's a case where

00:40:29.080 --> 00:40:34.840
the data that you to use has to be
completely unbiased, run by an alliance or

00:40:34.920 --> 00:40:41.440
a third party, validated
just because of the stakes being so high.

00:40:41.520 --> 00:40:43.520
I think we're talking about data.

00:40:43.520 --> 00:40:44.920
There's also features, right?

00:40:45.000 --> 00:40:47.600
That makes sense between
different data sets.

00:40:47.680 --> 00:40:51.560
One of the things...
I used to work at J.

00:40:51.560 --> 00:40:55.920
P.
Morgan, and we built an ensemble model

00:40:56.000 --> 00:41:01.640
and a way to help with building trust and

00:41:01.960 --> 00:41:06.360
making it feel more approachable to use

00:41:06.360 --> 00:41:09.440
the output because it was probabilistic,
was to include confidence intervals

00:41:09.520 --> 00:41:12.680
because there was a lot of noise,
for example, in the data.

00:41:12.760 --> 00:41:16.360
I think that you're never going
to have a perfect data set.

00:41:16.440 --> 00:41:20.600
I think you can do enrichment to make
it more consistent, more complete.

00:41:20.680 --> 00:41:22.520
There's underrepresentation in certain

00:41:22.600 --> 00:41:26.200
populations, which makes
information inherently biased.

00:41:26.280 --> 00:41:28.480
Being able to identify that.

00:41:28.560 --> 00:41:35.960
But on the of data and trust,
I do think that where you can design

00:41:35.960 --> 00:41:39.240
explainability, things like confidence
intervals when you're talking about

00:41:39.320 --> 00:41:45.040
financial data is an interesting approach
with trust, because it's in a relationship

00:41:45.120 --> 00:41:48.520
you share something about
yourself to help build that trust.

00:41:48.600 --> 00:41:54.240
I think that's maybe a second element
outside of the consistency of engagement.

00:41:54.320 --> 00:41:56.600
Yeah.
Go ahead, Zara.

00:41:56.680 --> 00:41:57.800
Are you interested in that?
I'm sorry.

00:41:57.880 --> 00:42:00.880
I think one of the hallucinating
you having thoughts.

00:42:00.960 --> 00:42:03.320
I'm like an AI.

00:42:03.400 --> 00:42:09.760
One of the things I wonder about is we've
been now talking chiefly about,

00:42:09.760 --> 00:42:11.680
from the brand perspective,
the business perspective,

00:42:11.680 --> 00:42:14.280
the technologist perspective,
even a little bit the regulator's

00:42:14.360 --> 00:42:18.560
perspective,
I wonder about the going back

00:42:18.640 --> 00:42:23.560
to the customer experience and how
much do we really need to know?

00:42:23.640 --> 00:42:26.360
I know that there's been headlines of late

00:42:26.440 --> 00:42:29.520
about what happens when you click
on that little info button and it tells

00:42:29.600 --> 00:42:33.480
you your prompt was transformed,
and you're like, That's not what I meant.

00:42:33.560 --> 00:42:36.160
I'm curious about how you're thinking

00:42:36.240 --> 00:42:40.400
about that from the customer
side of the trust equation.

00:42:40.480 --> 00:42:42.800
There's what you can do as designers

00:42:42.880 --> 00:42:46.720
and developers and brands, but what
is the other side of that equation?

00:42:46.800 --> 00:42:48.600
How do we avoid emotive AI becoming

00:42:48.680 --> 00:42:52.760
manipulative AI or people just
by default not trusting it?

00:42:52.840 --> 00:42:57.200
I'll open up to anybody
who wants to take a stab at that.

00:42:57.280 --> 00:43:00.440
I'll open it up with a It's a debate

00:43:00.520 --> 00:43:03.720
that's ongoing, and I think you
brought it up around labeling.

00:43:03.800 --> 00:43:08.760
You said watermarked,
and then you talked about labeling.

00:43:08.840 --> 00:43:13.360
Some people think that customers need
everything labeled to know if it's AI.

00:43:13.440 --> 00:43:15.360
They're like, I need to know.

00:43:15.440 --> 00:43:19.600
Other people are like, Well, okay,
if I start to label everything,

00:43:19.680 --> 00:43:25.480
the things that don't get labeled,
is that inherently trustful content?

00:43:25.560 --> 00:43:31.280
People can be just as deceitful
and deceptive as technology.

00:43:31.360 --> 00:43:33.360
I mean, I think we've seen that, right?

00:43:33.440 --> 00:43:37.280
I think that there is work to be done

00:43:37.360 --> 00:43:42.120
on what does it mean for customers
engaging with this type of content.

00:43:42.200 --> 00:43:46.440
But if we also go back to the discussion
of emotive AI, it's like

00:43:46.520 --> 00:43:51.400
if AI is having you experience a feeling,
I mean, you were talking about

00:43:51.400 --> 00:43:54.120
storytelling, and every good story
should make you feel something.

00:43:54.200 --> 00:43:57.040
It has a beginning, a middle, an end.

00:43:57.040 --> 00:44:00.040
There's a climax, there's a protagonist,
and something What happens?

00:44:00.120 --> 00:44:02.760
And I think that happens
in product experiences.

00:44:02.840 --> 00:44:04.560
You have the Nux.

00:44:04.640 --> 00:44:05.960
What do you do with the Nux?

00:44:06.040 --> 00:44:08.040
How are you setting expectations?

00:44:08.040 --> 00:44:09.040
That's the beginning.

00:44:09.120 --> 00:44:10.840
The middle is your engagement.

00:44:10.840 --> 00:44:11.960
What's happening there?

00:44:12.040 --> 00:44:13.200
And how is it learning?

00:44:13.200 --> 00:44:14.520
Is it a competent listener?

00:44:14.600 --> 00:44:17.560
Are you providing
affordances to get feedback?

00:44:17.640 --> 00:44:18.520
What's the end?

00:44:18.600 --> 00:44:22.120
Did I achieve my output or the
post-conditions met?

00:44:22.120 --> 00:44:23.320
And what is it doing with that?

00:44:23.400 --> 00:44:24.880
And is it evolving with me?

00:44:24.960 --> 00:44:27.640
So I guess I think that we're still trying

00:44:27.720 --> 00:44:32.960
to figure that out because Regulations
label everything.

00:44:33.040 --> 00:44:35.800
But what happens when
something then isn't?

00:44:35.880 --> 00:44:37.080
Is it automatically authentic?

00:44:37.160 --> 00:44:39.480
No, right?
Do you have a thought?

00:44:39.480 --> 00:44:41.760
We were starting to talk a little bit
about-Yeah, we were talking about this.

00:44:41.840 --> 00:44:44.960
I think that as unsexy as it is,

00:44:45.040 --> 00:44:49.400
this is where things like standards
and certifications come in.

00:44:49.480 --> 00:44:54.920
There's this light in our room.
And here,

00:44:55.000 --> 00:44:59.560
the amount of time it took
and the certifications it took to make

00:44:59.640 --> 00:45:06.080
sure that that light is hanging there
and we're all cool with it was insane.

00:45:06.160 --> 00:45:09.840
And it just shows you that we're
so early in this journey.

00:45:09.920 --> 00:45:13.080
We need standards, we need certifications.

00:45:13.160 --> 00:45:15.600
And at some point, Yes,

00:45:15.600 --> 00:45:17.840
there should be a lot of labeling,
but at some point we're going to forget

00:45:17.860 --> 00:45:21.160
all the labeling because we're
going to start to trust all of that.

00:45:21.240 --> 00:45:22.800
But we actually need to go through

00:45:22.880 --> 00:45:26.960
the process of figuring out
what needs to be labeled,

00:45:27.040 --> 00:45:30.640
what needs to be certified, what are
the standards we need, et et cetera.

00:45:30.720 --> 00:45:32.960
And then I think we'll be on our way.

00:45:33.040 --> 00:45:35.520
I was just going to add,
I think we're in this weird phase two

00:45:35.600 --> 00:45:40.160
from an interaction perspective where
we're dealing with this legacy of digital

00:45:40.240 --> 00:45:44.800
assistants and chat bots where we've
all lost a lot of trust in them.

00:45:44.880 --> 00:45:47.120
Over the years.
And yet the primary use cases

00:45:47.200 --> 00:45:51.800
of interaction we see for the LLMs
is basically like a chat interface.

00:45:51.880 --> 00:45:56.520
So we're trying to build back up
that trust that we lost for decades.

00:45:56.600 --> 00:46:01.520
I think some of the most interesting
interactions maybe have yet to come onto

00:46:01.600 --> 00:46:05.640
the market or have yet
for us to adapt to them.

00:46:05.720 --> 00:46:11.560
If you think about language being
like 90 % nonverbal, for instance.

00:46:11.640 --> 00:46:13.760
If you think about all the natural UI

00:46:13.840 --> 00:46:18.840
technologies that had been their nascent
stages and never got widely adopted.

00:46:18.920 --> 00:46:22.840
And now that we have
Gen AI, I think it's just going to be

00:46:22.870 --> 00:46:25.520
a leap forward in terms
of what the possibilities are.

00:46:25.600 --> 00:46:33.160
And hopefully we move the interaction out
of this chat-based prompt-based box, Luke?

00:46:33.240 --> 00:46:34.280
Yeah, totally.

00:46:34.360 --> 00:46:37.040
I think the conversation around

00:46:37.120 --> 00:46:41.080
factual content recommendations,
what have you, is super important.

00:46:41.160 --> 00:46:43.360
But a whole different frontier

00:46:43.440 --> 00:46:49.120
for Emotive AI is really in generative
interfaces and the way that we engage

00:46:49.200 --> 00:46:53.600
with experiences becoming more
personalized with the help of generative

00:46:53.680 --> 00:46:57.920
AI tools and something that Alejandro
can possibly speak to as well.

00:46:58.000 --> 00:47:02.560
But basically the idea that maybe
the content of the interface stays

00:47:02.640 --> 00:47:06.080
nice and regulated, nice and safe,
nice and same, same for everybody.

00:47:06.160 --> 00:47:10.840
But the way that it's presented,
depending on your knowledge level,

00:47:10.920 --> 00:47:14.120
your particular mood,
whether you're more of a visual person,

00:47:14.200 --> 00:47:20.600
you prefer to engage in things more in two
by twos or in text or in ones and zeros.

00:47:20.680 --> 00:47:25.160
Basically, that is a level of
personalization.

00:47:25.240 --> 00:47:28.160
I totally just snored it.

00:47:28.240 --> 00:47:30.240
Just give I'll take care of it.
I'll take care of it.

00:47:30.240 --> 00:47:30.480
I'll take care of it.
I'll take care of it.

00:47:30.480 --> 00:47:32.520
I'll take care of it.
I think in two by twos.

00:47:32.600 --> 00:47:40.840
But yeah, I think that's a very safe
but incredibly potentially fruitful form

00:47:40.920 --> 00:47:44.840
of personalization that keeps us
on the straight and narrow as far as

00:47:44.920 --> 00:47:49.200
content, but really changes the medium
in a way that is just that much more

00:47:49.280 --> 00:47:51.720
compelling, that much
more accessible for you.

00:47:51.800 --> 00:47:53.800
And then hopefully, you as a user,

00:47:53.800 --> 00:47:57.440
if you're getting information presented
to you in a way that gels with the way

00:47:57.440 --> 00:48:01.480
that you like to pay attention and then
lean in, you're actually going to pay more

00:48:01.480 --> 00:48:04.800
attention to that information,
and you're going to take that information

00:48:04.880 --> 00:48:08.640
in and hopefully act in a way that's
more responsible and more trustful.

00:48:08.720 --> 00:48:11.080
Yeah.
Go ahead, Ali.

00:48:11.160 --> 00:48:14.600
I would like to add something to that.

00:48:14.680 --> 00:48:20.040
Yes, I guess reemphasizing on the idea
that we are still very early.

00:48:20.240 --> 00:48:26.920
At Runway, we like to go
back to previous technological advances

00:48:27.000 --> 00:48:36.840
a lot, and from the the creation of two
paintings to the camera to filmmaking.

00:48:37.160 --> 00:48:39.640
It always happened that we tried

00:48:39.720 --> 00:48:44.680
to implement these new technology advances
to what we already know how it works.

00:48:44.760 --> 00:48:49.080
We
use the camera to take photos of people

00:48:49.160 --> 00:48:55.640
like we used to paint them or we
used to record or film like plays.

00:48:55.720 --> 00:48:57.760
But then a few years in,

00:48:57.840 --> 00:49:02.960
we created cinema or we use
the camera to create different art.

00:49:03.040 --> 00:49:09.800
Or then, Van Gogh was born because

00:49:10.800 --> 00:49:14.600
the painting on tubes existed.

00:49:15.240 --> 00:49:17.200
I guess we're still in that phase today.

00:49:17.280 --> 00:49:20.360
We're trying to understand these systems

00:49:20.440 --> 00:49:25.280
in a way that we know how we
have been working in the past.

00:49:25.360 --> 00:49:27.520
We are trying to seek for personalized

00:49:27.600 --> 00:49:33.600
experiences on top of the that we
have already built 10 years ago.

00:49:33.680 --> 00:49:37.880
We have systems built, and now we're
trying to add a chat base on top of that.

00:49:37.960 --> 00:49:39.280
Or we're trying to maybe think,

00:49:39.360 --> 00:49:43.600
maybe this UI doesn't need to be fixed
and maybe it can be generative and maybe

00:49:43.680 --> 00:49:49.440
it can the response to your
emotions or to your previous.

00:49:49.520 --> 00:49:52.240
But the question to me is,
doesn't it need to be like that?

00:49:52.320 --> 00:49:56.880
This is only a response to what we're
seeing today based on what we already

00:49:56.880 --> 00:49:58.840
know, or maybe it can
be completely different.

00:49:58.920 --> 00:50:00.680
We have talked about agents here.

00:50:00.760 --> 00:50:04.880
And maybe the whole concept is not
that you're going to go and interact

00:50:04.960 --> 00:50:07.720
with an agent every time
that you talk to a brand.

00:50:07.800 --> 00:50:10.240
Maybe you talk to one agent that there is

00:50:10.320 --> 00:50:14.680
your own agent and your agent will
figure out anything, things for you.

00:50:14.760 --> 00:50:18.280
And Maybe the only
experience that you need to build is

00:50:18.360 --> 00:50:22.200
within your agent, and then you customize
it the way that you create,

00:50:22.280 --> 00:50:26.480
and then your agent figured out
how you interact with different brands.

00:50:26.560 --> 00:50:29.000
This is, of course, like a speculation.

00:50:29.080 --> 00:50:32.800
We don't know how it's going to play,
but what I'm trying to say is in

00:50:32.880 --> 00:50:37.080
most of the past
experiences where we have seen

00:50:37.160 --> 00:50:42.440
technological transformations like this,
we first tried to plug it in into what we

00:50:42.520 --> 00:50:46.120
already know, but it ended up
being It's really different things.

00:50:46.200 --> 00:50:47.200
Yeah.

00:50:47.280 --> 00:50:49.680
So I'm mindful that we've only
got a couple of minutes left.

00:50:49.760 --> 00:50:54.080
And so do we want to see if anyone has
a question or two before we wrap up?

00:50:54.160 --> 00:50:58.240
And I can pop.
No.

00:50:58.240 --> 00:50:59.520
Max.
Hi.

00:50:59.600 --> 00:51:04.280
I have a generative AI consultancy
for CPG brands and beauty brands.

00:51:04.360 --> 00:51:08.200
The biggest question that's really coming
out from a lot of these brands is

00:51:08.280 --> 00:51:12.160
in regards to labeling,
what will our customers actually think?

00:51:12.240 --> 00:51:15.280
The EU has obviously released legislation

00:51:15.360 --> 00:51:17.840
that generative AI
content must be labeled.

00:51:17.840 --> 00:51:21.000
It's going to be interesting to see
what happens here in the States.

00:51:21.080 --> 00:51:24.880
My question is really
around FTC and advertising.

00:51:24.880 --> 00:51:26.920
What is going to be
the consequences for these brands?

00:51:26.920 --> 00:51:30.440
Is it going to be considered false
advertising if the content they're using

00:51:30.520 --> 00:51:33.240
to advertise their products
is with generative AI?

00:51:33.320 --> 00:51:34.400
Yeah, I don't know.

00:51:34.480 --> 00:51:36.440
Do you want to take this one?

00:51:36.760 --> 00:51:38.880
I mean, that's a tough question because

00:51:38.960 --> 00:51:45.120
you're talking about who's going
to be to blame for the content.

00:51:45.200 --> 00:51:47.000
I mean, that's why I think

00:51:47.080 --> 00:51:51.320
the labeling and showing if
it's first-party, third-party.

00:51:51.400 --> 00:51:55.160
I mean, the challenge is that there's
so many creator tools now.

00:51:55.240 --> 00:51:57.200
You talked about Loving Midjourney.

00:51:57.280 --> 00:52:03.920
It's like, I don't think that they
do water their marks on their images.

00:52:04.000 --> 00:52:08.360
Yeah, right?
Then you're at the mercy, right?

00:52:08.440 --> 00:52:13.880
If somebody goes in, they create
that to then the distributors.

00:52:13.960 --> 00:52:16.280
I know I I said, I wasn't going to...

00:52:16.360 --> 00:52:19.040
This is my personal point of view.

00:52:19.120 --> 00:52:24.360
That's why I think, to begin with,
it is important to include labels

00:52:24.440 --> 00:52:28.320
and for providers to do that,
to start that education until people start

00:52:28.400 --> 00:52:32.680
to determine what is
the right framework to evaluate.

00:52:32.680 --> 00:52:34.960
Because I think people
are going to make things.

00:52:35.040 --> 00:52:38.080
And that's also why the provenance, the...

00:52:38.160 --> 00:52:39.640
What is it?

00:52:39.720 --> 00:52:42.520
Cppa, I'm going to mix it up,

00:52:42.600 --> 00:52:45.280
is so important because you're talking
about derivatives,

00:52:45.280 --> 00:52:49.120
and that's where it gets really confusing,
because if you create unique content

00:52:49.200 --> 00:52:51.720
that gets shared,
somebody scrapes that content,

00:52:51.800 --> 00:52:54.760
they then create a derivative, and then
someone else creates a derivative.

00:52:54.840 --> 00:52:57.640
It's like, how do you trace that lineage?

00:52:57.640 --> 00:52:59.400
And then who's responsible for that?

00:52:59.480 --> 00:53:00.520
And I think for And now,

00:53:00.600 --> 00:53:04.680
if you are a distributor, whether
it's 1P or 3P, you include a label.

00:53:04.760 --> 00:53:10.960
And it might feel clunky at first, but at
least we're doing that to support that.

00:53:11.960 --> 00:53:14.280
I understand.

00:53:17.440 --> 00:53:18.640
Hi, everyone.

00:53:18.640 --> 00:53:19.880
I just had a quick question.

00:53:19.960 --> 00:53:27.760
I think beyond labeling, what can consumer
education look like with respect to AI?

00:53:27.840 --> 00:53:32.280
I feel like we're
in a space A place culturally where we're

00:53:32.360 --> 00:53:38.480
not really focused on learning just what
we're using with social media and whatnot.

00:53:38.560 --> 00:53:43.880
And as obviously everyone has background
in terms of what AI is doing,

00:53:43.960 --> 00:53:51.080
I'm just wondering on the consumer side
of things, how can we educate consumers on

00:53:51.160 --> 00:53:59.720
how companies are actually using
AI with respect to their data?

00:53:59.920 --> 00:54:01.000
To that.

00:54:01.080 --> 00:54:04.360
Yeah, I think this goes also about...
Sorry.

00:54:04.440 --> 00:54:07.920
My
perspective on this is around,

00:54:08.000 --> 00:54:14.080
I mentioned the Nux, the new user
experience, and how you meet them.

00:54:14.160 --> 00:54:18.440
You need to have engagement and education.

00:54:18.520 --> 00:54:23.440
If you meet somebody with a wall of text,
it's like, Oh, this is scary.

00:54:23.520 --> 00:54:28.920
I think the tone of the conversation
today is all about personalization.

00:54:29.000 --> 00:54:32.360
In order to achieve that,
that means that the data...

00:54:32.440 --> 00:54:35.240
This AI needs to know you.

00:54:35.320 --> 00:54:37.160
I think that there needs to be

00:54:37.240 --> 00:54:44.840
an understanding of the value exchange
and that data is encrypted and how it's

00:54:44.920 --> 00:54:52.040
used to train models and the sense making
that's being made by the LLM versus

00:54:52.120 --> 00:54:57.280
it's not like somebody's defined
how to identify you,

00:54:57.360 --> 00:55:00.680
but how do you demystify
that with such small real estate?

00:55:00.680 --> 00:55:02.400
If you're also starting to talk about

00:55:02.480 --> 00:55:06.280
different interaction paradigms where it's
not just with the screen and you don't

00:55:06.360 --> 00:55:11.040
have that affordance and maybe it's
in a different environment.

00:55:11.120 --> 00:55:12.520
I'm not a gamer.

00:55:12.600 --> 00:55:14.200
Are there any gamers out there?

00:55:14.280 --> 00:55:16.240
Yeah, who Okay, I need
to try to become a gamer.

00:55:16.280 --> 00:55:17.480
I feel like it's like...

00:55:17.560 --> 00:55:21.880
But there's a ton of generative AI
experiences inside of games,

00:55:21.880 --> 00:55:24.240
but it's like, you're not
going to be like, Oh, hold on.

00:55:24.320 --> 00:55:28.920
This couch that you're about to sit on,
that was actually made by a machine.

00:55:29.000 --> 00:55:36.240
I think it's upon all of us to understand
how these things are created.

00:55:36.320 --> 00:55:42.400
When you do have user experiences, how are
you providing the affordances to educate?

00:55:45.960 --> 00:55:50.160
Hi, Bjorn Ostrad
on my third AI revolution.

00:55:50.240 --> 00:55:53.520
I've been added for a while and found

00:55:53.600 --> 00:55:56.520
myself hilariously
future-blind on all of them.

00:55:56.600 --> 00:56:01.080
I'm wondering what you think the coolest
professions will be 10 years from now?

00:56:01.160 --> 00:56:05.040
Is that going to put the eye
head for this question?

00:56:06.120 --> 00:56:10.920
Well, I mean, obviously,
meta-employee into an employee runway.

00:56:11.440 --> 00:56:13.960
No, I mean, I think it's
an excellent question.

00:56:14.040 --> 00:56:21.240
I think the reality is that the barrier to
tools is just descending really sharply.

00:56:21.320 --> 00:56:23.680
I think Runway is a great example.

00:56:23.760 --> 00:56:26.400
We think about the ways in which

00:56:26.480 --> 00:56:30.920
access to information is just
becoming easier easier and easier.

00:56:30.920 --> 00:56:33.280
And obviously, we have the standards
and provenance pieces to balance.

00:56:33.360 --> 00:56:38.720
But your ability to have a personal tutor,

00:56:38.800 --> 00:56:45.760
then companion, then co-creator,
co-ideator for anything that you want

00:56:45.840 --> 00:56:50.600
to try your hand at is
really astonishing and really, truly new.

00:56:50.600 --> 00:56:53.480
And I think when it comes to AI
revolutions, third time's a charm.

00:56:53.560 --> 00:56:58.080
So I think that the question is less
about professions and more about tools.

00:56:58.160 --> 00:57:04.560
And I think that the tools are becoming
more general and more imaginative and are

00:57:04.640 --> 00:57:08.440
more capable of augmenting human
creativity and innovation in a way

00:57:08.520 --> 00:57:11.480
that makes your question excellent
but impossible to answer?

00:57:11.560 --> 00:57:13.920
I have a short answer that I think being

00:57:14.000 --> 00:57:18.880
a philosopher,
I think humans have ethics and we are very

00:57:18.960 --> 00:57:22.480
good at reasoning,
whereas machines are good at reckoning.

00:57:22.560 --> 00:57:25.320
So I think the other question is,

00:57:25.400 --> 00:57:29.560
what's the future of academic
institutions in 10 years?

00:57:29.640 --> 00:57:33.680
Is maybe the harder question to answer.

00:57:33.760 --> 00:57:38.680
I think professions are blending,

00:57:38.760 --> 00:57:44.120
so it's going to be hard to
put them in a box in some way.

00:57:44.200 --> 00:57:45.440
We are seeing at a runway,

00:57:45.520 --> 00:57:49.520
we're seeing graphic designers turning
animators, we're seeing animators turning

00:57:49.600 --> 00:57:54.600
filmmakers, we're seeing photographers
turning writers, we're seeing this blend,

00:57:54.680 --> 00:57:58.120
specifically on the creative world,
this blend of I am this,

00:57:58.200 --> 00:58:03.120
but I'm also this now because As we were
talking about enhancing

00:58:04.200 --> 00:58:09.960
creativity or empowering people to do
all the stuff, we're seeing I'm not...

00:58:10.040 --> 00:58:12.720
I come from a design bunker.
I'm not a designer anymore.

00:58:12.720 --> 00:58:14.120
Maybe I'm also a game designer,

00:58:14.120 --> 00:58:15.720
and maybe I'm also a programmer
or maybe or something else.

00:58:15.800 --> 00:58:23.120
So I'm not sure if I will qualify to
one specific profession on its own.

00:58:23.200 --> 00:58:24.960
One thing that I'm super excited

00:58:25.040 --> 00:58:28.320
that I believe it will happen in terms
of where we're talking about building

00:58:28.400 --> 00:58:33.440
context, and context in some way
can refer to taste.

00:58:33.520 --> 00:58:40.240
And once you are building this things,

00:58:40.270 --> 00:58:43.520
specifically storytelling,
you can start thinking on context being

00:58:43.600 --> 00:58:48.720
worlds, and worlds can be like,
I want to create maybe a story or maybe I

00:58:48.770 --> 00:58:50.920
want to create a brand
campaign for a brand.

00:58:51.000 --> 00:58:56.480
What is the context that I need to curate,
accumulate, generate mix from different

00:58:56.560 --> 00:59:00.760
pieces in order to eventually
create outputs of that world?

00:59:00.840 --> 00:59:05.360
That work can be a system, and that system
allows me to create a website for a brand.

00:59:05.440 --> 00:59:08.920
That system allows me to create multiple

00:59:09.000 --> 00:59:12.880
video outputs or stories or
podcasts out of that world.

00:59:12.960 --> 00:59:15.080
I call that world designers.

00:59:15.080 --> 00:59:18.600
I don't know if that's going to stick
for long, but I am trying to coin that.

00:59:18.680 --> 00:59:23.800
And I think someone
that will-Coin here at Neuhaus.

00:59:23.880 --> 00:59:27.640
Someone who will add this context to what

00:59:27.640 --> 00:59:29.200
they're going to be
creating in the future.

00:59:29.280 --> 00:59:33.200
I'll take his wide one and go narrow.

00:59:33.280 --> 00:59:36.960
In my head, it's something like
an investigative data journalist.

00:59:37.040 --> 00:59:43.720
I think that the more we use models,
it's just going to compound over time.

00:59:43.720 --> 00:59:45.440
You're going to take this data and it's

00:59:45.440 --> 00:59:49.360
compounds, and then you could add a model
or an app or a service on top of that.

00:59:49.440 --> 00:59:52.200
And all of a sudden it's like,
where did we come from?

00:59:52.280 --> 00:59:54.440
And I think we're going
to want to know that.

00:59:54.520 --> 00:59:59.720
And obviously, my background as a
journalist, so that's how my brain thinks.

00:59:59.800 --> 01:00:02.720
But I do think that even...

01:00:02.800 --> 01:00:06.360
I was talking to somebody who's doing some
work in the world of media,

01:00:06.440 --> 01:00:11.480
and I was like, how are you going to be
doing fact-checking in this world?

01:00:11.560 --> 01:00:15.440
So the investigative
element of who are we?

01:00:15.440 --> 01:00:17.560
Where do we come from?
Where did you get that from?

01:00:17.640 --> 01:00:22.200
I think it's going to be a lot
more important than we think.

01:00:22.280 --> 01:00:24.320
I'm actually hopeful in some of my former

01:00:24.400 --> 01:00:26.920
life, I was an entrepreneur,
small business owner.

01:00:26.920 --> 01:00:29.040
Are there any small business
owners in the house?

01:00:29.120 --> 01:00:30.400
Self-employed?

01:00:30.480 --> 01:00:32.080
Gig economy.

01:00:32.160 --> 01:00:36.720
Alejandra up here is an entrepreneur.
I just...

01:00:36.720 --> 01:00:38.520
Everything that...
Well designed.

01:00:38.600 --> 01:00:45.040
But I think if I can create a video story
now, if I can run my marketing campaign I

01:00:45.120 --> 01:00:48.640
mean, dear God, if I don't have to do
accounting and taxes,

01:00:48.720 --> 01:00:54.400
if all of that becomes taken care of
and the means to those tools in production

01:00:54.480 --> 01:01:00.040
is really cheap,
cost-effective, I just look forward to

01:01:00.120 --> 01:01:04.360
everyone starting a business
or working for themselves.

01:01:04.440 --> 01:01:06.840
I think it'll be incredible.

01:01:07.480 --> 01:01:10.440
So I'm hopeful for that.
Two more.

01:01:10.520 --> 01:01:13.600
We have time for two more?
Two more.

01:01:13.680 --> 01:01:14.960
Two more.
Okay, great.

01:01:15.040 --> 01:01:16.400
Who's next?
Yeah.

01:01:16.480 --> 01:01:18.240
Someone behind a pillar that I can't see.

01:01:18.320 --> 01:01:20.040
Oh, sorry.
Hi.

01:01:20.120 --> 01:01:27.160
So when we talk about
Emotive AI and in context of traditional

01:01:27.240 --> 01:01:32.600
family norms and stuff,
when it comes to kids or high schoolers

01:01:32.680 --> 01:01:38.360
that have this agent or a copilot or
something that's guiding them through life

01:01:38.440 --> 01:01:43.560
instead of necessarily their parents, has
there been any thought

01:01:43.640 --> 01:01:50.680
into the implications of how that could
restructure just society in general?

01:01:50.760 --> 01:01:54.880
If you have a figure

01:01:54.960 --> 01:01:59.520
that is not necessarily your parental
structure, but something that is guiding

01:01:59.600 --> 01:02:07.360
you that's overriding the advice
that is being given to you by a parent?

01:02:07.840 --> 01:02:10.720
Yeah, I haven't really thought about it,
but I'm sure it's going to be fine.

01:02:10.800 --> 01:02:12.320
Yeah.

01:02:12.400 --> 01:02:13.840
Next question.

01:02:13.920 --> 01:02:17.240
No, I think what you're What
we're waiting on is really the...

01:02:17.320 --> 01:02:19.080
If we're at day zero,

01:02:19.160 --> 01:02:24.520
that's a very day one question to resolve,
because right now we're getting to grips

01:02:24.520 --> 01:02:27.640
with and experimenting and pushing
forward the capabilities of these models.

01:02:27.720 --> 01:02:29.600
But you're exactly right.

01:02:29.680 --> 01:02:32.640
I mean, you look at a good
example is character.

01:02:32.720 --> 01:02:34.880
Ai, which you may be aware of,

01:02:34.960 --> 01:02:41.360
but it's a really simple idea where you
are able to talk to these different agents

01:02:41.360 --> 01:02:44.120
that have been imbued with personalities
by literally saying,

01:02:44.200 --> 01:02:48.480
You are Julius Caesar, teach me history,
or You are Nikola Tesla, teach me science,

01:02:48.560 --> 01:02:54.920
or, You are my boyfriend now, and be
my boyfriend, with that delightful voice.

01:02:55.000 --> 01:03:00.400
And folks are spending a ton of time,
particularly younger folks,

01:03:00.400 --> 01:03:03.000
Gen Z and below, they're spending a ton
of time engaging with these agents

01:03:03.080 --> 01:03:07.320
and forming relationships
really free form and really Wild Westley.

01:03:07.400 --> 01:03:12.680
And I think that a discussion about where

01:03:12.760 --> 01:03:17.280
AI occupies roles and maybe shouldn't
occupy roles in your life

01:03:17.360 --> 01:03:20.160
is a really critical,
maybe the most critical subset

01:03:20.240 --> 01:03:23.040
of the conversation we're finally
beginning to have about the role

01:03:23.120 --> 01:03:27.160
of technology in children's
lives and young people's lives.

01:03:27.240 --> 01:03:28.680
Because we're really late to the party.

01:03:28.760 --> 01:03:30.120
It's happening now.

01:03:30.200 --> 01:03:32.240
It's mostly through the lens of screen

01:03:32.320 --> 01:03:37.520
time and content on whatever platform
that you're consuming content on.

01:03:37.600 --> 01:03:41.960
But yes, as you rightly point out,
as we begin to reach a point where we can

01:03:42.040 --> 01:03:45.800
develop relationships,
invest emotion into, and ultimately,

01:03:45.880 --> 01:03:51.320
psychologically depend upon agents in AI
systems to some degree, yeah,

01:03:51.320 --> 01:03:52.960
that's going to need to be
a really active conversation.

01:03:52.960 --> 01:03:54.720
But one, I don't think we're even...

01:03:54.800 --> 01:03:58.120
I think it's day one
and we're at day zero still.

01:03:58.200 --> 01:04:04.200
I have a an answer to that, which is,
first of all, I love the question.

01:04:04.280 --> 01:04:10.560
I think that I'm going to pick
on to Carly's philosopher point here,

01:04:10.640 --> 01:04:16.200
which is that if, let's say,
we have to have parental control for these

01:04:16.280 --> 01:04:19.600
agents, much like we was
talking about screen time.

01:04:19.680 --> 01:04:23.000
I think what it's going to make parents do

01:04:23.080 --> 01:04:28.600
is think about the value system that they
need to imbue those agents with.

01:04:28.680 --> 01:04:33.280
And so then the question is going to be to
every parent, what is your value system?

01:04:33.360 --> 01:04:37.200
What are the actual controls that you're
going to put onto that agent?

01:04:37.280 --> 01:04:41.800
I don't know the parents think about
that today, but I think tomorrow they're

01:04:41.880 --> 01:04:46.080
going to have to think about the values
they want their kids to grow up with.

01:04:46.160 --> 01:04:49.120
I guess we have time for one more.

01:04:49.200 --> 01:04:51.120
One more question.

01:04:52.800 --> 01:04:53.760
Thank you.

01:04:53.840 --> 01:04:59.400
I'm a software engineer, and I'm really
interested in the therapy industry.

01:04:59.480 --> 01:05:03.720
Actually, In my leisure time, I provide

01:05:03.800 --> 01:05:07.120
consulting, which is like
the life coaching for my friends.

01:05:07.200 --> 01:05:12.840
And my question is, how will the emotive
AI change the therapy industry?

01:05:12.920 --> 01:05:14.200
Will it replace

01:05:14.280 --> 01:05:21.800
the therapist and the psychologist, or
at least take some share from that market?

01:05:22.280 --> 01:05:24.920
Yeah, that's an excellent question.

01:05:25.000 --> 01:05:27.240
I think bottom line,

01:05:27.240 --> 01:05:29.640
and one of the ways that I think it's
really important to think about AI

01:05:29.720 --> 01:05:34.560
and of the mode of AI is that we still
need to maintain agency in the person.

01:05:34.560 --> 01:05:35.880
So when I'm thinking about therapeutic

01:05:35.960 --> 01:05:40.200
applications, I think
mood journaling and treating the AI as

01:05:40.280 --> 01:05:44.960
a way to express and extend what you're
feeling into a way that's

01:05:45.040 --> 01:05:49.920
you can reflect upon, your therapist, your
trained professional can reflect upon.

01:05:50.000 --> 01:05:52.520
That's certainly where we think about it,

01:05:52.600 --> 01:05:56.560
hope that it's heading,
and what we're pushing for is that making

01:05:56.640 --> 01:05:59.840
sure we're not creating negative
dependencies, but instead,

01:05:59.920 --> 01:06:05.320
encouraging people to maintain, but not
just maintain, increase their agency.

01:06:05.400 --> 01:06:09.040
So that's self-expression,
self-reflection.

01:06:09.120 --> 01:06:13.880
That's going to be super important to,
again, obviously standardize and regulate.

01:06:13.960 --> 01:06:16.640
But we hope that it takes humans

01:06:16.720 --> 01:06:20.360
and doesn't put them in a less empowered
role, but actually a more empowered role

01:06:20.440 --> 01:06:25.320
to, again, self-express, self-reflect,
and hopefully heal in the case of therapy.

01:06:25.400 --> 01:06:28.880
I think it goes back to the centar system,
though, and human in the loop.

01:06:28.880 --> 01:06:32.880
Like, lonely This was a big problem during
the pandemic, access to mental health.

01:06:32.960 --> 01:06:38.680
I think a big part
of self-care is social connections.

01:06:38.760 --> 01:06:45.160
I think that therapists in person provide
a different dimension than remotely.

01:06:45.240 --> 01:06:47.240
I haven't looked at the data,

01:06:47.320 --> 01:06:50.720
but I do think it just goes back
to that idea of, I think you even said

01:06:50.720 --> 01:06:53.520
human in the loop,
and I think a call to center system is so

01:06:53.600 --> 01:06:58.640
much more important than just
an emotive therapist on its own.

01:06:58.720 --> 01:07:04.000
I think I'll make just a call back to the
question about cool jobs of the future.

01:07:04.080 --> 01:07:07.000
This is technology that is really going

01:07:07.080 --> 01:07:11.360
to demand a deeper level
of understanding about humans.

01:07:11.440 --> 01:07:14.600
You're not going to A/B test
your way to the best design.

01:07:14.680 --> 01:07:16.920
It's not going to happen.
This is not a thing you're just going

01:07:17.000 --> 01:07:21.520
to pop up on user interviews and get
some feedback and know what to build.

01:07:21.600 --> 01:07:26.800
Some real deep anthropological,
sociological, psychological understanding

01:07:26.880 --> 01:07:30.560
of what people need when
and how to treat ethically.

01:07:30.640 --> 01:07:33.560
I would piggyback
on philosopher and add ethicist.

01:07:33.640 --> 01:07:36.120
We need more data ethicists in the world.

01:07:36.200 --> 01:07:38.720
But I think this is a place where really

01:07:38.720 --> 01:07:42.680
getting to know people is going to be more
important than ever because we're trying

01:07:42.760 --> 01:07:48.480
to substitute an artificial intelligence
for human interaction in certain moments.

01:07:48.560 --> 01:07:52.280
If it's going to work, it's going
to need to feel really authentic.

01:07:52.360 --> 01:07:55.840
With that comes all
the scary potentialities.

01:07:55.920 --> 01:08:01.680
But in any event,
all really good good questions and scary

01:08:01.760 --> 01:08:06.000
questions, some of them, but also really
optimistic potentials for all of us here.

01:08:06.080 --> 01:08:08.200
I want to thank everyone.

01:08:08.280 --> 01:08:12.320
I want to thank our panelists
for an excellent conversation.

01:08:12.400 --> 01:08:14.840
There are drinks to be had and more

01:08:14.880 --> 01:08:19.080
conversations to be had and this wonderful
space to take advantage of out of the rain

01:08:19.160 --> 01:08:22.280
and under a giant disco ball
or something over there.

01:08:22.360 --> 01:08:24.160
So thank you all.

01:08:24.160 --> 01:08:25.240
Thank you to the audience.

01:08:25.240 --> 01:08:26.440
Thank you to our panelists.

01:08:26.520 --> 01:08:30.200
And thank you to us, too,
at Noya House for hosting us.

01:08:30.280 --> 01:08:33.040
Yeah.
Thank you.

