AILearn the latest news about the AI revolution straight from California.
Meet Yann Caloghiris from Left Field Labs
During TØFF film festival 2026, we had the pleasure to have an unfiltered meeting with Yann Caloghiris, Executive Creative Director at Left Field Labs in California and the host of For Breakfast Club on YouTube.
Yann works close to the cutting edge of brand, product, and experience design – and he’s seen the latest AI wave from the inside, with insight shaped by work and networks around the big players of Silicon Valley. Expect sharp observations, real examples, and plenty of humor.
- What does “original” mean when everyone has tools?
- How do filmmakers stay relevant when the pipeline is reinvented?
- Where are the real opportunities – and the real traps?
Check out the interview with plenty of examples below.
The talk was designed for festival goers, filmmakers, producers, agencies, creatives, and anyone building a business around storytelling. Hosted by director and DOP Anders Jørgensen from Feber Film.
Transcript
Hello, Yann! How are you?
Hello, Anders. I’m great, thank you.
You can see it’s nice and sunny here in California.
It’s February, but it’s twenty-two degrees centigrade, so I can’t complain. Most of my team is on the East
Coast, and they’re suffering snowstorms right now, so I feel pretty snug.
And, and tell me, Leftfield Labs, what is that?
Yeah, so maybe just a, a little bit of background.
So this is Leftfield Labs, so we are a creative tech company.
That’s what we say, we’re a creative tech company.
What we actually are is a design business that uses new and emerging
technologies to give our clients a competitive advantage.
And so a lot of what we do is for a lot of B2B companies, um, and we help them accelerate the
adoption of new and emerging technologies.
So they will come to us with… You’ve probably heard the term
FOMO. With FOMO, for an acronym, they’ll say: “I’ve heard
about XR, I’ve heard about AR, I’ve heard about blockchain, I’ve heard about
AI,” obviously. And, uh, we will help
them, um, either adopt those technologies, or we’ll
help them create brand experiences,
products, platforms that leverage those,
um, those technologies and give them a competitive
advantage. Sometimes the answer isn’t a new
technology, it’s just good design, uh, but in most
cases, that’s our sweet spot. We, we like to say that we make these big bets that they take
real. So we don’t really talk about them, and I realize the irony ’cause
I’m talking to you right now, but we don’t really talk that much about it.
It’s more we take those technologies, we implement them, and we create measurable
impact for our clients.
So it’s not like a normal
agency as such?
I mean, there’s some parts of it that… My background is, is in agency. Um, and so
there’s some parts of it that you would recognize in the sense that,
um, we still pitch a lot of creative.
Uh, we have members of the team that have transferable skills
that you would see in agencies. So we have content
creators, we have 3D designers, and things like that, but it’s the output that’s very, very different.
And then it’s also the fact that, um, we operate
more like a software company than a traditional agency.
A traditional agency will tend to go a bit more waterfall, you know, from a
project management standpoint, so they will, you know, very much
do this phase, then the next phase, then the next phase.
Um, we operate a lot more in an agile work stream.
So clients will come with
a challenge, a, uh… Sometimes it’s very ambiguous, so it’s not a very well-defined thing.
It might be something like, “We’re realizing that our customers are going to a competitor,” or it might be something like, “There is a new business on the
market that is creating an existential threat for us.” And so we will show up, we’ll analyze their business, we’ll understand what their challenges are, and then in many cases, we
will, you know, give them, uh, access to new
growth, new customers, or sometimes completely a whole new
business, uh, through experience design,
through, you know, operation design, and then also everything that we do with new
and emerging technologies.
And, and what’s your role in this?
So I lead creative, um, and strategy. Um, so really my role is to help, um,
help the client define, as articulately as we can, what the challenge is, come up withhow to create a solution that is on brand but is also gonna be from aperformance standpoint, deliver the results that they want, so kind of a bit like
what a consultancy would do. And then I have a team of creative technologists, so people who are one foot in creativity and, um, you know, it could be visual creativity, it could be, uh, experiential architecture, it could be product design, and then one foot in
engineering, in most cases. So they might be software
developers, uh, they might be software architects, um, you
know, writing TDDs, technical design documents, et
cetera. So we’re, like,
creating the in-between, between brands that are looking to step
into-… a brave new world of new and emerging technology, and
then the Silicon Valley-style firms themselves who
develop, you know, large language models or other technologies.
And we tend to be the people in between who make these– you know, who help these
businesses capitalize on these new innovations.
And what kind of clients are you working with?
Um, so most of our clients are in the B2B space.
Yeah, but- We are beginning to-
Can you name drop?
Okay. Uh, I’ll name drop a couple. So I would say our longest
client has been Google. Um, with Google, we do
everything from helping them build some of their more
high-tech, uh, websites and online experiences. We’ve helped them develop, uh,
products. We were involved in, very early in Project
Tango, which is now, uh, part of their AR, so
augmented reality platform. Um, but we also
do things for Google I/O, which is their big, um, annual
event. And so if you go to googleio.com, you’ll see there’s a fun
little puzzle. Many cases, it uses AI tools
in there, but they’re, like, game-based, and they’re designed to engage
developers. Um, other clients that we have
are Salesforce, which is, um, you’ve already heard, the CRM
company, and we’re helping them, uh,
market, but also create proof points for what they call
the agentic enterprise, which is AIs that
aren’t just chatbots, they’re AIs that can perform tasks
on behalf of organizations at scale, and
obviously, on top of what they mostly do, which is cloud-based CRM
tools and platforms. We do work
for, uh, Meta. We did a lot of work for
them in XR, so the Meta glasses, which were
released this year, um, also project code-named
Orion, which is now public, which is XR glasses, which have
the Llama model built in, the AI model.
So you can speak to AI, and it’s contextual.
So where you look with the glasses, the AI knows that you’re looking at it,
and so that opens up plenty of new, exciting experiences and
opportunities to interact with the world.
Uh, we’ve done a lot of work for Meta,
for, uh, Amazon. For in automotive space,
we helped companies like Ford and Toyota adopt these new
technologies, find new, exciting experiences to do
in-vehicle. We helped them launch the
Mach-E, which is their first ground-up EV.
We designed the, uh, center stack.
Now, we were one of the partners, many partners were involved in this, but we
helped them design, like, some hero features for it, so things like that.
I can go on forever, but it’s mostly…
I would say the common threads
are companies that are at an inflection point because
technology, market, and their core business is all
beco- becoming shaken, and they need a partner to help
them define a vision and then build the
first real proof points for it.
So that’s, of course, one of the reasons why I wanted to bring you
here to Tønsberg Film Festival, uh, is, of course,
how you kind of are in, in the middle of this, this
big… where, where, where it happens, basically, you know,
because the things that happen in Silicon Valley, they affect the whole
world, of course, and, um-
… after a few years, uh, changes that happen there often tends
to arrive here somewhat later. So it’s really cool,
so to hear, you know, what’s, what’s the atmosphere? What’s the vibe?
What’s the trend? What’s the zeitgeist in, in these big
corporate e-environments?
That’s such a big question. So, so I’m– you can probably hear
from my accent, I’m French and British, so I didn’t grew up in the US.
So, um, I’ve only been here just over ten years
174
00:10:00.032 –> 00:10:01.652
now, and
175
00:10:02.592 –> 00:10:04.452
it’s been through many changes. I mean, you’re right.
176
00:10:04.472 –> 00:10:08.412
Like, I would say, uh, California as
177
00:10:08.472 –> 00:10:12.432
a whole is the fourth largest economy, and if you look at the pie
178
00:10:12.552 –> 00:10:16.412
of where it creates most value, it is, you know,
179
00:10:16.432 –> 00:10:20.352
big tech by far. Um, and one of the interesting things
180
00:10:20.392 –> 00:10:24.052
is, I don’t live in San Francisco. Most of my clients do, and I’m
181
00:10:24.192 –> 00:10:25.972
regularly over there in the, in the Bay Area,
182
00:10:26.892 –> 00:10:30.552
but I am in the space where a lot of the content that I grew up with, the
183
00:10:30.632 –> 00:10:34.592
movies, you know… I, I live not far from the beach, where I remember
184
00:10:34.772 –> 00:10:38.052
seeing, you know, people in red swimsuits running in slow motion.
185
00:10:38.072 –> 00:10:42.012
You know, it sort of defines my childhood in some extents, a lot of the content
186
00:10:42.072 –> 00:10:45.452
that came from here. And so you’ve got two
187
00:10:46.372 –> 00:10:50.012
big worlds: one which is, in some respects,
188
00:10:50.072 –> 00:10:53.492
defining a lot of culture for the world, and one that’s
189
00:10:53.532 –> 00:10:56.412
defining a lot of tech and
190
00:10:56.532 –> 00:11:00.472
innovation, and they’re coexisting in
191
00:11:00.532 –> 00:11:04.372
this, you know, desert strip of land and creating all
192
00:11:04.412 –> 00:11:08.372
this wealth. What’s interesting is that
193
00:11:08.442 –> 00:11:12.172
they are ki- kind of odd bedfellows in some ways.
194
00:11:12.292 –> 00:11:16.272
Um, Silicon Valley has this, you know, belief that
195
00:11:16.332 –> 00:11:18.892
technology will save the world,
196
00:11:18.992 –> 00:11:22.852
um, that what you need is just some brilliant
197
00:11:22.892 –> 00:11:25.242
engineers and the right tools, and any problem is
198
00:11:25.312 –> 00:11:29.302
solvable. And Hollywood, um,
199
00:11:29.512 –> 00:11:32.072
is more like the legacy, um,
200
00:11:32.892 –> 00:11:36.692
you know, mega industry here, where it’s being a little
201
00:11:36.792 –> 00:11:40.292
bit frazzled by all this, a bit more than frazzled,
202
00:11:40.992 –> 00:11:44.682
and I’ll give you some, some practical examples. So we do…
203
00:11:44.682 –> 00:11:48.632
we, we do build, uh, AI post-production pipelines
204
00:11:48.652 –> 00:11:52.632
for our clients. Um, so that’s where both the worlds that we operate in
205
00:11:52.712 –> 00:11:55.652
come together. So brands will come to us and
206
00:11:55.752 –> 00:11:59.412
say, “We have a lot of content that needs to be
207
00:11:59.452 –> 00:12:03.292
brand accurate, that we spent hundreds of thousands of dollars, if not
208
00:12:03.372 –> 00:12:06.972
millions, on, um, but they’re fairly repetitive.” So
209
00:12:07.132 –> 00:12:10.181
think-… um, graphics
210
00:12:10.872 –> 00:12:14.662
or, um, content that is, I would call it, through the
211
00:12:14.732 –> 00:12:18.572
line, if you call it in marketing. So maybe not the above the line adverts
212
00:12:18.652 –> 00:12:22.032
you see at the Super Bowl, and not the stuff that is like a
213
00:12:22.052 –> 00:12:24.672
podcast or like a training video, but the bits in
214
00:12:24.732 –> 00:12:28.722
between. And we will build complete pipelines that
215
00:12:28.752 –> 00:12:32.412
can then allow them to generate their own ads and
216
00:12:32.452 –> 00:12:36.322
content on the fly with AI content completely, and
217
00:12:36.352 –> 00:12:39.092
it’s on brand, it’s approved. So,
218
00:12:40.132 –> 00:12:44.112
you know, um, because of NDAs, I can’t discuss which companies, but I can– I think
219
00:12:44.192 –> 00:12:48.092
I can say that you have to imagine that most,
220
00:12:48.192 –> 00:12:52.182
uh, companies currently in the US who are B2B, have something
221
00:12:52.352 –> 00:12:56.322
similar or are exploring it. Um, so I
222
00:12:56.352 –> 00:13:00.032
think maybe five, five years
223
00:13:00.072 –> 00:13:03.692
ago, there was the realization that these tools are
224
00:13:03.772 –> 00:13:06.692
here, and yes, they caught a lot of attention,
225
00:13:07.692 –> 00:13:10.892
but are they actual real business tools?
226
00:13:10.952 –> 00:13:14.712
You know, it looked a bit like a bad acid trip with most of it.
227
00:13:14.812 –> 00:13:18.632
Um, so the credibility of it was not quite
228
00:13:18.692 –> 00:13:21.002
there as a business tool. Um,
229
00:13:22.272 –> 00:13:24.912
over the past five years, the models have improved
230
00:13:25.472 –> 00:13:29.332
dramatically. The top hat tools that you
231
00:13:29.372 –> 00:13:31.532
sit on top of it and the pipelines
232
00:13:32.452 –> 00:13:35.972
have also improved, which means that creatives have more
233
00:13:36.032 –> 00:13:38.962
control over those tools. The
234
00:13:38.992 –> 00:13:42.452
legislation, um, copyright, etc., is
235
00:13:42.572 –> 00:13:45.672
beginning to catch up, although that’s trailing
236
00:13:45.752 –> 00:13:48.312
behind. So I would say,
237
00:13:49.972 –> 00:13:53.792
depending on who you speak to, there’s either acceptance
238
00:13:54.432 –> 00:13:58.332
that these are the tools that are here to stay, or
239
00:13:58.772 –> 00:14:02.452
there is still some malaise, I would say, with the more traditional Hollywood,
240
00:14:02.492 –> 00:14:06.392
which is saying, well… And you know, we had film
241
00:14:06.432 –> 00:14:10.392
directors here recently. We had a big event to do with Hollywood and AI here at
242
00:14:10.432 –> 00:14:14.402
the office. Private event, phones down, so I can’t discuss who attended, but I can
243
00:14:14.432 –> 00:14:17.842
talk about the headlines. So some directors were
244
00:14:17.851 –> 00:14:20.622
committed to using these technologies.
245
00:14:20.652 –> 00:14:24.442
So they might be working for a big studio, like a Universal or a Disney,
246
00:14:24.452 –> 00:14:28.312
etc., and they are building, uh, AI labs
247
00:14:28.592 –> 00:14:31.912
where they’re beginning to test and iterate on content.
248
00:14:31.932 –> 00:14:35.632
There are actually some movies right now in the theaters that have
249
00:14:35.732 –> 00:14:39.572
sequences that were generated with AI and then
250
00:14:39.652 –> 00:14:43.472
composited in post-production. They’re not vocal
251
00:14:43.532 –> 00:14:46.712
about it, but they are currently in theaters right
252
00:14:46.752 –> 00:14:50.672
now. There are some which, um, where they
253
00:14:50.712 –> 00:14:54.572
were vocal. If you, you know the movie, The Brutalist, um, that
254
00:14:54.632 –> 00:14:58.252
came out recently, the act- the,
255
00:14:58.372 –> 00:15:01.812
the, the, um, the biopic,
256
00:15:01.832 –> 00:15:05.671
basically, of the architect. The architect was Hungarian,
257
00:15:05.772 –> 00:15:09.612
and he couldn’t nail his accent when he was speaking Hungarian,
258
00:15:10.092 –> 00:15:13.652
so they used AI and trained it on his voice so he
259
00:15:13.752 –> 00:15:17.502
speaks perfect Hungarian. So those are
260
00:15:17.502 –> 00:15:21.322
examples where Hollywood is beginning to adopt these technologies, and they feel it
261
00:15:21.372 –> 00:15:24.112
improves their art form and the authenticity of the content that they
262
00:15:24.172 –> 00:15:25.542
create.
263
00:15:25.632 –> 00:15:26.201
Yeah, I, I think, uh-
264
00:15:26.212 –> 00:15:27.912
But at the same time, this… Yeah.
265
00:15:27.932 –> 00:15:31.552
Yeah, let’s, let’s get back to the sort of the, that,
266
00:15:31.752 –> 00:15:32.732
s- you know-
267
00:15:32.752 –> 00:15:32.761
Yeah
268
00:15:32.761 –> 00:15:36.152
… core film storytelling. I, I was just going a bit more around-
269
00:15:36.172 –> 00:15:36.182
Yeah
270
00:15:36.182 –> 00:15:38.592
-in terms of, uh,
271
00:15:39.432 –> 00:15:41.292
so in of, in the whole business world.
272
00:15:41.332 –> 00:15:42.452
So what I’ve understood is-
273
00:15:42.472 –> 00:15:42.712
Yeah
274
00:15:42.722 –> 00:15:45.312
-you know, is AI a bubble or not, you know?
275
00:15:45.372 –> 00:15:47.992
Is, is it affecting the work market?
276
00:15:48.232 –> 00:15:52.122
How, how is kind of the whole business around this AI, not just film, but, you
277
00:15:52.132 –> 00:15:56.072
know, in all the creative industries and the… You know, what do you see there?
278
00:15:56.132 –> 00:15:56.162
So
279
00:15:57.112 –> 00:16:01.042
I, so I’m not a– I’m not gonna give any, any, uh, advice on what
280
00:16:01.052 –> 00:16:02.272
shares to buy or anything like that.
281
00:16:02.292 –> 00:16:05.962
I’m not a financial expert, but I would say the bubble question is a question that
282
00:16:05.992 –> 00:16:08.732
obviously we talk a lot about. So,
283
00:16:10.212 –> 00:16:12.432
yes, there’s a bubble, and no, there isn’t a bubble.
284
00:16:12.472 –> 00:16:16.172
So h- here’s what I mean by that. So, yes, there is a bubble in the
285
00:16:16.232 –> 00:16:19.892
sense that if you look at the investment made
286
00:16:20.512 –> 00:16:21.372
and the returns,
287
00:16:22.712 –> 00:16:26.352
not all businesses that have heavily invested with AI
288
00:16:27.152 –> 00:16:30.752
are going to make it on the other side, and I would compare this
289
00:16:30.892 –> 00:16:34.552
to the beginning of the internet. How many of you are still
290
00:16:34.632 –> 00:16:38.432
using AOL, right? Um, so there are
291
00:16:38.452 –> 00:16:41.892
going to be- there is going to be a change of guards, where the investment has been
292
00:16:41.952 –> 00:16:45.352
so high that there is… It’s gonna be incredibly difficult for a
293
00:16:45.392 –> 00:16:47.502
trillion-dollar company to recoup that
294
00:16:47.532 –> 00:16:51.422
investment. Um, so you will see some
295
00:16:51.492 –> 00:16:54.692
change, and there’s gonna be- it’s gonna be a bumpy road, for sure.
296
00:16:54.732 –> 00:16:58.712
But, uh, yeah, because like in terms of, of s- software, uh, you
297
00:16:58.752 –> 00:17:01.652
know, using AI to make code, uh-
298
00:17:01.692 –> 00:17:01.852
Yeah
299
00:17:02.932 –> 00:17:06.113
… there’s been studies saying that, for instance, uh, some developers
300
00:17:06.652 –> 00:17:10.552
claim that they are twenty percent faster now that they can get the AI
301
00:17:10.652 –> 00:17:14.512
to, to code things. Uh, but b- when they’re measured, they’re actually
302
00:17:14.572 –> 00:17:18.452
twenty percent slower. Or, you know, they get into a
303
00:17:18.492 –> 00:17:22.333
dangerous territory when the AI just makes so much code that they can’t
304
00:17:22.392 –> 00:17:26.122
revise it, or it becomes so complicated that they don’t, they don’t know…
305
00:17:26.133 –> 00:17:29.852
They’re shipping code that they don’t know what, what is, basically, you know?
306
00:17:29.912 –> 00:17:30.452
And, uh-
307
00:17:30.492 –> 00:17:30.752
Yeah
308
00:17:31.492 –> 00:17:32.472
… so, so it’s like-
309
00:17:34.212 –> 00:17:35.922
There’s, you’re basically saying…
310
00:17:35.932 –> 00:17:39.492
I mean, remember, we are, I would say, the
311
00:17:39.512 –> 00:17:42.872
emergence of LLMs is only a few years in.
312
00:17:43.572 –> 00:17:47.432
It is the fastest adopted piece of technology in the
313
00:17:47.492 –> 00:17:51.332
history of mankind. There is nothing that has had this
314
00:17:51.892 –> 00:17:55.512
level of adoption rate. So I think let’s just put that down.
315
00:17:55.532 –> 00:17:57.752
There is no precedent for this, right?
316
00:17:57.772 –> 00:18:01.592
The, um, the internet, the wheel, fire, etc.,
317
00:18:01.712 –> 00:18:01.952
took
318
00:18:03.272 –> 00:18:05.872
tens of thousands of years for it to adopt this.
319
00:18:05.912 –> 00:18:09.792
Within a matter of weeks, was hundreds of millions of users.
320
00:18:09.852 –> 00:18:12.646
So I-… at the same time, it’s an emerging
321
00:18:12.756 –> 00:18:16.176
technology, and if we look back a little bit about
322
00:18:16.256 –> 00:18:19.956
how AI was built, if you go to the origin stories,
323
00:18:20.956 –> 00:18:24.716
um, a lot of these weren’t designed to do what they’re doing now.
324
00:18:24.776 –> 00:18:28.326
So the large language model had very different
325
00:18:28.376 –> 00:18:30.416
applications when it was started, right?
326
00:18:30.466 –> 00:18:31.416
For example, Google
327
00:18:32.396 –> 00:18:34.876
used a very early version of an LLM
328
00:18:35.696 –> 00:18:39.236
to predict what the next word was going to be in your search
329
00:18:39.296 –> 00:18:43.256
box. So when you typed dog, it would predict
330
00:18:43.456 –> 00:18:47.236
food. It, it wasn’t a- it wasn’t
331
00:18:47.316 –> 00:18:50.956
ana- analyzing other people’s names or, or other people’s
332
00:18:51.056 –> 00:18:54.896
searches. It was predicting what it thought your next word was going to
333
00:18:54.936 –> 00:18:58.886
- That’s like at the very beginning, that was the beginning of
334
00:18:58.936 –> 00:19:01.136
a large language model. It’s not
335
00:19:02.216 –> 00:19:05.496
intelligence in the sense that it ha- is fully context
336
00:19:05.556 –> 00:19:09.056
aware. It is just a very sophisticated mathematical
337
00:19:09.136 –> 00:19:12.976
tool that predicts the next word, and then the
338
00:19:12.996 –> 00:19:16.676
rationale was, well, maybe I can use that to
339
00:19:16.716 –> 00:19:17.236
predict
340
00:19:18.096 –> 00:19:20.716
in, in a, in a translation context.
341
00:19:20.776 –> 00:19:24.716
So I’m speaking to you in English, you might be
342
00:19:24.816 –> 00:19:28.776
speaking to me in Mandarin. You can’t translate word for word,
343
00:19:28.856 –> 00:19:31.916
you wanna translate whole sentences to be able to capture the
344
00:19:31.956 –> 00:19:35.946
meaning. So large language models were then used for
345
00:19:35.976 –> 00:19:39.536
translation purposes, and then someone said, “Well, hold on, wait a
346
00:19:39.576 –> 00:19:43.116
minute. Maybe I can build something that
347
00:19:43.156 –> 00:19:47.096
appears intelligent and that is predicting answers to
348
00:19:47.136 –> 00:19:50.176
a question.” And the answer was yes.
349
00:19:50.196 –> 00:19:53.736
If you give it enough data, it can respond with
350
00:19:54.556 –> 00:19:58.476
a computational answer to your question, and then
351
00:19:58.516 –> 00:20:02.256
you keep going down, and you keep feeding it more data, and you keep refining the
352
00:20:02.316 –> 00:20:06.136
model, and eventually you end up with something that can predict the next
353
00:20:06.716 –> 00:20:10.696
two thousand pages of code. But the origin
354
00:20:10.816 –> 00:20:14.396
story was a prediction tool, not a coding
355
00:20:14.496 –> 00:20:18.136
tool. So you’re going to see some weird
356
00:20:18.176 –> 00:20:21.096
results, you know? But that’s not to say
357
00:20:22.376 –> 00:20:22.696
that
358
00:20:23.756 –> 00:20:27.446
using this methodology, whether it’s LLMs or growing into
359
00:20:27.476 –> 00:20:30.306
something like neural nets, if those
360
00:20:30.336 –> 00:20:32.585
tools… That, that’s not in question.
361
00:20:32.616 –> 00:20:36.566
I think anybody here, there’s no one in here who’s, like, doubting that this
362
00:20:36.616 –> 00:20:40.276
is a fad and somehow we’ll retor- return to the way things
363
00:20:40.336 –> 00:20:43.416
were. I think it’s consensus now, um,
364
00:20:44.256 –> 00:20:48.176
that this is well afoot. Whether this technology or that technology
365
00:20:48.216 –> 00:20:51.856
becomes dominant, whether LLMs lead to AGI,
366
00:20:51.876 –> 00:20:55.696
artificial general intelligence, is a whole different story,
367
00:20:55.736 –> 00:20:55.936
but
368
00:20:56.836 –> 00:20:59.116
I think everybody’s adamant that this is happening, this is here to
369
00:20:59.176 –> 00:21:01.116
stay.
370
00:21:01.216 –> 00:21:04.016
And, uh, I guess some people are panicking.
371
00:21:06.316 –> 00:21:06.596
Yeah,
372
00:21:07.576 –> 00:21:11.376
uh, and I, you know, I, I was panicking a bit, if I’m quite honest,
373
00:21:11.456 –> 00:21:13.796
about a decade ago. Um-
374
00:21:14.376 –> 00:21:14.756
So, yeah.
375
00:21:14.836 –> 00:21:15.356
I ran-
376
00:21:15.516 –> 00:21:16.656
So, so, so, uh, go… Let me-
377
00:21:16.666 –> 00:21:17.476
Yeah. You want me to talk a bit about that?
378
00:21:17.536 –> 00:21:18.916
Yeah, take me through your-
379
00:21:18.925 –> 00:21:18.925
Yeah
380
00:21:18.925 –> 00:21:22.696
… your, uh, career, which is quite interesting.
381
00:21:22.705 –> 00:21:25.596
Well, I don’t know if it’s interesting, but, uh, but thank you for saying that.
382
00:21:25.656 –> 00:21:28.396
I mean, I had a very classic,
383
00:21:28.496 –> 00:21:32.096
um… I, I studied m- media, and we studied
384
00:21:32.176 –> 00:21:36.116
together, which was amazing. Um, so I studied the, the film, film,
385
00:21:36.156 –> 00:21:39.776
video, graphic design portion, and then, um,
386
00:21:40.136 –> 00:21:43.756
the other part of my, uh, course was computer science.
387
00:21:43.816 –> 00:21:44.365
And, um,
388
00:21:45.376 –> 00:21:49.266
the rationale was that if I compare engineering with design and
389
00:21:49.356 –> 00:21:53.056
creativity, there might be a path for an interesting career
390
00:21:53.066 –> 00:21:56.746
somewhere in there, but I hadn’t really figured out what that was going to be.
391
00:21:56.816 –> 00:22:00.516
Um, and so I- that’s always been mixing the two.
392
00:22:00.616 –> 00:22:03.156
It’s a left brain, right brain type of approach.
393
00:22:03.276 –> 00:22:07.156
Um, can I apply engineering to design, and can I buy ar-
394
00:22:07.196 –> 00:22:10.296
can I apply artistic flair to technology and
395
00:22:10.316 –> 00:22:14.296
engineering? And over time, I kind of created a, a path for myself.
396
00:22:14.356 –> 00:22:17.796
I did, uh, traditional advertising for a while.
397
00:22:17.856 –> 00:22:21.826
I wanted to do film also and video. We, we did a short film together, and I
398
00:22:21.856 –> 00:22:25.516
used a lot of technology to do post-production and all sorts of things.
399
00:22:25.556 –> 00:22:28.986
And then over time, I, I realized that
400
00:22:29.196 –> 00:22:32.756
where I got excited about was where
401
00:22:32.816 –> 00:22:36.686
it’s in the real world. It’s where people, and
402
00:22:36.716 –> 00:22:40.316
technologies, and brands, and experiences all come together in the physical
403
00:22:40.396 –> 00:22:41.296
world. I felt
404
00:22:42.176 –> 00:22:45.916
like being able to create a transformational moment for someone in the
405
00:22:46.016 –> 00:22:49.976
physical world, to me, was very exciting and interesting,
406
00:22:49.996 –> 00:22:53.596
and so that led me down the path of… I did automotive for many years.
407
00:22:53.676 –> 00:22:57.456
I did, um, experience design for, like, big
408
00:22:57.496 –> 00:23:01.396
auto show spaces, um, product,
409
00:23:01.456 –> 00:23:05.216
you know, um, so apps and those types of technologies,
410
00:23:05.376 –> 00:23:08.956
XR, VR. So I really migrated, but applying the same principles
411
00:23:09.016 –> 00:23:12.496
of art to engineering and engineering to art.
412
00:23:12.596 –> 00:23:13.016
Um-
413
00:23:14.156 –> 00:23:17.906
And, and what’s, what’s your, what’s your, what’s your design philosophy in that?
414
00:23:18.856 –> 00:23:22.356
What’s your, what’s your main, your core thing that you try to
415
00:23:22.396 –> 00:23:24.996
infuse in these projects?
416
00:23:25.996 –> 00:23:29.856
I mean, I’m very much a people first.
417
00:23:29.976 –> 00:23:30.296
Um,
418
00:23:32.336 –> 00:23:36.236
I think for me, it’s, can I create something that is
419
00:23:36.296 –> 00:23:38.796
genuinely useful, genuinely
420
00:23:38.816 –> 00:23:40.926
delightful, um,
421
00:23:42.136 –> 00:23:44.536
and a net positive using new and emerging
422
00:23:44.576 –> 00:23:48.056
technologies? I mean, so I would say
423
00:23:48.096 –> 00:23:51.956
philosophically, but the… If I had one headline, I, I want the world to be
424
00:23:51.976 –> 00:23:55.556
more creative, a more creative place everywhere we go, I
425
00:23:55.576 –> 00:23:59.056
think. Um, in the world that we live
426
00:23:59.156 –> 00:24:02.556
today, we all know when we’re unhappy.
427
00:24:02.565 –> 00:24:06.156
Most of us, even if we don’t consider ourselves creative people,
428
00:24:06.256 –> 00:24:10.236
you’re more unhappy if you consume more content
429
00:24:10.276 –> 00:24:13.788
and experiences-… and you tend to be happier if you have the
430
00:24:13.828 –> 00:24:17.768
opportunity to be creative. And it doesn’t have to be art, anything
431
00:24:17.828 –> 00:24:21.548
is creative, but it just has– you’ve contributed something.
432
00:24:21.568 –> 00:24:24.948
I think we’re inherently creative creatures.
433
00:24:24.968 –> 00:24:25.348
And so
434
00:24:27.088 –> 00:24:30.288
ten years ago, um, I ran a program with Ford.
435
00:24:31.208 –> 00:24:34.878
It was a, um, a research program to try and
436
00:24:34.928 –> 00:24:38.708
identify what would be the in-car experience
437
00:24:38.848 –> 00:24:41.148
when cars became fully autonomous.
438
00:24:41.188 –> 00:24:44.948
So some of that work became Ford’s autonomous program.
439
00:24:44.968 –> 00:24:48.868
But I, I pitched to them that we should do a parallel program where we
440
00:24:48.968 –> 00:24:49.948
also research
441
00:24:51.428 –> 00:24:54.708
how will AI transform the discipline of
442
00:24:54.828 –> 00:24:56.588
design, research and design,
443
00:24:57.468 –> 00:25:00.848
thinking that it would just be a little bit of interesting PR.
444
00:25:00.948 –> 00:25:04.488
Um, and so we, we went through a four D process: uh,
445
00:25:04.498 –> 00:25:06.648
discover, design, develop, deploy.
446
00:25:06.668 –> 00:25:10.188
And at each of those steps, we had the latest in AI
447
00:25:10.308 –> 00:25:13.948
tools and algorithms and technologies, even like research
448
00:25:14.008 –> 00:25:16.788
papers that we had applied to some of that work.
449
00:25:16.828 –> 00:25:20.018
And at the end of the workstream, I was a little bit shocked, um,
450
00:25:20.108 –> 00:25:23.248
because I realized that roughly one-third
451
00:25:24.068 –> 00:25:26.748
of the work that I had built Ford for
452
00:25:27.968 –> 00:25:31.848
was– could be replaced by AI. And
453
00:25:31.928 –> 00:25:34.808
what I mean replaced, not
454
00:25:34.888 –> 00:25:38.728
automated, but certainly so heavily disrupted
455
00:25:39.348 –> 00:25:43.248
that it was difficult to charge the same amounts for those
456
00:25:43.328 –> 00:25:47.168
portions of work. And so that made me
457
00:25:47.288 –> 00:25:51.208
quite anxious about, selfishly, my career, but
458
00:25:51.228 –> 00:25:55.208
also the industry as a whole. An industry where people like us, who are creative
459
00:25:55.288 –> 00:25:58.368
people, who don’t really fit very well anywhere else,
460
00:25:59.488 –> 00:26:03.108
what’s our value if what we do is being
461
00:26:03.168 –> 00:26:06.318
eroded and being changed and being handed over to
462
00:26:06.318 –> 00:26:10.138
technologies? So I did a talk. I was invited at South
463
00:26:10.168 –> 00:26:14.028
by Southwest. I thought it was gonna be a small session, but like a thousand people
464
00:26:14.108 –> 00:26:17.808
showed up, so clearly, I wasn’t the only one with that
465
00:26:17.888 –> 00:26:18.628
anxiety.
466
00:26:19.828 –> 00:26:23.168
Um, and so and I’ve been, I’ve been on a mission since, to
467
00:26:24.048 –> 00:26:27.208
basically lean into the technology and not
468
00:26:27.328 –> 00:26:31.088
just, um… My, my goal is to
469
00:26:31.108 –> 00:26:35.048
not be on the receiving end of the technology, but to lean in
470
00:26:35.108 –> 00:26:35.568
enough
471
00:26:36.408 –> 00:26:39.308
that we become the architects of this technology.
472
00:26:39.368 –> 00:26:43.308
We influence how it should show up and how it should elevate
473
00:26:43.368 –> 00:26:47.148
what we do, um, instead of being in this position where we feel like we’re in
474
00:26:47.268 –> 00:26:49.528
competition with the technology.
475
00:26:49.588 –> 00:26:52.688
Um, so I’ve done a, a bunch of projects here.
476
00:26:52.708 –> 00:26:54.768
I’ve done a bunch of talks. I have a podcast.
477
00:26:54.778 –> 00:26:58.668
It’s only five months old, um, but where I interview people who have
478
00:26:58.708 –> 00:27:02.028
been disrupted, so in some cases, completely lost their revenue
479
00:27:02.108 –> 00:27:05.188
stream, but found new ways to adopt the
480
00:27:05.228 –> 00:27:08.568
technology, um, to create new revenues and still be
481
00:27:08.608 –> 00:27:09.728
creative.
482
00:27:09.768 –> 00:27:12.088
Great. And so, so either change or die,
483
00:27:12.128 –> 00:27:14.368
basically?
484
00:27:14.448 –> 00:27:18.168
So I don’t know if I agree that it’s adapt or die.
485
00:27:18.248 –> 00:27:22.128
I think, yes, there will be for
486
00:27:22.148 –> 00:27:25.768
sure, um, and there already is some
487
00:27:25.788 –> 00:27:29.398
things on our, on our… You know, when we, we
488
00:27:29.548 –> 00:27:33.028
bill hourly on our projects, there are some
489
00:27:33.128 –> 00:27:36.048
items on our services that just don’t exist anymore.
490
00:27:36.128 –> 00:27:39.638
Um, um, or they’re so heavily
491
00:27:39.708 –> 00:27:43.148
disrupted that they don’t really justify a full-time role
492
00:27:43.188 –> 00:27:46.088
anymore. However, at the same
493
00:27:46.208 –> 00:27:48.338
time, um,
494
00:27:49.328 –> 00:27:52.588
it is– we are seeing a productivity increase.
495
00:27:52.628 –> 00:27:55.648
So I know there’s… You, you mentioned the controversy around
496
00:27:56.048 –> 00:27:59.967
development, and a lot of developers are, you know, spreading the word
497
00:28:00.108 –> 00:28:03.348
that AI is creating a lot of noise. True.
498
00:28:03.388 –> 00:28:07.168
But if you look at, uh, tools that are just a few weeks old now, like Cloud
499
00:28:07.268 –> 00:28:10.238
Code, it’s a fantastic tool. It’s an agentic
500
00:28:10.288 –> 00:28:14.208
coder, uh, which means it doesn’t just write code, it can write
501
00:28:14.288 –> 00:28:17.708
across multiple platforms, it can do some backhand work.
502
00:28:17.768 –> 00:28:21.098
So it’s coming. Um, the–
503
00:28:21.608 –> 00:28:25.388
but we haven’t really seen a lot of layoffs on our side, on the
504
00:28:25.468 –> 00:28:29.008
developer side. Um, and the reason is because we pick
505
00:28:29.108 –> 00:28:32.768
developers who are also architects, um, they need
506
00:28:33.388 –> 00:28:36.688
to know what efficient code looks like, so that they can
507
00:28:36.848 –> 00:28:40.818
evaluate what the AI is doing. There are some areas
508
00:28:40.848 –> 00:28:44.748
that AI just doesn’t code in yet. Um, there
509
00:28:44.758 –> 00:28:48.588
are some areas that, for security reasons, uh, we don’t
510
00:28:48.628 –> 00:28:52.008
let AI code. Um, and then there are
511
00:28:52.108 –> 00:28:55.918
also, um, you know, being able to
512
00:28:55.948 –> 00:28:59.748
design an application that exists in the real world, you need to be
513
00:28:59.908 –> 00:29:03.348
context aware in a way that currently AI can’t.
514
00:29:03.528 –> 00:29:07.328
Um, for example, we helped to develop some of these XR
515
00:29:07.368 –> 00:29:10.548
glasses and experiences that have AI, plus
516
00:29:10.988 –> 00:29:14.548
context-aware content, so, uh, digital content
517
00:29:14.708 –> 00:29:18.658
overlaid on the real world. You, you need to understand how
518
00:29:18.708 –> 00:29:22.568
those things work. The AI needs to be able to interpret
519
00:29:23.348 –> 00:29:27.048
that right now you’re in the car, what’s more important
520
00:29:27.608 –> 00:29:31.508
is eyes on the road and not a text message
521
00:29:31.628 –> 00:29:35.428
with a picture that would take over your screen real estate.
522
00:29:35.438 –> 00:29:39.378
So things like that, that are obvious to us as humans,
523
00:29:39.428 –> 00:29:42.278
are not necessarily obvious to AI.
524
00:29:42.368 –> 00:29:46.088
So, um, increasingly, we’re finding that our
525
00:29:46.128 –> 00:29:49.568
developers are coding in specific areas that AI
526
00:29:49.588 –> 00:29:53.568
doesn’t work into or becoming… taking the ten thousand foot
527
00:29:53.608 –> 00:29:56.728
view. So this narrative that it’s adapt or
528
00:29:56.808 –> 00:30:00.668
die, is– has some truth. You do need to adapt, you
529
00:30:00.788 –> 00:30:03.357
do need to lean in. Um,
530
00:30:04.228 –> 00:30:08.208
and there are some roles that are disappearing, and a lot of post-production
531
00:30:08.668 –> 00:30:12.248
is going to move to AI, for sure.
532
00:30:12.368 –> 00:30:16.212
Um, but at the same time-… we’re also finding
533
00:30:16.912 –> 00:30:19.552
that especially in the content space, um,
534
00:30:20.572 –> 00:30:24.312
if you look at the work that, uh, through-the-line agencies are doing and
535
00:30:24.392 –> 00:30:28.232
content companies are doing, there is a clear fork in the
536
00:30:28.272 –> 00:30:32.112
road. You’re either embracing AI in your content, and that’s a
537
00:30:32.172 –> 00:30:36.152
certain type of content, so the big blockbuster movies, I think
538
00:30:36.172 –> 00:30:38.772
there is no doubt in my mind that they’re gonna adopt those technologies.
539
00:30:38.812 –> 00:30:42.652
Why should you pay $350,000 for the next Spider-Man movie,
540
00:30:42.712 –> 00:30:46.312
three, $350 million for the next Spider-Man movie when you can do it
541
00:30:46.432 –> 00:30:49.912
for less than 100? I think that’s going to be a clear
542
00:30:49.992 –> 00:30:53.172
area. But at the same time, content like
543
00:30:53.232 –> 00:30:56.472
this, content where people want to interface with
544
00:30:56.552 –> 00:31:00.432
people, be it in the real world or over content, those area
545
00:31:00.492 –> 00:31:04.472
heres are right now booming, and one of the areas that for us has been
546
00:31:04.712 –> 00:31:08.592
massive growth has been our work in experiences and experiential marketing,
547
00:31:09.292 –> 00:31:13.152
so where people meet in person. So it could be anything
548
00:31:13.252 –> 00:31:16.132
from themed entertainment to big
549
00:31:16.292 –> 00:31:19.292
conferences to sponsorship at sporting
550
00:31:19.372 –> 00:31:21.912
events. That’s becoming more
551
00:31:21.992 –> 00:31:25.312
important, in part as a
552
00:31:25.392 –> 00:31:29.372
by-product of the pandemic, where we took it away from each other
553
00:31:29.452 –> 00:31:32.722
for a short while, and now we’ve realized how valuable that
554
00:31:32.772 –> 00:31:36.392
- But also in this world where AI is taking
555
00:31:36.472 –> 00:31:39.952
over a lot of screen-based content, a lot of screen-based work,
556
00:31:40.852 –> 00:31:44.812
the more valuable interactions are the ones that we do in person that are more
557
00:31:44.832 –> 00:31:48.432
authentic. So, so it’s not
558
00:31:48.472 –> 00:31:52.452
necessarily die. I think there will be some shedding, but
559
00:31:52.492 –> 00:31:55.342
at the same time, there is an opportunity, um,
560
00:31:56.212 –> 00:31:59.892
but you have to lean into it, uh, and you have to do it now.
561
00:32:01.252 –> 00:32:04.992
So, uh, I know you have to leave, uh, soon, but, uh,
562
00:32:05.512 –> 00:32:09.352
so maybe if you can return to, to the film part.
563
00:32:09.412 –> 00:32:10.132
And, uh-
564
00:32:10.142 –> 00:32:10.162
Mm-hmm.
565
00:32:10.192 –> 00:32:13.992
… I saw a video on this, uh, photographer, a- and on
566
00:32:14.032 –> 00:32:17.852
his website, uh, he had, uh, you
567
00:32:17.892 –> 00:32:21.832
know, “I- I’m a photographer that, uh, captures
568
00:32:21.972 –> 00:32:25.052
photons of events that actually
569
00:32:25.092 –> 00:32:28.992
happened.” Now, which I th- thought was a really good positioning,
570
00:32:29.012 –> 00:32:32.392
sort of on the other hand, you know, because,
571
00:32:32.532 –> 00:32:36.272
uh, you know, we’re kind of all blown away with these clips you see
572
00:32:36.352 –> 00:32:40.182
on Facebook or whatever. Uh, but
573
00:32:40.252 –> 00:32:41.232
then I’ve
574
00:32:42.852 –> 00:32:46.542
s- there’s very few examples where I sort of see something that’s
575
00:32:46.612 –> 00:32:50.442
AI, where, where kind of that’s not on the front of your mind. It kind of…
576
00:32:50.452 –> 00:32:54.272
it, it flavours the experience, and most of the
577
00:32:54.312 –> 00:32:58.112
time, it’s, it’s just so much, it’s just basically s**t, you know?
578
00:32:58.132 –> 00:33:00.152
It’s like, I would never pay to watch
579
00:33:01.092 –> 00:33:05.052
some AI slop, you know. It just- it doesn’t- because it’s
580
00:33:05.112 –> 00:33:05.322
like,
581
00:33:06.312 –> 00:33:08.932
if you, if you watch like, uh, an animation,
582
00:33:09.972 –> 00:33:11.312
you have this kind of-
583
00:33:11.322 –> 00:33:11.322
Mm
584
00:33:11.332 –> 00:33:15.132
… you have, like, an agreed with the creator that this is a,
585
00:33:15.232 –> 00:33:16.832
a, uh, what do you call it? Uh,
586
00:33:18.032 –> 00:33:19.352
a suspension of disbelief.
587
00:33:19.392 –> 00:33:20.432
Yeah, suspended dis-
588
00:33:20.492 –> 00:33:20.612
Yeah.
589
00:33:20.672 –> 00:33:20.932
Yeah.
590
00:33:20.952 –> 00:33:24.832
We have sort of an agreed agreement that this is not real, this is animation, and
591
00:33:24.932 –> 00:33:27.952
people can’t fly and… but, but we accept it, and we buy it.
592
00:33:27.972 –> 00:33:28.812
And I, I find that-
593
00:33:29.092 –> 00:33:29.292
Mm-hmm
594
00:33:29.532 –> 00:33:33.512
… because AI is so close to the real thing, it’s kind of- it’s, it’s in a
595
00:33:33.532 –> 00:33:36.521
way, it’s kind of like it’s lying. It’s, it’s, it’s
596
00:33:36.572 –> 00:33:39.012
pretending, but it’s so close that we-
597
00:33:39.212 –> 00:33:39.222
Well
598
00:33:39.222 –> 00:33:41.402
… s- we’re, it, it, it makes us, uh…
599
00:33:41.412 –> 00:33:45.152
it tricks us, and, and nobody likes being tricked
600
00:33:45.492 –> 00:33:47.212
or fooled or, you know-
601
00:33:47.252 –> 00:33:49.152
I, I think you’ve hit the nail on the head, right?
602
00:33:49.212 –> 00:33:53.032
It’s- but that’s true of every piece of content,
603
00:33:53.092 –> 00:33:56.192
right? I, I- especially if you’re in the space of advertising and
604
00:33:56.232 –> 00:34:00.072
marketing. I’ve always believed strongly that there’s a clear
605
00:34:00.132 –> 00:34:03.952
line between convincing someone and
606
00:34:04.032 –> 00:34:07.992
coercing someone. To me, convincing someone
607
00:34:08.053 –> 00:34:11.932
is we both have the same information, and then I convince you to
608
00:34:12.033 –> 00:34:13.241
see the world the way I do.
609
00:34:14.252 –> 00:34:17.592
Coercing someone is, I hold back some of that
610
00:34:17.672 –> 00:34:20.792
information, and I convince you to see my point of
611
00:34:20.832 –> 00:34:23.493
view. It’s in those, in this latter
612
00:34:23.973 –> 00:34:27.752
example, that you feel robbed, you feel cheated,
613
00:34:27.812 –> 00:34:31.732
right? I was sold a car, but he didn’t tell me there was a, a leak in
614
00:34:31.792 –> 00:34:34.572
the, uh, in the, in the, um, in the gearbox, right?
615
00:34:34.752 –> 00:34:37.392
Uh, y- I, I was coerced into buying it.
616
00:34:37.432 –> 00:34:40.432
And I think the same is true for anything in the creative
617
00:34:40.513 –> 00:34:44.292
space. Um, if you are a photographer and you generate a
618
00:34:44.392 –> 00:34:47.993
spectacular image of an animal, and then you submit it to a
619
00:34:48.033 –> 00:34:51.752
competition for wildlife, you’ve coerced your audience.
620
00:34:51.772 –> 00:34:55.352
The trust is broken, and I do think that that’s a, that’s a clear
621
00:34:55.572 –> 00:34:58.232
line that can’t be crossed. At the same
622
00:34:58.292 –> 00:35:02.192
time, you know, we go watch Titanic
623
00:35:02.572 –> 00:35:05.652
knowing that we’re not filming, we’re not seeing the actual boat.
624
00:35:06.352 –> 00:35:09.992
We go see Marvel movies by the millions,
625
00:35:10.002 –> 00:35:12.532
knowing that we’re not seeing something real.
626
00:35:12.592 –> 00:35:16.502
So then the question becomes: Do I need to know
627
00:35:16.532 –> 00:35:20.292
that they spent $250 million and three years
628
00:35:21.112 –> 00:35:23.772
for the story to have more meaning? Perhaps.
629
00:35:23.812 –> 00:35:27.152
Maybe that was true also in the, in the ’60s with movies like
630
00:35:27.192 –> 00:35:30.772
Cleopatra, you know, that in today’s dollars would have been half a billion
631
00:35:30.832 –> 00:35:34.772
dollar. So maybe there’s some value in that, but at
632
00:35:34.812 –> 00:35:37.432
the same time, I think it’s gonna weed out, um,
633
00:35:38.752 –> 00:35:41.552
a lot of those types of movies that we went to see because they were
634
00:35:41.592 –> 00:35:45.562
expensive. You know, maybe that’s not such a bad thing. Maybe it’ll…
635
00:35:45.592 –> 00:35:49.172
if everybody can generate big spectacle movies, well, maybe that
636
00:35:49.312 –> 00:35:52.882
means that we’re gonna have to focus back on what is
637
00:35:52.952 –> 00:35:56.912
true and authentic, and that’s story, because who knows
638
00:35:56.952 –> 00:36:00.212
how it was made, um, and does that actually matter?
639
00:36:00.252 –> 00:36:03.972
Isn’t it, like, the connection we make with the characters, how it makes me feel,
640
00:36:04.012 –> 00:36:06.012
and how I come out of it that really matters?
641
00:36:07.992 –> 00:36:11.392
So why haven’t I, why haven’t I seen anything that really, that moves me
642
00:36:11.752 –> 00:36:15.312
and, uh, that’s sort of AI-made?
643
00:36:15.372 –> 00:36:18.892
Well, if you’ve been in movie, to movies in the past six
644
00:36:18.952 –> 00:36:22.777
months-… I guarantee you that es- I mean, I’m not
645
00:36:22.808 –> 00:36:26.648
talking like niche, you know, French, you know, nouvelle
646
00:36:26.668 –> 00:36:29.598
vague style movie. I’m talking like big budget movies.
647
00:36:29.648 –> 00:36:33.448
I guarantee you that there were parts, full sequences in some of
648
00:36:33.488 –> 00:36:36.588
them, that would’ve been generated, and you probably didn’t see
649
00:36:36.628 –> 00:36:38.488
- Um, so
650
00:36:39.968 –> 00:36:43.338
yes, you know, maybe you were coerced into something,
651
00:36:43.388 –> 00:36:47.308
thinking that they used a Flame suite and multiple hours of
652
00:36:47.448 –> 00:36:51.328
3D modeling. Maybe some of us in the room will be pissed off and say, “Hey,
653
00:36:51.448 –> 00:36:55.268
I wish they’d told me this was an AI sequence.” But when you’re talking
654
00:36:55.348 –> 00:36:59.288
about storytelling and sustaining disbelief, I feel like
655
00:36:59.328 –> 00:37:02.968
most of audiences just want to believe that it’s
656
00:37:03.008 –> 00:37:06.658
real for the short few hours that they’re spending with you in the
657
00:37:06.668 –> 00:37:10.008
theater. Who cares if it was a miniature 3D
658
00:37:10.088 –> 00:37:12.788
animation or AI generation, right?
659
00:37:12.828 –> 00:37:16.588
It’s, it’s if you’re showing me something and telling me this is real when it was
660
00:37:16.648 –> 00:37:19.708
not, that’s when the trust is broken for me.
661
00:37:19.768 –> 00:37:23.708
And obviously, if it’s poor, if it’s badly done, and the character’s got
662
00:37:23.808 –> 00:37:27.348
seven fingers and one eye that’s droopy, yeah, I
663
00:37:27.408 –> 00:37:30.888
mean, that’s just- that’s just poor use of the tools.
664
00:37:30.988 –> 00:37:34.508
Um, but that’s true for 3D animation, you know, uh,
665
00:37:34.608 –> 00:37:38.138
also. So but I would tell you that the, the tools right
666
00:37:38.228 –> 00:37:40.258
now, you know, there’s been some…
667
00:37:40.288 –> 00:37:43.818
If you look at a lot of the models right now, there’s open source models that
668
00:37:43.968 –> 00:37:47.068
anybody can download, you can put on your own machines.
669
00:37:47.168 –> 00:37:49.748
You can train the model and make it your own.
670
00:37:49.788 –> 00:37:53.708
You know, if you think of like Wan, for example, Wan 2.2, I think, or 2.3
671
00:37:53.768 –> 00:37:56.368
now. Download it, put it on your machine.
672
00:37:56.428 –> 00:38:00.388
If you’re a DOP, train your footage on it, and then you’ll be able to generate
673
00:38:00.448 –> 00:38:04.348
content that looks like your work, that you own, um, that you’re not feeding
674
00:38:04.428 –> 00:38:08.318
back to a, a company in Silicon Valley that can then generate off of your
675
00:38:08.348 –> 00:38:12.118
work. You become effectively the author, um,
676
00:38:12.528 –> 00:38:16.148
of work that looks like the way you like to shoot, and I’m seeing
677
00:38:16.188 –> 00:38:18.448
directors here in LA that are beginning to do this.
678
00:38:18.468 –> 00:38:22.268
And so they’re- it’s not yet used for,
679
00:38:22.368 –> 00:38:22.658
um,
680
00:38:23.548 –> 00:38:27.208
you know, movies that it’s maybe just a couple scenes, it’s not for full movies
681
00:38:27.288 –> 00:38:30.908
yet, but certainly in pre-production, people are doing
682
00:38:30.928 –> 00:38:34.308
complete pre-production movies in AI that look
683
00:38:34.368 –> 00:38:37.818
fantastic, that look very close to the finished thing, and they’re doing it on
684
00:38:37.868 –> 00:38:41.248
models that they’ve trained themselves on their own footage.
685
00:38:41.308 –> 00:38:45.177
So that’s a whole new area of, um, of
686
00:38:45.228 –> 00:38:46.668
opportunity for creative people.
687
00:38:48.848 –> 00:38:51.368
Instead of asking, what will AI do next?
688
00:38:51.408 –> 00:38:55.388
I believe we should be asking: How are we going to use AI to
689
00:38:55.428 –> 00:38:58.658
create the future that we want? Join us for
690
00:38:58.668 –> 00:38:59.928
breakfast.
691
00:39:01.808 –> 00:39:04.078
So to finish off, uh,
692
00:39:05.148 –> 00:39:08.988
going back to your, uh, YouTube channel on, uh, For
693
00:39:09.048 –> 00:39:12.658
Breakfast, where you’ve interviewed, uh…
694
00:39:12.688 –> 00:39:16.548
Well, who, who have you, who have you had on there, and, and what’s
695
00:39:16.628 –> 00:39:17.318
your, you know,
696
00:39:18.288 –> 00:39:20.998
biggest takeaways from, from those conversations that you can,
697
00:39:21.188 –> 00:39:23.868
uh, ha- hand us, hand off
698
00:39:23.908 –> 00:39:26.388
to?
699
00:39:26.488 –> 00:39:29.768
So top line, I’ve had,
700
00:39:29.868 –> 00:39:33.308
um, a famous car designer who,
701
00:39:33.388 –> 00:39:36.528
uh, designed the Mustang, the new Mustang
702
00:39:36.588 –> 00:39:40.328
recently. I’ve had a music producer who had a
703
00:39:40.388 –> 00:39:43.307
fantastic Malibu house, um, doing lo-fi
704
00:39:43.368 –> 00:39:46.838
music.
705
00:39:46.888 –> 00:39:50.168
Today, we’re talking to Jeremy Dalme, a successful music
706
00:39:50.228 –> 00:39:53.628
entrepreneur who saw his record label be completely
707
00:39:53.668 –> 00:39:56.988
disrupted by AI and who decided to
708
00:39:57.048 –> 00:40:00.658
reinvent the business of music and
709
00:40:00.768 –> 00:40:04.408
sponsorship with his new company, Acrylic.
710
00:40:04.468 –> 00:40:05.188
Join us for
711
00:40:05.208 –> 00:40:09.428
breakfast.
712
00:40:09.508 –> 00:40:13.048
For several months, we were doing about $40,000
713
00:40:13.188 –> 00:40:17.108
in, uh, just streaming revenue, um, which for an indie label
714
00:40:17.148 –> 00:40:19.328
that was established a year before, is just-
715
00:40:19.408 –> 00:40:19.828
Insane
716
00:40:20.108 –> 00:40:20.117
… insane.
717
00:40:20.128 –> 00:40:20.198
Yeah.
718
00:40:20.288 –> 00:40:24.207
Um, and because of that, our, our distribution partner gave us a half- a
719
00:40:24.248 –> 00:40:26.848
little over half a million dollar advance.
720
00:40:26.868 –> 00:40:30.848
The problem is that just three months after we h- we got
721
00:40:30.948 –> 00:40:34.308
that advance in 2022, AI music was
722
00:40:34.328 –> 00:40:38.028
announced, and the playlists that we
723
00:40:38.048 –> 00:40:41.988
depended on grew in size, and we started looking at
724
00:40:42.028 –> 00:40:45.928
these playlists and be like, “Why are we losing 75% of our revenue right now?”
725
00:40:45.988 –> 00:40:48.088
And we couldn’t recognize any of these artists.
726
00:40:48.108 –> 00:40:51.548
We didn’t really know where they were coming from, but still, the result was the
727
00:40:51.608 –> 00:40:52.068
same.
728
00:40:52.088 –> 00:40:52.158
Mm.
729
00:40:52.168 –> 00:40:55.348
Producers coming to my house saying, “Jeremy, what are we gonna do?
730
00:40:55.388 –> 00:40:59.128
I’ve gotta drive an Uber now.” And me thinking, “Well, uh, uh, we’re the label,
731
00:40:59.148 –> 00:41:02.808
like, I, I’m in bed with you. Like, I don’t know what we’re gonna
732
00:41:02.848 –> 00:41:06.608
do.” Um, and that was really the moment, like, the
733
00:41:06.668 –> 00:41:10.268
seminal moment where, where the, the idea behind
734
00:41:10.308 –> 00:41:14.208
Acrylic really was born. Like, when we were really back against the wall, and I
735
00:41:14.268 –> 00:41:17.988
was like: Well, what’s the difference between millions of emerging artists
736
00:41:18.408 –> 00:41:18.528
and-
737
00:41:18.648 –> 00:41:18.828
Yeah
738
00:41:18.848 –> 00:41:19.338
… AI robots?
739
00:41:19.388 –> 00:41:22.928
How… So how did that, how did that
740
00:41:23.168 –> 00:41:26.588
feel, that moment? Because obviously, you came out the other
741
00:41:26.728 –> 00:41:27.188
side,
742
00:41:28.088 –> 00:41:30.768
but did it take you by surprise?
743
00:41:31.788 –> 00:41:34.878
I, I recall the moment very, very clearly and
744
00:41:34.988 –> 00:41:35.848
vividly.
745
00:41:35.868 –> 00:41:39.058
Like, when you got the advance, where- di- did- was
746
00:41:39.128 –> 00:41:42.808
there, uh, something in the back of your head saying, “This
747
00:41:42.908 –> 00:41:46.768
is sort of sunsetting,” or, “This is the beginning of growth?” How were
748
00:41:46.828 –> 00:41:47.408
you seeing-
749
00:41:47.468 –> 00:41:49.308
Oh, yeah. Uh, when I got, when we got the advance-
750
00:41:49.348 –> 00:41:49.468
Yeah
751
00:41:49.478 –> 00:41:53.398
… it was beautiful. I mean, we had an, uh, an, an in-house artist, a
752
00:41:53.468 –> 00:41:53.998
designer-
753
00:41:54.048 –> 00:41:54.058
Yeah
754
00:41:54.058 –> 00:41:57.858
… who was making all of our cover art, because a lot of these lo-fi guys, like,
755
00:41:57.868 –> 00:42:01.518
they made amazing music, but not all of them, you know, were into
756
00:42:01.568 –> 00:42:03.258
art and, and their artwork and whatever.
757
00:42:03.258 –> 00:42:03.268
Yeah.
758
00:42:03.278 –> 00:42:07.228
And we wanted to have these incredible covers, and so we kind of created a
759
00:42:07.288 –> 00:42:11.138
world where we could make this beautiful art and have beautiful visual art and
760
00:42:11.188 –> 00:42:11.808
videos.
761
00:42:11.828 –> 00:42:11.928
Mm.
762
00:42:11.948 –> 00:42:15.788
So like, no, no, those two years were some of the best of my life,
763
00:42:15.848 –> 00:42:19.608
if not the best to this point, where I felt like I had creative freedom.
764
00:42:19.668 –> 00:42:22.098
I didn’t have a boss, you know, telling me, “You gotta do this.”
765
00:42:22.108 –> 00:42:23.268
But, but could you see a trajectory?
766
00:42:23.288 –> 00:42:24.168
Could you go, “Uh-
767
00:42:24.324 –> 00:42:26.253
… this is it. This is gonna keep going up?
768
00:42:26.284 –> 00:42:26.804
Oh, yeah!
769
00:42:26.884 –> 00:42:27.724
I have a-
770
00:42:27.744 –> 00:42:28.484
Oh, 100%.
771
00:42:28.504 –> 00:42:29.384
And so when that
772
00:42:30.424 –> 00:42:33.644
playlist started to transform, was it a shock?
773
00:42:33.664 –> 00:42:34.274
It was a shock.
774
00:42:34.344 –> 00:42:34.784
Complete shock?
775
00:42:34.804 –> 00:42:38.784
It was a complete shock. The announce of AI music, and the fact that the
776
00:42:38.824 –> 00:42:41.744
world was saying… And, and this- and not just the world, like, people were
777
00:42:41.824 –> 00:42:45.604
actually telling me, “Jeremy, like, your
778
00:42:45.644 –> 00:42:49.304
music, I can make it with a button now.” And I’d look at them and
779
00:42:49.344 –> 00:42:53.184
be, “Really? You can go talk to Paraguayan harpists
780
00:42:53.244 –> 00:42:57.024
and say, ‘Hey, let’s pair you with cool producers in LA-
781
00:42:57.064 –> 00:42:57.144
Mm
782
00:42:57.204 –> 00:43:00.814
… to make this music that no one has made before with these instruments that are,
783
00:43:00.844 –> 00:43:04.784
you know, hundreds of years old. And, you know, be in the front page of the
784
00:43:04.804 –> 00:43:08.084
Asuncion, Paraguay, um, arts section of their, their
785
00:43:08.164 –> 00:43:09.974
main, you know, newspaper-
786
00:43:10.144 –> 00:43:10.154
Mm-hmm
787
00:43:10.154 –> 00:43:13.584
… you know, and create opportunities for all these, these people around the
788
00:43:13.624 –> 00:43:13.814
world.’
789
00:43:13.814 –> 00:43:13.814
Right.
790
00:43:13.824 –> 00:43:17.524
Like, you can do that with robots?” And I refused to believe it, Jan.
791
00:43:17.584 –> 00:43:18.304
I refused.
792
00:43:18.324 –> 00:43:21.344
Did you… I mean, I, I know the first time I saw
793
00:43:22.224 –> 00:43:25.904
a piece of AI video, you know, I, I studied film, I
794
00:43:25.984 –> 00:43:28.384
studied media. I wanted to become a director.
795
00:43:28.424 –> 00:43:31.764
I did some commercials, some t- and I thought this was a career path for me.
796
00:43:31.804 –> 00:43:32.274
Yeah.
797
00:43:32.324 –> 00:43:35.274
And when I saw the first ones, I thought,
798
00:43:35.344 –> 00:43:38.904
really, um, it’s going to completely
799
00:43:38.944 –> 00:43:42.934
devalue. Even if I felt, as an artist, it didn’t
800
00:43:43.004 –> 00:43:46.464
have the depth, and I missed some of the humanness-
801
00:43:46.684 –> 00:43:47.164
Mm-hmm
802
00:43:48.024 –> 00:43:51.784
… I was concerned that it was just going to devalue it as a whole
803
00:43:52.264 –> 00:43:55.584
because it would muddy the water. You may have that one nice
804
00:43:56.184 –> 00:43:59.684
jewel of a human-crafted track, but if you have
805
00:43:59.944 –> 00:44:03.854
20,000 others that are 80%, 90%,
806
00:44:04.344 –> 00:44:07.414
to the person who’s just got their headphones at Starbucks and doing some work,
807
00:44:08.104 –> 00:44:10.404
it’s, it’s, it’s a nuance, the difference.
808
00:44:10.444 –> 00:44:11.104
Yeah.
809
00:44:11.124 –> 00:44:12.724
It’s just gonna demolish the industry.
810
00:44:12.734 –> 00:44:12.744
Yeah.
811
00:44:12.764 –> 00:44:14.664
Did you think this is the end of music?
812
00:44:15.524 –> 00:44:19.504
I th- I, I, I mean, the idea did cross my mind, but that is
813
00:44:19.584 –> 00:44:21.984
when I came up with the idea for Acrylic.
814
00:44:22.204 –> 00:44:22.254
Yeah.
815
00:44:22.254 –> 00:44:24.664
And, and, and I, it, it is really at that point-
816
00:44:24.704 –> 00:44:24.804
Yeah
817
00:44:24.814 –> 00:44:28.304
… because we were all desperate. I remember we were running out of money.
818
00:44:28.424 –> 00:44:32.344
Um, we owed our distributor on the advance, and we
819
00:44:32.364 –> 00:44:35.984
were running out of options. And that’s when I, I literally, I was in, in my, uh,
on my desk, and I got up.
Yeah.
822
00:44:38.304 –> 00:44:41.344
I have a board behind me, um, a, a whiteboard.
823
00:44:41.384 –> 00:44:41.724
Yeah.
824
00:44:41.744 –> 00:44:45.714
And I put a line through it, and I put, “Millions of emerging artists,” on the
825
00:44:45.784 –> 00:44:46.264
left-
826
00:44:46.324 –> 00:44:46.404
Yeah
827
00:44:46.484 –> 00:44:50.424
… “and AI robots” on the right. And I asked myself,
828
00:44:50.434 –> 00:44:52.544
“What is the difference between the two?
829
00:44:52.644 –> 00:44:56.244
How do we fight this?” And it came to me, and I was like, “Hold
830
00:44:56.284 –> 00:44:58.624
- Hyperlocal-
831
00:44:58.664 –> 00:44:58.744
Mm
832
00:44:58.844 –> 00:45:02.404
… hyper-engaged fan bases. AI robots don’t have fans,
833
00:45:03.304 –> 00:45:06.164
so no one cares about their music,
834
00:45:07.044 –> 00:45:10.084
but we can prove that people care about ours.
835
00:45:10.124 –> 00:45:11.324
Maybe it’s only 100 fans-
836
00:45:11.364 –> 00:45:11.564
Mm-hmm
837
00:45:12.344 –> 00:45:16.184
… but those 100 fans are human, so we better bet on
838
00:45:16.244 –> 00:45:20.004
human IP because this war is not over.” And
839
00:45:20.064 –> 00:45:23.084
that’s when, really, things changed.
840
00:45:23.144 –> 00:45:26.884
That’s when I had the idea for the Acrylic platform.
841
00:45:26.984 –> 00:45:30.964
Um, and then, you know, things grew from there.
842
00:45:31.044 –> 00:45:32.624
I’ve had, um,
843
00:45:34.244 –> 00:45:37.924
you know, uh, who else have I had? I’ve had business leaders in software
844
00:45:38.304 –> 00:45:41.604
who… But I would say the common thread between all of them
845
00:45:42.324 –> 00:45:43.584
is they are people who were
846
00:45:44.804 –> 00:45:48.664
very successful, um, very committed to their career
847
00:45:48.784 –> 00:45:52.764
path, and then there was an inflection point in their career
848
00:45:52.804 –> 00:45:56.664
where AI stepped in and either made it very difficult
849
00:45:56.764 –> 00:45:59.864
or impossible for them to continue their business as it was.
850
00:45:59.884 –> 00:46:02.054
That’s the common thread, and it’s not a…
851
00:46:02.144 –> 00:46:04.264
I hope it’s not a depressing channel.
852
00:46:04.304 –> 00:46:08.224
The idea- it’s, they’re all showing what they did
853
00:46:08.234 –> 00:46:11.364
to come out the other side. Um, so for example, the car
854
00:46:11.424 –> 00:46:15.134
designer, now instead of spending
855
00:46:15.244 –> 00:46:18.864
hours rendering a sketch to try and convince his clients
856
00:46:19.284 –> 00:46:22.564
that this is the right design, he will
857
00:46:23.364 –> 00:46:26.724
do some rapid sketches and then use AI to
858
00:46:26.784 –> 00:46:30.584
create 20,000 variations, and then
859
00:46:30.644 –> 00:46:34.244
using his artistic flair and craft, he will
860
00:46:34.324 –> 00:46:38.184
select those that are most valuable and then immediately generate them
861
00:46:38.264 –> 00:46:42.224
into 3D models, into fake
862
00:46:42.364 –> 00:46:45.944
TV commercials. So you can design your car and then see it in a
863
00:46:46.024 –> 00:46:48.124
commercial or just driving on the street.
864
00:46:49.284 –> 00:46:52.524
And, uh, but, but what really, what really began to transform the
865
00:46:52.564 –> 00:46:56.444
process, uh, for me and my team was the,
866
00:46:56.504 –> 00:47:00.184
the rise of these sort of rapid 3D modeling tools, like, like
867
00:47:00.224 –> 00:47:03.514
Blender and, uh, Maya and, um, and,
868
00:47:04.424 –> 00:47:08.124
um, now Gravity Sketch, which is, which is the ability to, to
869
00:47:08.204 –> 00:47:12.004
spacely sketch in s- uh, sketch spatially with virtual
870
00:47:12.064 –> 00:47:15.604
reality and augmented reality. And, um, and so a
871
00:47:15.624 –> 00:47:17.944
designer could go from sketching on paper
872
00:47:18.784 –> 00:47:22.524
to, to building a model, um, but now they can sketch
873
00:47:22.544 –> 00:47:23.704
directly in 3D.
874
00:47:23.724 –> 00:47:23.794
Right.
875
00:47:23.804 –> 00:47:27.784
And so, so, um, that, that sort of translation step
876
00:47:28.184 –> 00:47:32.034
from 2D to 3D is now irrelevant. You’re just, you’re just going
877
00:47:32.084 –> 00:47:36.034
straight to 3D. And so, uh, and, and in doing so, we began to bypass a lot
878
00:47:36.044 –> 00:47:39.994
of the traditional workflows with clay models and
879
00:47:40.504 –> 00:47:44.344
working with digital sculptors, and, and, uh, the designers themselves were
880
00:47:44.364 –> 00:47:48.184
beginning to do more and more of their own sculpting and sending data
881
00:47:48.204 –> 00:47:51.184
directly to, to milling machines and, and,
882
00:47:51.324 –> 00:47:55.274
um, uh, and, and only working with the clay modelers in, in sort
883
00:47:55.304 –> 00:47:56.744
of more of the finishing stages.
884
00:47:56.784 –> 00:47:57.034
Right.
885
00:47:57.064 –> 00:47:58.304
Yeah.
886
00:47:58.404 –> 00:47:59.814
And so it means that from
887
00:48:01.024 –> 00:48:04.884
the challenge that he had as a creative was, “The
888
00:48:04.944 –> 00:48:07.584
vision I have in my head, how do I convince
889
00:48:08.384 –> 00:48:12.244
the accountants and the marketers in this automotive group
890
00:48:12.304 –> 00:48:16.024
to see it the way I do?” And so the designers that
891
00:48:16.164 –> 00:48:18.784
succeeded there were people who could sketch great.
892
00:48:19.284 –> 00:48:22.224
Wasn’t always the best idea, but they were the people who sketched the
893
00:48:22.244 –> 00:48:26.024
best. And now using AI, you don’t have to be the best
894
00:48:26.044 –> 00:48:29.322
sketch artist. You have to have the best taste-…
895
00:48:29.352 –> 00:48:31.632
you have to know what great looks like.
896
00:48:31.672 –> 00:48:35.202
You have to know when there’s a, a real white space that no one else is
897
00:48:35.232 –> 00:48:39.212
going, and you need to, you know, drill deeper on that one.
898
00:48:39.252 –> 00:48:43.122
And so the tools are here to accelerate, and I
899
00:48:43.192 –> 00:48:46.872
think used well, they are an opportunity to make the world more
900
00:48:46.912 –> 00:48:47.422
creative.
901
00:48:48.572 –> 00:48:52.502
I mean, at the same time, there is a place also where
902
00:48:52.532 –> 00:48:55.812
those accountants and marketers can generate their own cars
903
00:48:55.852 –> 00:48:59.632
themselves, and so far, it, it doesn’t
904
00:48:59.752 –> 00:49:03.612
sound as though they’re doing as good a job at figuring out what
905
00:49:03.672 –> 00:49:07.432
the right, you know, design is and what is culturally relevant and what is
906
00:49:07.472 –> 00:49:10.332
a, you know, a beautiful design and why.
907
00:49:10.392 –> 00:49:10.502
So,
908
00:49:11.492 –> 00:49:14.252
but that’s one example.
909
00:49:15.412 –> 00:49:18.952
So, so, uh, having good taste and knowing your,
910
00:49:19.012 –> 00:49:21.712
uh, uh, art history still
911
00:49:21.792 –> 00:49:23.512
matters?
912
00:49:24.672 –> 00:49:24.932
Yeah.
913
00:49:25.892 –> 00:49:29.382
And then also, you know, I, um, I was reading
914
00:49:29.412 –> 00:49:33.282
recently about the lost generation, which I think you and I are both a
915
00:49:33.332 –> 00:49:36.712
part of, which they call them Xennials.
916
00:49:36.772 –> 00:49:40.252
So we are- we tend to be born, I believe it’s
917
00:49:40.262 –> 00:49:43.512
’78 to ’85, something like
918
00:49:43.572 –> 00:49:47.252
that. It’s a generation that was born without the
919
00:49:47.312 –> 00:49:50.692
Internet. We know what a dial-up
920
00:49:50.872 –> 00:49:54.592
telephone is. We remembered having to walk to the
921
00:49:54.652 –> 00:49:57.712
television to change station, but at the same
922
00:49:57.772 –> 00:50:01.612
time, our education and our career
923
00:50:01.672 –> 00:50:04.942
meant that we used the Internet and had to adapt to new
924
00:50:04.972 –> 00:50:08.772
technologies. So we’re not quite Gen Xers, who
925
00:50:09.592 –> 00:50:11.512
learned about the Internet later in life.
926
00:50:11.532 –> 00:50:15.272
We’re not millennials, who were born with social media and the Internet.
927
00:50:15.292 –> 00:50:17.412
We’re somewhere in between, and so
928
00:50:18.432 –> 00:50:22.052
I think that makes us very well-suited to helping
929
00:50:22.532 –> 00:50:26.512
adapt to change. But if I look at my kids and the Gen
930
00:50:26.592 –> 00:50:29.932
Alpha, the generation coming before the
931
00:50:29.972 –> 00:50:33.952
millennials, they were born with AI, they were born with social
932
00:50:34.012 –> 00:50:36.892
media. They have no hang-up about it.
933
00:50:36.972 –> 00:50:40.472
Uh, they’re generating memes. They’re using it to express
934
00:50:40.512 –> 00:50:44.092
themselves, um, and they have no qualms about it
935
00:50:44.632 –> 00:50:48.082
because they haven’t experienced a world where
936
00:50:49.112 –> 00:50:52.612
a Disney movie was hand-drawn. They’ve seen them as
937
00:50:52.812 –> 00:50:55.112
3D characters all their lives, and so
938
00:50:56.092 –> 00:50:59.632
I think we might be the generation that feels
939
00:50:59.972 –> 00:51:03.772
this anxiety the most, um, because we remember a
940
00:51:03.872 –> 00:51:07.812
world where it didn’t exist. Uh, but at the same time, we’re
941
00:51:07.852 –> 00:51:10.812
probably the generation also that will help make the
942
00:51:10.872 –> 00:51:13.832
transition because we have a foot in before and
943
00:51:13.952 –> 00:51:17.572
after. But I… My, my gut tells me that the next
944
00:51:17.612 –> 00:51:20.972
generation coming isn’t gonna have the same qualms about
945
00:51:21.032 –> 00:51:23.972
it, um, and will adopt it freely.
946
00:51:25.092 –> 00:51:26.592
It’s hard to know what that means for
947
00:51:27.572 –> 00:51:30.832
society or where value is
948
00:51:30.872 –> 00:51:34.842
created, where purpose lies, what will- what
949
00:51:34.972 –> 00:51:38.852
would it mean to be creative if you’re co-creating together
950
00:51:38.952 –> 00:51:42.852
with AI? But at the same time,
951
00:51:42.932 –> 00:51:46.921
I, I know that the hang-ups we have, um, is ours and
952
00:51:46.932 –> 00:51:48.352
not necessarily the next generations.
953
00:51:49.772 –> 00:51:53.632
Yeah. Okay, so Jan, uh, I know you have to shoot, so, uh, thank you so much for,
954
00:51:53.732 –> 00:51:56.992
uh, being here in Tønsberg, uh, with us for this
955
00:51:57.032 –> 00:51:58.212
chat.
956
00:51:58.312 –> 00:52:01.932
Yeah, I mean, my pleasure. I, I wish I could be there in person.
958
00:52:03.432 –> 00:52:07.201
I, it- I’ve been to a few film festivals, and I, I love the energy and
959
00:52:07.272 –> 00:52:11.252
the, the passion that people have for storytelling, and I would say
960
00:52:12.292 –> 00:52:16.272
to the people in the room who are worried about AI, focus
961
00:52:16.332 –> 00:52:19.332
on that. The storytelling still matters.
How we connect with each other still matters, um,
whatever the tools show up. And my podcast is designed to be a platform for creatives to talk to each
other, to exchange ideas, and to, you
know, adapt to the change. So, you know, feel free to reach out.
I’m sure Anders will give you my contact details.
969
Excellent. Thank you, Jan.
All the best. Thank you.




