1
00:00:00,560 --> 00:00:06,260
you're watching the context it's time
2
00:00:02,560 --> 00:00:08,519
for our new weekly segment AI
3
00:00:06,260 --> 00:00:11,679
[Music]
4
00:00:08,519 --> 00:00:13,360
decoded welcome to AI decoded that time
5
00:00:11,679 --> 00:00:15,360
of the week when we look in depth at
6
00:00:13,360 --> 00:00:18,400
some of the most eye-catching stories in
7
00:00:15,360 --> 00:00:21,039
the world of artificial intelligence and
8
00:00:18,400 --> 00:00:24,640
we begin New York Times who report
9
00:00:21,039 --> 00:00:26,960
Google has been fined $271 Million by
10
00:00:24,640 --> 00:00:29,000
France's competition Watchdog for
11
00:00:26,960 --> 00:00:31,519
failing to broker agreements with media
12
00:00:29,000 --> 00:00:35,079
outlets for using their content to train
13
00:00:31,519 --> 00:00:38,079
its AI Tech which leads us to a possible
14
00:00:35,079 --> 00:00:40,079
solution after leading AI developer open
15
00:00:38,079 --> 00:00:43,239
AI told the UK Parliament it's
16
00:00:40,079 --> 00:00:46,160
impossible to train leading AI models
17
00:00:43,239 --> 00:00:48,079
without using uh copyrighted materials a
18
00:00:46,160 --> 00:00:50,120
group of researchers says there is an
19
00:00:48,079 --> 00:00:52,359
alternative after releasing what's
20
00:00:50,120 --> 00:00:54,640
thought to be the largest AI training
21
00:00:52,359 --> 00:00:57,879
data set composed entirely of text
22
00:00:54,640 --> 00:01:00,640
that's in the public domain not under
23
00:00:57,879 --> 00:01:02,760
copyright AP News looks at the UN
24
00:01:00,640 --> 00:01:05,080
general assembly vote on what would be
25
00:01:02,760 --> 00:01:06,320
the first United Nations resolution on
26
00:01:05,080 --> 00:01:08,960
artificial
27
00:01:06,320 --> 00:01:10,880
intelligence and The Verge Tech website
28
00:01:08,960 --> 00:01:12,840
has a video of one of their journalists
29
00:01:10,880 --> 00:01:16,000
having a real-time conversation with an
30
00:01:12,840 --> 00:01:17,840
AI Avatar who responds to human speech
31
00:01:16,000 --> 00:01:19,960
currently it's an experimental program
32
00:01:17,840 --> 00:01:21,840
developed by video game developer
33
00:01:19,960 --> 00:01:24,479
Ubisoft and we'll show you a clip of
34
00:01:21,840 --> 00:01:27,240
that a little later actually in the Ft
35
00:01:24,479 --> 00:01:29,119
an Uber Eats delivery worker who after
36
00:01:27,240 --> 00:01:31,158
becoming fed up with the company's app
37
00:01:29,119 --> 00:01:33,880
consistently made making errors decided
38
00:01:31,158 --> 00:01:36,000
to rewrite the code to fix the issues we
39
00:01:33,880 --> 00:01:38,840
have the author of that article with us
40
00:01:36,000 --> 00:01:41,079
and she'll tell us a bit more and AFP
41
00:01:38,840 --> 00:01:43,399
news features novelist salmon rushy who
42
00:01:41,079 --> 00:01:45,600
says artificial intelligence tools may
43
00:01:43,399 --> 00:01:47,920
pose a threat to writers of Thrillers
44
00:01:45,600 --> 00:01:50,320
and science fiction but they lack the
45
00:01:47,920 --> 00:01:53,280
originality and humor to challenge
46
00:01:50,320 --> 00:01:55,920
serious novelists and another artist
47
00:01:53,280 --> 00:01:58,360
musician James Blunt says he felt
48
00:01:55,920 --> 00:02:00,439
humiliated at how bad the results were
49
00:01:58,360 --> 00:02:04,920
when he experimented with AI to see if
50
00:02:00,439 --> 00:02:08,440
it could create lyrics accurately in his
51
00:02:04,920 --> 00:02:10,560
style well with me here is mad murgia
52
00:02:08,440 --> 00:02:12,599
the financial times's artificial
53
00:02:10,560 --> 00:02:14,160
intelligence editor thanks very much for
54
00:02:12,599 --> 00:02:16,519
coming on the program thanks for having
55
00:02:14,160 --> 00:02:18,560
me right lots to get through quite a
56
00:02:16,519 --> 00:02:20,720
busy week I think we can probably almost
57
00:02:18,560 --> 00:02:22,680
say that every week now let's uh let's
58
00:02:20,720 --> 00:02:26,560
start with the New York Times France
59
00:02:22,680 --> 00:02:28,920
finds Google amid AI dispute with news
60
00:02:26,560 --> 00:02:30,640
media so this is a kind of broad issue
61
00:02:28,920 --> 00:02:33,000
that we're going to come up against
62
00:02:30,640 --> 00:02:34,920
quite a lot and already have but a
63
00:02:33,000 --> 00:02:37,120
seemingly significant moment what's
64
00:02:34,920 --> 00:02:38,720
happened here yeah so this is you know
65
00:02:37,120 --> 00:02:40,319
there's been as you say a long running
66
00:02:38,720 --> 00:02:42,640
dispute between Google and news
67
00:02:40,319 --> 00:02:45,640
organizations about how Google has been
68
00:02:42,640 --> 00:02:48,840
using links uh news media newspaper
69
00:02:45,640 --> 00:02:50,560
articles and so on this particular fine
70
00:02:48,840 --> 00:02:54,120
um they've said that they've failed to
71
00:02:50,560 --> 00:02:56,040
negotiate Fair deals um and have used
72
00:02:54,120 --> 00:02:58,000
data from newspapers to train their
73
00:02:56,040 --> 00:03:00,519
large language models the chat Bots that
74
00:02:58,000 --> 00:03:02,640
many of us have been using so this is
75
00:03:00,519 --> 00:03:04,519
part of a wider ruling that they've made
76
00:03:02,640 --> 00:03:06,720
and said Google hasn't negotiated in
77
00:03:04,519 --> 00:03:09,519
good faith and and saying it's failing
78
00:03:06,720 --> 00:03:12,360
to inform Publishers of the use of their
79
00:03:09,519 --> 00:03:15,080
content for their
80
00:03:12,360 --> 00:03:19,760
software isn't that something that we're
81
00:03:15,080 --> 00:03:22,159
kind of all the big AI kind of companies
82
00:03:19,760 --> 00:03:24,560
all in the same boat here is everything
83
00:03:22,159 --> 00:03:27,360
kind of trained the same way absolutely
84
00:03:24,560 --> 00:03:30,159
it's not just Google open AI too um you
85
00:03:27,360 --> 00:03:32,159
know they are uh funded by Microsoft
86
00:03:30,159 --> 00:03:34,439
primarily they also that they have one
87
00:03:32,159 --> 00:03:36,599
of the most powerful models which um
88
00:03:34,439 --> 00:03:38,959
Powers chat GPT which people might have
89
00:03:36,599 --> 00:03:41,319
played around with that too is a large
90
00:03:38,959 --> 00:03:43,760
language model trained on a huge Corpus
91
00:03:41,319 --> 00:03:46,400
of data found online New York Times
92
00:03:43,760 --> 00:03:48,879
itself actually has sued open AI um and
93
00:03:46,400 --> 00:03:51,400
is currently in in this court case with
94
00:03:48,879 --> 00:03:53,239
them saying that they've illegally used
95
00:03:51,400 --> 00:03:55,040
their copyrighted material and and
96
00:03:53,239 --> 00:03:56,840
that's kind of another ongoing battle on
97
00:03:55,040 --> 00:03:58,760
another front interesting and so we're
98
00:03:56,840 --> 00:04:01,239
going to stick with exactly this theme
99
00:03:58,760 --> 00:04:04,760
because they are now clay this is uh
100
00:04:01,239 --> 00:04:08,840
wired here's proof you can train an AI
101
00:04:04,760 --> 00:04:11,480
model without slurping copyrighted
102
00:04:08,840 --> 00:04:13,599
content um not often you get slurping in
103
00:04:11,480 --> 00:04:15,760
a headline but I like it uh so what's
104
00:04:13,599 --> 00:04:17,600
this story about so this is kind of look
105
00:04:15,760 --> 00:04:20,279
this is the other side of the coin right
106
00:04:17,600 --> 00:04:22,400
because the leaders of these companies
107
00:04:20,279 --> 00:04:23,840
you know open AI in particular has said
108
00:04:22,400 --> 00:04:25,680
there's no other way to build these
109
00:04:23,840 --> 00:04:28,400
models if we want really powerful
110
00:04:25,680 --> 00:04:30,400
language models if we want chatbots we
111
00:04:28,400 --> 00:04:33,080
need to use the the words on the
112
00:04:30,400 --> 00:04:35,320
internet we need this data um but the
113
00:04:33,080 --> 00:04:37,039
the research that's come out of here um
114
00:04:35,320 --> 00:04:38,400
it's a group of researchers backed by
115
00:04:37,039 --> 00:04:40,560
the French government they've shown that
116
00:04:38,400 --> 00:04:43,600
actually there are other ways there are
117
00:04:40,560 --> 00:04:45,880
alternatives um you can make data sets
118
00:04:43,600 --> 00:04:48,520
that power maybe smaller models but for
119
00:04:45,880 --> 00:04:50,199
specific use cases um an example they
120
00:04:48,520 --> 00:04:51,560
give here is for for a law firm for
121
00:04:50,199 --> 00:04:53,440
example so if you're trying to build a
122
00:04:51,560 --> 00:04:55,720
language model specifically to help with
123
00:04:53,440 --> 00:04:57,199
lawyers and the work that they do and
124
00:04:55,720 --> 00:04:59,400
this could be true in scientific
125
00:04:57,199 --> 00:05:01,840
research or any sort of vertical that
126
00:04:59,400 --> 00:05:04,800
that want to apply it to you can you can
127
00:05:01,840 --> 00:05:07,280
use you know less data and it can be
128
00:05:04,800 --> 00:05:09,720
paid for interesting so that's one
129
00:05:07,280 --> 00:05:12,880
potential solution but again on those
130
00:05:09,720 --> 00:05:14,759
narrow use cases or narrower use cases
131
00:05:12,880 --> 00:05:18,160
because it says in here that yeah the
132
00:05:14,759 --> 00:05:20,639
data set is Tiny compared to what lots
133
00:05:18,160 --> 00:05:22,960
of the other language models were kind
134
00:05:20,639 --> 00:05:25,199
of based on so surely there must be some
135
00:05:22,960 --> 00:05:27,360
kind of difference in quality of outcome
136
00:05:25,199 --> 00:05:29,120
or may Maybe not maybe you don't you
137
00:05:27,360 --> 00:05:31,680
don't need that you know obviously the
138
00:05:29,120 --> 00:05:33,440
internet the entire swell of information
139
00:05:31,680 --> 00:05:36,440
that we all put out there from Reddit
140
00:05:33,440 --> 00:05:38,360
post to you know Amazon comments and
141
00:05:36,440 --> 00:05:40,120
reviews um but there's also a lot of
142
00:05:38,360 --> 00:05:42,199
noise in that data right there's there's
143
00:05:40,120 --> 00:05:44,039
a lot of rubbish on the internet as as
144
00:05:42,199 --> 00:05:46,759
everyone can attest to so it's not
145
00:05:44,039 --> 00:05:48,880
necessarily clean good quality data so
146
00:05:46,759 --> 00:05:50,800
there is there is still a it's an open
147
00:05:48,880 --> 00:05:53,360
question of whether it's quantity or
148
00:05:50,800 --> 00:05:55,280
quality um and I think that there is an
149
00:05:53,360 --> 00:05:58,520
argument to be made for good quality
150
00:05:55,280 --> 00:06:01,360
data interesting right let's move on to
151
00:05:58,520 --> 00:06:04,600
a bit of high level regulation now
152
00:06:01,360 --> 00:06:06,360
because it's hugely important there are
153
00:06:04,600 --> 00:06:08,440
as we've been witnessing over the last
154
00:06:06,360 --> 00:06:11,960
kind of six months especially these big
155
00:06:08,440 --> 00:06:15,599
attempts uh at regulation the United
156
00:06:11,960 --> 00:06:18,160
Nations now hoping to get in on the ACT
157
00:06:15,599 --> 00:06:20,880
what's going on there so today the
158
00:06:18,160 --> 00:06:22,800
general assembly is set to vote um on
159
00:06:20,880 --> 00:06:25,160
what's going to be the first United
160
00:06:22,800 --> 00:06:27,240
Nations resolution on artificial
161
00:06:25,160 --> 00:06:29,000
intelligence um the goal they're hoping
162
00:06:27,240 --> 00:06:31,199
that it will be unanimous and the the
163
00:06:29,000 --> 00:06:33,159
goal really is to bridge inequities
164
00:06:31,199 --> 00:06:35,759
between um you know the developed
165
00:06:33,159 --> 00:06:37,840
Western World and uh countries in the in
166
00:06:35,759 --> 00:06:39,479
the global South developing world and
167
00:06:37,840 --> 00:06:40,960
make sure that those countries have a
168
00:06:39,479 --> 00:06:42,880
seat at the table when it comes to
169
00:06:40,960 --> 00:06:45,080
developing the Technologies and it isn't
170
00:06:42,880 --> 00:06:47,520
just places like the United States where
171
00:06:45,080 --> 00:06:50,159
these companies are based that you know
172
00:06:47,520 --> 00:06:51,800
see all the upside of the technology
173
00:06:50,159 --> 00:06:54,639
interesting yeah because we're picking
174
00:06:51,800 --> 00:06:57,159
up these these quotes from AP here and
175
00:06:54,639 --> 00:07:00,440
they're saying they want to make it safe
176
00:06:57,159 --> 00:07:02,240
secure trustworthy of those broad
177
00:07:00,440 --> 00:07:04,319
principles that I think most people
178
00:07:02,240 --> 00:07:05,960
would kind of ascribe to but really
179
00:07:04,319 --> 00:07:08,639
interesting picking up on exactly what
180
00:07:05,960 --> 00:07:11,479
you were talking about there it's it's
181
00:07:08,639 --> 00:07:14,840
Distributing the potential benefits not
182
00:07:11,479 --> 00:07:17,240
just having a few companies in a few
183
00:07:14,840 --> 00:07:19,919
countries and therefore the populations
184
00:07:17,240 --> 00:07:23,240
and people of those uh countries
185
00:07:19,919 --> 00:07:26,560
benefiting but making sure it's across
186
00:07:23,240 --> 00:07:28,560
the world uh that's a laudable aim I'm
187
00:07:26,560 --> 00:07:30,879
sure but uh probably easier said than
188
00:07:28,560 --> 00:07:32,159
done or absolutely I mean this is
189
00:07:30,879 --> 00:07:34,639
they're supposed to they're doing this
190
00:07:32,159 --> 00:07:37,639
so that they can help AI kind of achieve
191
00:07:34,639 --> 00:07:39,240
the UN development goals for 2030 which
192
00:07:37,639 --> 00:07:40,759
they're far behind on so they're hoping
193
00:07:39,240 --> 00:07:42,960
this can address questions you know
194
00:07:40,759 --> 00:07:45,479
inequities in health care education and
195
00:07:42,960 --> 00:07:47,520
so on but these are big problems and I'm
196
00:07:45,479 --> 00:07:50,000
not sure AI is the Magic Bullet to all
197
00:07:47,520 --> 00:07:51,319
of it but it's a step it's a step uh
198
00:07:50,000 --> 00:07:53,479
yeah some of the examples detecting
199
00:07:51,319 --> 00:07:55,639
diseases predicting floods helping
200
00:07:53,479 --> 00:07:57,000
Farmers training the next generation of
201
00:07:55,639 --> 00:08:01,080
workers that's what they want to make
202
00:07:57,000 --> 00:08:04,000
sure is equally spread across the globe
203
00:08:01,080 --> 00:08:05,720
right next story um before we talk about
204
00:08:04,000 --> 00:08:07,840
it we're going to actually see it
205
00:08:05,720 --> 00:08:10,199
because we're going to take uh a moment
206
00:08:07,840 --> 00:08:13,199
to watch this example of a real-time
207
00:08:10,199 --> 00:08:15,479
conversation with an AI Avatar this is
208
00:08:13,199 --> 00:08:17,680
not yet released it's a creative
209
00:08:15,479 --> 00:08:19,960
experiment by video game developer
210
00:08:17,680 --> 00:08:23,560
Ubisoft that suggests the company has
211
00:08:19,960 --> 00:08:26,680
made small breakthroughs in AI realtime
212
00:08:23,560 --> 00:08:28,759
emotion and animation contextual
213
00:08:26,680 --> 00:08:31,080
awareness and collaborative
214
00:08:28,759 --> 00:08:35,760
decision-making let's take a
215
00:08:31,080 --> 00:08:35,760
look tell me one word why I should
216
00:08:38,279 --> 00:08:42,838
stay
217
00:08:39,959 --> 00:08:45,040
passion you can feel it can't you all
218
00:08:42,838 --> 00:08:47,640
right that burning desire to make a
219
00:08:45,040 --> 00:08:50,240
difference to fight for what's
220
00:08:47,640 --> 00:08:51,519
right that's what drives us and that's
221
00:08:50,240 --> 00:08:53,440
what will keep you going through the
222
00:08:51,519 --> 00:08:55,800
tough times and trust
223
00:08:53,440 --> 00:09:00,720
me there will be tough
224
00:08:55,800 --> 00:09:03,040
times but together we can overcome them
225
00:09:00,720 --> 00:09:03,880
I have too much to lose I have a family
226
00:09:03,040 --> 00:09:08,079
two
227
00:09:03,880 --> 00:09:08,079
daughters really should I be doing
228
00:09:10,519 --> 00:09:15,959
this family is everything
229
00:09:13,200 --> 00:09:19,160
Nelson and that's why we're fighting to
230
00:09:15,959 --> 00:09:22,160
create a better world for our loved ones
231
00:09:19,160 --> 00:09:24,399
we can't continue living in fear and
232
00:09:22,160 --> 00:09:27,000
oppression it's a risk but it's a risk
233
00:09:24,399 --> 00:09:29,519
worth taking for a brighter
234
00:09:27,000 --> 00:09:31,839
future and don't worry we'll have each
235
00:09:29,519 --> 00:09:31,839
other's
236
00:09:32,240 --> 00:09:36,680
backs have you lost anybody close to
237
00:09:39,680 --> 00:09:43,279
you
238
00:09:41,279 --> 00:09:46,000
yes I've lost
239
00:09:43,279 --> 00:09:48,839
many my son my friends my
240
00:09:46,000 --> 00:09:51,000
home but every loss only fuels my
241
00:09:48,839 --> 00:09:53,560
determination to keep
242
00:09:51,000 --> 00:09:54,760
going and through the resistance I found
243
00:09:53,560 --> 00:09:57,399
a new
244
00:09:54,760 --> 00:09:59,720
family together we can make a difference
245
00:09:57,399 --> 00:10:03,000
and prevent others from experiencing ing
246
00:09:59,720 --> 00:10:05,360
the same pain right slightly strange to
247
00:10:03,000 --> 00:10:07,560
watch that what was going on so that is
248
00:10:05,360 --> 00:10:09,920
an avatar powered by Ai and what's
249
00:10:07,560 --> 00:10:12,320
unique about it is it's not scripted so
250
00:10:09,920 --> 00:10:14,160
it's just responding to you in real time
251
00:10:12,320 --> 00:10:16,399
conversation that's what's kind of new
252
00:10:14,160 --> 00:10:18,560
about this and a bit stunted but you can
253
00:10:16,399 --> 00:10:20,440
see the kind of potential there for for
254
00:10:18,560 --> 00:10:22,399
the next generation of games it's quite
255
00:10:20,440 --> 00:10:23,880
haunting wasn't it watching it which I
256
00:10:22,399 --> 00:10:25,640
suppose is part of the desired effect
257
00:10:23,880 --> 00:10:28,040
but if that is basically still in the
258
00:10:25,640 --> 00:10:29,839
kind of concept phase that seems I don't
259
00:10:28,040 --> 00:10:32,200
know pretty impressive to me even if it
260
00:10:29,839 --> 00:10:34,519
is just a kind of fancy chat bot at the
261
00:10:32,200 --> 00:10:36,800
moment it's still deeply impressive
262
00:10:34,519 --> 00:10:40,240
right let's move on to your story ft the
263
00:10:36,800 --> 00:10:43,320
delivery Rider who took on his faceless
264
00:10:40,240 --> 00:10:45,440
boss is the headline what's this so This
265
00:10:43,320 --> 00:10:47,120
is actually from my book which is which
266
00:10:45,440 --> 00:10:49,120
came out today which is exciting but
267
00:10:47,120 --> 00:10:50,760
this is about um it's called code
268
00:10:49,120 --> 00:10:52,760
dependent and it's about human beings
269
00:10:50,760 --> 00:10:55,120
who've been impacted by AI systems
270
00:10:52,760 --> 00:10:57,120
unexpectedly and Armin Sami who's the
271
00:10:55,120 --> 00:10:59,600
Uber Eats driver in this story found
272
00:10:57,120 --> 00:11:01,279
that he was being underpaid consistently
273
00:10:59,600 --> 00:11:02,920
or in one case and he wanted to figure
274
00:11:01,279 --> 00:11:04,519
out was this a problem that occurred
275
00:11:02,920 --> 00:11:07,040
again and again and because the
276
00:11:04,519 --> 00:11:09,959
algorithms that kind of govern work on
277
00:11:07,040 --> 00:11:11,519
gig apps are so opaque you know people
278
00:11:09,959 --> 00:11:13,440
the people who work for these apps have
279
00:11:11,519 --> 00:11:15,600
no idea why they're being paid what
280
00:11:13,440 --> 00:11:17,480
they're being paid he had to hack it he
281
00:11:15,600 --> 00:11:19,600
essentially built a tool that could
282
00:11:17,480 --> 00:11:20,880
figure out how far he was traveling and
283
00:11:19,600 --> 00:11:23,079
therefore how much he should have been
284
00:11:20,880 --> 00:11:25,040
paid he then made this free for other
285
00:11:23,079 --> 00:11:27,200
rubber drivers to use and they've all
286
00:11:25,040 --> 00:11:28,440
been using it around the world to figure
287
00:11:27,200 --> 00:11:30,800
out you know whether they've been
288
00:11:28,440 --> 00:11:33,399
underpaid which had that is an
289
00:11:30,800 --> 00:11:35,320
absolutely incredible story I wish we
290
00:11:33,399 --> 00:11:37,720
had more time for it unfortunately we
291
00:11:35,320 --> 00:11:39,360
don't last issue we've got to move on
292
00:11:37,720 --> 00:11:43,200
because we are unfortunately nearly out
293
00:11:39,360 --> 00:11:47,040
of time Salon rushy AI only poses a
294
00:11:43,200 --> 00:11:49,800
threat to unoriginal writers and also we
295
00:11:47,040 --> 00:11:52,680
can look at at the same time James Blunt
296
00:11:49,800 --> 00:11:55,279
humiliated by generic AI versions of his
297
00:11:52,680 --> 00:11:57,240
Lyrics these are artists what's going on
298
00:11:55,279 --> 00:11:59,360
here so this is really about creativity
299
00:11:57,240 --> 00:12:02,120
and whether artificial intelligence can
300
00:11:59,360 --> 00:12:05,320
ever be creative right whether that's
301
00:12:02,120 --> 00:12:07,920
artists you know voice over actors uh
302
00:12:05,320 --> 00:12:10,920
writers journalists like ask you know
303
00:12:07,920 --> 00:12:12,959
can we be replaced by Ai and so far the
304
00:12:10,920 --> 00:12:15,240
answer seems to be no it's it's pretty
305
00:12:12,959 --> 00:12:18,079
generic James Blunt actually finds it
306
00:12:15,240 --> 00:12:20,360
you know humiliating as he says nothing
307
00:12:18,079 --> 00:12:23,199
like his real lyrics um so I don't think
308
00:12:20,360 --> 00:12:24,680
it's there yet um if to rep coming to
309
00:12:23,199 --> 00:12:26,440
replace us well so yeah you didn't have
310
00:12:24,680 --> 00:12:27,920
to bring journalists into it at the end
311
00:12:26,440 --> 00:12:30,120
no one was talking about them until you
312
00:12:27,920 --> 00:12:31,800
brought it in right alter time thank you
313
00:12:30,120 --> 00:12:34,040
so much for coming in and talking us
314
00:12:31,800 --> 00:12:38,959
through the brilliant stuff that's it
315
00:12:34,040 --> 00:12:38,959
we'll do this again same time next week