WEBVTT

1
00:00:00.450 --> 00:00:04.680
<v Kristin Alford>Welcome everyone to this next session as part of the Adelaide Festival of Ideas.</v>

2
00:00:04.681 --> 00:00:06.180
My name is Dr. Kristin Alford,

3
00:00:06.210 --> 00:00:10.560
and I'll be chairing the panel with Dr. Fiona Kerr and David Hobbs this morning.

4
00:00:11.460 --> 00:00:13.470
A couple of bits of housekeeping as we start,

5
00:00:13.471 --> 00:00:18.030
please make sure your mobile phone is on silent. But you're very,

6
00:00:18.031 --> 00:00:22.620
very welcome to tweet Instagram, LinkedIn, Facebook,

7
00:00:22.980 --> 00:00:27.180
whatever you'd like to do. And the hashtag is hash ADL,

8
00:00:27.181 --> 00:00:30.240
FOI tweet handle is the same, and it's the same for Instagram.

9
00:00:31.230 --> 00:00:33.930
Unauthorised recordings of any kind are not permitted during the session though.

10
00:00:33.931 --> 00:00:38.340
So please don't video and today's session is being audio recorded by Radio

11
00:00:38.341 --> 00:00:42.180
Adelaide on behalf of the Adelaide Festival of Ideas for future broadcasts.

12
00:00:42.750 --> 00:00:44.070
So there's something to keep in mind.

13
00:00:44.071 --> 00:00:46.860
If you do ask any questions as well that you will be recorded.

14
00:00:47.580 --> 00:00:49.860
So I'm delighted to be joined by David and Fiona.

15
00:00:50.700 --> 00:00:52.860
David is a researcher at Flinders University.

16
00:00:52.861 --> 00:00:55.260
He has a long history in rehabilitation engineering,

17
00:00:55.440 --> 00:00:57.210
looking at assistive technologies.

18
00:00:57.711 --> 00:01:01.110
And when we were talking before some of the projects he was talking about where

19
00:01:02.040 --> 00:01:03.900
that's better, sorry, I can hear a squeal.

20
00:01:04.170 --> 00:01:09.090
Some of the projects we were talking about was his PhD thesis in particularly on

21
00:01:09.091 --> 00:01:12.840
developing and testing gaming technologies for children with cerebral palsy and

22
00:01:12.841 --> 00:01:15.690
how you might use technologies to improve body function.

23
00:01:17.640 --> 00:01:21.300
Dr. Fiona Kerr is an industry professor at University of Adelaide.

24
00:01:22.140 --> 00:01:24.270
As a neuroscientist and systems engineer,

25
00:01:24.271 --> 00:01:27.060
she's been working a lot in that human technology interaction space,

26
00:01:27.061 --> 00:01:30.690
across many sectors, especially in defense and health. And when we,

27
00:01:30.720 --> 00:01:32.040
when we were talking earlier, she said,

28
00:01:32.041 --> 00:01:34.110
one of her other interests is building new brains.

29
00:01:34.440 --> 00:01:37.380
I had to check whether that was artificial brains or human brains,

30
00:01:37.381 --> 00:01:41.430
but it is about building the capacity of the human brain some of her other work.

31
00:01:41.460 --> 00:01:44.880
So we might actually start with you Fiona a little bit around. What,

32
00:01:45.030 --> 00:01:48.060
what particularly are you thinking about when you're looking at artificial

33
00:01:48.061 --> 00:01:49.650
intelligence in the moment at the moment,

34
00:01:49.651 --> 00:01:52.860
how does that relate to your current area of work with a range of indices?

35
00:01:53.660 --> 00:01:58.010
<v Fionna Kerr>Right. So one of the things I'm looking at is how do we shape each other,</v>

36
00:01:58.040 --> 00:02:01.850
hence looking at how we build our own brains, how do we shape technology?

37
00:02:02.060 --> 00:02:05.630
How does technology shape us? And therefore,

38
00:02:05.631 --> 00:02:07.820
how should we be shaping the world that we want?

39
00:02:08.330 --> 00:02:13.250
Part of the thing that drives me is that we are technologized and we're going to

40
00:02:13.251 --> 00:02:14.270
be technologized.

41
00:02:14.330 --> 00:02:18.890
And instead of the conversation that very often starts to become very

42
00:02:18.891 --> 00:02:21.620
simplistic, which is, you know, it's going to be a terrible thing.

43
00:02:21.650 --> 00:02:23.780
It's going to be a wonderful thing.

44
00:02:24.080 --> 00:02:27.560
We tend to oversimplify and polarize that kind of whole discussion,

45
00:02:28.010 --> 00:02:31.420
instead of being able to say the question shapes the systems.

46
00:02:31.421 --> 00:02:36.290
So if we can be more clear about what we will want,

47
00:02:36.291 --> 00:02:37.970
sorry, what we want the world to look like,

48
00:02:38.240 --> 00:02:43.130
and then how can AI and technology enable that we can actually get

49
00:02:43.131 --> 00:02:45.620
much better and much different answers.

50
00:02:45.980 --> 00:02:50.210
So a lot of my work is around understanding how humans

51
00:02:50.360 --> 00:02:54.890
neurophysiologically impact each other and are impacted by

52
00:02:55.520 --> 00:02:58.820
technology so that we have a much better understanding of how to design and

53
00:03:00.700 --> 00:03:05.020
the technology that therefore can make our lives better and enable what we want

54
00:03:05.021 --> 00:03:09.820
to do as humans who care and connect in the future. And now,

55
00:03:10.180 --> 00:03:13.840
instead of being driven, feeling like we're being driven by this, you know,

56
00:03:13.841 --> 00:03:18.190
runaway bus, which we just have to, to get used to.

57
00:03:18.611 --> 00:03:21.310
And so often I hear people saying, we just have to,

58
00:03:21.311 --> 00:03:24.640
you just have to learn to live with it and learn to work with it. No,

59
00:03:24.760 --> 00:03:29.620
we have to be clear about what we want, the way that we want to live and work.

60
00:03:30.510 --> 00:03:32.860
And then AI, because AI is quite neutral,

61
00:03:32.861 --> 00:03:37.060
although it is in its it favors those who control and design it.

62
00:03:37.480 --> 00:03:40.180
So how do we control it then? How do we design it?

63
00:03:40.510 --> 00:03:45.070
Because it can be a fantastic enabler or it can be something which makes our

64
00:03:45.071 --> 00:03:48.970
life worse. It's very much up to us. So that's, that's what I deal with.

65
00:03:49.710 --> 00:03:53.970
<v Kristin Alford>Is there anything different in terms of that enabling technology around</v>

66
00:03:53.971 --> 00:03:55.500
artificial intelligence specifically,

67
00:03:55.501 --> 00:03:58.920
or is this a common theme with many of the technologies that have

68
00:04:00.240 --> 00:04:02.400
hit us in the law? You know, like,

69
00:04:02.430 --> 00:04:05.130
like the internet or like a social media wave,

70
00:04:05.131 --> 00:04:06.600
is there something particularly different around that?

71
00:04:07.140 --> 00:04:11.040
<v Fionna Kerr>Yes. there's a few, well, there's lots, but there's a couple of major areas.</v>

72
00:04:11.041 --> 00:04:14.460
One is that it is all pervasive. So AI is, is global.

73
00:04:14.520 --> 00:04:17.940
And we don't think globally as people, we tend to be quite regional.

74
00:04:17.941 --> 00:04:22.680
So I work in different countries and people think about the problems around AI

75
00:04:22.710 --> 00:04:24.150
quite differently in different countries.

76
00:04:24.630 --> 00:04:27.870
So I'm on the steering committee in Finland for designing their artificial

77
00:04:27.871 --> 00:04:29.400
intelligence program for the future.

78
00:04:29.730 --> 00:04:32.820
And the discussions are quite different in a country like that.

79
00:04:33.090 --> 00:04:38.010
Then I also work in the U S and then where AI will go in the U

80
00:04:38.011 --> 00:04:42.660
S so one of the things that's different is scope and size reach and speed.

81
00:04:43.110 --> 00:04:47.370
Another difference is humans are electrochemical bags. That's what we are.

82
00:04:47.490 --> 00:04:50.280
It's very romantic way of thinking about ourselves, but that's what we are.

83
00:04:50.760 --> 00:04:55.620
And AI once you become part of something that's quite

84
00:04:55.621 --> 00:04:58.830
immersive with AI, you know, gaming, VR,

85
00:04:58.890 --> 00:05:03.330
even AR it has a very significant effect

86
00:05:03.331 --> 00:05:07.770
impact on our brain and on our physiology, our neurophysiology.

87
00:05:08.100 --> 00:05:13.020
So it's much stronger in impact than various other technologies because it's,

88
00:05:13.260 --> 00:05:16.090
it's, we are, well, we,

89
00:05:16.230 --> 00:05:18.510
all of the issues that you read about all the time,

90
00:05:18.750 --> 00:05:21.780
we become addicted much more easily. It's quite invasive.

91
00:05:22.320 --> 00:05:25.590
We get neural patterns and domain requirements and all that sort of thing that

92
00:05:25.591 --> 00:05:29.970
hooks us in with that. So there's that whole area of AI. So I guess it's,

93
00:05:30.050 --> 00:05:32.730
it's a few, it's a few of those things. It's,

94
00:05:32.740 --> 00:05:37.200
it's more immersive and more invasive than a lot of other types of technology.

95
00:05:37.380 --> 00:05:41.340
And the way we use it, because we have, most of us have a, you know,

96
00:05:41.430 --> 00:05:43.620
a multifunctional device in our pockets,

97
00:05:44.010 --> 00:05:48.520
which is the worst and the best of AI, it's,

98
00:05:48.521 --> 00:05:51.600
it's intimate and keeps us connected and,

99
00:05:52.140 --> 00:05:56.970
and gives us communication and information. It also keeps us isolated,

100
00:05:57.140 --> 00:06:00.890
gives us terrible information and sequesters data that we need and makes us

101
00:06:00.891 --> 00:06:03.680
think really superficially. And so we've, you know, we've got,

102
00:06:03.770 --> 00:06:07.550
we've got everything in that and we have to get much better at actually managing

103
00:06:07.551 --> 00:06:08.384
it.

104
00:06:08.620 --> 00:06:12.010
<v Kristin Alford>Thank you. That's my electric chemical bag is my new favorite insult.</v>

105
00:06:12.011 --> 00:06:14.200
I'm going to use it when I go home and the girls haven't tied the house.

106
00:06:15.310 --> 00:06:19.630
I think what you said there around the size and scope and the strength of the

107
00:06:19.631 --> 00:06:23.510
impact is, is a really interesting one. And it leads me into talking,

108
00:06:23.680 --> 00:06:26.770
talking with you, David, a little bit more about your work to start with. Just,

109
00:06:26.771 --> 00:06:30.610
I mean, when we were talking earlier, it was around you know,

110
00:06:30.970 --> 00:06:34.990
the fact that the sort of work that you do often often leads the sector and

111
00:06:34.991 --> 00:06:39.190
technology does have the ability to be stronger and impact for people who may

112
00:06:39.220 --> 00:06:40.450
present a disability as well.

113
00:06:40.451 --> 00:06:42.490
Can you talk a little bit about the sort of things that you've been doing?

114
00:06:42.910 --> 00:06:43.391
<v David Hobbs>Absolutely.</v>

115
00:06:43.391 --> 00:06:47.410
So I suppose the field of rehabilitation engineering and assistive technology is

116
00:06:47.411 --> 00:06:51.940
about developing technology to harness and to give people the empowerment that

117
00:06:51.941 --> 00:06:53.710
we have, if we do not have a disability.

118
00:06:53.711 --> 00:06:57.880
So it's about facilitating and giving them access and function and a

119
00:06:57.881 --> 00:06:59.020
conversation before. In fact,

120
00:06:59.021 --> 00:07:02.620
there's been a Twitter conversation in the last week is I don't think the

121
00:07:02.621 --> 00:07:07.450
general population actually realize the benefit that the general population has

122
00:07:07.600 --> 00:07:08.800
from the disability sector.

123
00:07:09.280 --> 00:07:12.520
And to give you some concrete examples with that things like tax prediction.

124
00:07:12.910 --> 00:07:17.860
So actually predicting what you might be saying next comes from the AAC or

125
00:07:17.861 --> 00:07:21.190
the alternative or mental communication area where people with speech generating

126
00:07:21.191 --> 00:07:24.400
devices would like to type and hence speak a lot faster.

127
00:07:24.670 --> 00:07:26.770
So that technology has been around for a long, long time.

128
00:07:27.040 --> 00:07:30.100
And now we're benefiting it from where we send SMSs and emails,

129
00:07:30.101 --> 00:07:32.140
and even typing words to search engines,

130
00:07:32.470 --> 00:07:36.240
a text to speech and speech to text again from the community where people were

131
00:07:36.940 --> 00:07:37.120
nonverbal.

132
00:07:37.120 --> 00:07:40.570
And so they're communicating in different media and that it's being spoken.

133
00:07:40.571 --> 00:07:42.400
And we're now seeing that come in Adobe,

134
00:07:42.401 --> 00:07:46.390
you can have Adobe read the document to you in the general population doors that

135
00:07:46.391 --> 00:07:49.330
open and close, you know, star Trek would have started that in the movies,

136
00:07:49.331 --> 00:07:50.920
science fiction, science fiction,

137
00:07:50.921 --> 00:07:53.560
crossing over into the technology that we benefit from every day.

138
00:07:53.590 --> 00:07:58.450
So there's a great paper actually out there from a guy called Chris law called

139
00:07:58.451 --> 00:08:01.240
the technology and your cell phone wasn't invented for you.

140
00:08:01.510 --> 00:08:06.310
And that paper actually plants, the seed for what's called universal design.

141
00:08:06.460 --> 00:08:10.660
Universal design is about considering the end user and all their capabilities on

142
00:08:10.661 --> 00:08:14.350
the whole spectrum and how you should be designing so everyone can access use,

143
00:08:14.351 --> 00:08:17.860
and actually get the best from that particular item, be in a space,

144
00:08:17.890 --> 00:08:20.530
so public space or a device or technology.

145
00:08:20.830 --> 00:08:24.580
And he outlined some fantastic examples of where that technology has come from

146
00:08:25.090 --> 00:08:27.130
with some of the examples I've given you to say, well, look, you know,

147
00:08:27.131 --> 00:08:29.740
we're all benefiting from this because it was designed correctly.

148
00:08:30.010 --> 00:08:33.460
The intended user or the spectrum of users was considered appropriately.

149
00:08:33.730 --> 00:08:36.430
So we can all benefit it from in a general sense. Hmm.

150
00:08:36.960 --> 00:08:40.180
<v Kristin Alford>That's really interesting. So, so in my, in my job, I am director of mode,</v>

151
00:08:40.181 --> 00:08:43.660
which is just across the road and it's a space where we're trying to inspire

152
00:08:43.661 --> 00:08:47.290
young adults about the role that science and tech can play in their lives.

153
00:08:47.320 --> 00:08:50.380
And so we've been very conscious about trying to design a space that is

154
00:08:50.381 --> 00:08:51.070
inclusive,

155
00:08:51.070 --> 00:08:53.380
but I'm also reminded when you've been talking about texts to speeches.

156
00:08:53.381 --> 00:08:56.070
We have an animatronic head at mod called Josh.

157
00:08:56.880 --> 00:08:58.830
And Josh is modeled on a 17 year old boy,

158
00:08:58.860 --> 00:09:01.140
and he's kind of in a dreamlike state and we're going, oh,

159
00:09:01.141 --> 00:09:03.990
if only we could do this with him, if only we could do this. And when you,

160
00:09:03.991 --> 00:09:05.940
when you, when you're talking it's yes,

161
00:09:05.941 --> 00:09:08.250
he doesn't have the ability to speak on his own,

162
00:09:08.251 --> 00:09:11.910
but if we've got text to speech, we can make him speak. We've got, you know,

163
00:09:11.911 --> 00:09:16.050
we're using all of those technologies to animal to sort of animate this robot in

164
00:09:16.051 --> 00:09:16.411
very,

165
00:09:16.411 --> 00:09:19.860
in very interesting ways that rely on technologies that have come through that.

166
00:09:20.390 --> 00:09:23.810
<v David Hobbs>So technology is I think, brilliant and fantastic in that aspect.</v>

167
00:09:23.811 --> 00:09:26.840
And just to give you some examples of where you might get some of this mismatch.

168
00:09:27.050 --> 00:09:30.530
So some of the first speech generating devices that came out and you hopefully

169
00:09:30.531 --> 00:09:32.240
are all familiar with say Stephen Hawkings,

170
00:09:32.241 --> 00:09:35.810
who used one of these devices to talk for him was when they first came out,

171
00:09:35.960 --> 00:09:38.420
you have to realize these devices came from the U S.

172
00:09:38.600 --> 00:09:41.030
And so if you use a speech generating device,

173
00:09:41.060 --> 00:09:44.390
it would have an American male adult voice.

174
00:09:44.630 --> 00:09:48.830
Now just think about that one system being plugged into any speech germane

175
00:09:48.831 --> 00:09:52.220
device you use young children, young females, older females.

176
00:09:52.300 --> 00:09:55.310
And so you get this mismatch between what's being said and the personality of

177
00:09:55.311 --> 00:09:57.140
the person, which is what you can. Oh.

178
00:09:57.240 --> 00:09:59.570
<v Kristin Alford>And I think one of the things that we've found with Josh is some people are</v>

179
00:09:59.571 --> 00:10:01.970
really creeped out by him because he looks really human,

180
00:10:01.971 --> 00:10:04.670
even though he's a robot and some people are creeped out by him because he

181
00:10:04.671 --> 00:10:07.430
speaks with an Australian accent and they've never had texts.

182
00:10:08.600 --> 00:10:09.200
[inaudible] That's right, because.

183
00:10:09.200 --> 00:10:10.033
<v David Hobbs>It's not Siri.</v>

184
00:10:10.720 --> 00:10:14.210
And so there's a big push actually in Australia now to have an Australian

185
00:10:14.211 --> 00:10:17.450
language, but not just an Australian language, actually a young female,

186
00:10:17.510 --> 00:10:19.010
a young male, a teenage female.

187
00:10:19.011 --> 00:10:23.390
She's about to get that that closer connection between you and your device,

188
00:10:23.391 --> 00:10:26.990
because for the disability sector and for a set of tech leaders in general,

189
00:10:27.110 --> 00:10:28.970
there has to be this connection because, you know,

190
00:10:28.971 --> 00:10:31.880
people talk about their wheelchairs as an extension of themselves.

191
00:10:31.881 --> 00:10:34.970
It gives me the mobility. So you want all your devices to be like that.

192
00:10:34.971 --> 00:10:37.880
And I think the voice is an important one because then the voices,

193
00:10:38.240 --> 00:10:41.450
the connection to who you are and how you say things. So that's important.

194
00:10:41.930 --> 00:10:43.700
<v Kristin Alford>So I've had a little bit around, I guess,</v>

195
00:10:43.880 --> 00:10:45.480
artificial intelligence and your interests,

196
00:10:45.730 --> 00:10:48.920
and are a little bit around tech leading in the disability sector,

197
00:10:48.950 --> 00:10:52.460
but how do we put them together? Which is my big question when I'm,

198
00:10:52.461 --> 00:10:54.380
when I suppose spoke to Fiona and David, well,

199
00:10:54.410 --> 00:10:58.730
what does AI mean when you put these things together in terms of design and

200
00:10:58.731 --> 00:11:02.390
technology and how it's shaping us? So how do we, how do we put those things.

201
00:11:02.390 --> 00:11:04.910
<v David Hobbs>Together? I think that's a good question.</v>

202
00:11:04.911 --> 00:11:08.010
I'll let Fiona answer it in just a moment. What I am finding, interesting.

203
00:11:08.060 --> 00:11:11.270
We talked about this before is the disability sector is,

204
00:11:11.870 --> 00:11:15.740
has been hamstrung in many areas because of size.

205
00:11:16.190 --> 00:11:19.220
And so your economies of scale are typically not there like their general

206
00:11:19.221 --> 00:11:22.190
population, which means the costs of devices and items,

207
00:11:22.191 --> 00:11:26.810
and sometimes exorbitant and unattainable. And so what I'm seeing now,

208
00:11:26.840 --> 00:11:27.740
and what is the,

209
00:11:27.741 --> 00:11:31.460
the big influence in the last couple of years is where AI and big data and other

210
00:11:31.461 --> 00:11:35.120
things I haven't been influenced such as Google home, for example, Google mini.

211
00:11:35.420 --> 00:11:38.420
So now that's taking all of those years of speech recognition,

212
00:11:38.421 --> 00:11:42.350
machine learning and your ability to take your voice to command

213
00:11:43.040 --> 00:11:45.770
15 years ago, excuse me. When I was working at Novita,

214
00:11:45.771 --> 00:11:49.100
that's called environmental control units and they are very, very expensive,

215
00:11:49.101 --> 00:11:52.130
but now it's a lot more achievable and autonomous vehicles.

216
00:11:52.490 --> 00:11:54.160
So the ability hop into a vehicle,

217
00:11:54.430 --> 00:11:57.280
if you can not drive and then have that vehicle take you to where you want to

218
00:11:57.281 --> 00:11:58.090
go. I mean,

219
00:11:58.090 --> 00:12:01.390
where we're just entering that phase now with trials around Adelaide and things

220
00:12:01.391 --> 00:12:02.500
like that and overseas,

221
00:12:02.830 --> 00:12:07.750
but we're seeing what would I consider to be assisted devices or assistive

222
00:12:07.751 --> 00:12:11.050
features now coming into mainstream, but the cost is coming right down,

223
00:12:11.051 --> 00:12:11.884
which is fantastic.

224
00:12:12.690 --> 00:12:15.450
<v Fionna Kerr>I'll probably take a bit of a bigger picture then.</v>

225
00:12:16.830 --> 00:12:21.660
So one of the main things that I spend time talking about and advising on

226
00:12:21.661 --> 00:12:26.250
is how you make sure that when you want to use any of the new technologies

227
00:12:26.580 --> 00:12:29.310
that the drivers for it align with,

228
00:12:29.340 --> 00:12:31.800
with what we actually wanted to do.

229
00:12:32.310 --> 00:12:35.520
So one of the main ways that you put them together is to,

230
00:12:35.600 --> 00:12:40.500
to try and think about what is the question. So, so your example, you're like,

231
00:12:40.501 --> 00:12:42.330
what you're doing, the work you're doing is,

232
00:12:42.870 --> 00:12:46.680
is asking a good human centric, positive question.

233
00:12:47.070 --> 00:12:49.890
Cause one of the things to keep thinking about is AI is,

234
00:12:50.070 --> 00:12:53.700
is it's a goal based optimizer.

235
00:12:54.300 --> 00:12:58.760
It will do everything to achieve the question that we ask

236
00:12:59.640 --> 00:13:03.960
and to create the goal path, the optimist, or sorry,

237
00:13:03.961 --> 00:13:05.730
the optimal goal path.

238
00:13:06.150 --> 00:13:09.780
So if you ask it a question which is positive and human centric,

239
00:13:10.260 --> 00:13:13.980
AI is fantastic at doing that and helping us get there.

240
00:13:14.550 --> 00:13:15.810
If we ask it a question,

241
00:13:16.020 --> 00:13:20.940
which is for the benefit of some and often to the detriment of

242
00:13:20.941 --> 00:13:24.270
others, AI is just as good at doing that.

243
00:13:24.750 --> 00:13:29.370
And very often we were just kind of sleepwalking in letting the second

244
00:13:29.820 --> 00:13:32.640
happen. So, you know, if, if it's just profit driving it,

245
00:13:32.641 --> 00:13:36.030
then it's perfect at making profit for those people who ask the question.

246
00:13:36.540 --> 00:13:37.290
So what we,

247
00:13:37.290 --> 00:13:40.590
one of the things that always is in the back of my mind with how we put them

248
00:13:40.591 --> 00:13:44.010
together is how do we make sure that the question we're asking in the first

249
00:13:44.011 --> 00:13:47.220
place is for the benefit of humans.

250
00:13:47.580 --> 00:13:50.700
And then the other thing is how do we make sure, I guess it's because it's my,

251
00:13:50.720 --> 00:13:55.620
my middle area is how do we understand the neurophysiological impact on the

252
00:13:55.621 --> 00:13:58.470
human of the technology as well as the other way around?

253
00:13:59.250 --> 00:14:03.540
And if we're really clear about that, then whatever we build,

254
00:14:03.570 --> 00:14:07.950
we will be able to try and build it to maximize the benefits.

255
00:14:08.040 --> 00:14:13.020
So for me, a couple of examples I'm directly involved in in the U S for example,

256
00:14:13.350 --> 00:14:16.890
are the design of a family bot, a family bots you'll see in a few years,

257
00:14:17.340 --> 00:14:21.090
there is something which are supposed to be in families to assist with the run

258
00:14:21.091 --> 00:14:24.030
around, will cook a teardrop with a kind of a screen on the top of the big eyes.

259
00:14:24.390 --> 00:14:29.220
And they're there to seemingly assist parents to communicate with their children

260
00:14:29.280 --> 00:14:34.110
and to help them to teach their children, you know, moral and ethical behavior.

261
00:14:34.680 --> 00:14:38.130
If you actually know what the brain needs around,

262
00:14:38.131 --> 00:14:42.720
that it is a lot of very direct interaction between the child and the parent.

263
00:14:43.020 --> 00:14:45.870
A lot of retinal eye lock, a lot of touch,

264
00:14:46.290 --> 00:14:50.610
a lot of amygdala and hippocampus stimulation through looking at gazing into

265
00:14:50.611 --> 00:14:53.150
people, you, the other child, the child's eyes and the parent's eyes.

266
00:14:53.330 --> 00:14:57.350
So that really direct eye and gaze connectivity

267
00:14:57.890 --> 00:15:02.420
stimulates the parts of the brain that increase the capability for communication

268
00:15:02.430 --> 00:15:06.890
skills and, and understanding good behaviors. Whereas what the,

269
00:15:06.930 --> 00:15:11.780
the bop tends to do is decrease that because it either keeps you away from the

270
00:15:11.781 --> 00:15:15.080
child or it keeps everybody looking at a screen.

271
00:15:15.620 --> 00:15:19.940
So my way of trying to deal with that is not to say to the people that design it

272
00:15:20.000 --> 00:15:20.271
well,

273
00:15:20.271 --> 00:15:22.820
you're going to create a bunch of psychopaths who never look at each other.

274
00:15:23.840 --> 00:15:24.650
It's to say,

275
00:15:24.650 --> 00:15:28.910
let's have a look at what the brain actually needs in order to do that.

276
00:15:28.940 --> 00:15:33.530
And so how do you make your device increase direct eye-gaze touch and

277
00:15:33.531 --> 00:15:37.340
connection between the parent and the child instead of decreasing it,

278
00:15:37.341 --> 00:15:38.570
which is doing at the moment.

279
00:15:38.930 --> 00:15:43.790
And so often it's just that they haven't thought technologists are

280
00:15:43.820 --> 00:15:47.480
very technologically optimistic in general, and I'm a technologist, but,

281
00:15:47.660 --> 00:15:51.410
but many, many things that you can just technologize everything, you know,

282
00:15:51.620 --> 00:15:53.710
there were some things you can't technologize or there,

283
00:15:53.750 --> 00:15:54.890
when you technologize them,

284
00:15:54.891 --> 00:15:58.550
you have to take into account the interaction between the human in that

285
00:15:58.551 --> 00:16:02.810
technology station. So once they get that extra piece of information,

286
00:16:03.290 --> 00:16:07.070
then they, they usually very happy to go away and,

287
00:16:07.080 --> 00:16:10.160
and it's a terrific new puzzle. So one of the ways is to,

288
00:16:10.580 --> 00:16:15.260
is to have maximum information in trying to do those things so that you do

289
00:16:15.261 --> 00:16:19.610
get really good outcomes because you're coming off a base that understands what

290
00:16:19.611 --> 00:16:23.180
the technology does understands, what have the team, not the human works,

291
00:16:23.390 --> 00:16:26.900
and then understands how to put them together in a really positive,

292
00:16:26.901 --> 00:16:27.740
functional way.

293
00:16:28.210 --> 00:16:31.630
<v Kristin Alford>It reminds me of the the discussion often around artificial intelligence and</v>

294
00:16:31.631 --> 00:16:35.830
jobs of the future and work, we, where we sort of hear these, these numbers,

295
00:16:35.831 --> 00:16:40.410
that 40% of the jobs that we currently have one exist. And you know, and,

296
00:16:40.411 --> 00:16:43.360
and that what makes us humans then starts to become really important.

297
00:16:43.361 --> 00:16:44.590
But when you interrogate that,

298
00:16:44.591 --> 00:16:47.350
that's the sort of things that we don't currently value like care work,

299
00:16:47.410 --> 00:16:52.180
or teaching work, or and so it's, it's that,

300
00:16:52.210 --> 00:16:54.130
it's that question about sort of saying, well,

301
00:16:54.131 --> 00:16:58.990
what can AI do that helps solve a lot of the problems of what humans really

302
00:16:59.140 --> 00:17:01.420
could be better off spending their time doing something else.

303
00:17:01.600 --> 00:17:04.180
But then we also need to answer that question. I think, how have you said, well,

304
00:17:04.181 --> 00:17:06.400
what, what, where should they be spending their time?

305
00:17:06.610 --> 00:17:10.570
Because the example that you've just given me is that's parents gazing into

306
00:17:10.960 --> 00:17:12.550
children should be where they're spending their time,

307
00:17:12.551 --> 00:17:15.010
as opposed to doing the washing up or fiddling around,

308
00:17:15.040 --> 00:17:18.790
you know mowing the lawn or whatever else.

309
00:17:18.791 --> 00:17:20.350
You can probably get a boat to do more easily.

310
00:17:20.680 --> 00:17:24.310
<v Fionna Kerr>That also gets us to another question, which is around again, it's a choice.</v>

311
00:17:24.311 --> 00:17:26.830
Do we want quality partnerships with AI,

312
00:17:27.040 --> 00:17:30.040
or do we want quantity partnerships with AI? And again,

313
00:17:30.041 --> 00:17:32.050
I see different trends in different countries.

314
00:17:32.440 --> 00:17:36.600
So the quality partnership is using AI to it's.

315
00:17:36.601 --> 00:17:39.250
Something like the surgeon who's got AI clipped on their glasses,

316
00:17:39.520 --> 00:17:41.680
it's operating the person's operating,

317
00:17:41.681 --> 00:17:45.850
sees something untoward can ask AI to give them the really good data.

318
00:17:45.940 --> 00:17:49.060
AI is fabulous, aggregating data really quickly and giving them,

319
00:17:49.890 --> 00:17:53.820
but the surgeon still decides because the surgeons got years and years of

320
00:17:53.821 --> 00:17:54.720
knowledge chunked up.

321
00:17:54.721 --> 00:17:59.250
So that's a beautiful partnership between the knowledge and wisdom of a

322
00:17:59.251 --> 00:18:03.600
human and the capacity of AI to give it really relevant

323
00:18:03.601 --> 00:18:07.710
information on the spot. Perfect. Quantity is your, you know,

324
00:18:07.711 --> 00:18:09.570
your mechanical Turk. You say,

325
00:18:09.600 --> 00:18:14.130
we can take this job and we can make 90% of it automated

326
00:18:14.400 --> 00:18:17.790
10% left that we have to do for humans. So we'll put that on and let you bid.

327
00:18:18.720 --> 00:18:20.940
And so you come in really cheaply to do that.

328
00:18:21.060 --> 00:18:22.770
You have to do it hundreds of times,

329
00:18:22.800 --> 00:18:24.720
hundreds of people are doing that hundreds of times.

330
00:18:24.721 --> 00:18:26.880
So we've got this kind of battery hens situation.

331
00:18:27.300 --> 00:18:32.190
And if you complain and feel like you haven't got a social system anymore and

332
00:18:32.191 --> 00:18:34.350
not sort of thing, tough, just get up and, and, you know,

333
00:18:34.380 --> 00:18:35.580
we'll have the next person in place.

334
00:18:36.030 --> 00:18:39.580
So that's the drive for profit and it's the quantity, sorry,

335
00:18:39.581 --> 00:18:41.370
I call it the quantity relationship.

336
00:18:41.730 --> 00:18:45.450
So those discussions are going on right now in different countries.

337
00:18:45.480 --> 00:18:50.160
And depending on how proactive we are, we can really change that discussion.

338
00:18:50.190 --> 00:18:55.080
You know, we can really get involved in driving quality relationships with

339
00:18:55.081 --> 00:18:55.914
AI.

340
00:18:56.080 --> 00:18:59.060
<v Kristin Alford>What, what does, what does quality or quantity look like in your sector,</v>

341
00:18:59.090 --> 00:19:00.050
perhaps David,

342
00:19:00.051 --> 00:19:03.680
when you're thinking about the sorts of questions you want to be designing.

343
00:19:04.490 --> 00:19:07.880
<v David Hobbs>So the first example you gave actually reminds me a lot of how I treat Google</v>

344
00:19:07.881 --> 00:19:08.300
maps.

345
00:19:08.300 --> 00:19:11.750
Like I might ask Google maps to give me an indication of time and where I want

346
00:19:11.751 --> 00:19:13.700
to go, but then it's my input on top of that. Actually,

347
00:19:13.701 --> 00:19:15.560
I want to take this route because I know that's a better route,

348
00:19:15.590 --> 00:19:19.010
even though the time might be different because the optimizing engines within

349
00:19:19.011 --> 00:19:22.100
Google maps are probably doing it for time and less congestion and things like

350
00:19:22.101 --> 00:19:22.760
that.

351
00:19:22.760 --> 00:19:27.320
So I suppose what that highlights to me is the heterogeneity of the population

352
00:19:27.321 --> 00:19:27.981
that I work with.

353
00:19:27.981 --> 00:19:31.370
So even though there might be the umbrella of disability and the different

354
00:19:31.371 --> 00:19:33.080
populations within that the,

355
00:19:33.081 --> 00:19:36.140
the key thing is the person is always so intimate to that process.

356
00:19:36.141 --> 00:19:37.850
And so that could be the allied health professional,

357
00:19:37.851 --> 00:19:40.850
or whoever's working with the individual or groups of individuals.

358
00:19:41.150 --> 00:19:45.170
And so you always need that discerning clinical based input

359
00:19:45.680 --> 00:19:48.380
because you can use the information you have. In fact,

360
00:19:48.470 --> 00:19:50.720
you will fall back on that information to many, many times,

361
00:19:50.721 --> 00:19:53.150
but it's going to be the association you have with that person.

362
00:19:53.151 --> 00:19:56.510
You've been working with them for 15 or 20 years, you know, their lifestyle,

363
00:19:56.511 --> 00:19:58.520
you know, their families, you know, how they interact, you know,

364
00:19:58.521 --> 00:20:00.770
their work environments or you know,

365
00:20:00.771 --> 00:20:03.320
that this person might be more susceptible to say a pressure injury.

366
00:20:03.321 --> 00:20:06.140
And so you're going to nominate this pathway compared to that pathway.

367
00:20:06.141 --> 00:20:10.580
So I can't see those finite or those

368
00:20:11.330 --> 00:20:13.430
minutia details being taken away by AI,

369
00:20:13.460 --> 00:20:16.610
because I think it's that personal connectivity to that you have to have with

370
00:20:16.640 --> 00:20:19.730
clinical insight and knowledge, but it's the information around it.

371
00:20:19.760 --> 00:20:22.550
They just shape hopefully a narrow tunnel that you can go down.

372
00:20:22.730 --> 00:20:25.520
So you don't use this much of making you maybe use this much,

373
00:20:25.521 --> 00:20:27.470
but still use that personal insight and judgment.

374
00:20:27.930 --> 00:20:29.540
That's where I see the most effective.

375
00:20:29.930 --> 00:20:32.420
<v Kristin Alford>And Fanny, you sort of said, countries are doing this differently.</v>

376
00:20:32.421 --> 00:20:34.790
Have you got some examples from, from where you've been,

377
00:20:35.630 --> 00:20:37.910
where you've been advising or where you've been looking at?

378
00:20:38.720 --> 00:20:41.180
<v Fionna Kerr>So in Finland, and in fact,</v>

379
00:20:41.181 --> 00:20:44.780
the first question that I was asked when I briefed the steering committee was

380
00:20:44.781 --> 00:20:48.850
just starting when I was over there speaking on partnering with artificial

381
00:20:48.851 --> 00:20:52.030
intelligence for human centric future, and the,

382
00:20:52.070 --> 00:20:56.320
the very first question wasn't around any form of technology.

383
00:20:56.980 --> 00:21:00.280
I think they were interested in talking to me because I was originally an

384
00:21:00.281 --> 00:21:03.280
anthropologist. And so the very first question was, oh, great.

385
00:21:03.281 --> 00:21:08.140
So how do we align Eastern Western and middle Eastern values in order

386
00:21:08.141 --> 00:21:12.790
to be able to shape AI globally in a, you know, in a positive way.

387
00:21:12.791 --> 00:21:16.180
And I thought, oh, yes, I want to be here and do this. And, you know,

388
00:21:16.510 --> 00:21:18.280
because that was, that was the very first thing.

389
00:21:18.640 --> 00:21:23.590
So the questions were very often there are around how do

390
00:21:23.591 --> 00:21:27.580
we ensure that? So what are the drivers, what are the political drivers?

391
00:21:27.580 --> 00:21:31.840
What are the economic drivers? Because we are going to lose 40, 50% of jobs.

392
00:21:32.380 --> 00:21:34.360
But what does that mean? But in that country,

393
00:21:34.930 --> 00:21:38.200
because they've already got a very strong

394
00:21:38.980 --> 00:21:41.240
understanding that the,

395
00:21:41.241 --> 00:21:45.250
the social capital is what creates the, you know,

396
00:21:45.310 --> 00:21:47.800
the very much the health of that society.

397
00:21:48.040 --> 00:21:50.080
They're already advertising money across there.

398
00:21:50.081 --> 00:21:53.500
They have very good social systems. They pay teachers and nurses really highly.

399
00:21:53.501 --> 00:21:56.440
So they actually understand the whole thing you were making the point about

400
00:21:56.470 --> 00:22:00.100
earlier. So the conversation's there, they've already got a global wage.

401
00:22:00.760 --> 00:22:02.920
And so what we were talking about was kind of,

402
00:22:03.130 --> 00:22:05.410
you're a few steps ahead because you're having discussions around,

403
00:22:05.620 --> 00:22:08.950
how do you ensure that people get a basic wage,

404
00:22:08.980 --> 00:22:12.700
but they still feel like, like positive humans.

405
00:22:12.730 --> 00:22:15.610
So humans need to feel worthwhile. They need to feel useful.

406
00:22:15.820 --> 00:22:17.050
They need to feel connected.

407
00:22:17.320 --> 00:22:21.190
So you can't just give a global wage and let someone sit on a couch.

408
00:22:21.970 --> 00:22:24.400
Whereas in another country that I'm dealing with,

409
00:22:24.610 --> 00:22:26.980
I can already see some of the,

410
00:22:27.150 --> 00:22:31.930
the rules and some of the designs and the discussions are around.

411
00:22:31.931 --> 00:22:33.490
How do you, how do you basically,

412
00:22:33.491 --> 00:22:36.790
how do you use AI as a soporific so that people will just not get bored when

413
00:22:36.880 --> 00:22:37.840
they sitting on the couch?

414
00:22:38.170 --> 00:22:42.790
It's a totally different kind of discussion and depending on what's being

415
00:22:42.791 --> 00:22:45.580
designed. So there's a lovely example. Two examples.

416
00:22:45.581 --> 00:22:47.920
I just came because I just came back yesterday.

417
00:22:48.370 --> 00:22:49.450
Something that you were talking about,

418
00:22:49.451 --> 00:22:54.040
David Mount Sinai has got a beautiful example of people in the

419
00:22:54.041 --> 00:22:58.120
local area that are suffering levels of dementia and they,

420
00:22:58.510 --> 00:23:02.260
where they want to use various kinds of apps to be able to deal with that.

421
00:23:02.710 --> 00:23:05.320
But what they understand really well is that it's the,

422
00:23:05.321 --> 00:23:10.030
the direct human interaction that you have to loop into that all the time.

423
00:23:10.210 --> 00:23:14.200
So there's a beautiful example of people come from their local area and every

424
00:23:14.201 --> 00:23:19.180
week they check in and they get their vital signs taken by one of four young

425
00:23:19.210 --> 00:23:23.980
local people that are taking the data to make a long lens,

426
00:23:24.010 --> 00:23:25.210
a longterm database.

427
00:23:25.660 --> 00:23:28.900
The fact that those people are coming and actually seeing a human being and

428
00:23:28.901 --> 00:23:31.090
interacting with them and getting to know those four,

429
00:23:31.480 --> 00:23:33.580
and they're being looked at they're being touched,

430
00:23:33.670 --> 00:23:35.890
their C5 fibers are being stimulated in their skin.

431
00:23:35.891 --> 00:23:39.400
When that when the vital signs being taken is they,

432
00:23:39.420 --> 00:23:42.430
they have all sorts of really positive outcomes.

433
00:23:42.730 --> 00:23:47.450
Whereas when they just use the app, you don't get that nearly as much.

434
00:23:48.200 --> 00:23:52.160
So it's those sorts of things of being awake to those, you know, those,

435
00:23:52.220 --> 00:23:55.640
those things that are really important. When do you put the human in the loop?

436
00:23:55.790 --> 00:23:59.090
When do you need to reinforce what's going on technologically?

437
00:23:59.570 --> 00:24:01.670
And when do you have experts as well,

438
00:24:01.880 --> 00:24:06.710
that know exactly how to use the technology and when, and when not to use it.

439
00:24:07.010 --> 00:24:09.050
And it makes a gigantic difference.

440
00:24:09.460 --> 00:24:10.840
<v Kristin Alford>I most, I guess like my,</v>

441
00:24:10.900 --> 00:24:14.620
my area is mostly around future future works and also reminded I was listening

442
00:24:14.621 --> 00:24:17.830
to Merita Chang speak at something last year where she was talking about an

443
00:24:17.831 --> 00:24:22.330
example of an AI that had been developed to work out the difference between

444
00:24:22.840 --> 00:24:23.291
hotdogs and legs.

445
00:24:23.291 --> 00:24:25.390
I don't know if you remember that mean from a couple of years ago,

446
00:24:25.810 --> 00:24:29.770
but somebody has had to fed the AI with pictures of hotdogs and legs,

447
00:24:29.980 --> 00:24:33.160
which means there's a human at the end of that, feeding the data in.

448
00:24:33.220 --> 00:24:34.053
And it's like, well,

449
00:24:34.270 --> 00:24:38.080
is that really what we want humans to be doing in this equation? Is there,

450
00:24:38.081 --> 00:24:39.250
is there a better role?

451
00:24:39.251 --> 00:24:42.130
And I think what you've just explained is that there are a bit of roles.

452
00:24:42.430 --> 00:24:44.020
And so if we're looking at the quantity, we can,

453
00:24:44.050 --> 00:24:48.280
we can have jobs for people where you're feeding AI pictures of hotdogs and

454
00:24:48.281 --> 00:24:48.970
links,

455
00:24:48.970 --> 00:24:52.720
or you can have jobs for people where they're actually providing a benefit for

456
00:24:52.721 --> 00:24:56.170
health or a benefit that, that moves beyond just the training of the,

457
00:24:56.230 --> 00:24:57.670
of the AI or the data. Yeah.

458
00:24:57.910 --> 00:25:01.570
<v David Hobbs>I think that's important to recognize where the expertise lay a couple years ago</v>

459
00:25:01.571 --> 00:25:05.020
had Rodney Brooks give a presentation and obviously very famous person for

460
00:25:05.021 --> 00:25:08.140
robotics, the vacuum cleaners, et cetera. And he would say that they,

461
00:25:08.320 --> 00:25:11.410
their robotic devices can do many, many things. They're on production lines,

462
00:25:11.890 --> 00:25:15.250
but there's a group from the U S who got some research funding and they trained

463
00:25:15.251 --> 00:25:18.940
a robot. I think it was either default to fold a tablecloth or default of shirt.

464
00:25:19.090 --> 00:25:23.890
And it took something like five years to teach the robot, to do a task,

465
00:25:23.891 --> 00:25:25.090
which we can do in seconds.

466
00:25:25.150 --> 00:25:28.570
Yet it can assemble a BMW infractions of a second with high precision.

467
00:25:28.870 --> 00:25:31.480
And his was all about, well, where is the best application?

468
00:25:31.481 --> 00:25:34.690
Let's think about it. It's not a global paint tool. Here's where you do it,

469
00:25:34.780 --> 00:25:36.850
there's roles and those roles that would be considered for others.

470
00:25:37.210 --> 00:25:39.370
<v Kristin Alford>I dunno, folding a fitted sheet would be great though.</v>

471
00:25:41.590 --> 00:25:45.610
<v Fionna Kerr>A lot of videos on that, but that's, that's a really critical thing is,</v>

472
00:25:45.910 --> 00:25:49.330
is when is the human better? When is the technology better?

473
00:25:49.600 --> 00:25:52.420
And when is the combination better or when does it not matter?

474
00:25:52.810 --> 00:25:56.020
And there were things that we just don't look at enough.

475
00:25:56.050 --> 00:25:57.910
I guess a lot of my work is around when,

476
00:25:58.150 --> 00:26:00.730
when is the human more effective and efficient? You know, we, we,

477
00:26:00.910 --> 00:26:04.240
and we talk about examples of, of, of nursing of care,

478
00:26:04.480 --> 00:26:09.070
where especially if a child or a patient is either in pain or upset

479
00:26:09.520 --> 00:26:13.060
because of the way that your body changes in that situation,

480
00:26:13.110 --> 00:26:17.320
chemically five minutes, looking at a human, especially one, you know,

481
00:26:17.321 --> 00:26:22.300
and trust changes your physiology within that five minutes really quickly

482
00:26:22.660 --> 00:26:24.730
for the patient, your immune system changes,

483
00:26:24.731 --> 00:26:28.690
it downplays cognitive all sorts of cognitive stress markers.

484
00:26:28.930 --> 00:26:32.620
It increases your serotonin uptake. It decreases cortisol.

485
00:26:33.670 --> 00:26:38.650
And yet what we do is either they just get tablets instead of have

486
00:26:38.651 --> 00:26:43.350
five minutes with a nurse who can just calm them down or we're starting to bring

487
00:26:43.351 --> 00:26:46.230
in technologies to kind of nicely restrained them.

488
00:26:47.220 --> 00:26:50.550
So those are the sorts of things where if we understand that,

489
00:26:50.670 --> 00:26:51.780
in fact humans know,

490
00:26:51.840 --> 00:26:56.220
we know that a cuddle and just looking at someone of color with your child or

491
00:26:56.221 --> 00:26:59.100
just looking at someone and spending the time with your hand on their arm

492
00:26:59.101 --> 00:27:02.400
quietly as reassuring them makes a big difference.

493
00:27:02.401 --> 00:27:04.530
Humans know that because that's how we work.

494
00:27:04.740 --> 00:27:09.720
So getting the signs to explain that that's really critical is an important part

495
00:27:09.721 --> 00:27:12.440
of then saying, so what is the role here of technology? And that's,

496
00:27:12.441 --> 00:27:16.620
that's all we really should be doing is saying what's the role of each in any

497
00:27:16.621 --> 00:27:19.890
given circumstance. So that we're really smart about using both.

498
00:27:20.960 --> 00:27:24.230
<v David Hobbs>If I could just add I met about 15, 20 years ago.</v>

499
00:27:24.290 --> 00:27:27.500
I was part of a group that put an application in to try and really improve some

500
00:27:27.501 --> 00:27:32.000
access to funding for what we would have called environmental control units to

501
00:27:32.001 --> 00:27:35.390
control devices around the home. So you can be much more independent.

502
00:27:35.391 --> 00:27:37.910
You can do the doors, curtains, et cetera, et cetera.

503
00:27:38.270 --> 00:27:40.810
And one of my occupational therapist, colleagues just reminded me.

504
00:27:41.050 --> 00:27:44.180
It didn't really it wasn't a strong caution, but just reminded me, look,

505
00:27:44.480 --> 00:27:46.760
it's not Cara replacement. Okay.

506
00:27:46.790 --> 00:27:49.190
It's actually just providing that level of independence.

507
00:27:49.240 --> 00:27:52.790
The person can maybe have a greater self esteem, et cetera, et cetera,

508
00:27:52.820 --> 00:27:56.390
greater quality of life. And that means the care can be doing other things.

509
00:27:56.570 --> 00:28:00.560
So it's not changing the channel on TV or doing this. It's the conversation,

510
00:28:00.561 --> 00:28:02.360
making a cup of tea, doing the other things.

511
00:28:02.361 --> 00:28:05.090
And so we should be reminding funding bodies,

512
00:28:05.091 --> 00:28:07.880
particularly that they're not carer replacements,

513
00:28:07.910 --> 00:28:10.430
they're all mentoring aspects to that sort of thing.

514
00:28:10.431 --> 00:28:12.710
So it's not going down your balance sheet and going all right.

515
00:28:12.711 --> 00:28:14.870
I can put that person somewhere else. It's actually just saying, well,

516
00:28:14.871 --> 00:28:17.300
that person might be spending more quality time,

517
00:28:17.570 --> 00:28:18.890
but it's not a replacement device.

518
00:28:19.550 --> 00:28:20.990
<v Kristin Alford>That quality is a good one. And.</v>

519
00:28:20.990 --> 00:28:23.630
<v Fionna Kerr>That's the, that's the difference in discussions in different countries.</v>

520
00:28:23.990 --> 00:28:27.590
Some don't even go to the, so how do we get the human out of the equation?

521
00:28:27.890 --> 00:28:29.180
They are having a look at the,

522
00:28:29.390 --> 00:28:34.250
how do we give them time to do the stuff that humans are really good at? And we,

523
00:28:34.310 --> 00:28:37.280
you know, I love technology. I mean,

524
00:28:37.281 --> 00:28:40.760
I've been a really I've been lucky to also be part of using it,

525
00:28:40.761 --> 00:28:43.820
to make things that let people do things they're never able to do.

526
00:28:44.210 --> 00:28:48.080
And David's my kind of, you know, like pin-up boy of how to do this. And,

527
00:28:48.500 --> 00:28:52.250
but as long as we just make sure that we, you know, we utilize both,

528
00:28:52.280 --> 00:28:55.070
then they're just, they're both quite miraculous.

529
00:28:55.380 --> 00:28:57.500
<v Kristin Alford>And, and I guess the, the feedback loop. So, you know, when you're,</v>

530
00:28:57.560 --> 00:29:00.980
when you're talking about developing gaming to help children with cerebral

531
00:29:01.490 --> 00:29:03.410
palsy, it's, it's the,

532
00:29:03.620 --> 00:29:08.570
does the game make an improvement on various factors? Not just one or,

533
00:29:08.571 --> 00:29:11.600
you know, there, there was a reinforcing effect from, from doing that.

534
00:29:12.380 --> 00:29:14.960
<v David Hobbs>So as we're having a conversation before,</v>

535
00:29:14.961 --> 00:29:19.130
so my PhD work was about developing an accessible gaming system to enable a

536
00:29:19.131 --> 00:29:22.550
child with a hand impairment due to cerebral palsy to play computer games.

537
00:29:23.030 --> 00:29:25.640
But that, wasn't the question. The question was,

538
00:29:25.641 --> 00:29:28.310
how can we engage children to use both their hands?

539
00:29:28.550 --> 00:29:31.670
Because a child with an impairment would typically use their dominant or strong

540
00:29:31.671 --> 00:29:34.070
hand and they'll ignore their hand, which doesn't work as well.

541
00:29:34.520 --> 00:29:38.090
And through lots of conversations and conceptualization that ends up being a

542
00:29:38.091 --> 00:29:41.530
gaming system that we developed and the gaming system showed that it actually

543
00:29:41.531 --> 00:29:44.380
could improve hand functions. So the hand, which didn't perform as well,

544
00:29:44.381 --> 00:29:48.790
actually improved it. So it gave a better function better as we've talked about.

545
00:29:49.450 --> 00:29:52.450
We actually had other situations where that gaming system,

546
00:29:52.451 --> 00:29:55.360
because it was accessible when once it went into the families' homes,

547
00:29:55.690 --> 00:29:58.840
actually improved sibling interaction and participation.

548
00:29:59.140 --> 00:30:03.400
So parents were commenting on sibling rivalry because they had a common platform

549
00:30:03.401 --> 00:30:04.234
they could play on now,

550
00:30:04.240 --> 00:30:07.720
which wasn't disadvantaging the child with an impairment. So, you know,

551
00:30:07.721 --> 00:30:10.060
everyone loves to beat their brother and sister. I know I did.

552
00:30:10.061 --> 00:30:11.950
So why not give everyone that opportunity?

553
00:30:12.460 --> 00:30:15.910
Parents were scared to hop on the system because their child was so good at it.

554
00:30:15.911 --> 00:30:18.490
So they don't want to be shown up that they couldn't be their own child.

555
00:30:19.300 --> 00:30:23.410
But one of the most profound experiences that came out of that was it created

556
00:30:23.411 --> 00:30:28.030
such a an interactive environment that we had one boy actually

557
00:30:28.031 --> 00:30:31.330
talk more during the trial. So he was non-verbal before the trial,

558
00:30:31.510 --> 00:30:35.830
or he was non-verbal and he was achieving so well in the gaming the system and

559
00:30:35.831 --> 00:30:39.670
getting through many of our games and many of the levels that when his sister

560
00:30:39.671 --> 00:30:41.530
who was able-bodied was playing the game assistance,

561
00:30:41.531 --> 00:30:42.820
she couldn't get to the same level.

562
00:30:42.970 --> 00:30:45.850
So he was coaching her and telling her what to do, do this, go here.

563
00:30:45.851 --> 00:30:48.220
This is how you get, this is how you solve that particular puzzle.

564
00:30:48.490 --> 00:30:52.360
So he became more verbal. Now, if we all sat down nine years ago,

565
00:30:52.390 --> 00:30:53.350
when we started the system and said,

566
00:30:53.351 --> 00:30:55.750
how can we make a child with a disability talk more?

567
00:30:55.751 --> 00:30:58.090
We would not have arrived at a game in the system. So it's this,

568
00:30:58.091 --> 00:31:01.600
these ripple effects that came out of it. And it's about designing things,

569
00:31:01.601 --> 00:31:03.220
right? The beginning. So universal design,

570
00:31:03.221 --> 00:31:04.660
it's about asking the right questions.

571
00:31:04.661 --> 00:31:07.450
And it's about looking on the peripheral for these other effects,

572
00:31:07.660 --> 00:31:09.040
these edge effects that you can see as well.

573
00:31:09.900 --> 00:31:10.733
<v Kristin Alford>Thank you.</v>

574
00:31:10.890 --> 00:31:15.330
I think that the value of this discussion really is around how do we,

575
00:31:15.360 --> 00:31:19.050
how do we shape artificial intelligence and social interaction in a way that

576
00:31:19.051 --> 00:31:22.470
gives us what we want? And I think there's two things that you can do.

577
00:31:22.770 --> 00:31:24.750
I'm inspired by the fact that every, you know,

578
00:31:24.751 --> 00:31:28.200
some of the technologies that we've seen develop means that a lot of these are

579
00:31:28.230 --> 00:31:32.930
decreasing the price input in like level for people to have interactions.

580
00:31:32.931 --> 00:31:35.010
So I'm thinking about 3d printing,

581
00:31:35.011 --> 00:31:37.980
and I'm thinking about being able to publish your own work online and all of

582
00:31:37.981 --> 00:31:38.880
those sorts of things.

583
00:31:39.090 --> 00:31:42.360
I think there is a low entry point for us to start to be involved in developing

584
00:31:42.390 --> 00:31:45.300
AI that we want. And so taking power as a citizen to,

585
00:31:45.600 --> 00:31:47.850
to develop that ourselves is, is one thing.

586
00:31:48.240 --> 00:31:50.400
And I'm going to pick up also what you said, Fiona,

587
00:31:50.580 --> 00:31:53.640
if everybody in levels of politics, economics, and lawyers,

588
00:31:53.641 --> 00:31:58.320
then they rely on us as non-economic economists and lawyers to give them input

589
00:31:58.321 --> 00:32:00.360
about some of those other diverse perspectives.

590
00:32:00.361 --> 00:32:04.680
So I guess there is a call for active citizenry so that you are informing the

591
00:32:04.681 --> 00:32:05.730
people who represent you,

592
00:32:05.731 --> 00:32:09.210
that there are things that we want and that we are things that we are expecting

593
00:32:09.211 --> 00:32:09.901
them to deliver.

594
00:32:09.901 --> 00:32:13.500
As we look at how we adopt the technology and those things matter to us because

595
00:32:13.501 --> 00:32:16.080
they are the things, the conversation, the care,

596
00:32:16.260 --> 00:32:20.550
the face-to-face contact that make us human. So with that,

597
00:32:20.551 --> 00:32:23.310
please join me in thanking David Hobbs, Fiona Kerr.

598
00:32:26.460 --> 00:32:26.460
<v 3>[Inaudible].</v>

