Preview Mode Links will not work in preview mode
Welcome to the AI in Education podcast With Dan Bowen and Ray Fleming. It's a weekly chat about Artificial Intelligence in Education for educators and education leaders. Also available through Apple Podcasts and Spotify. "This podcast is co-hosted by an employee of Microsoft Australia & New Zealand, but all the views and opinions expressed on this podcast are their own.”

Jan 8, 2020

In this first episode of 2020 Dan and Ray reflect on the last podcast series and the heroes and the villains of Artificial Intelligence in education for the year ahead and beyond.  Will Home Automation become mainstream?   Is this the time that true Digital Transformation will actually happen in our sector?   Will pervasive AI work its way into everything we do?   And will those black boxes with hidden algorithms be a thing of the past.  Hold on to your hover boards, let go.

TRANSCRIPT For this episode of The AI in Education Podcast
Series: 2
Episode: 1

This transcript was auto-generated. If you spot any important errors, do feel free to email the podcast hosts for corrections.

 

 

 

 

Hi everybody. This is the first podcast of a new decade. Ray, welcome to the AI and education podcast.
Thanks Dan. I'm very I'm very excited. I think the first series was really quite awesome. I thought there were 13 episodes in the first series, but did you leave the edit room unlocked? Because there was a Christmas one.
There was a Christmas one. Ums and a rs.
Ums and I thought we'd left those all behind us.
Never.
That cutting room floor is very busy.
Okay. So, we managed to make our way through 14 episodes of a first series of an AI and education podcast that really started as a twinkle in your eye and a twinkle in my eye of well, let's do it and see how it works. It feels like we've been learning all the course of the last few months as we've been working out how to do this.
Yeah, sure does, doesn't it? You know, I think when I was looking back at some of the topics when I was, you know, we had the big wrangling around AI for good and AI for evil. I enjoyed those two episodes a lot.
I I had a lot of fun. I'm still reminded how evil you really are, Dan.
Really? Yeah. And and there was a lot of references to 80s movies in there as well as we were going through. But then I thought when I was reflecting on the year and the series, the accessibility one was really good.
I'd love that
with with Troy and also bringing out the story to life from David Kellerman at UNSW is fantastic.
What that has made me realize is that it's the passion. It's not necessarily about the technology. It's not about the amazing, you know, insights you can do with data. It's about somebody taking the understanding of what you can do with the technology and putting a passion behind it, which is what you see with David Kellerman. It's what you saw with Troy. Troy was passionate about how do we make inclusive learning for everybody? David is passionate about how do I make my students experience better? It's the secret ingredient. Did you know at Microsoft we talk about the tech intensity?
Yes.
Do you know the bit that's missing in that recipe that just absolutely fuels the boom is probably the passion bit as well.
It's very true. It's a good point. So when we're looking at the year ahead uh and our heroes and villains, I think that's what we should do in this podcast. Bring things to bear for the year ahead and try to I know it's very hard in technological terms, but look ahead and see what we think is going to happen in the next year to 18 months and what listeners can expect in the next 12 months.
Okay. Well, let's do that because we are at the dawn of a new decade. How exciting is that? Heroes and zeros, Dan.
Heroes and zeros. Fantastic. So, let's start with the goodies. The heroes. And I think interesting, I think some are going to overlap here cuz I think some of our villains or zeros might jump back into some of our heroes if used the right way. So, when we're looking at our heroes, where would you like to start to think about your main heroes coming up over the next year?
Oh, look, I think If we look forward and then review what's going to change, I think we're on the doorstep of digital transformation and let me go deeper into that. So I think the problem we have at the moment in a lot of organizations and I think education is similar to that is what we've tended to do in the last decade is we're going to digitize existing processes. And what AI allows us to do is to actually do processes differently. And so if there's one thing that I think we're going to start to see really strongly over the next 10 years, It's about a genuine digital transformation. You know, in the same way that Uber transformed the transport business by rethinking the end-to-end process, we haven't really done that, I think, in education. You know, we've still got big points in time that are defined by old processes and old silos within the organization.
So, what's the tipping point for that now then? Because we've been talking about digital transformation for a while. So, what do you think is the impetus for this coming year and decade to push that really into the forefront.
The thing that I've seen in other industries is it's external pressure. It's external pressure that has then caused a change. So if you think about it, the taxi industry is only responding to Uber. They didn't want to make change beforehand. And so in highly regulated industries, as education is, the change tends to come from external pressure. So in higher education, it's probably going to come from a reform of the market where we've seen the that says that the number of learning hours are going to double. Well, in today's systems and processes, that's going to be really tricky to do. That's going to be lifelong learning and lots of short courses. And so, how do you do that in an institution where you're basically built around selling a 4-year product to students and you've got a long recruitment process when the course itself might only last three weeks or six weeks. You can't have a six-month recruitment process or a 40-day admission process. in order to decide if you're allowed to do a six week course.
So education itself is becoming more agile. Therefore, the processes are needing to become more agile to put up with it.
Yeah. And it's driven by the opportunity that the nation has to re-educate itself on an ongoing basis. So that no longer can we say you have an education at the beginning and get on with your career or life. That we're going to be constantly relearning just as you and I know because that's the cycle we're on, which is we've got to stay ahead of these whippers snappers because they can learn technology faster than we can. uh but that's going to be the same across the whole economy. That's going to force a change in the way that we think about a student life cycle. And so that then is going to cause problems at the institutional layer when you say well what are our processes to be able to support that. So in higher education an example of digital transformation will be rethinking the student journey and using AI within that in order to shortcut the processes.
So it's partly about digital transformation, but it's also about the use of artificial intelligence in order to be able to optimize the processes.
Yeah. And from a K12 point of view, where do you see that transformation on a K12 point of view?
Well, if only we had a K12 person in the room.
Yeah, that's interesting. Where do you see it?
I was interested in your point there because all education's process heavy. So, there are processes in in a K12 element. But I think I can see the transformation happening more quickly in higher ed because of the fact that people are doing more shorter courses let's do a six week course in AI whereas when you look at to transform the school system and I don't want to be sticking the mud here but the tricky thing we've always had is the terms the semesters now we are seeing some schools who are trying to break that boundary around fitting timets around project based learning and things like that but actually generally we still geared towards the standardized testing model
so let's talk about the standardized testing then because that's probably the bit where the dig transformation is going to happen.
Yeah.
Because why do we have an end of end of education career test? You know, why do you go through 8 years?
Yeah.
And then it's all dependent on whether you've got a cold on the last day.
Exactly.
So, I can understand historically the reason for doing that, but it's probably not right going forward.
And so, there will be a lot of debate about it. There will be a lot of different opinions, but probably by the end of the decade, we'll have a completely different way of assessing students and their skills. Now, that's a bold statement, isn't it? That's one of those statements like, you know, the IBM statement of everybody will have a PC in their house.
Was Was that the right quote or not? I forget.
It was the right quote with the wrong company.
It wasn't Bill Gates, was it?
It was Bill Gates.
Was that Bill Gates? Bill Gates wasn't IBM. Remember that, Dan?
Well, you know, there was there was a nice IBM quote as well, though, wasn't it?
So, so you can't even get um digital history, let alone predictions. No. So, I I'm pretty sure probably 10 years ago, I would have predicted the change in the assessment system to have happened this decade. So, it's really disappointing that we're still in the old model.
Agreed.
But my goodness, if it hasn't changed, then it's not going to keep up with the situation we've got. I mean, I feel terrible when I see the reporting around Pisa tests and around the SATs and the NAP plan tests that say our kids can't read and write as well as they used to be able to. Their students aren't progressing. That is a load of bull because when you see a student today, their communication skills their collaboration skills, their technology skills are way, way better than I could ever manage even after 10 years into my career. So what we're failing to do is measure the skills that our students are being valued for.
And the old message about well back to basics in order to get them to learn to read and write properly. Sure, that is important, but it's not the only thing. And sometimes the exam system seems to think it is the only thing.
Yeah, that's a great great great point. You know, I look at my own kids and the presentations they do in primary school these days. It's, you know, it's absolutely leagues ahead.
So, we've got to learn how to assess that. We've got to make it a continual process. AI has got a great role to play in what that digital transformation of that process is going to be.
Totally. Totally agree. Thanks for going down that K12 alley with me there because it's really interesting.
Can I stop you just for a second there because we have got people that might be outside of Australia or the US listening to this.
I never heard the phrase K12 until I joined Microsoft.
Yeah.
And I'd worked in education for 20 something years before I joined Microsoft. I'd only ever known it as schools. So
maybe we should talk about schools.
Yeah. Exactly. Well, like one one of the things from a from a school point of view, if I'm looking at heroes myself for for the next year and then extrapolating up for the next decade, I think and we've alluded to it in a lot of our podcasts already, there's going to be pervasive Yeah, I agree. You know, I think schools K12, same thing. But what we need to do is kind of just unpick some of the some of the trends along along that area. And I think one of my heroes for the next um decade uh would be and even the next year, I think we're going to see a big push around this is around that kind of magic or secretive uh AI. And I don't mean it in a bad way. I mean it in terms of the stuff that's in the background, the really small things that AI does really really well. I was at a school yesterday and One of the things that blew in the teachers's minds when I was talking through some of the office products was the fact that when you insert an attachment, it uses AI inside an email to actually pick up the show you the last 10 documents you've been working on because the AI is saying, well, the chances are if you're emailing somebody, then the documents you're going to attach, it's going to be one of the ones you were just working on, like that spreadsheet or whatever it may be. And that completely blew her mind, but it was so subtle
that she just it just felt intuitive. And I think when we've looked at some of the ethical issues with AI and the way some of the articles in the media at the minute are portraying you almost like an embedding of AI under the covers almost. I think we're going to see a lot more of that hidden AI to support us, you know, and it's in everyday life as well as in education, isn't it? The liar in cars where you're kind of picking up your driving habits and the way you're kind of swaying across the lines and things like that.
So, I guess the thing that is interesting there is the the language we use around it because you know kind of secret and hidden
absolutely
and magic have different meanings in different terms and so I think where it's going to be secret magic and it's going to be surprises is when it does something you go that's smart how does it do that the Arthur Cclark moment of technology is sufficiently advanced that it's indistinguishable from magic and then the other side of that is where people use it in a way that's sneaky or you think that technology is becoming a little bit
inclusive Exclusive.
Exclusive.
Yeah. So the example is when I go to send an email and I want to attach a document, it remembering the last 10 documents I went used and saying, is it one of these? That's really, really good. But when I go to social media and it remembers the last 10 websites I visited and the last 10 products I looked at and I get adverts for them, I don't like that. So I guess it's about s being sensitive to the situation.
But I think you're right. We're going to see a lot more small things that are helping people in education. and just achieve the next job that they want to do better and better.
And what other heroes do you see in the next uh 10 years?
Probably the the other one is home automation.
Look,
we did talk about that.
We talked about it in AI for evil.
Yes.
And I think I probably have a slightly less optimistic view of it than you do.
Okay.
I don't have any home automation products in my home. But as things start to mature on that and as we start to get into a situation where you actually can control things as a consumer in one way. I don't have to go and buy 12 different standards of systems that I don't have an app for my light bulb and an app for my Christmas tree lights and an app for my music. If it's all comes together into one thing and I know that's where we're heading and the recent announcements around that have been really exciting that I think is going to unlock home automation because my goodness, you don't want to spend all your time being the network debugger on your on your home network.
And to segue into one of my sort of heroes One of the things that I'm seeing just a trend that I'm seeing currently with CIOS and and the IT technical sides to education where we are in education with lots of point products I'm seeing a slimming down of those point products you know 5 years ago 10 years ago where it was which app is the best for maths which app is the best for English or a particular area of the curriculum whereas now we're trying to bring things together because we are seeing from a technical side and I know this doesn't affect student outcomes directly but indirectly we've seen an integration of vendors in lots of schools where currently you know like we've said from day one here and we'll explore this in in further episodes as well but where you've got siloed school information systems learning management systems productivity suites everything's in the same boat but never the twain should meet so I think we've got a really interesting opportunity where schools are starting to kind of bring things together and schools are starting to look at where the data is and how that combines and when you your interesting segue there was the frustration that I get, you know, cuz I use Amazon Alexa. Some other people will use the Google systems of home automation, you know, which bulb do I buy? Do I buy the bulb that's compatible with this or the bulb that's compatible with that? You know, am I my curtains going to close? Are my lights going to come on? What what's going to happen if I buy ones which are not compatible with the systems I've got? And it becomes carnage.
Could you imagine that we would have had this conversation 5 years ago? It's like, which light bulb should I be buying to be compatible with my home automation solution? This is bonkers, D.
It is, but it's going to be very popular this, you know, Christmas time this year. I'm sure home automation devices and this year, previous year just gone. I'm sure listeners to this podcast are probably saying, I know exactly the feeling. I've now got a starter kit for home automation. And that's really pervasive. But I do think that that's going to have an impact on schools because of that aggregation of suppliers and vendors for point products. I think that the data sets inside schools are becoming more aggregated. I think people are bringing in, you know, the classic example we were talking about the other day and it would be good to if you can articulate this this on the podcast actually because you we were talking about lecture capture for example and a lecture capture system. If I'm working for a lecture capt capture system, I'd be saying to a school or university, this is the way you'd personalize learning. But it's bigger than that, isn't it?
Yeah, that's right. I think the story that you were telling around well how do we free the data people have to become interested in how do I help free the data so that people can use it for value so the example of lecture capture is if you go and talk to a lecture capture company they will think that the way you personalize learning the way that you predict what's going to happen in the future the way that you optimize everything from the student starts with the data about have they watched the lecture you know it starts with the data that's in their system if you go and talk to an LMS company they'll tell you that the way that you personalize learning the way that you deliver a journey for a student to Excel is starting with did they log into the LMS and did they look at their assignment and how long were the editing things and then if you go and talk to oh talk to us if you look at us and our perspective around my analytics in Outlook the thing that makes you an awesome employee and presumably we say the same around a student the thing that personalizes the journey is have they logged into their email and how many meetings have they attended and the reality is it's none of those things it's all of those things that if we get fixated on the silo that we look at. We think that our silo is the answer. And so free the data, look at the data holistically and that I think is genuinely going to unlock the use of AI and the journey of using data to help AI to help improve
and I think that that like just to finish on one of the heroes as well. I think that's one of the things that going to I think we're going to see more of in K12 is some of the companies which have siloed data off are going to start to share more data. open that data more and give people and systems more access to the data that they've sitting on even though it's been in the past a gold mine that they've sometimes sat on or haven't actioned.
I wonder if we're going to start to see data consortia that you're going to see two ways of working. One is let's just free the data and the second is well let's work with other organizations to maximize the value of bringing two sets of data together. You know a student management system company and a learning management system company. putting that together and starting to say, "Well, we're going to give you the best interpretation of that. Either way, it's going to be a completely different way that the industry is working in the future than it has in the past."
Yeah. And we know a standardization of data and things have gone in education previously, so hopefully this could be a tipping point. I wouldn't talk about any acronyms. I can see I can see the pain in your eyes, right?
Just remembering history there.
Yeah, they're still going on some of those conversations. So, if we look in we've looked at the heroes. So, what about the zeros or villains? in this area. Where do you where do you see some of those starting to drive in the next year to 10 years?
So, in the world of AI, I think we're already starting to see the emergence of disqu and unease around facial recognition and I think that is going to grow and I think we're going to get into a pretty serious conversation about how we're going to use it in a positive way without avoiding the negatives. So, probably facial recognition is probably the number one thing that I think you know, we're going to need to get on top of very early in education as well as elsewhere.
So, you mean getting on top of it in terms of the fact that say doing an attendance in an exam hall in a university, good use, but then some of the uses where you might be looking at emotional behaviors of students. Where's that pushing the boundaries?
And also recognizing it's not perfect because facial recognition, I was reading a federal report this morning from the US and they've worked out that some facial recognition systems are less than 1% effective with people of color or women. So if you start to use facial recognition to do something like uh bank tellers or in education, you know, to sort your coffee order when you get to the back of the queue so it's ready for you when you get to the front of the queue. Well, what happens if certain groups in society can't take advantage of it? That um that facial recognition toilet roll dispenser in China. Well, imagine if your face didn't But with that, you know, that's pretty deep societal consequences. So, I think facial recognition is going to stay top of the the contentious topics for the next year and and probably for the next decade. And I think that what that means from an AI in education point of view is we probably should be trading pretty carefully with thinking about scenarios in that because let's face it, it's exciting and it makes a fascinating demo, but there's probably so much more that we could do with other AI systems to deliver significant benefit. in education.
Yeah, that's a fair point. I think one of the villain for me I suppose in there is the blackbox AI. So the AI that isn't transparent that you don't see you can't see the algorithms it just works you know an out of the box proprietary piece of software for example that might support you with assessment or whatever it may be. I think that's where we're going to get a bit of push back as well. I think the more people listen to podcasts like this and other podcasts around education and the more principal senior leaders leaders, IT managers, business decision makers in universities start to understand the implications of AI, privacy and security elements to that then I think the questions will be asked of those systems and I think the algorithms are going to be need to be made more transparent and obviously with machine learning where the algorithms are invented on the fly then we need to make sure that we are understanding how those machine learning algorithms are programmed as well.
Yeah and it's an ethics and fairness thing as well. It's understanding where it is okay to use an algorithm that you don't know how it works and where it isn't. You know, if it's a life-changing decision, you need to understand how the decision was made. You know, if it's a decision about excluding a pupil, if it's a decision about admitting somebody to university, you need to be able to understand how that was done, you need to be able to explain that to other people. But if it's a decision about which learning resource you present to a user next, that's not a life-changing decision. That's a journey decision. It's a bit like I don't care how Netflix works out what kind of movies I like, but it seems to do it really well and it's fine. But if it picks a bad one, it's not a life-changing thing for me. It might be an evening changing thing.
I've had plenty of those. You're just sitting down for Netflix thinking this is going to be awesome and yeah, it's a bit of a dud.
Blackbox AI, so non-explainable AI is fine in a whole load of circumstances, but I think as a profession, we need to be able to evaluate and understand when it's good, when it's bad, when it's okay. a when we need human intervention in decisions because you don't want AI going off and making decisions that have got life-changing consequences without humans being involved and being able to understand. So absolutely agree blackbox AI is going to be if it's not going to be a zero, it's going to remain contentious for a while until we truly understand when it's okay and when it's not okay.
Yeah. Yeah, that's fair. And then I think one of the last ones that when we were talking about this previously was around the the unlocking data that we don't collect. I think that's an interesting one. as well. What are your thoughts around that data which the unknown unknowns the known unknowns?
So yeah, I think I think it's really interesting. There's a lot of data that we've structured and that sit in our systems that we know how to use, but there's a massive amount of data we're not collecting. There's a massive amount of data that we don't necessarily know how to use. So probably we need to be more aware of what that data is. We need to be thinking about are we collecting the right things? Are we collecting things that may be useful in the future and we're currently ignoring them. But equally, on the other hand, are we collecting things that really shouldn't be used or that aren't going to be used and just become a a legacy risk when you've got a pabyte of data sitting somewhere on people.
So yeah, I think that whole thing around well, what is all the data? How do we get the value from it? The risk is that the zero side of the decade is that we just believe in hoovering up everything. and we don't actually do anything with it. I think the opportunity is we use the right things in the right way to deliver value back whether that's to a teacher or a student or an institution or a country.
So some of these zeros I suppose can flip into heroes depending on the way you're used I suppose and they
it's it's being aware of the downside risk and the upside potential.
So a bit like when you do a you know a SWAT analysis you look at the opportunities and the threats and you say how do we change them? How do we turn them into being benefits? And I think it's the same with this with facial recognition. It would be how do we understand the good and the bad and we avoid the bad and use it for the good. It's the same with data. How do we avoid the downside consequences of data and how do we turn it into an upside of delivering benefit? I I tell you what would be good for the next podcast, right? Because we've we've talked a lot about where we think the trends are. You know, it'd be good to have you comment as well. You know, reply to us on social media. Let us know what you think about the future of AI. It'll be good to, you know, listen to everybody's points of view on that. But for the next podcast, I think it'd be quite interesting to actually think about that data we don't collect. And if we get to work through one of these problems, actually sit down and think, well, what are the big decisions? Let's pick a big decision, for example, in education. Let's follow that through and how would somebody implement an AI solution around that to solve that problem.
Yeah, I think that's a really good discussion for next week, especially as you think about one of the big elements of AI today is it learns from history. You have to give it the data on the past for it to be able to predict the future. And so yeah, let's have that discussion about what that data is, what that could be, what the data is that's being used in other industries and see if there's a comparison in education.
Great. And it's been great today. Thanks Ray for looking at these heroes and villains of AI in the future for the next year to 10 years. It's good to look out of the boat sometimes. And I really hope I do feel that is G roots showing in lots of these big thorny issues that seem to have been kind of lurking in education for a while and hopefully the next 10 years we'll see some solutions for them
and hopefully also some of that home automation stuff. All of that that IoT AI becomes smart because what nobody can see is we've been sitting in the dark for the last 5 minutes because the AI has decided this room is unoccupied. Let's see if that's fixed in the next decade as well then.
I doubt it. Cool.
See you next week.