Preview Mode Links will not work in preview mode
Welcome to the AI in Education podcast With Dan Bowen and Ray Fleming. It's a weekly chat about Artificial Intelligence in Education for educators and education leaders. Also available through Apple Podcasts and Spotify. "This podcast is co-hosted by an employee of Microsoft Australia & New Zealand, but all the views and opinions expressed on this podcast are their own.”

Oct 14, 2020

In this podcast we discuss AI, security, innovation and Scale with Mike Reading from Using Technology Better and Blake Seufert,  IT manager McKinnon Secondary College. They also run the fantastic Outclassed podcast.   

In today's episode we also mention the Monash Uni released this survey Report

 

________________________________________

TRANSCRIPT For this episode of The AI in Education Podcast
Series: 3
Episode: 11

This transcript was auto-generated. If you spot any important errors, do feel free to email the podcast hosts for corrections.

 

 


Welcome to the AI education podcast. Hi Lee, how are you?
I'm good. Dan, good and good to hear your doul sit tones again and good to be back talking AI and I hope you're surviving school holidays.
Yeah, just about. It's been great to kind of uh get out inside New South Wales and kind of have a little wander around with the kids. How about yourself?
Yeah, explore in the backyard. Absolely. Absolutely. I went about as far in New South Wales as you can go without being in Queensland. I went up to Byron. Um but yeah, beautiful time. Good to be away, but also, you know, kind of good to be back at work as well because there's lots of fun things going on.
Yeah, totally. So, for this uh week's episode,
what we decided to do, I think if you remember last time, we're going to take people from outside and have a chat with them. We we spoke to uh Brett Salakas and uh Rob McTaggart in our last podcast. Yeah. And this time I've got out on the road and gone to grab  Mike Reading from using technology better who's a kind of training partner and Mike came to us and said hey you know we're doing heaps of stuff with schools around AI it' be great to have a chat I thought yeah let's grab Mike and one of his colleagues Blake Sufett who's the IT manager of McKinnon secondary college they'll introduce themselves when we get into their part now but they also do an outclass podcast where they talk about relevant things in education but because they're unique aspects one from an IT and kind of security and innovation side and also somebody from a training and scale side. I thought they'd be two interesting people to talk to.
That sounds interesting and and and I love that. What do they call that? It's a crossover episode where two two stories are going to intermclass podcast and the AI for education podcast are going to merge together and create a whole new one. So that's that should be exciting and and I think you know this this couple of events we've done now on talking to teachers and talking to those actually in education, it's really important to think about what are we doing to be prepared for that future. You know, we've talked about this in the past and just this week Dan I got a report come through from uh Oxford Insights global report on AI readiness index for countries around the world. Um and of course Australia's on there and Australia ranks uh 12th out of the uh I think it's I'm going to say it's about 100 plus comp 172 countries on there which in principle seems pretty good. Yeah. You know 12th out of 172 but then of course when you look at the the 11 in front of us and those behind us. You know, actually Australia's got some work to do to kind of get to get up that that scale and really contribute in a meaningful way. We should put the report link in the data, but it's you know what it really points out is that Australia's got really good vision and intention for AI and we're pretty strong on ethics and all the things that matter about it, but actually what are we doing around innovation, human capital and kind of building the future for that that that falls to our teachers. So I think it's really important that what you know the guys you're going to speak to today, I'm looking forward to hear what they have to say. Right,
cool. Let's hit the tape. Thanks guys for joining today. Really, really appreciate it. You you're kind of key part of my PLN. I've known you guys for for quite a while and I know you're doing some amazing stuff.
It's great to be here, Dan.
Thanks for having us. Thanks, Dan.
No problem. So, can you tell us a bit about your context, guys?
I can go first. Um, so I'm from McKinnon Secondary College. I've been there around 14 years and I've been lucky enough to kind of explore and see many innovations from, you know, us becoming a Google reference school and um you know being at the forefront of uh of you know technology and having the scale to do so as well given our size we're a very large school um so yeah really enjoying the challenges there of course and in the in the course of those those years I've also started my own software company have a big interest in sort of the the ed edtech market as well not just the um sort of the consumer side but the the production side as well and building good software and things like that and also um I started a consortium the educational technology consortium I've helped to to get that off the ground round um that's been running for I think about five or six years now where schools meet and we try and solve these kind of conundrums and problems and debate things and come up with um you know solutions for things.
Sounds like a great impact you have there Blake. How big is your school by the way? You said it's really large.
Uh yeah, we have uh 2300 kids on one site about 200 staff um and we're we're just in the midst of building a new campus um down the road which is going to support well it's going to support our growth for another sort of thousand students over however long.
Yeah.
Wow, that's great. So, Mike, how about yourself?
Yeah. So, I'm the director of using technology better. It's a company I started in 2008 while I was teaching in the New South Wales school system. Used to be a high school science teacher. Also taught history and geography for a number of years through a weird twist of fate. And uh yeah, became very interested in how technology could really help students in terms of their engagement and motivation. Um I was doing some training around student engagement and motivation before technology really became a thing and then uh when we saw the cloud come and real time collaboration I could see the ways in which we could use that tech to really spur students to learn so uh yeah launched using technology better we're now based in Australia and in New Zealand uh work in Asia in Pacific and a little bit in the US and uh yeah we've got this podcast that Blake and I have been doing uh for this year uh we call it the outclass podcast and we sort of talk around technology and leadership and and so on. So it's great to bring the two worlds together, Dan.
Yeah. No, absolutely. I've listened to your podcast as well. It's fantastic. So yeah, keep keep up good work with that. So you recently shared some findings of a Monache Data Futures Institute report with me before before this uh some of you found Mike the other day and and I think like I looked through through it, you know, especially the executive summary is a really large report. Um but uh you know there was really interesting for us on what people actually were aware of. It was very much an awareness of AI and lots of people and that I think I said nine out of 10 people are aware of the term but the majority of the public in Australia consider themselves have little knowledge of what it really means. You know most people think about the fact that it's about robots taking on work, taking out jobs, taking over the world. Um and actually they had some kind of interesting thoughts around their uh kind of uh I suppose their knowledge of AI. What what did you take from that report?
Yeah, I think the thing that stood out for me probably more than anything else, well, I guess there was two things. One was they saying that um people as they're going through the survey, they seem to understand a little bit more about what AI uh was and how it interacts in society. And uh they said that 43% of all the people who were taking the survey, and I think there was 2,000 people who took it. So let's say a thousand people give or take uh changed their mind. mind about being opposed to AI the more they understood it. So basically they started off being nervous and opposed to it and then the more they realized what it was they started to go towards a neutral or supportive stance. So um they pointed to a real need for education for the wider public general public I guess to even understand what AI is and how it interacts with society.
Yeah absolutely. So Blake from your point of view being a teacher where where do you see that kind of uh lead for what project are you doing from an AI point of view in your school?
Uh so you know AI at our school like what what what are the use cases for it? Those kind of things. I think about that a lot. Um how we get students using it. I think the big challenge is around um the literacy of the a like of AI and how to actually put data into it. I mean AI is really about crunching data. Um so how we get that data kind of prepared and pipelined in. You know we're doing a project at the moment at the school. Uh we have an in-house developer helping us um you know look at ways to kind of use our school data which we have so much of um and how we can kind of uh corral that into something useful uh not just for teachers but for students for administrators for student managers make sure that we can put interventions in place be proactive about our uh our uh you know support and guidance with students so that's that's sort of where I see it being used that you know the challenges are around how do we get it in the classroom how do we get students engaged in in AI
um that's a that's a harder challenge and I think you know there are a lot of apps out there there's the facial recognition stuff you know we we'll probably talk about a whole range of different things that come under that umbrella of AI but um you know is just talking about it enough in my view I think you should understand it to some degree I think there should be more than just a surface level oh okay this is what this does because it is such a nuance thing um it isn't just you know like understanding what a smartphone is AI is this sort of invisible uh you know force behind so much of what we do at the moment uh we can show them some of that front end stuff, the facial recognition, but that's probably having less of an impact to our economies, to the way we live, to what we do dayto-day than something like, you know, the the AI algorithms that push suggestions from visiting the supermarket or tracking our shopping experience. So, you know, that stuff's far more sophisticated and is kind of on another level in a sense um to learning what we're doing, what we're going to want to do. And, you know, you hear those stories out of the US where, you know, teenage girls were getting sent um some teenage girl got sent a an email about uh uh you know here's here's some shopping for you from Walmart things you might want and they were babies nappies and things like that and uh the girl and the dad were horrified. Yeah. They went to the
went to the um I think I don't know who they went to the regulatory body or something and then they found out the girl was actually pregnant and that none of them knew but at Walmart did you know before they did. So
uh you know some scary things like that but but I think uh you know broadly we want to be involving students in challenges like that and and and trying to integrate that into the curriculum. I don't think it needs to be like a subject. I think it just needs to be integrated like technology is supposed to be uh whether it is or not um supposed to be integrated across curricular skill you come out with because if you know how to write a little Python library and and crunch you know 200 million lines of data uh rather than kind of looking at some graphs and trying to draw some conclusions with your gut feelings uh you're going to have a much better chance in of being employed. You know we talk about employability and job skills. I think that's uh that's really important for for this generation to understand.
I I love the way you mentioned Python then like uh like for me when I first started teaching um computer science and things and then people it was a bit of a rage in the UK um using Python and things and I was like what junk is this? It's like the worst programming language ever. I'm like oh god you could you could change um uh variables halfway through the program you know it just wasn't my traditional computer background. And nowadays like we paying like so much money to Python developers in Microsoft because all of our ML
runs across Python and R and things and I'm like oh my god like I I had no idea you know I was using it on a Raspberry Pi and I was like oh this is giving kids such bad programming practices but in in that u in that front you know we do an AI for good challenge like you guys know and and I'll put some links in the show notes but I think when you said about context there I think um uh during that that uh AI for good challenge you come some really good ideas students had good opportunities to think about the use of AI and the ethical use of a AI and just because you can do something doesn't mean you should do something um and that kind of connects in with the fact that that report that we were talking about earlier on says about the fact that the public agrees with the need for industry guidelines to legislate it but they do think that um there are high level support for humanitarian and environmental challenges and social good in AI which is quite an interesting one as well.
Yeah, I think it was only 12% of people were actually opposed to um the thought of you know being developed. So um you know like the appetite is really high even though we spoke about earlier the the fear and you know the kind of it's going to take away jobs and you know we're going to have AI driving trucks and that's one of the biggest industries in Australia and you know these kind of these kind of fears around it. Um but even so the That's really surprisingly positive.
Yeah, absolutely. So when when we think about security in this area then and and and that kind of we talk about the ethical elements of AI from a security point of view, Mike, have you got any thoughts and especially from your point of view, Blake, we'll come to you in a second, but from your point of view, somebody going in speaking to teachers a lot in school systems, Mike, have you got a got any thoughts on on that kind of security aspect to all of this?
Yeah, I think like if you look at that report that that was the other thing that really stood out to me was that there was just such a high trust model in governments or regulatory bodies to look after the good of um of us the the general population. So uh basically I think people are just like well there's people way smarter than me that can can do this. So uh for me I think security is an interesting one because I think we've got students who have grown up in a world of security. So we look at it because we didn't have it um or there's data and you know uh all the all the stuff we're putting our information into goodness knows how many apps on our phones and different websites and and so on and so we didn't have it now we do and we've got something to compare and crunch us to but our students don't and so they seem to be a little bit more blas about it um
and it's interesting even just uh not yesterday I think the day before a whole bunch of COVID results were sent to the wrong people just via text message so uh it's not just AI where we're getting these security issues. It's like, you know, people's medical reports are now being texted to the wrong person.
Yeah. And that's the the, you know, the notifiable breaches. You know, people can get heavily fined now for for leaking data out. And, you know, I know in the corporate environment, we we're doing a lot of training around there. But, you know, I I I look at that every quarter, the report that comes out about the mandatory notifiable breaches that we've got to do and education, I think, had 44 last quarter, and the majority of those people, you know, sending emails, sharing documents they shouldn't and things like that, which is quite a interesting. It's always look, it's always good to look for trends and sometimes we see things in the news which are which are kind of quite scary, but then a lot of the stuff's kind of low level, but actually quite serious in lots of cases.
Go on, Blake.
I was just reading the uh the IT news um uh yesterday and they were talking about a breach that affected 100,000 teacher accounts from K7 maths. But when you read into it more, you know, no passwords are exposed. So on the surface it looks really scary.
Um but the passwords were all encrypted, right? Which okay that they could be decrypted depending on the encryption level that was used but again you know the the media aren't reporting the encryption and and to me that's kind of critical information like if if your password's being put out there well is it is it crackable or is it so sec you know is the encryption so heavy on it that it would be 25 years or is it just a basic level encryption that's done inhouse that's going to be crackable in you know a few hours. does because if you're a target, if you're a public profile or you've, you know, you've said things that this company or these bad actors don't like, you know, they can they can crack that and get into your account before before you even know potentially.
Yeah. And and I think, you know, being involved in edtech for quite a while, I think there was a lot of um uh companies previously that that and there are some companies these days that store usernames about the students and you know, things like that because they they store that locally and lots of that technology now like using Aure active directory or some other identity providers help you kind of single sign onto that. So so hopefully you you know because you you don't want as a as a as you know anyway Blake in your role you know you don't really want to be sitting on the credentials of a load of kids um or or teachers if if you can help it but yeah you're absolutely right you know it's it's the a lot of the time this there's a lot of misinterpreted um things in the papers because that's what sells papers and makes people click on the ads and nobody reads the context or sometimes they don't don't have the technical knowledge to actually ask the right questions in a lot of these cases. You know, when you look I you know, it was almost laughable when I when you're watching the um the the Senate try to kind of grill Mark Zuckerberg from Facebook like a year or so ago and the questions they were asking him were just absolutely ridiculous. They weren't they weren't anything to do with the business model that he provided and and you know the real questions there were skirt their own because people didn't really know what to ask him, you know.
Are you suggesting our regulators might be out of touch?
Just Well, the American ones certainly are. Um, it was, you know, it was
I think my favorite my favorite one was when um one of the senators asking uh I think it was Google's CEO about something that was on the iPhone and he had to say that's not a phone we we produce.
Yeah, that's right. Yeah, I know. And that that sums it up really, doesn't it? Because I think you know all the policies and things that are that are happening and and the problem say from a tech company's point of view whether it's a Microsoft, Apple or Google is we're inventing these things. We might invent a new you know machine learning model we might be inventing something around facial recognition something new but universities can't teach students quick enough uh because it takes a year or two to get curriculum in place so we're missing out on that that workforce um to train people up and I suppose Mike from your point of view it's how we get scale right like scale is very hard in this area, isn't it?
Yeah. And I think you've hit the nail on head like you're saying, universities can't train our students fast enough to be working in this industry. If you look back into the education industry, we've had multiple meetings with different universities, teacher education centers around Australia in particular, uh about well help us, let us help you. So we'll come in and we'll all your trainee teachers, we'll get them Google and Microsoft certified. so that when they hit the classroom, they're ready to roll. And the universities are like, "Sounds great. Can't do it." And we're like, "Why can't you do it?" And they're like, "Oh, we've got a a review board and we've already had our curriculum mandated and it's too hard to change and everybody's too busy and and so I'm like, well, these, you know, students who have been through a school system that's probably not up to date, who are then going into a university which can't bring them up to date, which then go back around the cycle again into a school system that's not up to date. So, you know, we need to really break that cycle to to help that scale and that change.
And leading on from our from our last podcast, Michael, we were talking about Google and, you know, other other big um names entering into education offering six month courses that are kind of going to cost $600 US rather than tens of thousands. You can see why they've done that. Uh these companies want the skills, but also uh the the industry is rife for disrupt disruption. They move too slowly. They're not able to be agile enough. to keep up.
Yeah.
Yeah. Yeah. But but schools interestingly like you know when I used to work in one of the unis in Sydney and and it does take a long time to get the curriculum in and you know I felt that pain as well from from the vendor side as well. But it's um you know it's it's really interesting in terms of schools because like yourself Blake and and the teams that you're working with you know you you're doing some really innovative stuff you know like I just think back to a school I was working with in Adelaide like it's almost four years ago. We used Excel and the machine learning inside Excel to do to predict breast cancer. You know, you put in certain features, you know, your age, your weight, your height, your gender, and and there's certain things in there, and it would it had a model in the back end that they'd create. It was it was it was really good. And, you know, that that's like brilliant stuff. And and sometimes the things that you were doing, for example, in school would be, you know, way beyond what they'd be doing in some universities, which is pretty amazing.
Yeah. And I think like if I look at my my role from from sort of a high level is to be preparing students for for the world that awaits them. Um and I think you know first of all just having an environment that's kind of um you know a monoculture of just this one kind of way of doing things is really bad and that's why we want to encourage debate and have you know a breadth of knowledge you know teach arts and maths and science and all those things. That's why we why we have a a wide ranging curriculum is to give people a taste of things and you know I think part of it is just uh when I look at the role is thinking, okay, you got six years here, you've got four plus years at uni, that's 10 years, and we're going to get you at year seven and give you, you know, a word dock and tell you to email it to someone. That's not going to be relevant in 12 years. It's a tough thing. We've kind of got to be on that bleeding edge so that hopefully some of the skills you learn here are actually transferable and relevant um to the workforce you're going to enter in 10 plus years um you know, or 12 or 13 if you want to go down doctorates and things like that.
Yeah. No, absolutely. Absolutely. So when when we talked about uh security there a bit bit earlier you know we talked about kind of the way that um you know the the tech companies might even like uh be trusted I suppose. So trust is quite an important one um and I suppose that scale of being able to think about AI and knowing well what's inside the box what is inside that algorithm and also where is my data and is it safe is really important. Like have you have you started to approach that with any of your for uh staff in your school, Blake, or Mike, you in school systems to think about that.
Yeah. One of the one of the things we kind of have to do is is have some kind of uh you know, controls over privacy. Obviously, the big you know, security and privacy are different things, but they often get conflated into one one topic. But uh security can impact privacy for sure. But just generally privacy um one of the things we've had to institute in our induction process is just um you know, basically a mini course if you like. of hey here's here's how to think about data privacy for students. Here's how to think about your account as well because all the things that you have in your account probably betray the data of your students if it was to be
uh leaked and we have to have things in place that say you know I understand if I sign up for a third party app I don't want to stifle innovation. My in fact my big thing is about um you know giving power back to the teachers reprofessionalizing the teachers giving them um that opportunity to innovate. So uh we don't want to stifle innovation but at the same time we don't want to have um you know data just flowing everywhere and have breaches and these kind of things happening at school level uh because that'll you know definitely stifle innovation especially in government school get into trouble with that have to do some kind of upskilling around you know what is data security um what are the sorts of pieces of data you need in those apps and like if you're going to use an app fine you can use it but if you don't need to put their address and their date of birth for the students in there don't you know so uh you know if you can sign in with Google, sign in with Google, that's going to protect everyone's accounts a lot better uh than just a username and password. So um you know there's there's things like that two-factor authentication on the security side but yeah we we have to do a little bit of work on the on the front end when when staff come in to say yes I've done that training I understand what what the the I guess the terms are of me using other apps that aren't kind of school provided and vetted
um because we still want to encourage that even though these small apps can often have breaches and things you know the the impact of that might not be very much and the likelihood might not be very much. So we don't want to stop everyone from from ever innovating um just because you know that something might happen one day.
Fair enough. What about your thoughts on that mate?
Yeah, we spend a lot of time just in the fundamentals even just showing uh teachers things like Last Pass for instance so they have a different password for every account. They're not writing it down and sharing it around and um you know not having the same password. So just showing them little things like that, how to create secure passwords and teach their students how to do the same um is is something that we spend a lot of time on around that fundamentals. So if you extrapolate that across to where things are at really in terms of security in a school, they're pretty lax, I think. Um and even little things like, you know, clicking on fishing emails and bringing down your whole account for the school and and things like that. So I think uh we spent a fair bit of time in the fundamentals and just uh hoping that the IT providers are doing the work like Blake is doing in the background just to make sure things are as secure as possible in terms of the infrastructure side of things.
Yeah. No, absolutely. So, when we looking at that ethics and the um the security aspects of of AI, you know, we've we've kind of really unearthed quite a few examples there as we've gone. What what are the practical things that teachers or educators listening to this could actually do to actually start to get their kids involved in thinking this further on like you were panacea there Mike of of really trying to um uh get people thinking about the the technology where can we start
yeah that's a great question I think for me it comes back to where do we find integration back into curriculum that they're already doing um so we can start little debates on all sorts of things like you can take the app for instance like photo math what yeah photo math um where it's the app that lets you take a photo of any math uh equation and then it will automatically tell you how to find the answer step by step and then give you the answer and you know same sort of technology that's sitting inside one note in terms of students being able to put in a a mathematical equation have one note solve it for them so just because you can should you and so you could start bringing things out from there and I think the other part is like I've got a friend who's quite high up globally in AI and uh he lives here in Queenstown and his son's the same age as mine and uh I tell the very often when I'm talking to teachers cuz they start to get a bit nervous about all of this and you know what do our students need and how do we get our students prepared and all of this sort of stuff and um and I'm like well what are you doing with your son after hours outside the curriculum because whatever you're going to do I'm just going to copy that uh and he says you know what you don't really need that much you just need a good well-rounded curriculum where students understand data uh they understand ethics they understand history because you know we can learn from the past um and just that people who know what type of data to put into AI to get the result that they need. So he's saying so long as you've got a well-rounded curriculum um like your kids are going to be fine going into the future. So I think just giving them opportunities to think through ethical decisions like that is just part of that um development of character. Uh but then also the development of the capabilities that sit underneath that character.
Yeah. What about yourself, Blake? What do you think?
I don't know. I'm I'm still listening. I think but uh I I think broadly like if I sort of take a macro view of it I I think it's like any other um revolution industrial revolution the technological you know silicon revolution um there's going to be it's going to affect people in different ways and I think the more like Mike said that we can have a conversation around that the better and uh you know it's going to infect every subject um whether it's recolorizing you know historic um video for for your history class or you know or photos um or whether it's uh you know allowing you to um pass human um you know investigatory or an analyze data in different ways uh for for you know physics or or maths projects and things like that I think it will progress us as a species um so so I think there's there's tremendous opportunity there um but you how we integrate it how we get it on the ground um I think that's just going to be a case of how we're going with edtech. I mean, is probably a good example. And if you ask me, I don't like where where is the edtech revolution in schools. I don't think we've we've fully seen that or borne that out yet. We're sort of uh being forced in a way. Our hands are being forced to keep up with what's happening in the in the corporate world and what our what our, you know, employers are wanting, what skills they're wanting for our students who are graduating. So, um I think, you know, a lot of it is going to be focused, fortunately or unfortunately, depending on how you look at it. A lot of it is going to be pushed from corporate um corporate interests and what they're looking for in terms of skill sets in in um in graduates from university and things like that. But uh I think yeah part of it has to sort of rest on the school as well to implement it in a way that makes sense uh in a way that's digestible. And I think with AI that's a big challenge. Um in the same way you know putting a laptop in the hands of every kid has been a big challenge and schools today you know still struggle with that particularly during remote learning and all those challenges.
Yeah. Mike where do you see the future o of AI in education.
I see it as being incredibly positive in terms of teachers being able to use AI and machine learning to be able to inform teaching and learning. There's a company out of Oakuckland called Soul Machines and they're building kind of like artificial intelligence robots in a sense, but they're very sensory. They've got like a neural network. Um you can have a look at I'll put some links to some um YouTube videos in the show notes for you as well, but uh like they've created a teacher and and you know like you guys talk a lot about narrow and broad AI and so on. So this is very narrow AI. So the teacher has been trained to be able to teach students about weather and energy. Uh it was bought by a um one of the big power companies here in New Zealand. And um so they it can sit down present information to a student it can ask the student questions scaffold the knowledge appropriately. Uh they're at the point where they can detect frustration and the excitement and the energy and the emotion behind it and then change the way that they're presenting the information based on that. Uh so it's bit learning and it's becoming more responsive. So extrapolate that out 5 to 10 years and I can very much see artificial robots in or virtual teachers in a room doing the rope learning side of things. Um you know the been able to scaffold up and down according to teach the students needs and that then releasing the teacher into more of a coaching support mentoring kind of a role. Uh and then uh and so on. So with all of that then comes the whole we've got so much data in our schools and then we just don't even know what to do with it. So you know at the same time
yeah do we want teachers to be almost data scientists. Uh so how can we use machine learning or AI to produce predictive or you know useful reports where teachers don't need to know everything behind the scenes but they know how to make judgments based on it. So, I see it very positive going forward. Um, yeah, I'm quite excited for it to be honest.
Cool. How about yourself, Blake? You you're kind of innovative and you're kind of really leading the way with your your kids there. What's your general thoughts though?
Um, I think uh you know in the future I think even now we're already seeing that I mean a lot of these learning analytics packages apply some kind of machine learning um to you know make predictions and things like that. I that's sort of in its infancy. I think that's going to happen. I think If you look in 5 to 10 years where school's going to be at with data collection, it's not going to be just looking at your your NAP plan results or your, you know, your GPA, you you're going to be doing more analyst work, what what companies would do in terms of tracking profiles and, you know, profiling your students and making sure that uh you're not leaving kids behind. And I think there's so many benefits for that. Um, in terms of, you know, giving kids a great well-rounded curriculum, uh, giving them access to the right supports when they need them and identifying those before they happen being more proactive than we've ever been before. I think that's a big benefit that that's going to come out of it. In terms of taking over the teacher, I mean, I can pretty much dispel that. I I don't see I don't see any any future in which there's a robot teaching the class because if you look at what in fact if you look at the data that we're we're processing right now, um the biggest factor is the teacher. So, it's not the subject you choose um that that goes towards you know improving your success in terms of getting a good score, it's actually the teacher. So, that right there tells you something that okay, you can have kids that are passionate about a subject, but if you don't give them the right teacher, it's not going to work. And vice versa, you can have kids that aren't passionate about a subject, give them the right teacher, and it does work.
Um, that's true.
Yeah, we see that in our data sets every time we we do this. So, I think that's really telling and and kind of reassuring, I suppose.
Yeah, I remember I remember there's a research by Dylan Willie around I I think, you know, I'm I'm going to completely kill his re research here, but it was something around the the way he presented it was that the there's more variability inside a school and outside. So, what he was basically saying was it matters who you're teaching or where you go, who is teaching you, not which school you go to. Um, so it was really interesting and that kind of corroborates your kind of uh findings as well there.
Yeah. And you know, we're like we're saying what Mike was touching on, we have so much data, we're data rich, but we're information poor. So, how we present And we can do PowerBI and Google data studio dashboards and and as much as we like and you know um whatever you call them student dashboards with all the data on it but uh ultimately the stu the teacher knows the kid after a while. What what what I think helps in this case is more for the student is to give them the data in a way that they understand in a way they can manipulate as well. So one of the the sort of experiments we're working on is hey can we predict where you're going to be by the end of year 12 say um your ATAR score or your, you know, your your sort of end of school score is going to be at. And then we can look at all your inputs. We can say, you know, attendance, we can say effort in class from from your progress reports. We can say um your raw score on on your learning tasks and all the, you know, the outcomes you're doing throughout the year. And then what you do is you say, well, here's where you sit now and give them some sliders and say, go play with that. See what happens. If you, you know, you do more attendance, if you do more this, if you change your subject, you more math subjects. Um, that's going to work better for your model at the moment. So, I think there's there's some interesting ideas there in terms of giving kids a little bit more of the power to make good choices uh than just kind of saying, well, at the moment, I think a lot of the data is kept kept very close to the administration's chest and they look at it and they make decisions and determinations. I think we're going to see that flip on its head almost where most of the data is going to be served up to kids in a really palatable way. Um, and the students, the teachers will just use it for that kind of uh early warning. learning and and proactive use to um to get kids the uh the supports they need.
That's fantastic. Well, it's been brilliant chatting to you guys today. So, what do you think of Alli?
Well, it's it's always really interesting, Dan, to hear people from the uh I want to say that word again, pedagogy pedagogy side of the world, because they just have such a different view and I you know, it just immense respect for teachers and educators who are really grappling grappling with this problem of you know there's so much they want to teach our children, so much stuff that we need to teach children, but it falls on so few people, you know, such this sort of tight group of people, these student educators um and and teachers to to figure out how to navigate that path. And I think there's a couple of comments they make, you know, just the pace of technology movement and how important technology is becoming. It must be so difficult. So, it's super interesting to hear their thoughts. I mean, I guess you enjoyed the the discussion there as well.
Yeah. Yeah, totally. You know, I'm I'm still the the thing that that gets to me with it all um how many teachers I speak to and it came up as well with Mike, you know, talking about scale because it's something that Mike is very passionate about to scale out these things because there are heaps of educators and in and in enterprises there's heaps of people who are really innovative as we know from commercial you know whether it's the health service or whether it's uh departments of education or government departments certain departments and certain innovators that really drive that change and that transformation And it's the same in education. You get a teacher or an IT manager like Blake who's really keen to innovate and really keen to try different things and and is aware of all of the security and privacy aspects but also wants to innovate and really cares about where the kids are going to go in the future. But that scale is really hard. So when you you sometimes take Blake or a really innovative teacher out of that environment, then things just kind of go back to the norm. And we see that in enterprise as well. You know, an IT manager uh you know if an IT manager moves on things slow down if a new IT manager comes in sometimes it speeds up but
sometimes like you're saying there are few people who actually within an organization or within a school that actually drive a lot to change um so that scale element for me is something that that is really difficult to really fathom and and actually achieve.
Yeah no don't no doubt about it and you know again it's testament to sort of the impact that teachers have that we tend to I know personally and many people probably just underestimate just how much influence they have on on these things and you know the scale challenge is it I guess it also shines a light on the value of the human capital in this process and there was a point I think in the conversation I forget it might have been Michael who mentioned this idea that you know there's always this concern that you know AI is here about taking away human jobs you know driving the truck I think was the example he gave
and I think it just really continues to shine a light on this idea that that AI is a tool to amplify human capital and amplify human capability and a a teacher that can take advantage of AI to better understand the needs of or demands of or the challenges of individual students helps that student that teacher scale somewhat you know and gives them the opportunity to be more things to more students but you know what else was was apparent to me in uh in the conversation you know it was bigger than AI you know I know that this podcast is about AI but Clearly there's a lot of associative problems with technology generally. I think you know security they talked about you were sort of giving them examples of the risks of being a owner of security and data and the privacy risk that goes with that. And when you put all that together and look at it from a student a teacher's perspective I should say you know it really is there's a lot that they have to stay relevant with and stay up to date with in order to do justice to ensuring that students understand technology broadly and AI specifically. as a skill or a learn a learned capability for the future. Yeah.
Yeah. And and and the way that you you I suppose with technology it's very different to other subjects. Um I'm just thinking well generally here but you know the the the curriculum that's released say in Australia or in the UK or u the US
the curriculum are kind of released and revamped and kind of made relevant and obviously governments drive that. Um but things like maths you know area my my son at the minute is doing his test on on area volume algebra and something else you know and it's it's quite straightforward and that's the same in the US it's going to be the same in the UK and that contents there and it's standard content that kind of happens and sometimes the waiting for that might change but it's technology curriculum changes vastly you know this new content maths you maths does change in the university level where new things are constructed when we talk about quantum physics or whatever it might be but you know it's very seldom changed from a classical science or art or whatever you want to call it. Um whereas the technology curriculum you know it's very hard for teachers to to innovate because that curriculum comes through slowly um because the curriculum's got to be standardized because you got to test people against each other and you know it takes years to come through and the pace of change of technology is on a monthly basis. So it's very hard for teachers to to catch up.
I don't add Yeah, I think the guys were talking about this idea or this you know this new wave and I know Microsoft does it and Googleers and others do it where big tech or industry tries to sort of inject itself into that curriculum and and you know that's that's good and I think it's great that we do because we do bring a a level of kind of relevance to what's going on today. Um but you know but it does it's not perfect and it's not scalable and and
it almost gets me thinking that you know when I was at school far too many years ago, you know, the it was traditional learning. You did English, maths and sciences and geography and history and and technology was a thing you might have done as a adjunct class. You know, you go and spend half an hour a week playing with computers and that's obviously changed a bit now and you know, kids are using computers more often. But I I do wonder now if you know technology is so intrinsically part of what daily life is, you know, in terms of not just work, but the things we do on social and media and devices and all that kind of stuff. you know and I I don't know the answer but is it something now where technology almost needs to be infused in the way in which all curriculums are delivered not the not a curriculum of its own but actually you know when you're teaching say maths or English or geography then technology is a tool so instead of looking at geography and learning about like tectonics or you know different rock types you're learning about how technology might help us better predict the future of say you know uh geological changes in a particular area And it just infuses that technology thinking into each each subject versus just saying to teach students right here you learn geography here you learn maths and that other class you're going to learn about uh Python and uh you know algebraic uh services for math uh computation of large numbers or something
and it feels different. It's not part of everything else. I I often found that when I was teaching myself because a lot of the things the kids would be really enthused about when when you're teaching some of the dry topics in computer science like um cryptography or things, you could really bring that to life with stories and there's a lot of stuff around um history and there's a lot of stuff around maths in cryptography and then you know when you're starting to talk about security and p public private keys and all the big primary number numbers and things like that you know it it starts to become relevant again um and and it is an algebra with programming you know I I was I when I was I did teach maths but I was when I was in school I really didn't like algebra at all I didn't it was my teacher I think I just wasn't really touched on to that that particular subject. But then when I started to do computer programming and start to put variables in, you know, an A equals B * C and then I I suddenly clicked. I was like, ah, okay, this is just abstracting the numbers. Um,
it's application. It's application of a of a theory into a practical life thing.
Totally. Yeah. You know, and and and I think the ethics side of it as well uh bleeds into a lot of that uh pastoral care and the way that people should be like Blake mentioned in in the podcast about that discerning use of technology for the future and being job ready and thinking about these things, you know, and and thinking about the the security and privacy. You shouldn't just be down to it. There was a good um analogy from the UK that basically I remember when I was teaching at one point and somebody said to me, I can't remember who it was, so if they listen to this podcast, I'm sorry I'm not attributing it. Um but um basically they said to me, if you get a problem in school with a knife, if a kid comes into school with a knife, god forbid, um, but you don't take them to the technology teacher and kind of say, well, you you do knives, can you deal with it with this person? You know, everybody deals with it. And what we had in in the UK, somebody had an issue at home with social media or was being bullied on Instagram or whatever it might have been, then it they'd come to it as if it's like an IT problem. You know, it can deal with the technology side of it, whereas actually it's it's a problem for everybody. Um, And I think that's that's an interesting one to kind of spread that IT and technology education out across the school is is a good idea. Absolutely.
Well, I think so. I mean, it's easy for us to say here on the podcast without really kind of thinking about the application of it. And you know, Mike and Blake um really shine a light on just the real challenges of, you know, these big bold ideas are wonderful to have, but in practical terms
as a teacher in any kind of school, in what any kind of school system,
there's a reality to that. You know, you're one person and you have to assimilate all of that knowledge and then help your students and propagate it across the school. And you know, I get it. It's not much different to being in a commercial world like you and I are where you can have a great idea, but time and focus and your ability to impact that enough people. The scale point that you made up, you know, up front, it's hard to do. And and I think we we should acknowledge that that challenge.
Yeah. The other one that jumped into my mind as you're speaking there was Conrad Wolfram. He did a famous talk where he he invented like the Wolf Ram engine and stuff like that he's he's pretty um pretty amazing mathematician uh out of the US and he was talking about the fact that we've
changed now and computation and it allow us to do really interesting things with data but in the classroom we really going back to basics which is which is fair in one hand but he was very much around the lines of let's put all that data into the kids hands let's do real life data let's do modeling let's do simulations let's show analytics and really use big data and use computing to actually move maths further on rather than stagnate in the historical uh plane of maths. It's interesting.
Yeah, it's a fascinating area, you know, and I think there was a comment that was made at the end about, you know, I think they both disagree with each other about whether robots in the classroom is a good thing or or a bad thing or or will or won't work.
It's an interesting philosophical discussion, but I think it does raise this issue of, you know, we don't want to see a world where school is driven by an ultimately commercial outcome and I know it wasn't necessarily where they were thinking but this idea that you know Microsoft or Google or anyone else that starts to craft a curriculum for schools because then you end up in this world where you know today kids we're going to learn about quantum computing brought to you by the uh you know the IBM school of quantum computing suddenly you're back to that narrow well here's the only outcome you're going to learn it's this dystopian future of you know everything is a commercial entity I'm sure I've seen something like that in a movie somewhere but I think that's a
you know I get I get that concern about big big industry getting too deeply involved in the in the educational sector. Uh I think that I can I can see the challenges.
Yeah. No, absolutely. Thanks for your comments, Lee. And um yeah, let's let's see what we're going to do in the next episode. Maybe we can explore that pop cultureesque element of AI as you just mentioned there with the the lecture sponsored by Pepsi. Let's have a bit more of deep dive in that next episode.
I reckon that sounds like a fun idea. Let's do it. Cool. Thanks.