Preview Mode Links will not work in preview mode
Welcome to the AI in Education podcast With Dan Bowen and Ray Fleming. It's a weekly chat about Artificial Intelligence in Education for educators and education leaders. Also available through Apple Podcasts and Spotify. "This podcast is co-hosted by an employee of Microsoft Australia & New Zealand, but all the views and opinions expressed on this podcast are their own.”

Feb 26, 2020

Does the experience of a retail flagship store have a lesson for education? How does the way that retail blends online and in-store experiences through data and applying artificial intelligence provide lessons for education organisations? Two questions we set out to answer during this podcast.
 

We look at the advances of AI in the retail sector, and learn about things that are relevant to education. It starts with a fascinating conversation about the difference between Amazon and Walmart, and how a retailer that has traditionally been strong in grocery lines is able to compete against a new style retailer that has a digital business model. Marcy tells the story about how they are able to gain a data advantage through their customer relationships made through physical stores, and how they turn that to a competitive edge using artificial intelligence. This has interesting parallels to education, especially when you consider the encroachment of digital education and tools.

 

Marcy also talks about Starbucks and their use of artificial intelligence, and how they take flagship stores that are 'analogue, and pure theatre' and blend that into a digital relationship for their local stores, where a large proportion of their growth is driven through a mobile app. Of course, we talk about coffee buying preferences (Marcy's almond cappuccino with cinnamon on the top) and how AI helps them to personalise recommendations based on locations, not just on your order history. It was also interesting to hear about deliberate decisions to not be digital in their flagship stores where they designed experiences to be about spending time and switching off, rather than speed and efficiency

 

When we got to the subject of the 'dream team' within an organisation to make best use of artificial intelligence and data-centric thinking, and Marcy had a clear perspective, backed up by data, of the need for organisations to become skilled right across the organisation at this, rather than leaving it within the domain of IT or a Chief Digital Officer. As Marcy puts it, "IT leadership can't be in the corner, it needs to be infused across the organisation"

________________________________________

TRANSCRIPT For this episode of The AI in Education Podcast
Series: 2
Episode: 7

This transcript was auto-generated. If you spot any important errors, do feel free to email the podcast hosts for corrections.

 

 


Hi, welcome to the AI and education podcast. I'm Dan.
And I am Ray.
Hi Ray. How are you today?
I'm great, thank you.
Fantastic. If you remember the last episode, we talked to Dr. Nick Woods about healthcare. You're looking at these different people who doing fantastic work in AI has worked really well. But I was looking at your calendar, Ray, recently and I noticed that you've scheduled a meeting with Nicole Marson, our industry lead for public safety and national security. What is the link there with education, Ray?
Well, do you know I've been really pleased in the conversations we've been having. It has been interesting find the parallels between something happening in healthcare and education or something happening in national parks on the other side of the country
and what's happening in education. And you might wonder public safety and national what's that going to do with education? Well, it is really interesting because in talking to Nicole recently and one of her partners around a project, they have this project so which is about being able to make sense of huge troves of investigative data to be able to bring all of that data together and work out patterns and pictures. And so I know that when I saw that I immediately thought about university researchers who are trying to make sense of huge amounts of data and looking for patterns and pictures within that data. So, I know there's some relevance. Okay.
So, anyway, had a chat with Nicole.
Yep.
She agreed to meet me. Now, look, the audio is a little bit echoey because we had to meet in the secret underground bun here, but listen to what Nicole had to say.
Hi, Nicole. Thanks for joining me. Can you just tell me a little bit about who you are and what you do?
Thanks Ray and really appreciate the opportunity to have a chat today. Uh, look, my role is the public safety and national security industry. executive. So that looks after everything from law enforcements, courts, corrections, intelligence, judiciary, um, and defense. So pretty much as a thought leader in that area and working with our partner and ecosystem and looking at our technology set, discussing with our customers what their problems, their challenges are, and looking at what the opportunities are for us to make a difference in that sort of industry.
So when I think about artificial intelligence and I think about all of those things you talked about, I think about Robocops and Robo Warriors and all of those kind of scary scenarios that we see. Yeah. But I know you're doing some fascinating work with your customers about using artificial intelligence to help them solve some age-old problems. I mean the the one I came across was project so which was fascinating. So maybe for the benefit of everybody else because I've heard a little bit about it. Do you want to tell us a bit about So how it started and what it's doing?
Yeah absolutely. In fact it was probably one of the most exciting things I've worked on in my entire career at Microsoft which is nearly couple of decades. But you know it's where we really felt that we were able to make a difference. The problem is and we were working with the head of a computer crime division within a state law enforcement agency was the explosion of digital evidence. So if you think about it nearly every crime that's committed these days has some sort of digital footprint. It could be a text, it could be a photo, could be a voicemail. Could be an email, could be a Facebook posting, could be a financial transaction. There are a whole remit of digital pieces of evidence now. And the issues that police agencies are faced with is not only the conversation started off with storage, but was how do they investigate all of that? You know, how do they look at all of that, but then how do they sort of draw the insights from it
and then how do they also look at that and then look at it again. against another investigation. Is there any other connections? And then how do you take it to another division? If you got a sex crime division looking at it, is there associations with like organized crime? And then it's even bigger like how do you then draw from that evidence to look at interjurisdictional crimes and international crimes. And so the thing is is criminals don't have any boundaries. They don't have jurisdictions like law enforcement. Y
policing these days were having more and more challen storing with it. But how do they work with these huge huge files of digital evidence?
I can I can imagine because even just, you know, one phone seized might have all of that information on it and and I'm guessing in complex investigations, they're looking at multiple phones and multiple emails and how they bring all that together. Yeah,
you can throw CCTV footage in there, too. You know, sitting down and looking at all CCTV footage that may have been captured around and a particular event as well looking for somebody as part of that. Yeah, it's massive. There's a good example that I can give. There was and it's been in the press and I can't go into too specific details, but we're talking about a crime that was committed nearly 30 years ago, a serious crime. The digital evidence and footprint that was given to police that to go through that was 35 terabytes of data.
Wow.
That's nearly 30 years ago. So, if you think about what that exp explosion of evidence. Now, looks like there is a case that we're working on at the moment that I know there was six phones and three SIM cards and we're talking about again terabytes of data that investigators and analysts have to browse through and that's just a couple of phones.
So, that's interesting because in education we talk about big data and we don't really have that much big data in mainstream education but we we we do have some but researchers working in universities will be dealing with all kinds of massive data sets in the same kind of way. So what is it that so does that helps investigators?
So first of all it's the consumption of data and storage of data. And look I mean it's interesting enough going back probably four or five years ago I couldn't have even had the conversation with a customer about how we potentially could help them with that. So you know storage of the data in a secure manner is firstly the most important thing.
Yeah.
And that's where you know the agement of our data centers and being protected was a bit of game changer but really it is deriving insights from these right and that could be everything from language translation so in the recent investigation that we were going through over 80% of the data that was seized off the phones was actually not in English
right
so for Lauren enforcement to basically go through this evidence and so is based on analyzing legal evidence, legally seized evidence.
Yeah.
For them to get secure translators takes not only time but is extremely costly.
So what some of the quick wins was just doing processing and doing this transcription of text messages and posts and phone calls and looking at okay what is that translating it into? English in the way that an investigator and analyst could then look and derive insights. One particular example that we were looking at we know uh and we have actually done a benefit case on this uh in building the business case was the language translation for this particular case in a normal circumstances would have taken over 12 months right for that data set and that amount
there was everything from slang languages as well and of course the more data we put in the smarter it gets. But there was 16 languages that were used in this investigation. The translation was done in three and a half minutes.
So even just a core capability like translation, which is an AI feature, which we've talked about in this podcast for use in education about being able to listen to a lecturer or a teacher and then translate it into a student's home language. You're doing effectively the other way around, but it's the same technology, but used in a very different way.
Exactly. Exactly. And I mean, you know, once it's in a manner that they're able to consume, then it's easier obviously for them, you know, or the system to basically derive insights and connection connections within the data set. I
I think that's one of the most interesting things around the this work is the
ability to connect information together and to see a picture of a relationship or
to be able to link
Yeah.
aspects.
It's pretty scary actually. I mean, when you look at some of the the agencies that we're working in, law enforcement agencies, they know, and I think it'd be the same for researchers, they know that the data they've got is an asset that they're probably not exploiting to the full potential that they they can or the should. It's unfortunate when we look at most major events, there's always been a little bit of data or a little bit of information that may have been sitting in another agency or another division that could have made a difference to an outcome of a, you know, a particular track event. So how do we you bring all that data together and how do we surface up those connections and relationships and insights and that's where whether the agencies and for reasons I can't name you know all all instances of our customers you know we're dealing um helping them take all their data
and I'm talking under just a hundred pabytes
of data that they've had sitting there sitting on static disc and putting into the system to derive insights and it's quite interesting what's happening because we're starting to see some fruition of the the data and the ingestion of it that is moving towards helping solve cold cases that are decades old.
So it's interesting because I've got a question about timing. So in education for example we see there are lots of use cases
but we're in the early days of getting started.
Yeah.
It sounds like in your area in the kind of public safety national security area. You're ahead of the curve in terms of doing it, but why do you think it's happening now? What is the difference now to 5 years ago? You know, where's what's changed? And thinking specifically from an AI perspective, have things changed that make a difference?
Well, look, I think it it's two parts. One is the amount of data that we have to deal with is exploding to a rate that a human just can't deal with it. And as I mentioned before, in our industry, criminals don't have that sort of issue around jurisdictions and policies and things like that
and law enforcement in particular intelligence agencies have to get ahead right and they always feel like they're behind
but to be fair I don't think the technology was there to enable them to do it so I think it's convergence of need not want but need they have to get ahead of this but also now the technology being mature enough and secure enough that they can b ally derive those insights or share that data in a protected way.
So Nicole, I imagine your dream team when you're putting together projects around AI might be you want somebody that really understands the technology, but then I'm guessing you need the what I think of as the gum shoe detective, the person that's got the practical experience together.
It's funny you should say that Ry comes back to a memory probably about five years ago where assistant commissioner gave a presentation to some of the Microsoft field I actually asked them to come in and have a chat to the account executives and account technology strategists around how to engage with police and the conversation and understanding the business. And the presentation that he gave was basically three slides. The first slide was a picture of a dolphin. And he looked at this and he pointed to it and he goes, "A dolphin? This is, you know, something that's in incredibly smart but can't talk to humans. Microsoft, that's you.
And I'll never forget that. And I laughed my head off because I thought, "Yeah, okay. I can understand that. That's true." But then the second picture he put up was a picture of a goldfish. And he said, "This is your typical law enforcement officer. Oh, shiny new thing." And around it goes again. Oh, shiny new thing. Oh, shiny new thing. So, continually being distracted by the the latest and greatest and new shiny thing. And that's an in some ways how a lot of shadow IT projects pop up. But the third one was a picture of a knight and a knight and shining armor. And that person was that picture depicted the fact that that's your translator and that's generally somebody in the business that understands the benefits of it and what that can bring to the to the business and to the business problems I've got. So he's a translator, speaks both do and goldfish and it really sort of resonated and everybody to this day that was in that presentation I still hear them talk about it. So it's finding that person within the business that can understand the impact of how technology can help them move forward and bring everybody together do the translation and that's when we see acceleration I suppose a bit of a mindset open up about the art of the possible.
Yeah and I find that an interesting scenario because that's similar In education, you need that person that can understand the true potential. Not we're going to take process X like read a document faster, but the examples you gave us with so where it's but we're going to re relate the photos to the text messages to the information we find in emails.
And that's something that is a completely different process that's unlocked by transforming something in a digital way.
Exactly. And where we've had great success in sort of engagements with the business and it is where They've come in with the view of we're not going to tell you how we do it today because we want you to come with new ideas and bring in other aspects of what you've seen in other industries that have worked. So we don't have a fixed mindset on the on the approach and we're willing to try things out and if it doesn't work well we fail and we fail fast and we learn from it and we move on to the next bit
and I guess stay focused on the objective and the outcome rather than the process that we're in. today.
Exactly. Because particularly in government and agencies and law enforcement, some of these processes, these business processes are decades old. Some of the policies even older. So it is sometimes challenging for the mindset us from a technologist perspective as well, but also for the customers who saying, "Oh, this is the way we've always done it."
So it's interesting because I thought we were going to spend all our time talking about facial recognition because that's where I see the headlines going.
Ah, yeah. There's a lot of uh interesting comments going around facial recognition. I mean, facial recognition is a hot topic um in public safety and national security. Facial recognition and the services that are part of our AI services are used in the likes of SOA where it's used for digital evidence. It can help you know identify people that have relationships with known offenders or people on watch list so to speak that can help solve crimes. or prevent crimes. That's the ultimate outcome. But that's where it's being used legally. There are some sensitivities about how it should be used and that's why it's very important around you know when we look at the ethics of AI within Microsoft should it be used for surveillance. You know we've got to think about how that is impacting individuals privacies and human rights for example. So just because we can do it doesn't mean we should do it. So, I've been involved in several projects whereby, you know, we've reviewed what we're trying to achieve with the customer, with the partner, and with our corporate and external legal affairs office and our AI ethics committee. And there have been a couple of instances where Microsoft has said, "No, we do not believe that this is right, that this is the ethical use of AI and that it goes against some of our principles. So, we have actually stood up and said no, we won't allow our technology to be used in these instances.
And and I've seen public reporting on cases like I think it was one of the police forces in California where we didn't want it to be used because we knew that it wasn't as effective with certain minority groups and therefore could lead to overtargeting or whatever. I guess it's that surveillance state thing that
Yeah, it is. I mean, that's the bias because at the end of the day facial recognition is trained off images. So it comes down to the data set that it is learning from to identify individuals and so you know we have to be very mindful of that particularly around the diversity and inclusion and we you know we don't want to make sure that any particular group is targeted. So that's where we have to be particularly careful
and then you you talked about the you can do something but should you do something. That's I guess where the goldfish and the dolphins get into an interesting conversation.
Yeah, I think that's when the sharks come into probably the lawyers. So, look, I mean there there are avenues where facial recognition is being used but not on the Microsoft technology set.
Yeah.
And it's interesting even in New South Wales, I'm going to be watching this quite closely, is where they've started to use you've got cameras that have been put around in the infrastructure on the roads and bridges to identify the use of mobile phones in cars.
That's not something that Microsoft has been involved in. But, you know, is that actually impinging on the privacy of an individual in the when they're in their own vehicle?
And I've certainly seen scenarios in universities globally, for example, where they've installed cameras and then because you've got a camera feed, you can then start to do a little bit of scope to go, "Oh, well, we could do this and we could do this." And suddenly you start counting people move around campus, but then you start identifying people and
yeah,
you just plug in another piece of technology. And so I guess the conversation that we've had on this podcast before is about responsible AI, making sure that
we and working with our customers have guidelines and and frameworks in place to make sure it's being used well. So I guess you you probably talk about that more than we do in educ because you're probably more at the bleeding edge of the use of technology.
Yeah. And to be honest, it's where we're finding there's more gray areas, too.
We talk about when we work with police, should facial recognition be used, for example, at a major event and the potential, you know, identification of known individuals that are on a terrorist watch list.
So, we have to sort of sit through and work through those out, but we have a process That's the great thing that Microsoft has a very clear defined process that we follow to determine is this a ethical use of AI or fa you know and it's and it's not just facial recognition either
yes
but is this a ethical use of AI
so that's been a great conversation I've got a really interesting insight into the use of AI in your field but I know that there's an overlap isn't there between the world of education and the world of public safety and national security but We've done a lot of projects in the past where we're thinking about things like child exploitation where there are projects that cross the two boundaries.
Yeah. Yeah, we do. There is a lot of crossover into education in particular, but also into health and other areas of government as well. Child safety is very important to us. We've had a digital crimes unit for several decades. Um, and the digital crimes unit developed the child exploitation tracking system which we donated to police all over the world. There is another organization that has taken up that product now and taken it further. Photo DNA and again we donate that to law enforcement which helps protect children at risk and when images of children are being uh shared and how we can remove those images from the internet. And now we've got the establishment very recently of the new um digital safety office. So there's new technologies coming out from that that are all is a combination of law enforcement and Microsoft and uh education and government all working to protect children everything from cyber bullying right through to child exploitation and human trafficking you know how can we support in those areas as well so that's part of the public private partnership where you I think Microsoft shows a great amount of responsibility in ensuring that we provide the most safest environment for those that are using it online.
And I guess you're working at the the leading edge, but also the this I don't know if this is the right word, the slightly more scary edge of of some of the scenarios, but then the things that we do there I've seen end up in our product. So if I think about a school, we have a filtering system that allows you to check an image to see if it's got adult content in it. So that, you know, a child might upload something into a learning management system and not understand the picture they're putting up there. Or to check for bullying or inappropriate language in text
that are being uploaded and things like that. So we take some of that highly complex stuff used by the public safety and national security agencies and turn it into not quite consumer but education grade services. So you know there is a blow through.
Oh absolutely and you know the great thing about it is you know when we embed those safety features so to speak into our products our partners then take our platform and take it to the next level. So That's where you know things like so were developed and some of the new systems that we're seeing that are being developed to help and protect people online from our partner ecosystem. And that's where it's really really interesting. You know we will work with some open source intelligence organizations that have taken our platform and using AI to sort of monitor things on the uh social media network as well.
I am so glad I grew up in the age before social media.
I know. I know. Yeah. Can you imagine? I we didn't have smartphones around. And I'm just thinking, gosh, thank goodness for that. So, you know, privacy is a big thing. So, we that's always front of mind for us.
Yes. Yeah. It's it's interesting. Dan and I have talked on the podcast before about the ethical framework and the using AI responsibly. It's probably something we'll come back to, you know, in in future months. So, it'll be interesting to catch up with you in six months time, find out what the latest things are and whether some of those technologies like so have made the leap from investigation into maybe research investigations in universities where they're dealing with similar amounts of data. So I can see how the stories you told us and the way that technology is being used has got some relevance to the conversation we're having in education.
Absolutely. In fact, it's funny you should mention that. So e even in the last couple of weeks, we've had a few customers that are totally out of the public safety and national security um industry come to us and say, "Hey, we want to learn more about SOA. We think is applicable in this scenario and funny enough one of those is education. So can't say too much more at the moment but probably in the next few months we can certainly share about how that's being used in that and other areas of government.
Fab well that's exactly why we wanted to sit down with you and your other industry colleagues just just understand what's happening in other industries it relevant to education. So it's been great to be able to sit down with you. Thanks Nicole.
It's been a pleasure Ray. Thank you. Well, Ray, that was fantastic. I I really loved that podcast interview and just looking back and thinking about some things that jumped out at me. Obviously, we introduced the podcast and the interview by talking about
So,
and listening to that with Nicole is fantastic. So, what were your thoughts?
Well, look, so many parallels so many parallels to education. The the huge explosion in the volume of data, the the fact that there's so many different types of data. I mean, some of the examples she's talked about there wouldn't be exactly the same in education. But that explosion of different types of data is an issue that is affecting education as well. Whether that's in university research where you've got somebody dealing with terabytes and terabytes and terabytes of data or whether it's in a school where you've got data from your student administration system, your timetabling system, your learning management system, your documents, all of those things trying to put them all together and make sense of them.
And and I think that's one of the trends that we were looking at. Uh I think one picked up in one of our previous podcasts when we're looking at trends ahead for the year. One of the things about unstructured data listening to Nicole there, you know, the fact that police these days are collecting a heck of a lot of data on individual uh investigations now, even years and years ago, you know, 5 terabytes uh or whatever the amount was that she she was quoting uh goodness knows how much it is to get day with all the um social media accounts that people have got and and so on. So, the tools we used to analyze unstructured data are going to be uh more prevalent to allow us to make use of those data sets that we have and the unstructured data that's available to us in education and government as we can see so
I couldn't agree more and and as those tools get more sophisticated I think we'll have better tools as people inside education that come to us from other industries but the question I've got for you Dan
are you a
I know what's coming yeah
are you a goldfish or are you a dolphin Dan And just to remind you, the dolphin, the dolphin,
the incredibly smart one that can't deal with humans,
the goldfish goes around the tank every 8 seconds. Oh, look, there's something new. Oh, look, there's something new. Oh, look, there's something new.
Yeah, it's interesting, isn't it? I think I think really I'd be the knight. Right.
Of course you would be, Dan.
Because of my job and my strategic thinking in terms of the work that I do with schools. However,
and that, Dan, is why I didn't offer you night as a choice. Are you the dolphin or the goldfish then?
I'm the goldfish, Ray. I love shiny things. Um but but obviously from a Microsoft point of view you can see the analogy there when she was talking about you know some people being the intelligent being the can't commun communicate easy for me to say can't communicate our uh or I really can't communicate
you are you are the dolphin D
I stop talking Ray um but yeah you're right and that was a really interesting analogy when I'm thinking of teach me meets and things like that where you've got lots of new technologies that we see regularly uh that's really inspiring. You need the people to innovate. You need the people to go, "Wow, look at this piece of technology."
But you need the night. You need somebody that sits there and goes,
"Guys, guys, the problem we're trying to solve is dot dot dot." So that you stay focused on that because there are too many times when you become enraptured by the technology and you just get really focused on that.
And we say the CIOS and the business decision makers who are listening to this are the knights, aren't they? They're the ones that make that decision, learn about that technology, and can implement the things that the goldfish bring to him.
But the other thing that was really key out of that story that Nicole was telling is that actually you need the dream team. You don't just need a knight. You actually need a goldfish. You need a dolphin. You need people that understand the technology. You need people that understand that sense of innovation. Especially in education. You need an educator to be able to put their hand up and go, I'll try it.
And and then you need that person or that team that then stays focused on the outcome you're trying to achieve and that's not a technology outcome that's a human outcome.
Yeah, absolutely. We we have to discuss the facial recognition elements that she brought up.
Do you know I honestly expected that more of the interview was going to be focused on facial recognition because there are so many headlines around it. I think I got more subtlety in the conversation with Nicole than I'd expected in the sense of there are different uses for it. So using it to check a fact is very different from the surveillance use.
Yeah. And we we mentioned that in a couple of episodes ago, didn't we? When we were talking about um the use of it to kind of check people in exam halls and things like that very different to checking a group of students and their emotional state or their age or their gender.
Yeah. What I took from it from the way that Nicole was talking about it in public safety and national security which we can use as a bit of a guideline for thinking about it in education as well would be something like a the can I see your papers please the document check the the student that walks into the hall to say is this a student that's in our student management system you know that's a that's a particular problem probably for universities rather than for schools. But that kind of checking is probably much much more acceptable than saying we're going to put cameras all over the campus and we're going to plot everybody's route around campus. Now you might do that in order to look at crowds and to go we need more coffee shops over there or we need more working areas over there. But to do it down to the facial recognition, the surveillance level,
that seems to be the area where there from a public safety and national security point of view. That's that's a no no. And so that same conversation probably should happen in education because I reckon we're talking about the same
and it's a fine balance, isn't it? As well because when you're looking at some of these things and that's why it's always interesting. One of the reasons why we started this podcast because of the ethical dilemmas you get from a business point of view and from a societal point of view when you look at say analyzing images. If you were talking to a criminal for example and you're looking at their phone and you can uh picture match images or you can go through and find particular objects and images that correlate. You know these days people will take thousands thousands if not hundreds of thousands of photographs on their phone and never delete anything. You know it's almost an impossible task for a police officer to do that. So you are generating this information but in that context getting a machine to do that is the only way really you can do that efficiently to to find images and and faces within images and correlate that data Yeah. And and you know we've done projects for a long time. You know photo DNA which was what 10 or 11 years old now. Photo DNA is a a technology which is about finding images of child exploitation and um you know allowing the police forces to be able to efficiently find those and and intervene and act. That's been around for a long time but clearly AI is now being used in other ways as well. So the announcement that Nicole mentioned which is around AI to help spot grooming conversations. You know, that's that's kind of secret.
More subtle, isn't it?
Yeah, it's secret squirrel stuff in a way because it's things that are made available only to law enforcement agencies, but then some of that stuff does end up in our products as well. So, I know in Azure there's a service which is about being able to spot adult content for uploaded images or inappropriate language. And that's a system that can be used by, for example, people making learning platforms or learning management systems. If you've got a open conversation platform that you're building on the web somewhere. You need to have those capabilities, you know, because a student may not know something is inappropriate to post or they may just post the wrong thing. And so we take those technologies we're developing in a world that is completely different from our world of normal classroom education, but then they have applications within education as well. And and that's the really cool thing I love about some of the AI conversations we have is that we have a conversation over in healthcare and you can immediately see an application in education and you see that technology
education gets the benefit of it being developed for all of these other scenarios and then you also get
yeah I must say the parallels with with um Nicole's uh interview there were were quite stark you know I thought there was some great examples um and they actually really shone in terms of being able to really correlate the examples that we could see you know especially with the student use of social media the amount of data we generate uh you know and the fact is I suppose that the police will be you know seeing that the front line from a consumer point of view uh much more quickly so the the data that they've got to analyze and then the parallel for me looking straight into you know the the amount of data that we've always seen in schools is just uh you know absolutely great so you know I was really glad you went to interview her it was excellent
think back three months down I remember in the AI for evil podcast you talked about facial recognition for taking a register like we were on two different sides of the line on that one. What did that make you think about that scenario?
Um, it's an interesting one. I think you're I still think that there's a It depends what the process is, isn't it? And for facial recognition in in terms of taking the register, I still think there's a there's there's room for that. Um, because I think that type of process isn't that invasive and it can, you know, I think again it's very important for the people using that technology to be clear and transparent about why it's being used. If it's just being used to take the register and you have got to legitimate purpose for that. For example, if you got a split campus and students could literally be offsite and you don't know if they in particular site when there's a fire, then if there's a legitimate reason for that or it'll save efficiencies and make efficiency, sorry, for teachers, then um then it is it's acceptable, but it needs to be done in the right way.
And if it's releasing time to be able to go and do something like a register in the morning, a more human interface in the morning that isn't just 20 minutes on taking a register,
but instead is focused on you know the the kids
that's a good
and I think this is what we're going to be hitting you know in the government sphere in education in healthcare when we going to start to be dancing along that balancing act of the application in particular industries because for for example attendance in a in a school may be one um technology that might be acceptable for example but attendance in like a public place or a hospital the same technology might been seen as taboo or not acceptable
and do you know what great is each of these conversations looking into to a particular industry is getting us below the surface of the
glamorous and dramatic headlines
into what is actually happening and it's kind of different story to the story that you would read in the press I think
so was pretty glamorous though I love that I have to be honest you that's one of my favorite examples of the use of AI and big data so far
well good news I know somebody that can introduce you to so Dan
fantastic
it's a date okay well that's great I think we're learning a lot from having these conversations I'm going to go and see if I can find somebody completely unrelated to education that we can learn from next.
Great. Looking forward to it, Ray.