Preview Mode Links will not work in preview mode
Welcome to the AI in Education podcast With Dan Bowen and Ray Fleming. It's a weekly chat about Artificial Intelligence in Education for educators and education leaders. Also available through Apple Podcasts and Spotify. "This podcast is co-hosted by an employee of Microsoft Australia & New Zealand, but all the views and opinions expressed on this podcast are their own.”

Mar 4, 2020

This week's podcast is a deeply insightful interview with Zanné van Wyk, Head of Data and Analytics from the Catholic Diocese of Maitland-Newcastle, who has a background in both school and tertiary education data analytics and using AI for predictive analytics. Zanné discusses her and her team's role to deliver actionable predictive analytics for the Diocese. The podcast discovers the variety of data sources needed (including coal prices, to predict dropouts), and that you need a different data architecture for reporting\dashboards than you do for predictive analytics. And we also discover the three critical dependencies to deliver effective business value (data analytics strategy, executive support, and a governance framework).

Overall this is a fascinating podcast interview with somebody who's working at the pointy end of AI and data in education, and shares what's been learnt along the journey. Find out more about Zanné and her work at https://www.linkedin.com/in/zannevanwyk/

 

________________________________________

TRANSCRIPT For this episode of The AI in Education Podcast
Series: 2
Episode: 9

This transcript was auto-generated. If you spot any important errors, do feel free to email the podcast hosts for corrections.

 

 

 


Hey Dan, it's another week. It's another AI for Education podcast.
Can't wait. This is great.
Well, you don't have to wait. It's another week. It's another education podcast.
Brilliant.
Look, I think we've had a great what, four, five weeks of interviewing some really interesting people, talking about how AI is being used in other industries and it's really fascinating that the parallels exist there to other things. But you know what we said last week? We need to go and find some people in education.
So Dan, I think it's time you got your car keys and headed off to go and talk to people in education and bring some stories back for us to chat about.
I sure will. So I've gone out to interview I've done this already, Ray. I've gone out to interview Zani Vanwick from Maitland Newcastle Catholic Education Dascese and she's a data scientist there and she's come up with some fantastic work over the past couple of years. So, we're going to hear all about it now.
Hi, Zanie. Welcome to our AI and education podcast. Tell us a bit about yourself and your path to your current role.
Hi Dan, thank you so much for giving me the opportunity here. I am a data analytics strategist and at this point I'm the head of data analytics at the Mland Newcastle Dascese and my job is to find the balance between governance and value. of datal litics at the diosis.
Well, a small job then.
Yes. No, it it is quite an ambitious role I must admit. But fortunately with executive support that I have at the dasis, I can appoint the right people with the relevant experience. So I've learned the lesson very early on in my career is to appoint people that's more clever than I am and that has helped me to implement this job. I do have 30 years experience delivering data analytics um across a broad spectrum of industries but just by default over the last five to seven years I started migrating towards education for the first section it was in high education and universities and now it is school education and the dascese as a broader community as well which has been very interesting.
Have you seen that change over over that period of time? Have there been any trends that have happened?
Yes, I was surprised to see the maturity of Daytonics in education as a whole that is tertiary education as well as schools. I thought your universities would be more advanced when it comes to data analytics and I was very surprised to see that the maturity was actually quite low and I just thought it might be because the right data analytics strategies wasn't in place. So it wasn't that there wasn't the relevant skill sets or expect or even resources in terms of uh you know your technologies or those type of things. It was the fact that there wasn't compelling data analytics strategies that was in place and that hopefully will change you know over the next couple of years.
Yes, absolutely. So so before Australia do were you working in data analytics globally in in other countries?
Yes, I did work in other countries in data analytics. Um I worked in environmental sustainability for example where we calculated carbon emissions and did those predictions on carbon emissions for landfall gas methodologies. I also worked in retail in insurance and yes as I said many other industries yeah telecommunications
that's a fantastic background I suppose there are there are similarities across those industries when you've been working in them
there were similarities but there were also clear signs when something wasn't working and the key things that came up was a lack of a strategy that there wasn't a datalytic strategy in place that they didn't have executive support and then a governance framework.
So those three components weren't in place it was very very difficult to deliver business value through data analytics.
Yeah. So so when you so thinking about that and when when you're looking at the your current role or the people listening to this podcast maybe thinking about their uh their own strategy IES I suppose because of the speed of the analytics and the AI elements within there I suppose looking at a strategy where do you start?
So I started spending the first few months first understanding what the maturity was of the organization as a whole and I do it with every industry or every organization I walk into even those that say that they are more advanced. I need to understand what the baseline is where we working from and once I have an understanding of that maturity I will start to develop the data analytics strategy and there it needs to have ambitious goals I mean I can make it easy for myself and say let's deliver dashboards and reports because that's an easy goal
but I don't think that's necessarily a compelling strategy and you need to make sure that within those goals that you have clear focus understand what the objectives are but also understand what the challenges are and then with the road map define their actions either to overcome the challenge or achieve the objectives to deliver the value creation. So it is important to understand the strategy as a whole just to put a 40 page presentation up and having all these big plans without having a practical approach and how are you going to change or implement those plans is not good enough especially when it comes to data analytics.
That seems like a really great approach when you're actually thinking about the AI element to the analytic side of it and the the kind of black box as it were and the algorithms that you starting to produce, how do you then with your strategy, how do you then sell that into the business?
Well, I settled that within the business when it came to the objective. So, I made sure that the data analytics steering group which we established as part of the governance framework understood that my intent was not to deliver dashboards and reports. They've got very good operational systems already in place within the school's environment or CSO and these operational systems do deliver fantastic dashboards and analytics. So there was no reason for me to reinvent that wheel. It didn't make sense to me. Also, I wasn't interested to appoint a team of dashboard builders. I would get bought within the first six months and I'm sure they would just f So, so I made clear that executives across the board not only in schools but the chief financial officer, the CIO and everyone else understood that the objectives was not to deliver dashboards and reports but to deliver that self-service analytics and BI capability but most importantly to deliver actionable predictive analytics.
Right? So that's a really interesting approach and I think I see a lot of people looking at dashboards and when you speak to people about data science they they go straight to the dashboard and this is a very different approach isn't it?
Yeah. Yes, it is a very different approach, but it's important to have that approach to make sure that everyone understands what we're working towards and why we need to have certain skill sets in place, why we need to appoint them and why we need to have a certain architecture in place because deliver an architecture for dashboards and reports is a complete different type of architecture that you would deliver for predictive analytics.
So, how how would you start to build your team then. So from your point of view, what would your ideal team look like around you to deliver that strategy?
Well, I knew exactly how I wanted my team to look like and I'll talk to that in just a moment. What was interesting though is usually when you appoint a team, you would start with the obvious team members first and then you would move to your more specialized team members later. And that was also as I documented it within my road map because from a date science perspective, we were only supposed to start dropping that into the road map later this year, but I was in a fortunate position that I knew of a PhD student in data science and I knew she became available last year and I went to the executives and I said to them maybe this is an opportunity to bring her in earlier and just start working on ad hoc data science and advanced analytics projects to start delivering that data literacy. across organization or showcasing it almost
and it was most was most probably the best decision I made. So within the team I first appointed a data scientist
then I appointed a BI specialist and the BI
can I just can I just unpick for a second before we go to the BI specialist. So a data scientist what were you looking for in a data scientist you know you quite heavily ingrained in the data science world what what do you need as a data scientist what type of skills would they be needing to be able to do
so for me Data science is not about the technology. It's about the methodology. How you apply scientific methodologies in order to test and hypothesis.
So that was what was important for me and I've worked with her before. So she understood how I worked. So therefore and a good example is that within a strategy if someone comes up with a plan or a strategy you can test that plan or strategy using data science. And that's was what I wanted to bring into the dascese so that from now on when we plan and strategize we're not just going to plan and strategize based on our executive experience or our gut feelings or what I might have seen in this report or that report or what someone told me but once we decided what that strategy and plan is we might we not might we will use advanced analytics and data science to test whether that strategy and plan will be effective. So you test that hypothesis and if that hypothesis is positive then you can deliver a predictive model from there to measure the strategy against so that can potentially become your KPI or your
it's all kind of measurable yeah
the next person was the BI elements
specialist
yeah
yes so there were two key additional team members that I had to look at and find and the one was a BI specialist and the other one was a data engineer and the skill sets between the two of them was around um understanding the architecture obviously the integration of the various source systems into this data warehouse environment but then also the model the dimensional model designing that model and I was fortunate to get a BI specialist who understood that a model is not designed by technology a model is designed by looking at the people and the processes and to design the model from that analysis So the BI specialist was appointed to do specifically that and at this point I think there's about 27 processes that we busy analyzing from a student life cycle perspective in order to understand the data stories within each of those processes and that will help us to get that better understanding on how that multi-dimensional model should look which the data engineer will integrate the various systems into
and and I suppose taking your point at the beginning that feeds back right into the fact you're initial discussion about the goals of the organization you're working with. So it all feeds back together.
Yes, absolutely.
And and when you mentioned governance there earlier on, would that be under you were kind of remitt or is that a bigger picture?
No, the data analytics steering group is part of the overall data analytics governance framework and the steering group um has full executive support. So we've got your chief financial officer, your CEO, your CIO, your head of HR and All your directors within the agencies and departments within the dascese are represented in the data net exterior group and it's their responsibility to help drive this strategic intent across the dascese but also to prioritize the projects within the program of work. So I don't go and say listen I think the next project should be X. They have to decide on it and they decide what the next one is and one of them at executive level becomes the business. a sponsor as well for that project.
I see. And that makes such sense, doesn't it? And this is really useful for the listeners because I think everybody's coming to this from a different point of view. So having a strategy and an idea of how to approach this themselves is fantastic. When you've got your team together, what are the kind of interesting or any surprising insights that you've seen when you've been analyzing data when you've been using AI and machine learning in the back end?
An interesting insight that I got when I started working here at the Dasis, which wasn't as a result of AI but it has started the discussion on why we should look at data science and potential machine learning and AI and bigger data sets was when I just tried to get my head around what data is available and what's happening with the data and those type of things I noted that in some schools and not all years so not not every academic year some years some schools in year 10 lost boys and I was completely conf confused by it. Where are these boys going? And I thought, okay, maybe they're going to very very posh boy schools somewhere else, but it's still it just didn't fit well in terms of the narrative of, you know, everything around the schools.
And then once I started having more discussions with your senior leaders and those with years and years and years of experience of the school environment and the data being collected, it ended up there's a direct correlation between board is leaving our school system in year 10 and not only the Catholic schools, the other schools in the area as well and coal prices.
Wow.
Yes. So when
right
the mines when they have a lot of um when prices are goes up in coal the mines become have these job opportunities for these boys and they go for it. And now that I started having discussions with the hunter research foundation who's doing similar research within this area in order to plan going forward for this area. They also start to understand that we have that happening within schools.
Wow, that's happened, isn't it?
It is. And it's that almost open the door for me or not almost, it did open the door for me to start having discussions with directors in schools and say, but what if imagine we can now start using cold price data and aligning it with our or integrating it with our school data and have a better understanding in terms of how to plan going forward and how to make sure we retain those students,
you know, and just have a discussion about how we can retain them. But also another interesting story that we heard now while we were analyzing the people and processes across the student life cycle, we heard a story where complete the opposite to what was happening end of last year in terms of the terrible fires in New South Wales where there were floods and students had to write HSC and they didn't have access to electricity or internet so they couldn't study because they didn't have internet. So then we're saying but hang on then maybe we should start looking at weather patterns and pull that data into these environments as well and have a better understanding on why certain outcomes happen and we weren't quite sure maybe they were it was weather related and I do believe if we start analyzing assessments within schools not only HSC all years it will have an impact on the students what happened last year in terms of the fires might not be that the students didn't have electricity but just in terms of student well-being because it's such an important component of our understanding of stu this context of student information in order to retain these students. Yes.
Yeah. And I suppose there's a lot of information you know I think there's a wealth of information inside systems. What you're alluding to there as well is there's a lot of data outside school systems that we can kind of correlate with some of the the internal data to bring it together. How are you finding the access to data these days in say Australia context for from external sources uh or or third parties? Is that problem or is there is there certain things that are easier to get than others?
It's always the case. There's always certain things that would be easier to get than others but But I think most organizations that has either developed source systems or a part of you know selling these systems do understand that unless they have that integration component as part of the product that they sell they're going to have a difficult time going forward and that does make a big difference and also there's many ways to get data. So if I where you can't find it through the source that you might think would be the obvious one there might be another way to get the relevant information. So So it it's just being creative um in terms of how you develop these models and integrate the various data sources. But isn't that what data analytics teams are about? It's about innovation and being creative and just thinking out of the box. So just find more than one way to solve a problem. If the first one is an option, let's go for plan B and C.
Absolutely. One of the things that have come up time and time again when we're talking about AI and analytics in in education around the ethics and and even in in terms of bias when algorithms and things are being created. What are your thoughts particularly from a strategic point of view with with ethics and bias in AI?
That is such a good question.
Yeah, we should do another episode on that specifically.
No, but and it is a very very very important question. I think everyone in data analytics, you know, it doesn't matter what industry you are or what level of senior seniority you have within the organization. It's our responsibility to ensure we have an ethical compass you know and bake it into the solutions we deliver and the data analytics team and the way that we've done it at the dascese of Mland Newcastle is to have a keystone within our data analytics strategy. So why are we doing it? What's the overall reason why we're doing it? And the key key stone in the data analytics strategy at Maitland Newcastle is data for good and it's to harness the power of data analytics to make more informed and better decisions in our quest to help communities flourish and that is what our key focus is. So if we if any department or agency comes to us and they propose a project that we have to work on or that they recommend a project that we should work on if they don't have that data for good component within the proposal, we won't even look at it.
Yeah.
So that finance completely on its head because a potential future project is obviously around finance intelligence and the CFO said, I'll find the right one. I'll find the one. But you force them also to start just thinking about how we going to ensure that ethical compass is baked into the solutions that we
absolutely especially when you're bringing in other data around the student and other care institutes and you because because I'm sure I you know what I can see and I know we've had conversations of this in the past but in any organization in edu you know you've got a student and the every time something comes up about a student you know in any country in the world if if a student has an issue with anything there's usually an indicator from some system whether it's uh inside the school system whether it's attendance data whether it's something to do with medical information whether it's something to do with their mental health and often that data isn't collected You know, we dance this fine line between ethics and and where we use things and h how we use that. So, it's not just necessarily that we can use it is should we use it?
Absolutely. But also, if part of your services that you deliver as an organization, so at the dascese, we do have your cso department or as an agency, but we also have Catholic care who plays a very important role around that well-being. Not for communities but also at student level. I mean they've got foster children that's in our schools but they also have counseling and learning and teaching support and that type of functionality or resources within Catholic care. It is important to understand that by having that information and available we can make decisions and plan and have tailored inter intervention plans in place for students that might be at risk.
Yes. Yeah. Understand. Yeah. So, in terms of the areas of concern that you see around AI, is there anything that worries you when you were looking at AI in the press, are the things that are worrying you or do you think that the way you've set all of this up with your governance strategy can approach that?
I am concerned about I will call them operational systems that have this black boxes of predictive analytics. built in within them. And the reason why I'm concerned might not always just be around the ethical component of it. But as I mentioned before, artificial intelligence or predictive intelligence is about methodology. And part of the methodology is to understand which lead indicators or which variables has an impact on an outcome or is something that's being measured or predicted.
And in order to ize and plan. You need to understand which one of those variables carry more weight. But sometimes they can also carry a weight that can have a negative impact on your outcome. So if you pull the wrong lever, it might go in the wrong direction. And if you don't understand the methodology and the I mean some people call it an algorithm, I would rather prefer to call it a model. You might make wrong decisions which can have a significant impact. on the outcome
and and I suppose that's where where your team would then look and I suppose the the job of data science teams is to then uh modify that that model because presumably the model might actually need to be changed and analyzed just because you've created it one year it may need to be changed you know the next year
absolutely and monitor it and make sure it's still relevant and if not adjust it accordingly.
Yeah. Fantastic. So so thank you so much for all these insights. I suppose to end on on some in light as well. What are the most exciting things that you see on the horizon with AI and what are you most looking forward to working with?
For me, the most exciting thing that's arising in II is the fact that we've got the environments becoming available. So, previously it would be difficult to get access to these environments or you need needed an an significant amount of money or resources in order to implement them and that became just so much quicker and it's because of techn Y is just becoming more and more available and easier access accessible. So that's exciting for me because I think it used to be a roadblock on why we couldn't do this at an earlier point in time. It was just access to the relevant technologies and that is certainly or has certainly changed over the last couple years.
How do you actually keep up with that yourself? You know, it's up to date with this knowledge because obviously as a as a Microsoft employee, I'm looking at the Microsoft stack all the time, but you know, I there's a lot of stuff that that's been updated from a data science point of view. How do you keep track of how things are going? Are there places to look? Are there places that you'd recommend to our listeners to kind of be aware of to kind of get up to date? So for me, it's about engagement. So internal engagement, internal to the organization, but also external engagement. So engaging with your peers and others that you see or heard has done, you know, excellent work. Contact them, phone them, set up a meeting with them, travel to them, meet them. in person and just to be able not only to benchmark yourself but also to better understand what opportunities there are out there but also understanding the challenges that's out there. So for me external benchmarking is incredibly important. I try to stay in touch with people that I used to work with continue to work with or just meet at conferences for example the data analytics conference that I just attended met once again fantastic stakeholders within data analytics and I go to them, do you mind if I keep in touch? Do you mind if I give you a phone call and find out more? And it's amazing. No one is scared to talk about it. Everyone's actually almost thirsty for more information and more advice and more discussions around this.
So, it's much easier than you would think.
Yeah, that's Well, that's fantastic. Thank you so much for joining us today, Zie, and and sharing all your insights with our listeners. It's been fascinating and and I think this is really ramping up our podcast now. We've had a lot thought leaders on, but you're actually doing stuff on the ground and actually getting things done in K12 and and from your university background space as well. So, I would love to chat again soon and see how your projects are going, but thank you so much for joining us on our podcast.
Thank you so much. I appreciate the opportunity.
So, Rey, what do you think about uh that interview?
Oh my goodness, we got so much to talk about, Dan.
Another podcast almost just analyzing the strategies was using.
Okay, so let's talk fast.
Yes.
So I thought that was fascinating because we covered so many different topics in that what 25 minutes
but there were some real highlights for me. The the first one was around the discussion about governance right at the beginning and also around the value of data and analytics and staying focused on that and that came coming up throughout the interview is focusing on the value of what it is that's being delivered. And I picked on something that Zani said very early on which is you need three things to be in place to be successful. She said you need a data analytics strategy, you need executive support, and you need a governance framework. And putting it in those very simple terms allows you to look at it and go, okay, have we got those three things in place? Because she was very clear, I thought, whether it's implicit or explicit is if you haven't got those three things, you haven't got an AI strategy.
Yeah. And she did say that there was a lot of people doing work with BI and a I in in education that she'd seen and they're doing some very capable stuff but actually there was no strategy underpinning it so it wasn't sustainable.
Yeah. And I think where it really helped from her perspective is she came across with a very clear purpose and that clear purpose was about delivering actionable predictive analytics and and I know seems weird me saying those four words very slowly actionable predictive analytics. But delivering those things was a clear focus. It wasn't let's build reports. Let's look at this data. But it was predictive analytics. So how does this data help us to tell the difference that we're going to see in the future and how does it help us to predict what's going to happen in the future? Because then you start to intervene.
When she was looking at the sustainability of those dashboards, I think that that came out quite strongly as well because you don't want to be just creating new dashboards for dashboards sake. And we can look at a graph all day, but it's about well what does that mean to that student and what can I do to increase the grades of that student or increase the well-being of that student or increase whatever it may be telling me or so fix a problem.
It's also the maturity of the thinking. We talk about data decision-m and the the cultural change within Microsoft to become a data decision making organization but Zani went further than that which was yeah sure we can see the data we can make a decision but we then make a decision about what we're going to change in the organization and in her role with her team They're then able to test the change because they've got the predictive analytics that will tell them the result of the change that they're going to make.
Yeah. And when you talked to the team there, I think that was interesting because for me the good thing about this interview was it was quite focused. So like you said, you had the three elements at the beginning and then she went into the fact that a team was made up of herself, a data scientist, a BI specialist, and a data engineer. So she was quite explicit about who she needed and why she needed them. So it wasn't just somebody that's interested in BI. The creating dashboards. It was very deliberate in the way she'd put these people in to hypothesize and actually the way that she said about prioritizing projects for those people was really interesting as well.
Yeah. And and I think we were almost here in there a framework that I was writing down as I was listening to it. A framework for how you do these things. Well, here's the things that must be in place to start with. Here's the purpose that must be in place. Here's the team that must be in place.
The bit that I recognized was the bit you can't nail down in advance is here's the data we're going to need because the examples she talked about around the data would be something you would never set off at the beginning of a project knowing that you're going to need to know coal prices.
You know, that was a really fascinating insight because as she was starting to talk about coal prices, I was thinking, "Oh yeah, I know what happens when coal prices go down, less people can afford to send their children to private school was my assumption." And then when it actually turned out that when coal prices go up, that's the problem because kids drop out because they can go and earn $150,00 and driving a truck around a mine. For me, that was an eye opener. And I think at the beginning, you set out a data project, you'd say, "Well, we need all this data from our systems and you might need a bit of address data and stuff like that." But nobody would have said, "Oh, yeah, we've got to have coal prices in there if we want to predict this outcome in the future."
Yeah. Then she talked about weather patterns and things. And obviously because of the bush fires recently, who knows what, you know, when all the analytics we used to use or some of the data points in England about every time a student took I remember I used to say it's the year 11s when they were doing their their GCS In the UK, every day you take off is equivalent to like half a point at GCSE level or whatever it might have been at that point. The bush fires in Australia, I know that happened over the summer, but there's still communities still struggling to get back on their feet there. And she was talking about internet access. So really interesting the way that they're bringing extra sets of data in.
Yeah. So I think you've got to have an open mind about the data that you might be using. You've also got to have an ability to be able to bring that data in. And that's why you need the people around you with the skills to be able to do that. And Zani talked about the fact that the data architecture for BI and dashboards is different to the data architecture for predictive analytics and and I think that's true because if I think about some of the projects that I've done around BI it is very you know very structured you know well in advance in fact the first thing you do on any BI project is sit the manager down and ask them to define what the report is going to look like well in reality in most projects you get about 200 users defining the 2,000 reports they want to do and then you tell them you can give them 10
but
yeah In data science, it's very different. You know, when I've set out, you know, I've been doing this project for a long time now around this is great. I'm going to apply this to nap plan results and what data do I need to be able to predict nap plan results for a school. And we've talked about swimming pools. We've talked about adult education levels, you know, all of those things. It turns out it's non-education data that becomes better predictors than education data. And so, when you're in a project like Z projects is where you might get a surprise piece of data that's going to tell you a story that you're not going to be able to unlock without that data.
And the other interesting thing she said as well was about the actual fact that people could bring them projects and then they were they had that data for good mindset. What were your thoughts on that? Can you think of some examples of projects that you've been asked to do which have run in that spirit?
Yeah. So that was in the whole discussion about ethics
which I think you you've got to have that conversation when you're thinking about AI. And uh she talked about their keystone, you know, the why are we doing it and that keystone being it's data for good. It's about improving things and I think having that focus she talked about leadership being engaged and involved and and being sponsors for projects but I think you know that data for good thing keeps coming back of this is a great project but how is it improving things
yes
although you know we're talking about data we're talking about AI we're talking about all these kind of projects actually what we were talking about was how do you use data for good how do you use it to improve learning and I thought that was a unifying force across all of the people conversations that you and Zanie had in that discussion.
Yeah, absolutely. And and you know to kind of bring bring things to the head at the end of that interview, you know, when she was talking about the integration of the different um data sets she gets and then also the the worry and we've talked about it before about the blackbox AI where she you know and as a data scientist it was good to hear her perspective on it because of the models and we talked about it with um Nick Woods as well around the way that models in healthcare would change between different um countries and the way you needed to actually really try to understand what that model did. You know, she put a very different spin on that. So, it's quite interesting.
So, let's just for a second, let's just focus in on that phrase, blackbox AI, and explain what that means. So, to me, blackbox AI is you put a coin in the top and your result comes out at the bottom and you've got no idea what happened in the middle because there's some complex algorithm that does all of this.
Give us an example in education then.
Um, oh. Oh well, so uh predicting students going to drop out. You give it a whole load of data and it comes out and says this student is 80% likely to drop out. But if you don't know the data that underpins that, you can then go and make decisions based on that that fly in the face of it. The other one, and it's not really a black box, but my goodness, it probably is to most people,
is university rankings. Yeah, everyone fixates in Australia on university rankings almost as much as they fixated on house prices. And um and what is fascinating is that many of the actions you would take to improve your university actually harm your position in the rankings.
So for example, if you decide to limit the proportion of international students you have in order to make sure every student has the best experience, that actually can damage your rankings. The more international students as a proportion, the higher at the rankings you go, but it might lead to an a lower than optimal educational experience.
She was mentioning the levers that were being pulled. And if you don't know the model and the algorithm, then you could pull the wrong lever. or not really understand what believers are doing.
Which is the most important factor when it comes to preventing year 12 dropouts?
It's coal in some cases in Zan's area, but in other areas it'll be different thing. And knowing those things, not just putting in a whole lot of stuff into a sausage machine that comes out with the sausage, but you want to know how it's made and what the most important ingredients are.
Yeah.
Well, I thought when we got to the end of that and she talked about the excitement of the simplification of AI, I guess that's where, you know, My feeling is that every day, every week, every month, there's something happening with AI that becomes more accessible. And so I need less and less highly technical skills in order to use AI, but what I need is more attuned skills to the business outcomes that we're trying to achieve.
Yeah, absolutely. I think the simplification of AI is one of those things now that makes it accessible for everybody as well.
Do do you remember that quote? I can't remember who said it. It's awfully embarrassing. It's not Bill Gates cuz I got that quote wrong in episode one.
No, it's the quote about when it's pre-production, it's called AI, but when it's in production, it isn't called AI. It's just called the thing like Siri or whatever it is. And and I think it's the continuation of that journey. We're not thinking fancy AI stuff. We're thinking the thing that helps us predict which students are going to succeed, the thing that helps us predict which students are going to intervene with. The thing that helps us predict which learning resource we give to a student next to deliver an optimal journey. Yeah,
that simplification process where we can start talking about that outcome rather than the data sciency bit and the machine learning bit and all that kind of stuff. You know, that's the journey we're on. We've been doing this podcast for 6 months. In 6 months, some of that technology is completely changed and I guess in six months time it will have changed again. But that's the journey is taking this high science very technical stuff and turning into something that is usable for a teacher, a student, a leader in an education
as would say I tell you what, Ray, I'm going to go and find somebody else because that interview with Z oni was fantastic. So, I'm going to go find somebody else and I'll see you next time.
Great. Looking forward to it. Thanks, Dan.