Preview Mode Links will not work in preview mode
Welcome to the AI in Education podcast With Dan Bowen and Ray Fleming. It's a weekly chat about Artificial Intelligence in Education for educators and education leaders. Also available through Apple Podcasts and Spotify. "This podcast is co-hosted by an employee of Microsoft Australia & New Zealand, but all the views and opinions expressed on this podcast are their own.”

Nov 27, 2019

In this week's podcast we interview tech guru Sulabh Jain from the Cloud Collective. Sulabh's recent work has involved technical development of the AI used for the personalised learning support discussed in last week's interview with Dr David Kellerman at UNSW Sydney. David talked about the work he'd done on personalising learning using AI for his Engineering students, and this week Ray and Dan spend some time finding out how that was developed, and the role of data in making that successful. As Sulabh, and his team, developed chatbots, query engines, and knowledge discovery machines, they used troves of data to unlock information to help students.

They also discuss what's next on the project list, and discover there's some very smart assessment automarking in development, to deal with hand drawn engineering diagrams from students.

 

You can find Sulabh on LinkedIn here:

https://www.linkedin.com/in/sulabh-jain-b40a329/

And you can read more about the Question Bot project, and the Cloud Collective work with David Kellerman, here:

https://www.cloudcollective.com.au/bots-for-education/

 

TRANSCRIPT FOR The AI in Education Podcast
Series: 1
Episode: 10

This transcript and summary are auto-generated. If you spot any important errors, do feel free to email the podcast hosts for corrections.

This podcast excerpt, featuring hosts Dan Bowen and Ray Fleming with guest Sulabh Jain of Antara/Cloud Collective, delves into the technical development and broader impact of an AI-driven learning platform at UNSW, pioneered by Dr. David Kellerman. The discussion highlights the Microsoft mission of empowering every person and every organisation to achieve more, contrasting this with technology that replaces humans. Sulabh, the "digital brains" behind the platform's coding, explains the agile, iterative approach taken to build a collaborative learning community in Microsoft Teams, initially focusing on connecting students and teaching assistants (TAs). The platform evolved to include the QuestionBot, which uses AI to self-learn from discussion threads and provide students with personalised resources, including one thousand custom study packs generated by analysing student performance data. The conversation also explores future applications of machine learning, such as automarking complex, handwritten engineering drawings, and emphasises that scaling this solution requires not only technology but also organisational readiness and cultural change, with strong commitment from academics.

 

 

 

 

 

Welcome to episode 10 of the AI and education podcast. It's me, Dan Bowen,
and me, Ray Fleming.
And so today we've got a special guest, Ray. Would you like to introduce him to us?
Okay. Well, yes. We've got Well, you remember last week?
Yeah.
We had Dr. David Kellerman from UNSW School. for mechanical engineering talking about all the things that they've done in teaching and learning to help improve the student experience.
Sure do.
Using AI. Well, this week we've got Sulab. Sulab from the Antara team or the Cloud Collective team. Sulab is the digital brains behind the coding of the work that David did.
Fantastic. So, this is going to be technical today, is it?
Oh gosh, I do hope so.
It's been a good week this week in in Sydney. We've had Satcha in town, haven't we?
That's right. So, Satcha was in town last week and in fact, David element of Satcha. Sorry, we should do the who's Satcha.
Oh yes, Satcha.
Satcha, who's your boss's boss's boss's boss's boss? And my boss's boss's boss's boss boss. So Satcha is the CEO of Microsoft, who we both work for.
Yes.
And he was over here to do a series of events in Sydney and in CRA. And one of the things that he did was we ran a conference about the future of technology and what it's going to mean to our lives. And it was really fascinating because Satcha did an hourong keynote at the beginning of it and then invited onto stage Dr. David
no I fantastic
and so David talked a lot more about the story that he told us in the podcast that they we then released about an hour after he walked off stage
that's fantastic isn't it I suppose just to start this off today what one of the things that that resonated for me was obviously the impact um the the sat always kind of says around the technology that from a Microsoft point of view we're using and and to quote our mission empowering every person and every organization on the planet to achieve more it made me reflect on kind of when we talking about AI and what what generally what I do in my own
so so Satcha says to graduates when they are thinking of applying to Microsoft yeah
you come and work at Microsoft to make other people cool if you want to be cool go and work somewhere else if you want to make other people work cool come and work for us
and then we've got that mission statement about empowering other people and organizations along the planet to achieve more and to me that makes sense because it's like we're not doing a lot of work in self-driving cars for example example because that's about replacing humans with technology. What we're doing is a lot of work that helps people to achieve more. So if that's about making other people cool rather than making yourself cool.
Yeah.
What's your mission statement Dan?
Well my mission statement generally you
as a personal individual
it's an interesting question. I know we did some work on this ourselves. I think it's we had uh some work in Microsoft itself working through our own mission statements. For me personally and in my job I suppose it was about using my life and my job to make sure that I use that to the fullest and impact others and society. So, it's just impact was the big thing that jumped out at me when I did that exercise a year or so ago. How about yourself?
Well, I guess if I had one word, it would be stories. A lot of what I do is about how do we take complex topics
and turn them into things that make sense to a lot of other people. And I guess, you know, if I think about a skill I have, it's around that story piece. It's about taking complex stuff and making it much more simple to understand and more relatable. Because all too often technologies not relatable.
True. True. How about yourself, Salab?
Very similar to what you just said, Ray. So solving problems by applying technology in very simple steps. Quite often when we have a look at a complex problem, it's so easy to get overwhelmed by what we're trying to achieve and and all the different moving parts in the problem. But if you break it down and you apply technology in very very gradual steps, you can actually see meaningful outcomes.
So my career has always been education technology. Dan's career has always been education and technology. What's your background?
So, okay, great question. So, I actually um I did software engineering uh from University of Sydney and then I was actually started working for University of Sydney. So, I actually got to work in an academic institution and actually see how how the institution works. I then moved on to consulting. So, I've been in consulting ever since and uh have been working for Antarus for the last um six and a half years.
Right. Okay. So, we heard all about David last week.
Yeah, we did. So, interestingly, uh what do you think his what do you think his mission would have been then, right?
Wow, that's fascinating. I really should ask him this question, shouldn't I? But I don't know. So, uh I think it's probably his mission is about helping his students to succeed because he talked about every one of his students getting through his course.
Yeah.
And so, he wants them to be successful on uh engineers, but then I remember he was talking about engineers about changing society, wasn't he?
You know, it's the we want
I want the people that I teach to be the people that are building the future and improving our world as we go on. Did you get that from him?
Yeah, I did. Yeah, I think he was quite clear about that, wasn't he? And I think that's why it aligned so much with the Microsoft kind of mission when we were thinking about the fact that it's empowering everybody, you know, and like I liked your analogy about the cars there is very much around that empowering his students to achieve, which was kind of core to his kind of main goal there.
And so we had him on last week because of a lot of the work that he was doing is driven by AI under the covers. It's the artificial intelligence services that he's using that help him to deliver that to his students. But it didn't really start with an AI world. I remember when I was first working with David, the stories were about collaboration and engagement between students. It was about creating a community of learners. I think often he's talked about the fact that you have 500 students in a lecture hall, but it's really 500 islands. that aren't connected. And so his first point was about how do we connect those islands together? How do we create that connection between the students? And so a lot of the work that he did, pretty sure, was in teams about creating that collaborative community between students and the AI piece was only really part two once he'd got that collaboration in place.
Yes. Very good point. Yes. So from a from the So let's unpick because the the conversation last week was very much around the kind of impact which is great. That's what we want and humanistic education and all of those you know big big complex topics.
Yeah absolutely. So we really need to think about what technology was underneath that what was the AI underneath that. So so what from your point of view from that technical side what were the technologies under underpinning those and how did he did he do this? It started with a with a problem where David was teaching courses of 500 plus students and he he wanted to use a collaboration tool and he he he asked the students to start using teams and he got tremendous engagement on teams. Uh now because he got a lot of a lot of excitement and a lot of uh students asking questions in teams, he created his own problem like he's he's he's said in the past and he he wanted to ensure he was able to engage with each student, he was able to answer each student's questions. So it really started by us automating that and ensuring that you know all the teaching assistants as an example when a student was when a student asked a question were informed and that they were tagged into the conversation and they were able to have have the conversation with with students.
So it was like a connections element.
Correct. It was basically bringing everybody together in a conversation.
Yes. Yeah. Okay. And so once everybody's together that conversation, where did you then so I suppose this this developed as it went?
Correct. Absolutely. So what what we started doing was uh we started observing students very very closely and we put ourselves in a student mindset. We obviously all of our all of us have graduated from universities uh and so we were able to put ourselves in that mindset and and really think about, you know, how would a student react to this particular feature or if we did this, how would the student how would a student respond? Would the student gamify the system? And, you know, those kind of things we started to think about and we started to then uh really quickly pivot and iterate. So, we'd roll out something and then we'll still see the student response. Sometimes it would work, sometimes it wouldn't. And then, you know, we'll continually iterate. So, I remember developing one feature as an example which is around likes and reactions. And we were we were thinking thinking of rewarding the students based on the number of likes they had and then we started to realize now the students are starting to gain the system a bit more. So we we took that off as an example but there are many different examples of similar kind
when you're developing these days in an in an agile way. This is quite interesting generally because I think we see lots of schools and technology companies now starting to develop slightly differently whereas previously it was around a statement of work and you know you you'd go in and you you completing one particular problem these days especially with AI things start to kind of develop in a slightly different way around agile and things. What are your thoughts on that?
Absolutely. So the the main thing with this is having all the stakeholders on the same page. We had a very clear vision and a very clear goal and we wanted to build a student learning community. A studentto student learning community and a studenttoteer learning community. And so whenever we've been thinking of building something we we had that vision when we had that goal and that basically allowed us to pivot and um every single time when we had when you had to make a decision, it was very clear line on on how we wanted to you know whether we wanted to build something or whether we didn't want to build something.
So what we see when we hear David's story is a complex learning solution at its end point. Actually it's not even at its end point. It's it's a mature stage in the journey. But there have been a series of stages on that journey. So you know first stage from what I've just heard is how do we help the connection the students to the teaching assistants, the TAs. So when you've got 500 students and they want to ask you a question, even just helping to connect them to the TAs saves time and gets students more connected. And I guess that's part of that journey we talked about in the early days of well, what do you need to make AI work? You need some data to be able to train the AI to recognize a pattern. And in this case, the data is here's the students, here's the TAs. But then the next stage was if you're creating a massive increase in the volume of conversations and therefore questions that are coming up, how do you deal with those questions and that's the bit that really interests me because that's the bit where you went from some simple application of technology into how do we use AI to make this work and I know in talking with David his starting point was we've got all of this data all these questions that students have asked us before that we could use to train a bot or an AI agent in order to be able to answer questions from students.
Yeah, correct. So that's what basically led to um the birth of of question What we realized when students started to engage in different conversations how many different problems and how many different you know different types of questions etc being asked we were being asked and these questions could have been asked the same seme the same questions could have been asked the same semester could be asked again the following semester etc. So we really wanted to capitalize on that and we didn't want to lose those conversations we were having so we built the question bot that could self-learn so as questions were being discussed and answered uh the bot was self-learning. So in the question when a similar question was asked the next time it had the capability to come back and said well I think this question's been answered before and if you want to read this thread on this discussion you know here's we'll point you to that discussion and that's what uh that was the basically the birth of questionbot we then started to add on again learning from students behavior started to add on a lot more new features so we started to realize that students were taking photos of their assignments and asking questions well how do I solve this problem or what do I mean what do you mean by this and So we started to put QR codes on on assignments and the bot was able to read those those QR codes. Uh we then started to iterate and we started to to work out well we didn't want to always answer students questions. We wanted them to learn and so we we started providing them hints and we started uh the the bots started to have one one-on-one chats with the with the students. Uh and then we started to tap on to the other knowledge bases. So we started to tap on stream as an example as a knowledge source where a student where a particular problem was discussed in in a lecture. uh the student we were able to search in that knowledge base and answer students questions or point them to exactly when that discussion was
was it that so when you're thinking about adding those knowledge bases in were there certain things you then had to go back to and and make sure that say for example in those videos that it was being set up in the correct way that it was being tagged correctly were there certain things then you had to almost close the loop on to bring bring back together
yeah always when we're using data we have to make sure it's it's in a consumable format and that you know it's got the right metadata associated with each student. Particularly our aim and and goal was to provide that personalized student experience. So whatever data we were collecting, we also had to map it and and match it with the student metadata we already had and and David did a really good job of actually collecting all of this data that we were able to correlate with and then build the algorithms around it.
Fantastic. And you talked about that personalization piece there as well because I think that was that was something that we didn't cover a lot in the last episode, but when you look at his fuller video and some of his online talks and things. That personalization piece at the end was really interesting because that's where that you know from an educational point of view that's where you do take the time you know because when you talk to teachers generally or lecturers and academics you know marking and assessment somewhere where which really grates on their mind and time poor kind of educators are kind of together. So I think you know can you explain to us how that personalization um
and I'll tell you the bit I'm really interested in is that when he was on stage after Satcha,
yeah,
David said, "We produced a thousand personalized study packs." That's the bit that got the round of applause in the room is
we knew where students were succeeding and we knew where students had gap and we produced a thousand personalized study packs and that was one of those moments.
Exactly. And so again going back to data um David was collecting a lot of very very useful data. We had data on each on each topic every week. uh we had data on lab tests, we had uh data on block tests, we had data on student attendance, we had historical data. So we were able to pull all of that data and uh associate that with the student metadata to then be able to work out was students really good at this particular topic but not so great at this particular uh topic. So that then allowed us to build those individualized custom study packs for every student where you know they could focus on on on topics that are harder and they haven't been doing very well. whereas not focusing so much energy on you know topics they have been doing very well on
what what about where that then connects into machine learning so if you've got that data and you can personalize that learning ray struggling with a particular topic we can get that term quite a quite tactful kind of use of information but strategically when can you when or did you or can you or is it an extension of where you're going to start to connect machine learning into this to predict
you want to be careful Dan because every time we do an example struggling student we use me I'm going to be painting you as even more Dr. Evil going forward
so where where does machine learning fit into this
yeah so uh we're using machine learning algorithms basically to pass on a whole like a large data set uh that understands the relationships between various uh scores that a student has has scored or has has graded as well as uh different metadata that we already have collected for for each student. We've actually taken this a bit further uh most recently and and created a proof of concept um using machine learning to automark uh students assignments. And so what we've been able to do with that is actually mark complex engineering drawing, handdrawn uh engineering drawings. We've been able to predict what the students scorecards uh should be is going to be and we've been able to pre-populate that in actually teams assignments. We've actually found issues or find found errors I should say in human marked assignments as well.
Wow, that's fantastic. I recall from maybe five or 10 years ago research saying that human markers are very inconsistent. And so machine marking is more accurate and more consistent. But I think the issue is that often humans we're all above average. You know, go and ask a 100 drivers, every single driver is an above average driver. We tend to rate our capabilities higher. So it's really interesting this issue around automated marking because it's quite a contentious human issue, not just a technology. So Every time I hear the stories that David is talking about and what is being achieved there within those two thousand student courses at UNSW mechanical engineering, I think about the next part of the problem which is there are 17,000 students in mechanical engineering. He's only got 117th of the students coming through his year one course. There's 60,000 students at UNSW. There's 1.3 million Australian university students. There's hundreds of millions around the world. So, how do you scale from one person passionately driving a project that is making a difference. How do you take that and scale it out? Because that has always been the problem in education technology.
Yeah, very good question. And so we have received a lot of interest locally and and internationally from a number of education institutions who've been, you know, been wanting to do this similar sort of thing once they once they saw uh what David had done uh for school of mechanical engineering. We've created did a a learning and teaching assessment which basically allows us to learn the you know the business objectives the key goals of the organization what they're looking to do in terms of improving the student and and uh the student learning and the teaching experience uh within that education institution we conduct a number of assessments that are related to data and AI uh related to teams security identity that are all part of the the the overall um journey we've been with the with David um and that allows us to then come up with the road map and a blueprint on how this can be done at a scale for a number of uh courses across the university and we're working with a number of universities already um on this plan.
It's it's really interesting because like many projects I I guess we've all been involved in sometimes the technology can be the easiest bit of it. It's the organizational change, the cultural change, getting the users to change that can be the biggest challenge. You know, in the case of what's happened with David is David is the user as well as the person building the thing. But it's then how do you scale it out? I remember in the early days of interactive whiteboards, some users made a massive difference and somebody then said, well, let's put one in every classroom in the country. And of course, the same passion didn't go with it because people weren't on board the journey in the same way that the initial users are. So when you think about a university wanting to take the lessons from this and then scale it out, we can do the technology piece reasonably easily.
Yep. Relatively easily. Y
but what you were talking about in that assessment is about the organizational readiness.
Correct.
And and is there also an issue around the data as well? So like I said, there's a database of questions that have been answered to students that give you the information you need to train. How do you find other organizations in their view of the data that they've got available to be able to use?
Yeah. So every and every organization uses different systems. They store data differently. They have a different architecture, etc. But to your point also you know this this solution is only successful we believe when an academic is involved and when deans are involved it's not an IT project it's what there's it's an IT project from a data point of view and for us to be able to connect different systems together but from a business point of view it's really all about academics the the data part and the technology part we can solve and I said we do the learning and teaching assessments where we are able to conduct a basically a technical understanding of how the different parts are are connected or should be connected but we really start the learning and the teaching assessment by getting everybody on the same page by working getting the TAs getting the the lecturers getting the course coordinators uh all on the same page regarding you know what the overall objective is if they're committed to it um then we commit with them and then we start to work hand inhand with partners in partnership
and and that data source thing often I know when I in a conversation with the university around data they'll think about their traditional data sources you know the student information system that tells you which classes and groups are students in but it sounds like there's a lot more softer data that's being used in the in your work.
Correct. So yes, we are using the data from the learning management systems etc. We are using information from the student calendar systems. We are using students enrollment information but we are also using information on how they're currently using Office 365 as an example. How they're already communicating on Teams. Uh what does their identity look like? Um you know Are there any security considerations we need to look at when we are looking at student data? All of those are are different aspects of the solution.
And I think one thing that jumped out to me when David was talking last time was about that integrated approach and he was talking about the kitchen analogy which is interesting about getting different ingredients from different kitchens and I think we're in that position in education. Did do you think just generally from that project you've done do you see systems bringing data together now? Where do you see that going if you're advising universities or educational institutions like schools about what they should do with their data based on the fact that you've seen the output and got impact. What would your advice to be?
Yeah. So when whenever we speaking with any education institution, we we're speaking with the end goal in mind and and so that allows us to really take that holistic view. Universities and education institutions by nature are complex. Every faculty, every school has their own rules. They have different funding mechanisms. They have different u number of students. All of that comes into in into the equation. So I don't believe there is a single rule to connect all of the systems together or to basically solve all the problems. But if you again look at a problem by faculty for each faculty for specific courses you're looking to roll out the the the system to uh you can then take that approach that holistic approach and apply to the handful of faculties or the handful of courses and then you can use that to scale it at a larger at a larger magnitude. I
I think there's also an interesting conversation about what is the data you know within a university context for example. So do you know what the biggest database in a university is aside from like the square kilometer array a big research project you know what are typ what typically their largest database is
school information system
n
uh video
yeah it's the lecture recordings it's pabytes of data on lecture recordings but people don't tend to think of that as data because it's a video it's a lecture recording but if you turn that into data Suddenly you've got a wealth of stuff. And that's what you're doing, isn't it? You're using I think David's using stream to create the videos. Correct.
And then stream is turning that into data by saying, "Okay, what are all the concepts talked about? What's every word that's said in here?" And turning things like video into data allows you to then actually start treating it as a data.
Extremely powerful, isn't it? And that's where that integrated approach of the tools that people are using and the way that data is set is really important. Yeah.
Yeah. And I think it requires a an exp handed way of thinking about the assets that are there. I mean, it's not the case with universities. It's not the case with schools, but certainly with a number of organizations I work with, I'm pretty sure that their data is worth more than the organization. You know, I know some companies I've worked with, gosh, their data is worth way more than the all of the people and everything else put together because the data has got incredible insights in it. And I think that same is true in education organizations and the same is true in teaching institutions that the data that is there is incredibly useful but probably hasn't been captured because we think about the data in you know as you did student information system what's in our LMS
but actually the data that's being captured is huge
so where is this technology going to take us so if you think about one of the most complex issues or one of the issu one of the things that keeps most deans up at night is the is the is the marking the the amount of money that is being spent in contract marking is is absolutely ridiculous
and that that's where we think we can really use machine learning and artificial intelligence to really solve that problem in simple steps. Now you can't use machine learning to automark everything but there are certain types of questions there certain types of of courses that it can be really easily applied to uh and can actually take on a lot of the workload of the faculties.
So interesting on that point You mentioned earlier on that you started to put automarking in for engineering actually handwritten notes. That's phenomenal. So when I think OCR, optical character recognition, I think taking in text you've written in simple alphabets and numeric information. Can you tell us a little bit about the future of that marking in terms of the quite complex engineering and mathematical notations?
Yeah, so we there are machine learning is now because it's it's almost available at fingertips. And so there there are already pre-built models that you can apply that can then allow you to see, you know, if if a if you pass on like thousands of data sets to to to the to machine learning, uh it's then able to predict, you know, how a particular like is if you're doing a if you're drawing an engineering diagram, if how should a particular curve be measured, where where's the x-axis, where's the y- axis? And, you know, it's able to take on u we can use those those pre-built models and really apply those to to solve those problems.
Wow, that's fantastic.
It's taking some of the magic out of it. The thing I hear most commonly from my colleagues when they've heard David Kellerman talking about teaching engineering at UNSW is, and I see this quite often online, people go, "Wow, I want to go back to uni to study engineering now." And part of it is about the science and part of it is about the art of teaching and everything. It's fascinating to realize that actually you're putting this technology layer in place that is removing some of the drudge and providing more time to add the value. If I think about marking, you know, it's really interesting that you talk about the cost of marking and contract marking because what I think the true cost of marking is is that every time I'm having to put a number or a tick by something, I'm losing time that I could be providing feedback on something because feedback is the thing that improves the future. Marking is the thing that tells you how you went.
And that's absolutely critical. It's quite interesting you mentioned that I was speaking to a university on on Friday last week and it was quite an interesting conversation that we had. I showed the David Kellerman's Ignite speech and we were talking about the ways that that could be kind of implemented and there was a there was an interesting debate around does that mean that there's no need for TAs? Does that mean we don't need academics? You know, one of our questions, do we need teachers anymore? And it was very much about the diversion of the resource and using the resource for better impact rather than removing that resource per se.
I think we've probably talked about this before. If we haven't talked about on the podcast. I've talked about it heaps before because I believe that the Silicon Valley view which is we can get rid of teachers because teachers are the variable. So if we can replace teachers with robots then education's going to be great so misunderstands the role of a teacher as a mentor a person who is there to set an enthusiasm for learning for life not just to impart knowledge into students. And so I guess you know for me this whole story is about a passion for teaching ing and learning, not a passion for technology to allow me to not have to do a bunch of things. It's about technology enabling a whole new set of scenarios and giving people the time to add value rather than just
absolutely
giving people more time to do something else. It's how do you add value like feedback is an essential part of the marketing process because that's the thing that improves.
And if you think about from a student point of view, if I can get my grades much earlier, if I know what I'm good at, what I'm not good at, it gives me more time to prepare. prepare for my final exam. So, you know, automatically it's making a difference to my life as a student.
Yeah. And and making you a better engineer as David would say, he wants people to be changing society and changing the way they travel and and interact with technology and mobile phones and whatnot. So, where for our listeners, where do they go next?
So, like I said, we are conducting a number of learning and teaching assessments uh at the moment. So, institutions who want to take on this journey, please do reach out through Rey and D And we'll be happy to discuss with you what your journey looks like, what your business objectives are, what your drivers are, what really is your strategic plan and you know how we can then come in and actually use technology in simple ways and leverage the data that you already have to really improve and provide that personalized student learning experience.
I think I think that's a really important point because you remember at the end of the last podcast, David said, "And we're giving it away for free." You know, that was kind of his message. It's going to be on GitHub and you can go and do it. It's not there yet, by the way. So, I'm still waiting for that link to be
depending on when you listen to the podcast.
It's still not there. But what I think is actually technology is only part of the problem or only part of the solution that we can give you a technology thing, but it doesn't really address the complete thing which is about how do you get the culture change? How do you get people responding and doing things in the right way? And so that assessment piece isn't just about the have you got the data? Are you ready for the technology? Have you got this piece of technology? Tick tick tick. There you go. Because it's about how you get the change across the organization. It's as much about people as is is about technology. And sometimes having somebody in having that independent conversation and asking some of the tough questions based on what's happened at other places is the bit that helps you make the transformation. You know, certainly we've seen that around loads of other technologies over over our careers, haven't we?
Yeah. No, absolutely. Yeah. I thank you so much for coming in today and speaking to us about that technology. kind of road map and the way you've worked through that. It's nice to hear David's point of view and it's nice to hear a technological point of view to see how that kind of actually came to life and thank you for all the students in UNSW I'm sure to kind of have their personalized learning journeys supported by you guys and the technology you've put in place. So, thank you.
No, thank you for having me. Thank you. Really appreciate it.
Thank you. And you talked about machine learning models. I kind of almost feel like we should next week go back a stage to almost episode one or two and talk about some of the techn technologies and relate it back because now we've got some great examples. We had some early discussion about the technology. Maybe we should just put those two things together in the next episode.
Yeah, that sounds fantastic. Thanks.
Thanks, Dan. See you next week.
Thanks again.