The Edge

AI Development In Process with Yuna Burman and Sinem Aslan

February 22, 2024 ISTE Season 2 Episode 13
The Edge
AI Development In Process with Yuna Burman and Sinem Aslan
Show Notes Transcript

Join Georgia and Jessica as they explore artificial intelligence, how it has developed over time, and some considerations about how we develop AI usage in our society. Joining them this week are AI experts Yuna Burman and Sinem Aslan

Georgia Terlaje: [00:00:00] It's time for the edge, a podcast brought to you by it's the community leaders. Whether you're a seasoned educator, a visionary administrator, or a passionate education enthusiast, fasten your seatbelts because this podcast is tailor made for you. Get ready to embark on an exhilarating journey as our community leaders take you behind the scenes and into the dynamic world of education.

Georgia Terlaje: And the episodes ahead will unveil stories from the front lines, showcasing the relentless dedication and innovation that fuels the transformative field of education, buckle up, embrace yourself for an adventure coming up today. We're going to be discussing AI in the classroom. I'm one of your community leader hosts, Georgia Terlahi.

Georgia Terlaje: I'm a TK5 instructional coach and educator of 35 years. And I'm here with my favorite partner in crime, Jessica Pack. Thank 

Jessica Pack: you, Georgia. You are absolutely my favorite, too. And even though no one can see us, we happen to be matching in some fabulous cheetah print today. [00:01:00] So, I'm loving it. I'm Jessica Pack, a middle school teacher and an ITSC author.

Jessica Pack: Today's episode is going to be pretty awesome because we're going to keep building our understanding about the uses of AI in the classroom, which I think we can all agree is a pretty hot topic. We are very fortunate to be joined today by some incredibly knowledgeable folks. First, let's welcome Yuna Berman, who is a learning professional who currently works at Intel. Yuna, thank you so much for being here 

Yuna: today. My pleasure. Thank you. 

Jessica Pack: And you brought along another special guest, correct? 

Yuna: Correct. Would you 

Jessica Pack: mind introducing them for us? 

Yuna: Absolutely. I am so excited to introduce my colleague here at Intel, Sanam Aslam.

Yuna: And Sanam is a senior research scientist here at our Intel labs.

Georgia Terlaje: Welcome to The Edge Yuma and Sinem. Could you give us a little insight into how you met? Like, what's your origin story?[00:02:00] 

Yuna: I've never been asked for an origin story. I feel like I should be wearing a cape. Or some kind of superhero thing, but it's really very straightforward.

Yuna: I I have been in education slash learning for many years. And when I started at Intel, much like everybody else, I got onto LinkedIn and thought, Hmm, who's around here, who are my peers here? And I came across Sinan's name as a fellow educator in education and technology. And I reached out to her and said, Hey.

Yuna: You and I have a very similar background. Can we, can we connect? And she's a fabulous person. She did. And we've been discussing education and technology and its application in a variety of fields of not just higher ed, but Her her research is in elementary schools as well. And that I would say is our fantastic origin story.

Jessica Pack: That's awesome. Well, thank you so much for both of [00:03:00] you just giving your time today and your expertise. We're super interested in how AI has already started to sort of transform the traditional classroom experience. And we're wondering what types of examples you've seen and the impact AI has had in the classroom from your point of view.

Sinem Aslan: So maybe I can get started. So I like to, I like that you used the phrase for started because that's really where it is right now in terms of AI, it's just started to transform, you know, classroom experiences. And I mean, if you look at it, there is a lot of research and development you know, in the space that's going on.

Sinem Aslan: But apart from that, we also see many products, you know, coming up. That is readily available for our students and educators use it. And if you look at the spectrum, there are many applications, you know, from translations to language learning. I don't know, maybe you [00:04:00] have seen some examples, but now a teacher who doesn't know Turkish, for instance, can teach a Turkish student using generative AI technologies, you know with some, you know, through video technologies.

Sinem Aslan: And there are also some applications around writing, you probably heard about like Grammarly, you know, helping you know, users while they are writing an essay, you know, suggesting different ways to craft the essay, etc. And there are some applications in early childhood education, including, you know, one of the projects that we are working on and maybe we will talk more about you know, kids space in this space.

Sinem Aslan: Talk as well. And there are also some tools that helps teachers for teaching purposes, including creating lesson plans, right? Helping them to create lesson plans for teaching specific subjects. And of course, one of the, you know, biggest [00:05:00] area is around personalized learning which has strong connections toward.

Sinem Aslan: 101 tutoring. So we see a lot of conversational agents coming up, including coming off by Khan Academy. So these are all, you know, different applications that is kind of emerging. But as I mentioned, I see that this is gonna be exponential growth in coming up years. So, you know, do you want to add anything else?

Yuna: I love that you brought up the, the Khan Mego aspect because it's one of those things that Khan Academy has been around for a while and it's offered to everybody around the world. And by introducing, so what the Camigo is, it's like a little chatbot, and it will encourage the students, so if you enter a response, the student might see a message like that was great.

Yuna: What if you think about this? Or what if you do that? [00:06:00] And so it is really using that natural language processing, how somebody would understand words. It's not cybernetic. It's not like you're talking to a robot. But Khan Academy is so pervasive in so many households now to have an AI tool that you can access from your home or your classroom, I think is really showing that AI is, is, it's emerging, but it's kind of emerging here to stay.

Yuna: And so when we think about what, where else we use AI, like chatbots, I mean, we use it every day in our, In our everyday world. We use our facial recognition to unlock the phones and things. And so I love that it's coming now into education and can you go? I'm a big fan of Khan Academy just to see that shaping up.

Yuna: I think it's really exciting. But also, I do want to point out that the term AI in the classroom has been around for [00:07:00] a while. If you look at some of the research, there have been papers written as far back as 1990 in those back in the 90s about using different forms of AI as a, maybe a tutor or that guide on the side.

Yuna: And so it's, I think with things like chat GPT and other high profile tools, we're now. Maybe sitting up and taking more notice, but A. I. Has been growing and growing, and it's been around. There's lots of tools that claim that they can help teachers. Great papers, for example you know, maybe build your rubric and then make sure that students are applying all the points of the rubric to the homework.

Yuna: So I think it's here. It's probably here to stay, and it's probably in more places than people have recognized because it's just been around for that long, and they don't notice it. 

Georgia Terlaje: And I think like with chat GPT is like big, like [00:08:00] coming out party a few months ago, I think these things are much more on teacher's radar than they were in the past.

Georgia Terlaje: So with that in mind, . What are the ethical considerations when implementing AI in education in the classroom, particularly around data privacy and bias? What are things that educators need to keep in mind when choosing some of these tools?

Sinem Aslan: I think it's not only about education, but when we talk about AI, these type of ethical considerations come to place. But it's especially more sensitive in education because of, you know the younger population and the potential risks. that can implicate their growth. So when we look at some of the, you know, some of these considerations, we see, you know issues around privacy, like you mentioned, right?

Sinem Aslan: So especially with [00:09:00] face recognition or recommendation systems. that can cause some privacy related issues. There is also other issues around, you know, bias and potentially discrimination, right? Especially with some gender related bias or racial biases that can happen through this, let's say automated scoring system.

Sinem Aslan: And one of the other, like, bigger maybe umbrella is around surveillance, right? Like so the sense of being, you know, watched. All the time by students and also for this applies to teachers as well, because we see that in the research, there are many, you know, studies focusing on teacher behaviors in addition to student behaviors.

Sinem Aslan: So, like, in order to create that personalized learning experiences. These systems need to monitor student you know, in real time. So surveillance is becoming some you know [00:10:00] issues and the sense of privacy. And there is also you know, like the, of the autonomy aspect right, like, so through this predictive systems generated by ai maybe we might be geo the student's autonomy in a way for to govern their life further. So there are I think these ethical considerations, you know, in place. And at Intel, I would like to highlight that, to address these ethical considerations.

Sinem Aslan: We have a ethical AI council. And in this council a group of people from ISTs to AI resource scientists, to program managers to social researchers. So a, a wide range of people are evaluating every project, you know, that we do at Intel to make sure we put enough guardrails to address these ethical considerations.[00:11:00] 

Sinem Aslan: And I think these type of considerations should not be told after the fact, right? Like these needs to be considered and addressed while the start of the, you know, the project. So that's what we are doing at Intel. You know, do you, I mean, would you like to add more from your perspective? 

Yuna: Sure. And I'm glad that you brought up this concept of the council.

Yuna: I'm familiar with it also. I think we have Lots of publicly available information on how we're, we are, tracking our ethical use of AI in in general, I would say that there are two big buckets that we have to keep and keep in mind when we're talking about AI. And I would say that is stewardship and oversight because AI is It's so very broad, like I love that term you use, Georgia, the coming out party for chat GPT.

Yuna: And what chat GPT is, is that generative AI that you know, it, it can scrub information and then [00:12:00] learn and then spit something out. But there's also the AI that you can put in a model and you're supplying the data to build the model. And this is where I think a lot of Stewardship and oversight is needed, because as you build those models, as you build those algorithms as Sinem said, you want to have the diversity of voices, and I, I believe it's the Department of Education that's been saying that one of the main considerations now is making sure the human stays in the loop.

Yuna: So as we're building these models, as we're training our AI, we have to have that human voice, which is that council concept that you were talking about, Sanam, that there has to be somebody who's looking at where does that data come from? It's not coming out of thin air. It's being piped into those intelligent systems somehow.

Yuna: And so the stewardship is to make sure that we have the diverse voices, and then I would say the oversight is to make sure that the data that is collected isn't to the student's detriment. [00:13:00] So for example, if you are using something like facial recognition, and you're checking how many times a student goes to the library.

Yuna: Well, that data now be used to determine whether or not you get a scholarship, because, you know, if you had known that, maybe you would just keep going in and out of the library door. You don't know, right? So how is this data once it's captured? How is it being used? And do you know if it's being used?

Yuna: being captured. It might be super easy for you to not have to carry around your I. D. To get into some of like your university buildings. But is that then giving some of the access to what are your movements, what are your study habits and and those kinds of things. And then we want to be very careful also about in our underrepresented communities where students might not have such ready access to go to That same library, right?

Yuna: If somebody is looking at the data, student A who doesn't have access to the library constantly, student B does, and then they're being [00:14:00] evaluated for the same opportunities, that that inequity right there, if it's not taken into account in the model, that's not really going to help. Our students at all.

Yuna: It's just it's not taking into account the broad student base. So I think when we're thinking about AI and policies and privacy, just like we have FERPA and we talk about FERPA and our digital. Privacy and and we're very careful with who gets to see grades and those kinds of things. We're going to have to build a model like FERPA that will include AI.

Yuna: Where is that data coming from? Who gets to use it? Who gets to approve that it's even used? If I don't want to opt in, am I now, am I now not going to be counted for all these other opportunities and things? And so, yeah, there's a lot of considerations, but it does start with people asking the question.

Yuna: Who's got my data? What are you doing with it? So, 

Sinem Aslan: yeah, it also implies some sort of training as well, right? You know, in that [00:15:00] sense, from users perspective, educators or students perspective, right? Like, even on a day to day basis, we are just accepting your for our giving our consent right for so many different things when browsing online, but it's really hard to track, right?

Sinem Aslan: Like so, therefore, there is this and there needs to be some more awareness around these ethical concentrations on the user's perspective in addition to the companies who are really building these systems. So how do we increase that awareness? That's a good, I guess, question, right? Like in society.

Jessica Pack: So ISTE is a community full of early adopters and people who are really excited to jump into the deep end of the pond. I know Georgia and I are both those types of people where we just jump in first and sometimes ask questions later. Which can be really positive. And [00:16:00] also, you know, our conversation is making me think about that.

Jessica Pack: So for the average classroom teacher who maybe is really excited about bringing AI into their classroom and leveraging a lot of those tools, how can they vet the appropriateness of the tool? And also, how can they You know, benefit from it.

Yuna: I can jump in here. I did quite a bit of that. My previous role, not for AI, but digital tools in general, I would think that asking very basic questions, like I said, this training. If you are, if there's a cookie on your browser, why, why do they need that data? Why do they need to follow where you're going?

Yuna: Why does anybody need to be able to sell you something afterwards? It takes work and that's the thing that it is so very easy when you get to that website and it says accept all cookies. Sure, you can. It takes less than a second, but my kids always tease me because I'll go into the [00:17:00] cookie list and take off the ones that I don't want.

Yuna: They'll always have very funny things to say about it. But, you know, it takes a moment to say, you know what, I will give you this part of my life, but I will not give you this part of my life. And so when you're a student in the classroom Something, for example, when I was working with a lot of instructors at the university, I would ask them that when your students go to use a free tool that's online, free tools are great.

Yuna: We love to be able to give students resources without emptying their pockets. But when they have to make for example, an account to use that free resource yes, you need probably an edu extension on your email, but you need to give them the same password as what you use. to access your schoolwork because now you're exposing your very personal credentials to a website that you don't necessarily know what kind of security features they have.

Yuna: So a very simple thing to do is, yeah, use your at DU email, but use a very different password. And again, I go back to FERPA, it's a [00:18:00] good list of tools, ask who's getting this data when you're working with the younger students. Obviously, you want to make sure if there's any kind of vision where if there's any kind of recording again, does that have to happen?

Yuna: Can we use something like make a virtual background? It does take time and it does take effort. But I have found that when you work with vendors and say, you know what? I really would like you to think about these things. They're they're not opposed to changing some features because they might not have thought about it from a security point for your student.

Yuna: They're thinking about it as what's the quickest way to get you in and get you rolling and get you learning. And so I love that you and Georgia are new adopters. I love that that innovation curve. We need those early adopters right to make innovation go. But just take that five seconds before you log in or say yes, accept all the cookies and say, why do you have cookies here?

Yuna: Do I really need to be [00:19:00] tracked and open up on the same page when I come back here? So, Just, just asking those questions, taking that one extra step to ask, where is your data going to go and then, yeah, innovate because that's, that's what makes it happen. If we're all too scared to click go, it's never going to go anywhere.

Yuna: So I love that you're doing that. And I love that you're giving everyone the opportunity to hear more about it as well. And 

Georgia Terlaje: I take issue with the fact that they call them cookies because who doesn't want cookies, right? Call it something that I want to think twice about. That's all I'm saying. What, what advice would you have?

Georgia Terlaje: So like we had talked, you know, Jessica and I do like to jump in with really the deep end right away and then figure it out as we go along. But maybe for other educators who aren't quite as brave or reckless as we are, what advice would you give to them? As they maybe start to implement some a I in their classroom.

Georgia Terlaje: What are some questions you think they should be asking themselves [00:20:00] or lens they should be looking at? As they start to 

Yuna: try some new things. So then can I throw that to you? Because you actually did work with instructors and implementing your I tools. Sure. 

Sinem Aslan: Yeah. So we previously implemented a I tools together with, you know, teachers in various research projects.

Sinem Aslan: And of course, since these technologies are relatively new right. Some of, some of the time, you know, they don't have enough information about these technologies. They don't know how they work and all of that. So for me as a researcher, the critical part is to really first Training the teachers that I'm working with.

Sinem Aslan: You know, what are those technologies? What data do we collect? What data do we process and how are we using those data from a research project perspective? I'm not talking about from a product perspective, but how are we going to use that data and how are you going to delete that data after we are done and [00:21:00] all of that?

Sinem Aslan: So it's kind of a training process, and it's more participatory design, you know, together with teachers. Yes. And on the way when they have any questions, you know, we address them and we also have them, you know, have the opportunities for them to experience. these new, you know, technologies by firsthand.

Sinem Aslan: So we do demos, you know, with them. And again, new questions can come up, you know, pop up and they would address, I mean, we would address those questions, you know, on the way. And and usually in addition to teachers, students have the same curiosity as well, right? Like these are really kind of for them, very cool technologies.

Sinem Aslan: They are interested in learning more about them, and they also ask really good questions. So it's just that, you know I think we need to give platforms where both students and educators can ask this and pose these questions, you know, to be addressed. to be honest, because I mean, from my own experience, I see that, you know, teachers always thinking [00:22:00] about their, you know, students, you know, How can I make the best of my time to help them to learn them to see they grow, you know, and after they see that, Hey, this technology can really help me, you know, support this or enable this.

Sinem Aslan: Then they also open up as well because if you look at like literature, you see that there are also some barriers that teachers sometimes do not want to incorporate new technologies. But I think the critical aspect is, do they really need to, I mean, see the need, right? Do they really see the value of this new technology?

Sinem Aslan: Once they see, from my experience, I, they always open up and they were very like, Okay, let me try and see, you know, how it's gonna go. So I think those are some of the, you know, exponential you know observations from my side on this topic.

Jessica Pack: Do you have any final thoughts that you would like to share with our listeners regarding AI and the [00:23:00] classroom? 

Sinem Aslan: I think one thing that I see, I think as all technologies, right, like we should not incorporate technology for the sake of. you know, technology. And if you look at how current educational systems work we utilize a one size fits all paradigm where we keep time constant and expect all students to meet the same learning outcomes at the same time.

Sinem Aslan: But considering the foundational differences among learners. I mean the students who are performing really well will need to wait for others to catch them and the students who are lacking behind will need to do a lot, you know, to get there. They will just keep behind. Therefore I think personalized learning is very important to address the current you know, major issues in education and personalized learning at scale is impossible without use of technology and without use of AI [00:24:00] technologies nowadays.

Sinem Aslan: So therefore, I feel like there's a huge potential. In AI technologies that can help with personalized learning and make sure we provide enough conditions for all students to learn on. I believe through as we progress and as we as these technologies get more major, you know, we will see a little fruitation, you know, on the educational outcomes side as well.

Sinem Aslan: But we always need to keep in mind that, you know, as technologists, Or as researchers or as educators, we need to use these new technologies in a responsible way through responsible ai. So these are my you know, final comments. And yeah, thank you.

Yuna: I just wanna say a hundred percent to everything that em said. I love that concept of personalization at scale. If we wanna educate the world, we've got to [00:25:00] figure out how to do that, how to do it quickly, and. inexpensively. The other thing that just my closing observation would be that many years ago, when he learning distance learning, whatever you want to call it, became a thing.

Yuna: There were there were many similar conversations, right? There was, I think it's even is a garrison and at all. They even went so far as to make a model to prove that you could through computer Technology and communication communicate at an equal in an equal way as somebody talking to you directly to make sure that online learning actually worked and look at us today.

Yuna: We do so much online learning. There have been so many people who have been educated because of this phenomenon of learning at a distance and I think that what that shows is that there's an evolution that can happen that did happen. And I think there's an evolution that's going to happen with a as well.

Yuna: It's the new tool of the [00:26:00] day. It's not understood well, but I think lots of people can see the potential, lots of people who jump in recklessly. I won't name names, but you know, there is this, this hunger to be more efficient to, like you said, Sanam, there's only a certain amount of time and I have so much to teach you.

Yuna: How do I do that? You're going to need a tool, and technology is just a tool, and AI is just a tool, and I'm so interested to see this evolution of the discussions and how it's being used, and even thinking ahead, like, well, will this change learning objectives in the future? Will an essay class now have a different perspective?

Yuna: Will it be a How can you research instead of just writing the, not just writing, but writing an argumentative paper or writing a straight up research? With this tool in your pocket, student, what else can you do? And I'm excited to see this evolution because I do think now that we're in this [00:27:00] revolution of knowledge, this 4.

Yuna: 0 kind of world, and You know, I, I think it starts with asking how, how are we going to use it and how do we keep everybody safe, but also what else can it do for us? So I'm, I'm excited to see where it goes and thank you for allowing us to talk about it for a while. It's great. Yeah, I think 

Jessica Pack: you both are just right on target with the idea that time is a teacher's or any educator's most valuable commodity and anything that we can leverage to really.

Jessica Pack: Make the most of the time that we have within the workdays. Fantastic. This has been such an insightful conversation. Thank you both again for being here. Before we let you go, where can listeners connect with you to continue the conversation or maybe ask follow up questions? 

Yuna: I'm on LinkedIn. You can look me up.

Yuna: Una Berman, B U H R M A N. Don't think there's a lot of Bermans out there on LinkedIn. So be happy to hear from everyone. 

Sinem Aslan: It's same here [00:28:00] as well. LinkedIn is my professional account. I also, I'm very much active on Instagram as well. So, 

Jessica Pack: That's perfect. Well, thank you so much again, and that just about wraps up this episode of The Edge Podcast. We hope you had a great time. My name is Jessica, and you can find me at packwoman208 on Twitter, threads, and Instagram.

Georgia Terlaje: And I'm Georgia Tulahi and you can find me at Georgia Tulahi on Twitter, X, whatever you call it. And both of us at StorytellingSavesTheWorld. com. 

Jessica Pack: On behalf of everyone at ISTE's The Edge podcast, remember to keep exploring your passion, fostering your creativity and continue taking risks.

Jessica Pack: All things that can bring you to the edge.