1 00:00:00,370 --> 00:00:23,500 Thank you and good morning and these days you have to come to Madrid to see the future 2 00:00:23,500 --> 00:00:24,500 of education. 3 00:00:24,500 --> 00:00:30,719 I think we have just seen the future of learning, students who are not just receiving wisdom 4 00:00:30,719 --> 00:00:35,979 from the past but who are creating their own projects, who are doing things, who are mobilizing 5 00:00:35,979 --> 00:00:39,420 their kind of own ideas and realize them and do them. 6 00:00:39,420 --> 00:00:42,740 I think this is just an amazing example. 7 00:00:42,740 --> 00:00:48,520 And outside this room, I've been fascinated by the projects and ideas that young people 8 00:00:48,520 --> 00:00:50,899 develop. 9 00:00:50,899 --> 00:00:51,899 What can I add? 10 00:00:51,899 --> 00:00:53,079 This is the future of learning. 11 00:00:53,079 --> 00:00:56,119 This is the future of education. 12 00:00:56,119 --> 00:01:00,960 And technology can make it happen. 13 00:01:00,960 --> 00:01:04,200 Education will always be a social, a relational experience. 14 00:01:04,200 --> 00:01:08,739 That's why I'm never worried that technology will replace teachers. 15 00:01:08,739 --> 00:01:12,019 But technology will change the work of teachers, 16 00:01:12,019 --> 00:01:14,260 it will change the nature of learning. 17 00:01:15,260 --> 00:01:17,219 Maybe, you know, as a teacher, 18 00:01:18,060 --> 00:01:21,420 the kind of transmission of knowledge, the instruction, 19 00:01:21,420 --> 00:01:23,219 may become less important. 20 00:01:23,219 --> 00:01:26,659 Technology will take over some of those functions. 21 00:01:27,500 --> 00:01:29,420 But what will become even more important 22 00:01:29,420 --> 00:01:34,659 is teachers who have a real passion for the ideas of their students, 23 00:01:34,659 --> 00:01:37,260 who help their students understand who they are, 24 00:01:37,260 --> 00:01:40,359 who help their students understand who they can become 25 00:01:40,359 --> 00:01:44,700 and who accompany them on their journey as good coaches, 26 00:01:44,900 --> 00:01:49,599 as good mentors, as good facilitators, as good evaluators, 27 00:01:49,939 --> 00:01:53,739 as good designers of innovative learning environments 28 00:01:53,739 --> 00:01:56,120 such as the ones that you can see outside. 29 00:01:56,439 --> 00:01:58,840 So that's really what I will talk about. 30 00:01:59,400 --> 00:02:06,680 In this time, we are still coming out of the most disruptive period of education, 31 00:02:06,680 --> 00:02:07,939 the pandemic 32 00:02:07,939 --> 00:02:10,620 again Madrid has managed that period 33 00:02:10,620 --> 00:02:11,860 really really well 34 00:02:11,860 --> 00:02:14,740 I think many countries have looked to the experience 35 00:02:14,740 --> 00:02:15,979 of Madrid of keeping 36 00:02:15,979 --> 00:02:18,280 education moving forward 37 00:02:18,280 --> 00:02:20,020 even in the most difficult times 38 00:02:20,020 --> 00:02:22,319 but you know this time 39 00:02:22,319 --> 00:02:24,620 has also been a time of extraordinary 40 00:02:24,620 --> 00:02:26,300 technological 41 00:02:26,300 --> 00:02:27,900 and social innovation 42 00:02:27,900 --> 00:02:29,159 in education 43 00:02:29,159 --> 00:02:32,560 schools have woken up 44 00:02:32,560 --> 00:02:34,099 to the digital world that is 45 00:02:34,099 --> 00:02:36,020 radically transforming learning 46 00:02:36,020 --> 00:02:39,699 And the future will always surprise us. 47 00:02:40,319 --> 00:02:43,800 Climate change is going to disrupt our lives a lot more than the pandemic. 48 00:02:45,360 --> 00:02:52,360 And artificial intelligence is going to put to a test a lot of the things that we take for granted. 49 00:02:53,919 --> 00:03:00,199 It's easy to educate second-class robots, people who just repeat what you tell them. 50 00:03:00,199 --> 00:03:03,419 but what's going to make us human 51 00:03:03,419 --> 00:03:05,400 in a world in which the kind of things 52 00:03:05,400 --> 00:03:06,560 that are easy to teach 53 00:03:06,560 --> 00:03:08,819 and easy to test 54 00:03:08,819 --> 00:03:10,800 have also become easy to digitize 55 00:03:10,800 --> 00:03:12,840 to automate 56 00:03:12,840 --> 00:03:16,039 technology pushes us to think harder 57 00:03:16,039 --> 00:03:17,120 what makes us human 58 00:03:17,120 --> 00:03:20,000 and this is not about technological competencies 59 00:03:20,000 --> 00:03:21,240 this is a lot about 60 00:03:21,240 --> 00:03:24,180 the social and emotional skills 61 00:03:24,180 --> 00:03:25,000 and competencies 62 00:03:25,000 --> 00:03:27,439 that we need to be successful in this world 63 00:03:27,439 --> 00:03:28,740 and there are lots of other forces 64 00:03:28,740 --> 00:03:33,780 yet that, you know, disrupt our ways of working in education every day. 65 00:03:35,680 --> 00:03:39,159 I was interviewed, you know, by a great group of students early on 66 00:03:39,159 --> 00:03:43,340 and, you know, the challenge today is that we need to educate young people 67 00:03:43,340 --> 00:03:45,539 for jobs that have not been created, 68 00:03:46,360 --> 00:03:48,699 to use technologies that have not been invented, 69 00:03:48,879 --> 00:03:52,900 to solve social problems we cannot yet imagine. 70 00:03:53,419 --> 00:03:55,580 Those are the challenges ahead of this. 71 00:03:55,580 --> 00:03:58,199 Now, once again, the kind of things that are easy to teach, 72 00:03:58,740 --> 00:04:03,740 Memorizing something and repeating it, that's less important. 73 00:04:04,740 --> 00:04:08,740 This world no longer rewards you just for what you know. 74 00:04:08,740 --> 00:04:10,740 Google knows everything. 75 00:04:10,740 --> 00:04:12,740 GPT can answer every question. 76 00:04:12,740 --> 00:04:16,740 The world rewards you for what you can do with what you know. 77 00:04:16,740 --> 00:04:18,740 Technology-intensive tasks are on the rise. 78 00:04:18,740 --> 00:04:23,740 And you put the two things together, that's the picture of the future of work. 79 00:04:23,740 --> 00:04:29,560 some people say well you know technology will destroy jobs faster than it creates 80 00:04:29,560 --> 00:04:30,540 some other people say 81 00:04:30,540 --> 00:04:35,319 the opposite it's not so clear the answer but the direction of travel is 82 00:04:35,319 --> 00:04:35,920 very clear 83 00:04:35,920 --> 00:04:41,540 what counts today is the capacity to imagine to live with 84 00:04:41,540 --> 00:04:44,920 yourself to live with people who are different from you 85 00:04:44,920 --> 00:04:50,560 to live with our planet those kinds of skills the imagination the creativity 86 00:04:50,560 --> 00:04:55,759 the problem-solving skills are the ones that actually will create our future. 87 00:04:57,100 --> 00:05:06,740 You can say education has always been superior to the technological capacities, 88 00:05:06,959 --> 00:05:09,600 but there's no guarantee that it will continue in the future. 89 00:05:09,839 --> 00:05:11,720 Before the first industrial revolution, 90 00:05:12,420 --> 00:05:17,360 neither technology nor education made a big difference for the lives of most people. 91 00:05:17,360 --> 00:05:20,100 We lived on our farms, self-sufficient, 92 00:05:20,100 --> 00:05:25,980 But then, you know, suddenly the industrial revolution moved technology way ahead of people 93 00:05:25,980 --> 00:05:28,379 and created huge amounts of social pain. 94 00:05:28,560 --> 00:05:30,139 People were not ready for this. 95 00:05:31,060 --> 00:05:38,439 But then, you know, we created school, making people compliant with the ideas and norms of the industrial age. 96 00:05:38,639 --> 00:05:42,019 And actually, that created generations of prosperity. 97 00:05:43,360 --> 00:05:45,980 But nowadays, that's no longer good enough. 98 00:05:45,980 --> 00:05:51,759 You can see actually the digital revolution once again moves technology ahead of the skills of people 99 00:05:51,759 --> 00:05:54,819 and we see the same kind of social pain. 100 00:05:55,779 --> 00:06:03,160 Some people leaving Spanish universities with a degree and faint difficulties to get a good job. 101 00:06:03,300 --> 00:06:07,920 And at the same time, employers say they cannot find the people with the skills they need. 102 00:06:08,800 --> 00:06:14,459 That kind of mismatch between what we do in education and what the world is looking for 103 00:06:14,459 --> 00:06:24,060 It's becoming serious and we need to change this, turn things around and bring people once again ahead of the technologies of our times with technology. 104 00:06:24,600 --> 00:06:33,459 When you actually look at artificial intelligence versus humans, when it comes to talking with computers, computers are still not so great. 105 00:06:34,300 --> 00:06:38,800 They have a hard time with something that is very easy for humans, chatting, talking. 106 00:06:38,800 --> 00:06:49,220 But when it comes to, you know, answering questions, actually, you know, ChatGPT is pretty much as good as humans and a lot faster. 107 00:06:50,319 --> 00:07:00,019 That's a really important point that actually, you know, if we teach students to remember answers, we will not be very successful. 108 00:07:00,399 --> 00:07:02,259 That's something computers can do better. 109 00:07:02,259 --> 00:07:08,420 The bigger challenge today is teaching students to ask the right questions, to, you know, think about the novelty. 110 00:07:08,420 --> 00:07:11,920 And when it comes to processing vast amount of information, 111 00:07:12,139 --> 00:07:14,980 technology is way ahead of the skills of people. 112 00:07:16,060 --> 00:07:17,920 And everything is keeping moving. 113 00:07:18,120 --> 00:07:19,720 That's, I think, the dramatic things. 114 00:07:20,000 --> 00:07:22,079 One of the things that we did recently, 115 00:07:22,279 --> 00:07:25,459 we did, you know, you're familiar with the PISA test 116 00:07:25,459 --> 00:07:28,279 where we test students, and we have done a similar test 117 00:07:28,279 --> 00:07:30,600 where we assess the skills of adults. 118 00:07:31,819 --> 00:07:34,620 When you look at literacy, you can see most adults 119 00:07:34,620 --> 00:07:37,279 can solve very easy tasks, you know, 90%, 120 00:07:37,279 --> 00:07:41,759 About 70% can solve tasks, you know, everyday tasks. 121 00:07:42,199 --> 00:07:46,079 And then, you know, level four, you know, complex information. 122 00:07:47,100 --> 00:07:49,920 It's only a minority of people who can process this. 123 00:07:50,379 --> 00:07:52,639 So in 2016, we asked the question, 124 00:07:52,779 --> 00:07:55,259 what if we didn't give these tests to our workers? 125 00:07:55,639 --> 00:07:58,339 What if we give those tests to artificial intelligence? 126 00:07:59,420 --> 00:08:03,199 And the computers were actually able to do pretty much as well as people on that. 127 00:08:03,199 --> 00:08:06,959 And some tasks, particularly the more complex one, they even did better. 128 00:08:08,139 --> 00:08:10,740 And then we did that same thing in the year 21. 129 00:08:11,540 --> 00:08:13,920 That's two years ago, and we could see actually 130 00:08:13,920 --> 00:08:17,300 how dramatically improved the competencies 131 00:08:17,300 --> 00:08:19,600 of artificial intelligence have been, 132 00:08:19,779 --> 00:08:23,160 surpassing basic skills like literacy of humans. 133 00:08:24,220 --> 00:08:27,439 Once again, that pushes us to think harder. 134 00:08:28,120 --> 00:08:29,220 What makes us human? 135 00:08:29,220 --> 00:08:31,980 How we can complement, not substitute, 136 00:08:32,500 --> 00:08:36,279 the artificial intelligence we created in our computers. 137 00:08:37,279 --> 00:08:45,320 You know, if you look at these numbers, you can see the digital world is now the real world for most young people. 138 00:08:46,440 --> 00:08:51,200 15-year-olds in Spain, you know, spend 30 hours per week online. 139 00:08:51,980 --> 00:08:54,480 That is their real world, the digital world. 140 00:08:55,860 --> 00:08:58,519 And a lot of that actually happens in school. 141 00:08:58,700 --> 00:09:02,120 And, you know, the yellow bar in Madrid would be larger than the bar in Spain 142 00:09:02,120 --> 00:09:07,000 because Madrid actually has gone very, very far in bringing digital experiences into classrooms. 143 00:09:08,080 --> 00:09:11,759 So once again, keep that in mind, the digital world is the real world. 144 00:09:11,840 --> 00:09:14,299 We're no longer talking about the real world and the digital world. 145 00:09:14,720 --> 00:09:16,559 They become completely integrated. 146 00:09:17,080 --> 00:09:19,379 But are we ready for the digital world? 147 00:09:19,580 --> 00:09:22,899 That's another question that we ask ourselves in the last PISA assessment. 148 00:09:23,720 --> 00:09:25,440 And the numbers are quite scary. 149 00:09:26,220 --> 00:09:30,580 You can see a few countries here on the left, you know, Singapore, Korea, parts of China, 150 00:09:30,580 --> 00:09:36,419 where you can say, yeah, you know, a good half of the student population at age 15 151 00:09:36,419 --> 00:09:40,019 can process complex digital information, 152 00:09:40,879 --> 00:09:42,740 distinguishing fact from opinion, 153 00:09:44,279 --> 00:09:46,559 contrasting, critiquing information, 154 00:09:46,679 --> 00:09:47,720 asking good questions. 155 00:09:49,200 --> 00:09:51,860 But you look to many of the other countries 156 00:09:51,860 --> 00:09:53,360 and you can see it's a minority. 157 00:09:53,539 --> 00:09:55,860 Most of the students actually are not ready 158 00:09:55,860 --> 00:09:56,940 for the digital world. 159 00:09:57,039 --> 00:09:59,419 We have taught them how to use the technology, 160 00:10:00,279 --> 00:10:01,580 but we have not been good enough 161 00:10:01,580 --> 00:10:05,299 to give them the cognitive, social, emotional skills 162 00:10:05,299 --> 00:10:07,279 to actually use technology. 163 00:10:08,940 --> 00:10:13,039 Just a minority of 15-year-olds are really, really good at, you know, 164 00:10:13,759 --> 00:10:15,419 working with complex information, 165 00:10:16,039 --> 00:10:18,759 particularly distinguishing fact from opinion. 166 00:10:19,899 --> 00:10:22,000 You know, 20 years ago when I was in school, 167 00:10:22,200 --> 00:10:23,299 that was not so important. 168 00:10:24,100 --> 00:10:26,139 You know, when I didn't know the answer to a question, 169 00:10:26,259 --> 00:10:27,779 I could look it up in an encyclopedia 170 00:10:27,779 --> 00:10:30,600 and I could trust the answer, I find that to be true. 171 00:10:30,600 --> 00:10:34,259 the information I would process, read 172 00:10:34,259 --> 00:10:36,779 was given, was authorized, was curated 173 00:10:36,779 --> 00:10:39,500 today when you look up things on Google 174 00:10:39,500 --> 00:10:41,480 you get 10,000 answers to your question 175 00:10:41,480 --> 00:10:43,840 and nobody will tell you what is right and what is wrong 176 00:10:43,840 --> 00:10:45,240 what is true and what is not true 177 00:10:45,240 --> 00:10:49,919 you as teachers have an incredibly important role 178 00:10:49,919 --> 00:10:53,019 to help people navigate ambiguity 179 00:10:53,019 --> 00:10:57,399 find out what the right answer is 180 00:10:57,399 --> 00:11:00,100 contrast different perspectives, different opinions 181 00:11:00,100 --> 00:11:06,919 That is the literacy of the 21st century and you can see actually we're not really good enough on this. 182 00:11:07,720 --> 00:11:11,440 So what can the digital world offer us in this? 183 00:11:11,639 --> 00:11:13,379 And I think there's actually a lot. 184 00:11:13,539 --> 00:11:24,279 If you look sort of in the classroom, for example, you can see technology can make learning so much more adaptive, so much more personal, so much more kind of interactive. 185 00:11:24,279 --> 00:11:34,360 While you study mathematics on a computer, the computer can figure out how you learn mathematics and then adapt your learning experiences. 186 00:11:34,519 --> 00:11:39,039 There are amazing technologies that actually personalize learning experiences. 187 00:11:39,320 --> 00:11:46,779 They make learning accessible to students with special needs who never had that kind of chance in a standardized classroom setting. 188 00:11:47,179 --> 00:11:50,240 So that's actually a really, really amazing kind of way. 189 00:11:50,240 --> 00:11:57,740 But also for teachers, you know, for teachers, we can use technology to diagnose, to detect different learning patterns. 190 00:11:58,220 --> 00:12:10,759 One of the most amazing experiences I had recently when I visited the region of Shanghai in China, you know, and I wanted to see a primary school where students learn calligraphy. 191 00:12:12,179 --> 00:12:14,720 That's one of the biggest headaches for teachers in China. 192 00:12:14,720 --> 00:12:19,240 You know, in Europe, we struggle with 30 characters and they have to learn 4,000. 193 00:12:20,000 --> 00:12:23,360 It's a big kind of task for teachers to teach students calligraphy. 194 00:12:23,919 --> 00:12:27,000 And it's not just a technique, it's an art for them. 195 00:12:27,080 --> 00:12:29,480 That's very, very important for teachers there. 196 00:12:31,279 --> 00:12:34,720 And, you know, while the students were drawing their characters on their table, 197 00:12:34,899 --> 00:12:39,860 very carefully, in their tables they had integrated scanners. 198 00:12:39,860 --> 00:12:42,360 On their table, the student had their mobile phone, 199 00:12:42,419 --> 00:12:45,159 and the mobile phone was giving the students real-time feedback 200 00:12:45,159 --> 00:12:46,679 on the quality of their drawings. 201 00:12:47,480 --> 00:12:51,700 Students didn't have to wait, you know, for a week to get the teacher marking all the homework. 202 00:12:52,000 --> 00:12:53,820 Actually, they got that feedback immediately. 203 00:12:54,659 --> 00:13:02,480 But interestingly, you know, the teacher in the classroom could see in real time how different students learn differently. 204 00:13:03,279 --> 00:13:08,639 Where students have certain misconceptions, where they have, you know, good ideas, creative ideas. 205 00:13:08,639 --> 00:13:13,519 You know, where similar students struggle or students struggle with similar problems. 206 00:13:13,519 --> 00:13:16,620 so I can group them together, help them to advance, you know. 207 00:13:17,039 --> 00:13:18,779 And what you could see there, you know, 208 00:13:19,019 --> 00:13:22,720 teachers were great, you know, teachers, great instructors, 209 00:13:22,899 --> 00:13:24,740 but they were also great data scientists. 210 00:13:25,419 --> 00:13:27,539 They were actually able to read, you know, 211 00:13:27,720 --> 00:13:30,960 the minds of their students in real time on this. 212 00:13:31,120 --> 00:13:34,940 So this is what technology can actually really help us to do. 213 00:13:35,080 --> 00:13:36,940 At the task level, you will be familiar, 214 00:13:36,940 --> 00:13:40,580 you know, there are many, many kind of software programs now 215 00:13:40,580 --> 00:13:42,919 that help students, you know, interact with tasks 216 00:13:42,919 --> 00:13:44,580 and give them that kind of feedback. 217 00:13:45,000 --> 00:13:47,820 We can also see that at the curriculum level, 218 00:13:48,240 --> 00:13:52,419 where actually programs are scaffolding learning experiences 219 00:13:52,419 --> 00:13:54,179 in new and novel ways. 220 00:13:54,299 --> 00:13:56,360 This is becoming the new reality. 221 00:13:56,580 --> 00:13:59,320 It's no longer one single textbook for everyone, 222 00:13:59,460 --> 00:14:03,620 but every student now gets their own version of a textbook. 223 00:14:04,039 --> 00:14:07,460 A textbook that adapts to not just where they are, 224 00:14:07,460 --> 00:14:09,779 but also to their learning style. 225 00:14:09,779 --> 00:14:12,860 that's the time in which we live 226 00:14:12,860 --> 00:14:14,500 and I think that is the great opportunity 227 00:14:14,500 --> 00:14:16,960 that teachers in these days have 228 00:14:16,960 --> 00:14:22,500 technology has become an incredibly powerful tool 229 00:14:22,500 --> 00:14:25,419 to help students with learning difficulties 230 00:14:25,419 --> 00:14:28,539 dysgraphia is a great example 231 00:14:28,539 --> 00:14:31,440 students who have difficulties learning to write 232 00:14:31,440 --> 00:14:33,440 in the way in which we used to teach it 233 00:14:33,440 --> 00:14:36,639 the computer can teach them in other ways 234 00:14:36,639 --> 00:14:40,019 students with autism of different degrees 235 00:14:40,019 --> 00:14:43,659 sometimes for them that interaction with technology 236 00:14:43,659 --> 00:14:45,519 can open up new worlds of learning 237 00:14:45,519 --> 00:14:47,620 that are very very difficult to experience 238 00:14:47,620 --> 00:14:49,059 in the traditional classroom 239 00:14:49,059 --> 00:14:52,539 audio or visual impairments 240 00:14:52,539 --> 00:14:55,220 technology can give you alternative access 241 00:14:55,220 --> 00:14:56,620 to knowledge and information 242 00:14:56,620 --> 00:14:57,419 and actually again 243 00:14:57,419 --> 00:15:00,200 I think there is a risk that technology 244 00:15:00,200 --> 00:15:01,740 becomes a big social divide 245 00:15:01,740 --> 00:15:03,559 you know some students have access to this 246 00:15:03,559 --> 00:15:04,899 some classrooms do really well 247 00:15:04,899 --> 00:15:05,899 those do well 248 00:15:05,899 --> 00:15:10,139 others are left behind. That's one of the risks. But there's also the opportunity that 249 00:15:10,139 --> 00:15:15,740 technology will help us close those kinds of gaps and give students very, very important, 250 00:15:15,879 --> 00:15:22,240 very right opportunities. And then clearly, you know, technology can help us to make learning 251 00:15:22,240 --> 00:15:27,799 so much more interesting and engaging. Why would you listen to a teacher explaining you 252 00:15:27,799 --> 00:15:34,620 the result of a scientific experiment when you can do that experiment yourself in a virtual 253 00:15:34,620 --> 00:15:36,679 laboratory. 254 00:15:36,679 --> 00:15:40,100 Technology can help us to bring student agency at the center. 255 00:15:40,100 --> 00:15:42,480 Students can really do things, try out things. 256 00:15:42,480 --> 00:15:44,940 And you can actually see outside so many amazing 257 00:15:44,940 --> 00:15:45,639 examples here. 258 00:15:45,639 --> 00:15:49,220 This is happening right here in Madrid where students create 259 00:15:49,220 --> 00:15:50,120 their own projects. 260 00:15:50,120 --> 00:15:53,700 They use technology-enabled environments to actually 261 00:15:53,700 --> 00:15:57,559 realize ideas that they could never dream of when I was a 262 00:15:57,559 --> 00:15:58,100 student. 263 00:15:58,100 --> 00:16:00,039 And this is, I think, so important. 264 00:16:00,039 --> 00:16:04,059 The biggest threat to education today is not that 265 00:16:04,059 --> 00:16:07,279 education is not so effective and expensive and so on. 266 00:16:07,279 --> 00:16:11,759 No, the biggest threat is that education is losing its relevance to young people. 267 00:16:12,379 --> 00:16:15,980 The young people no longer see their own future in this. 268 00:16:16,059 --> 00:16:19,820 And I think technology can really, really be a great help on this. 269 00:16:19,820 --> 00:16:22,919 And then, you know, I talked already about classroom analytics, 270 00:16:23,159 --> 00:16:27,000 using data that comes from student learning experience 271 00:16:27,000 --> 00:16:30,399 to analyze how different students learn differently 272 00:16:30,399 --> 00:16:35,220 and to embrace that diversity with more differentiated pedagogical practice. 273 00:16:35,440 --> 00:16:39,019 Again, a huge kind of potential that is happening. 274 00:16:39,159 --> 00:16:42,919 Of course, you know, that requires a systemic approach to technology. 275 00:16:42,919 --> 00:16:46,720 We're still quite far away from this in most European countries. 276 00:16:47,360 --> 00:16:49,080 Best place to visit is Estonia. 277 00:16:49,639 --> 00:16:53,440 Very far advanced in this to provide that systemic architecture 278 00:16:53,440 --> 00:16:56,480 that the technology is compatible, integrated, 279 00:16:56,480 --> 00:17:00,899 and that you can actually use and analyze those kinds of data. 280 00:17:01,799 --> 00:17:04,619 So you can see classrooms as digital systems. 281 00:17:04,799 --> 00:17:07,460 Now you use, you know, patterns that you observe. 282 00:17:07,759 --> 00:17:09,359 You know, sometimes it's what students do. 283 00:17:09,480 --> 00:17:11,000 Sometimes it's their expressions. 284 00:17:11,339 --> 00:17:12,880 All of that is now available. 285 00:17:12,880 --> 00:17:18,200 And then you can create, you know, information that helps teachers actually see, you know, 286 00:17:18,579 --> 00:17:21,359 whether their students are really interested, whether they get bored, 287 00:17:21,480 --> 00:17:24,839 whether they advance, whether they fall behind, and so on. 288 00:17:25,359 --> 00:17:29,079 And teachers can use that for monitoring learning progress, 289 00:17:29,240 --> 00:17:31,500 for intervention, for sharing information, 290 00:17:31,880 --> 00:17:34,160 but also for grouping students, 291 00:17:34,319 --> 00:17:37,059 building teams around common challenges, 292 00:17:37,220 --> 00:17:39,799 common interests, and so on. 293 00:17:39,880 --> 00:17:41,640 I'll just show you a couple of examples 294 00:17:41,640 --> 00:17:43,900 that I found really interesting. 295 00:17:44,019 --> 00:17:46,619 One of the greatest challenges every teacher is familiar with 296 00:17:46,619 --> 00:17:49,980 is how fast should I progress in my lesson? 297 00:17:50,079 --> 00:17:52,240 If I go too fast, I leave some students behind. 298 00:17:52,619 --> 00:17:54,980 If I go too slowly, many people get bored. 299 00:17:55,200 --> 00:17:56,019 You all know that. 300 00:17:56,119 --> 00:17:57,319 That's the common experience. 301 00:17:57,519 --> 00:17:58,980 So how do you do that? 302 00:17:59,660 --> 00:18:03,759 Well, actually, data systems can become a very, very good support for that. 303 00:18:03,900 --> 00:18:05,240 Now, here you can see an example. 304 00:18:05,420 --> 00:18:08,019 A technology actually looks at where are different students. 305 00:18:08,019 --> 00:18:12,200 It can tell the teacher exactly, you know, at this point, you know, 306 00:18:12,279 --> 00:18:16,880 about 60% of your students have mastered this level, 40% not. 307 00:18:17,079 --> 00:18:19,299 If you continue for another 10 minutes, 308 00:18:19,640 --> 00:18:22,119 you're going to get maybe 90% of your students on board. 309 00:18:22,119 --> 00:18:27,119 So as a teacher, you get some really, really good feedback on where your students are 310 00:18:27,119 --> 00:18:31,460 and how you can pace your kind of classroom environment. 311 00:18:31,799 --> 00:18:34,640 For experienced teachers, that's not so much a challenge. 312 00:18:34,779 --> 00:18:37,500 That's what you become very good at when you're a long time for a teacher. 313 00:18:37,940 --> 00:18:40,960 But for novice teachers, that's a big challenge. 314 00:18:41,119 --> 00:18:44,859 Technology can really, really help you very well with that. 315 00:18:46,700 --> 00:18:51,039 Another thing, you know, it's strengthening self-awareness for teachers. 316 00:18:52,119 --> 00:18:59,119 Technology can help you, you know, with which students did you spend a lot of time where you spent very little time. 317 00:18:59,119 --> 00:19:03,119 Maybe you did that intentionally. You wanted to help a specific group of students. 318 00:19:03,119 --> 00:19:06,119 But sometimes it's also unintentional. 319 00:19:06,119 --> 00:19:13,119 We all have a lot of unconscious biases in the way in which we walk, in which we interact with people. 320 00:19:13,119 --> 00:19:18,119 And technology can make you more aware of some of those biases. 321 00:19:18,119 --> 00:19:24,180 Where you spend your time, you know, whether you, you know, distribute the time in the ways you want. 322 00:19:24,940 --> 00:19:29,920 The most controversial topic everywhere is, you know, should robots enter the classroom? 323 00:19:30,819 --> 00:19:32,799 Some people say, yeah, that's great. 324 00:19:32,920 --> 00:19:34,380 You know, that's an amazing opportunity. 325 00:19:34,500 --> 00:19:38,799 And other people say, no, this is the end of a good education, which is a social experience. 326 00:19:40,019 --> 00:19:42,720 And, you know, I'm skeptical, too, about this. 327 00:19:42,720 --> 00:19:47,059 You know, I really believe that learning is a social, a relational experience. 328 00:19:47,059 --> 00:19:49,339 The heart of learning is really personal. 329 00:19:50,099 --> 00:19:55,519 And we have seen that during the pandemic, you know, what kept students moving was that interaction with their teacher, 330 00:19:55,660 --> 00:19:59,680 that phone call, that kind of video link that actually keep people together. 331 00:20:00,660 --> 00:20:05,039 But, you know, despite that skepticism, we should look at some opportunities, you know. 332 00:20:06,180 --> 00:20:08,059 Language learning, robot tutors. 333 00:20:09,220 --> 00:20:13,640 You know, what we know, and you are kind of a champion on language learning here in Madrid. 334 00:20:13,640 --> 00:20:18,099 You know, that is something that this city is better than most places in the world. 335 00:20:18,279 --> 00:20:19,500 You have great experiences. 336 00:20:20,579 --> 00:20:25,279 And one of the things that you know very well and that most researchers have figured out 337 00:20:25,279 --> 00:20:31,119 is that the only way you do not learn a language is having a teacher in front of you 338 00:20:31,119 --> 00:20:32,700 talking about grammar and vocabulary. 339 00:20:33,680 --> 00:20:38,440 Learning is something that we learn through interaction, by talking, by experiencing, 340 00:20:38,740 --> 00:20:40,319 by basically communicating. 341 00:20:41,319 --> 00:20:42,559 A lot of data showing that. 342 00:20:42,559 --> 00:20:45,539 But it's very, very hard to do in a big classroom setting. 343 00:20:46,440 --> 00:20:48,220 You have 30 students in front of you. 344 00:20:48,720 --> 00:20:51,579 Well, that's where robot tutors do amazing work. 345 00:20:51,660 --> 00:20:53,240 This is an example from the Netherlands 346 00:20:53,240 --> 00:20:55,460 where actually, you know, robot tutors 347 00:20:55,460 --> 00:20:58,180 allow students to interact with them to learn languages 348 00:20:58,180 --> 00:21:00,259 and the results are very, very good. 349 00:21:00,599 --> 00:21:03,680 That can be a great complement to the classroom experience 350 00:21:03,680 --> 00:21:05,539 where actually we must concede 351 00:21:05,539 --> 00:21:07,420 this is something that students like, 352 00:21:07,519 --> 00:21:08,960 where students can express themselves 353 00:21:08,960 --> 00:21:11,019 and where they become really, really good at. 354 00:21:11,700 --> 00:21:14,140 Here's an interesting example from Japan. 355 00:21:15,059 --> 00:21:17,160 You know, one of the things that we found out is that 356 00:21:17,160 --> 00:21:20,279 sometimes you learn faster 357 00:21:20,279 --> 00:21:23,559 when you explain something to somebody else 358 00:21:23,559 --> 00:21:26,559 than having somebody else explaining something to you. 359 00:21:27,940 --> 00:21:30,279 Now, very difficult to do in a classroom setting 360 00:21:30,279 --> 00:21:32,180 because you have 30 kids in front of you. 361 00:21:32,680 --> 00:21:36,079 But once again, you know, robots are quite good at that. 362 00:21:36,200 --> 00:21:38,339 You know, they can be good listeners, they can respond, 363 00:21:38,480 --> 00:21:39,920 and students can explain themselves, 364 00:21:39,920 --> 00:21:41,240 and by explaining themselves, 365 00:21:41,779 --> 00:21:44,240 they get that deep conceptual understanding. 366 00:21:45,759 --> 00:21:47,119 Because that's the difference. 367 00:21:47,319 --> 00:21:49,460 If someone explains something to you, 368 00:21:49,740 --> 00:21:53,119 you can memorize it and actually keep it in your mind. 369 00:21:53,299 --> 00:21:55,180 But you will have as quickly forgotten that 370 00:21:55,180 --> 00:21:56,059 than you have learned it. 371 00:21:56,640 --> 00:21:59,000 When you explain something to someone else, 372 00:21:59,240 --> 00:22:00,700 that's when you internalize it. 373 00:22:00,759 --> 00:22:02,720 That's when you build that conceptual understanding. 374 00:22:03,440 --> 00:22:06,220 And once again, this example from Japan shows 375 00:22:06,220 --> 00:22:09,160 technology, even if controversial, 376 00:22:09,160 --> 00:22:12,359 can actually help us facilitate some of those kinds of processes. 377 00:22:13,440 --> 00:22:15,279 Telepresence, that's something very popular 378 00:22:15,279 --> 00:22:17,180 that became very popular in the pandemic 379 00:22:17,180 --> 00:22:19,819 when students and teachers couldn't interact. 380 00:22:20,359 --> 00:22:24,359 You know, you can use the cheap version of Zoom links or Teams links. 381 00:22:24,500 --> 00:22:26,099 Now, that's not such a great thing, 382 00:22:26,420 --> 00:22:28,599 but you can actually use some real telepresence, 383 00:22:28,700 --> 00:22:30,160 bringing people into classrooms 384 00:22:30,160 --> 00:22:34,079 or bringing teachers from other places into other locations. 385 00:22:34,079 --> 00:22:36,740 So I think this is, again, I think, 386 00:22:36,740 --> 00:22:40,019 a future that we discovered during the pandemic 387 00:22:40,019 --> 00:22:43,500 and that something suddenly for us to explore. 388 00:22:44,920 --> 00:22:45,819 Big data. 389 00:22:47,220 --> 00:22:50,200 You know, in Spain you still have a big issue 390 00:22:50,200 --> 00:22:51,740 with students dropping out of school, 391 00:22:51,859 --> 00:22:53,180 and that's true for many countries. 392 00:22:53,799 --> 00:22:57,880 But students never drop out of school suddenly. 393 00:22:59,039 --> 00:23:01,579 Most of that dropout has a long history, 394 00:23:02,039 --> 00:23:04,599 and we do not often understand the beginnings of that history 395 00:23:04,599 --> 00:23:06,640 when students lose their interest. 396 00:23:06,740 --> 00:23:09,480 when students have difficulties at home, 397 00:23:09,480 --> 00:23:12,200 when students have social problems at school. 398 00:23:12,779 --> 00:23:16,180 Those things are very, very difficult to detect for teachers early on 399 00:23:16,180 --> 00:23:18,619 because you teach mathematics or you teach history, 400 00:23:18,740 --> 00:23:21,319 but you do not see that student experience as a whole. 401 00:23:22,059 --> 00:23:25,759 Well, again, that's where technology has become incredibly powerful. 402 00:23:25,900 --> 00:23:27,759 If you can link up those data, 403 00:23:28,119 --> 00:23:31,680 you can get a much better picture of those experiences from students. 404 00:23:31,819 --> 00:23:35,940 And those early warning systems have become highly, highly accurate. 405 00:23:36,500 --> 00:23:39,980 They can predict dropout, sometimes many years ahead, 406 00:23:40,380 --> 00:23:41,220 and then you can intervene. 407 00:23:41,740 --> 00:23:43,240 That's when the student needs your help, 408 00:23:43,299 --> 00:23:44,240 not when they're dropped out. 409 00:23:44,359 --> 00:23:45,859 That's very late, then you can actually, 410 00:23:46,019 --> 00:23:47,579 it's very hard to correct the course. 411 00:23:47,700 --> 00:23:50,759 But if you catch that, you know, three, four years earlier, 412 00:23:50,900 --> 00:23:53,660 the learning deficits, the social difficulties, 413 00:23:53,839 --> 00:23:55,960 where the student needs, you know, student welfare 414 00:23:55,960 --> 00:23:58,900 or better learning opportunities or special tutoring, 415 00:23:59,299 --> 00:24:00,700 that's where you can address it. 416 00:24:00,799 --> 00:24:04,460 So these early warning systems using artificial intelligence 417 00:24:04,460 --> 00:24:08,359 have become really, really good and very powerful in their ways. 418 00:24:09,299 --> 00:24:11,420 And then, you know, assessment. 419 00:24:11,559 --> 00:24:13,119 That's a field very close to my heart. 420 00:24:13,240 --> 00:24:17,859 I've spent a big part of my life on designing modern assessments for students. 421 00:24:18,160 --> 00:24:20,680 And I would say, actually, 422 00:24:20,799 --> 00:24:26,359 one of the biggest mistakes that we made in the history of education 423 00:24:26,359 --> 00:24:28,019 over the last 300, 400 years 424 00:24:28,019 --> 00:24:31,400 was to divorce learning and assessment. 425 00:24:31,400 --> 00:24:35,019 if you go back to the origins of learning 426 00:24:35,019 --> 00:24:37,740 all learning was about apprenticeship 427 00:24:37,740 --> 00:24:40,220 you always learned with people 428 00:24:40,220 --> 00:24:41,779 and they gave you feedback 429 00:24:41,779 --> 00:24:44,480 learning and assessment was completely integrated 430 00:24:44,480 --> 00:24:45,740 nobody gave you a test 431 00:24:45,740 --> 00:24:48,599 and then we industrialized teaching 432 00:24:48,599 --> 00:24:50,500 we got a lot of people to educate 433 00:24:50,500 --> 00:24:52,200 in a way, in a mechanical way 434 00:24:52,200 --> 00:24:55,440 we made them pile up lots and lots and lots of learning 435 00:24:55,440 --> 00:24:57,099 and then at the end of the learning 436 00:24:57,099 --> 00:24:58,359 many years after we say 437 00:24:58,359 --> 00:25:00,900 come back and tell me everything that you learned 438 00:25:00,900 --> 00:25:03,500 in a very contrived, constrained, artificial setting. 439 00:25:03,599 --> 00:25:04,839 We call that an exam. 440 00:25:06,059 --> 00:25:08,759 But that divorce of learning from assessment 441 00:25:08,759 --> 00:25:11,259 has had many negative side effects. 442 00:25:11,819 --> 00:25:14,460 It has made learning very superficial. 443 00:25:14,980 --> 00:25:17,420 It has focused us on memorization, 444 00:25:17,559 --> 00:25:19,000 what you need to remember for a test. 445 00:25:19,420 --> 00:25:22,980 It has made teaching very kind of shallow often. 446 00:25:23,859 --> 00:25:27,319 Well, with technology we can bring back 447 00:25:27,319 --> 00:25:29,519 the world of learning and the world of assessment. 448 00:25:29,519 --> 00:25:33,880 while you study you get that appraisal and feedback 449 00:25:33,880 --> 00:25:36,660 and I can tell you something 450 00:25:36,660 --> 00:25:40,279 in 2025 the next PISA assessment 451 00:25:40,279 --> 00:25:43,940 will have a component that will do exactly that 452 00:25:43,940 --> 00:25:46,380 we will not give students a test 453 00:25:46,380 --> 00:25:49,299 we give them a true learning experience 454 00:25:49,299 --> 00:25:50,400 where they study 455 00:25:50,400 --> 00:25:54,619 and again one of the things that you are never allowed to do in a test 456 00:25:54,619 --> 00:25:57,660 in a school is to look at your neighbor 457 00:25:57,660 --> 00:26:00,240 or to look up at Google or in a book. 458 00:26:00,460 --> 00:26:01,859 People will say, that's cheating. 459 00:26:02,960 --> 00:26:06,339 Well, in the 25-PISA assessment, you're welcome to do that. 460 00:26:06,920 --> 00:26:10,059 Actually, you're going to get a digital tutor in your assessment 461 00:26:10,059 --> 00:26:13,680 because we do not want to test whether students can retrieve information. 462 00:26:13,680 --> 00:26:18,160 We want to see whether they can use and apply and extrapolate from what they know. 463 00:26:18,539 --> 00:26:23,099 So we will design assessment tasks that give students amazing tools to solve problems, 464 00:26:23,279 --> 00:26:26,180 and we will study how they approach those problems. 465 00:26:26,180 --> 00:26:29,200 What are the learning strategies, the problem-solution strategies? 466 00:26:29,599 --> 00:26:30,559 How do they advance? 467 00:26:31,319 --> 00:26:33,779 Integrating the world of learning and the world of assessment, 468 00:26:33,880 --> 00:26:37,740 that is one of the biggest promises that I see in technology these days. 469 00:26:37,819 --> 00:26:42,299 It will revolutionize the way in which we look at student learning experiences. 470 00:26:43,299 --> 00:26:48,339 So, in conclusion, you know what smart technologies can offer you 471 00:26:48,339 --> 00:26:50,859 that can make learning a lot more effective. 472 00:26:51,160 --> 00:26:54,039 They can also enhance equity. 473 00:26:54,819 --> 00:26:58,339 Here, you know, it's not so sure how this will play out. 474 00:26:59,099 --> 00:27:03,000 There's a great risk that technology will become the great divider. 475 00:27:03,900 --> 00:27:06,960 Some schools, some teachers, some regions, some countries, 476 00:27:07,180 --> 00:27:09,140 you know, being very, very rapid, very quick 477 00:27:09,140 --> 00:27:11,279 to capitalize on those opportunities. 478 00:27:12,019 --> 00:27:15,480 And other, you know, teachers, schools, not so ready, 479 00:27:15,599 --> 00:27:17,000 and then the gap will become wider. 480 00:27:17,319 --> 00:27:19,519 That is one of the risks of technology 481 00:27:19,519 --> 00:27:22,900 or not even having access to the kind of devices. 482 00:27:22,900 --> 00:27:34,480 Those are real risks, but I do believe that the promise of technology really to, by making learning more personal, more interactive, more equitable, it can actually help us close some of the gaps. 483 00:27:34,619 --> 00:27:37,480 But, you know, the future is difficult to predict on this. 484 00:27:37,559 --> 00:27:42,000 We have seen both effects, you know, equity enhancing and dividing effects. 485 00:27:42,339 --> 00:27:48,500 And then, you know, last but not least, simply making learning more efficient, offering you more in less. 486 00:27:49,119 --> 00:27:50,619 But, you know, that's the promise. 487 00:27:50,619 --> 00:27:53,119 I've talked a lot about, you know, what we could be doing, 488 00:27:53,720 --> 00:27:56,339 but I don't want to hide part of the reality. 489 00:27:57,519 --> 00:28:00,519 You know, when we actually looked at learning outcomes 490 00:28:00,519 --> 00:28:03,039 in our last PISA assessment, 491 00:28:03,519 --> 00:28:05,779 we saw that technology intensity 492 00:28:05,779 --> 00:28:10,559 was often related negatively to learning outcomes. 493 00:28:11,539 --> 00:28:14,619 So students who used more technology in the classroom 494 00:28:14,619 --> 00:28:17,559 actually had lower levels of digital literacy, 495 00:28:17,779 --> 00:28:19,859 of problem-solving skills, of many of the skills 496 00:28:19,859 --> 00:28:20,980 that we value. 497 00:28:22,319 --> 00:28:23,980 Now, some people are quick to say, 498 00:28:24,059 --> 00:28:25,400 okay, you see, I told you, 499 00:28:25,440 --> 00:28:27,339 technology does more harm than good. 500 00:28:27,799 --> 00:28:30,400 But it is probably a lot more a question 501 00:28:30,400 --> 00:28:33,420 of how do we make use out of technology. 502 00:28:33,420 --> 00:28:37,279 You know, the best technology actually 503 00:28:37,279 --> 00:28:39,539 is present but not visible. 504 00:28:40,180 --> 00:28:43,619 It doesn't detract and distract the students, 505 00:28:44,019 --> 00:28:45,380 but it sits in the background 506 00:28:45,380 --> 00:28:47,579 and giving them additional insights, 507 00:28:47,779 --> 00:28:48,619 additional information. 508 00:28:48,759 --> 00:28:50,400 It gives teachers more eyes, more ears 509 00:28:50,400 --> 00:28:53,099 to see actually the reality in the classroom. 510 00:28:54,039 --> 00:28:56,140 Remember that example from Shanghai. 511 00:28:57,099 --> 00:28:59,039 The students were still drawing the characters 512 00:28:59,039 --> 00:29:00,400 in very traditional ways 513 00:29:00,400 --> 00:29:02,380 when the technology was everywhere 514 00:29:02,380 --> 00:29:05,440 to help them see where they're advancing, 515 00:29:05,640 --> 00:29:06,420 where they're struggling, 516 00:29:06,559 --> 00:29:07,480 to help the teachers see 517 00:29:07,480 --> 00:29:09,000 where different students learn differently. 518 00:29:09,160 --> 00:29:11,599 So again, I think technology is not about 519 00:29:11,599 --> 00:29:14,519 necessarily students typing on a tablet or a laptop. 520 00:29:15,019 --> 00:29:17,319 It is really about understanding 521 00:29:17,319 --> 00:29:18,980 how different students learn differently. 522 00:29:19,079 --> 00:29:21,099 We should take that picture to heart. 523 00:29:21,299 --> 00:29:22,960 You know, I think it's about the right use, 524 00:29:23,079 --> 00:29:25,839 intelligent use, the smart use of technology 525 00:29:25,839 --> 00:29:29,359 that we need to foster and avoid that technology, 526 00:29:29,500 --> 00:29:31,099 you know, just steals time 527 00:29:31,099 --> 00:29:33,079 or makes learning more scripted, 528 00:29:33,720 --> 00:29:36,420 more superficial, more reactive. 529 00:29:37,400 --> 00:29:40,140 You want to strengthen creativity through technology, 530 00:29:40,140 --> 00:29:41,759 not, you know, get students, you know, 531 00:29:42,500 --> 00:29:45,059 answers through very mechanical kind of question types. 532 00:29:45,240 --> 00:29:47,140 So I think it's very, very important. 533 00:29:47,319 --> 00:29:51,559 So what can we do? 534 00:29:51,619 --> 00:29:54,759 It has a lot to do with what we do in public policy. 535 00:29:55,180 --> 00:30:00,299 We need to govern the design of smart data systems by ethics. 536 00:30:01,859 --> 00:30:05,359 Artificial intelligence is not a magic power. 537 00:30:06,740 --> 00:30:09,799 Artificial intelligence is just an amazing amplifier 538 00:30:09,799 --> 00:30:11,420 and an incredible accelerator. 539 00:30:11,420 --> 00:30:15,019 But it amplifies good teaching practice 540 00:30:15,019 --> 00:30:17,680 and good, you know, ideas 541 00:30:17,680 --> 00:30:20,759 in the same way it amplifies bad teaching practice 542 00:30:20,759 --> 00:30:22,079 and bad ideas. 543 00:30:22,660 --> 00:30:24,920 Artificial intelligence can help you understand 544 00:30:24,920 --> 00:30:27,140 how your different students learn differently 545 00:30:27,140 --> 00:30:29,400 and embrace diversity 546 00:30:29,400 --> 00:30:31,740 with differentiated pedagogical practice. 547 00:30:31,920 --> 00:30:32,480 That's the power. 548 00:30:33,259 --> 00:30:35,380 But it can also amplify stereotypes. 549 00:30:35,880 --> 00:30:38,000 You may judge your students 550 00:30:38,000 --> 00:30:40,259 on the basis of experience of other students 551 00:30:40,259 --> 00:30:42,980 from the past that are just projected in them 552 00:30:42,980 --> 00:30:46,440 rather than seeing the unique potential every learner brings to you. 553 00:30:47,039 --> 00:30:47,779 That's the risk. 554 00:30:48,380 --> 00:30:53,519 So again, that's why artificial intelligence doesn't replace teaching. 555 00:30:53,960 --> 00:30:58,500 It can just amplify, accelerate really, really good teaching practices. 556 00:30:58,779 --> 00:31:00,000 But I think we need to be aware. 557 00:31:00,380 --> 00:31:03,200 That's why the ethical dimension is very, very important. 558 00:31:03,200 --> 00:31:06,759 If you do not understand the nature of an algorithm, 559 00:31:07,640 --> 00:31:10,200 you're going to be very quickly the slave of that algorithm. 560 00:31:10,440 --> 00:31:12,200 That's the bottom line of this. 561 00:31:12,740 --> 00:31:16,599 We need to improve learning with, you know, 562 00:31:16,660 --> 00:31:20,039 moving from a culture of public versus private, 563 00:31:20,160 --> 00:31:21,099 that's the past, you know, 564 00:31:21,099 --> 00:31:22,940 we have public schools and private schools 565 00:31:22,940 --> 00:31:25,539 and, you know, public experiences, public textbooks 566 00:31:25,539 --> 00:31:27,940 and, you know, private laptops and so on. 567 00:31:28,180 --> 00:31:30,619 We need to bring those worlds closer together, 568 00:31:30,799 --> 00:31:33,440 bring the public and the private world 569 00:31:33,440 --> 00:31:36,059 in service of the public good of education. 570 00:31:36,559 --> 00:31:38,579 Technology is a great example here. 571 00:31:38,640 --> 00:31:41,099 Many of the technologies are not designed 572 00:31:41,099 --> 00:31:43,079 designed with an educational purpose 573 00:31:43,079 --> 00:31:44,480 in mind. Why? 574 00:31:45,059 --> 00:31:47,180 Because they are not designed by educators, 575 00:31:47,339 --> 00:31:47,839 by teachers. 576 00:31:48,980 --> 00:31:51,000 That's again one of the things that we can learn from 577 00:31:51,000 --> 00:31:52,740 countries like Denmark, like Estonia, 578 00:31:53,299 --> 00:31:55,019 where actually teachers are the 579 00:31:55,019 --> 00:31:56,960 designers of those technologies. They are 580 00:31:56,960 --> 00:31:59,039 developed by companies, but at the 581 00:31:59,039 --> 00:32:00,619 heart of the design of technologies 582 00:32:00,619 --> 00:32:03,220 are people who are in the classroom, 583 00:32:03,420 --> 00:32:05,119 who know how different students learn. 584 00:32:05,880 --> 00:32:07,059 I think that's a really, really 585 00:32:07,059 --> 00:32:08,980 important factor here, that it is 586 00:32:08,980 --> 00:32:11,079 not about buying nice 587 00:32:11,079 --> 00:32:16,319 in Chinese products, it's about getting teachers into, you know, becoming data scientists, 588 00:32:16,319 --> 00:32:20,700 becoming designers of innovative learning environments and contributing to the development 589 00:32:20,700 --> 00:32:25,900 of new technologies, the R&D function. 590 00:32:25,900 --> 00:32:34,359 In the medical sector, the medical profession spends about 17 times as much money and time 591 00:32:34,359 --> 00:32:37,400 on innovation than what we do in education. 592 00:32:37,400 --> 00:32:39,240 We need to change that. 593 00:32:39,240 --> 00:32:42,240 should not just be transmitting what we know. 594 00:32:42,559 --> 00:32:45,480 They should actually have much more time and resources 595 00:32:45,480 --> 00:32:48,559 to design those kinds of innovative learning environment. 596 00:32:49,559 --> 00:32:52,960 Again, also, the integration of pedagogical approaches. 597 00:32:53,619 --> 00:32:55,420 Technology, again, is a tool. 598 00:32:55,920 --> 00:32:57,240 Technology is not the end. 599 00:32:57,319 --> 00:32:58,960 It's the pedagogy that is the end. 600 00:32:59,339 --> 00:33:03,480 We also need to do a lot better to strengthen compatibility 601 00:33:03,480 --> 00:33:05,640 and interoperability in technology. 602 00:33:05,819 --> 00:33:07,339 Sometimes, as a teacher, you know that, 603 00:33:07,339 --> 00:33:13,400 Well, you have to work with five, six, seven, eight, ten different software solutions every day. 604 00:33:13,480 --> 00:33:14,279 They're not compatible. 605 00:33:14,700 --> 00:33:15,740 They can't share their data. 606 00:33:16,160 --> 00:33:17,299 And that's not good for anyone. 607 00:33:17,880 --> 00:33:23,579 So we need to move towards closer integration that also creates a very monopolistic environment. 608 00:33:24,160 --> 00:33:29,079 One of the things that are so amazing in countries like Korea, Singapore, Estonia, 609 00:33:29,220 --> 00:33:32,200 is that they have a standardized platform throughout the system. 610 00:33:32,319 --> 00:33:33,519 And everybody can innovate. 611 00:33:33,700 --> 00:33:35,039 Everybody can build on this. 612 00:33:35,039 --> 00:33:38,680 even students can build their own learning technologies, 613 00:33:38,680 --> 00:33:42,680 groups of teachers can because there is a kind of standardised architecture 614 00:33:42,680 --> 00:33:44,519 that allows for that kind of innovation. 615 00:33:44,519 --> 00:33:48,640 I think that's really, really important to interoperability, compatibility. 616 00:33:49,519 --> 00:33:52,920 And then again, you know, paying good attention to a learning activity 617 00:33:52,920 --> 00:33:55,559 rather than learning technology. 618 00:33:55,559 --> 00:33:58,599 And I must also say, you know, many of our children 619 00:33:58,599 --> 00:34:01,799 would not voluntarily play with the kind of, you know, 620 00:34:01,799 --> 00:34:05,160 software that they are sometimes given by technology companies 621 00:34:05,160 --> 00:34:07,420 because they're not designed around learning needs. 622 00:34:07,519 --> 00:34:11,639 So a lot of work for all of us to do. 623 00:34:11,639 --> 00:34:17,860 But I just want to conclude by saying that actually the future is here. 624 00:34:18,099 --> 00:34:22,780 The future is not something that we need to wait for another few years. 625 00:34:22,880 --> 00:34:26,900 We have the possibilities in the event of artificial intelligence, 626 00:34:27,079 --> 00:34:30,139 of learning analytics, of the power of big data. 627 00:34:30,139 --> 00:34:34,199 It can really transform the way in which students learn, 628 00:34:34,639 --> 00:34:37,039 teachers teach, and how school systems operate. 629 00:34:37,300 --> 00:34:43,440 And it will certainly transform the role of educators, of teachers. 630 00:34:44,039 --> 00:34:47,880 Once again, traditional knowledge transmission 631 00:34:47,880 --> 00:34:52,300 will probably become less the center of teaching 632 00:34:52,300 --> 00:34:54,880 because that's where technology is very, very good. 633 00:34:54,980 --> 00:34:59,159 But as a teacher, you just simply need to be a very, very, very good coach, 634 00:34:59,159 --> 00:35:04,739 a very good mentor you need to really have a deep understanding who are my students who do 635 00:35:04,739 --> 00:35:09,599 they want to become and how I can help you on their journey thank you very much classes