Edición 30Tendencias

About the role of Learning Analytics in the COVID-19 pandemic

these changes are here to stay and to form a new education paradigm that will impact teachers, students, and parents.

2020 has seen an increase in the digital platform usage in all areas of knowledge, and English learning has not been an exception. The reasons for such a change are seen in broad daylight, but many do not fathom that these changes are here to stay and to form a new education paradigm that will impact teachers, students, and parents.

What is Learning Analytics?

The already known term Learning Analytics is a crucial tool for the purposes of this article. “Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” (Definition taken from the 1st International Conference on Learning Analytics and Knowledge 2011). This conference was indeed held in 2011. But the concept has its origins at the end of the 1980s and beginnings of the ’90s. Why is this relevant? Well, the massification of the personal computer in those years gave way to the opportunity for computers to be used for teaching, using devices that were before exclusively used in the corporate world.

At first, the idea of gathering information about student’s performance was precarious and was mainly kept at a conceptual level. At that time, the real challenge was the cost for storage and transfer of the data. Just to give readers an idea of the problem, internet penetration in the 90’s was low ―few people had access to the internet― and transferring a great deal of data was slow and costly. Even if it was from the school’s or university’s Media Lab. Likewise, storage in a local server was also expensive. Because of these obstacles, information about a student’s process ―if any information was gathered― was limited to the number of attempts, results and time dedicated to the activity.

The start of the millennium came with an explosive increase of internet penetration, which turned fast to be a part of every home, and education began to make improvements in adopting standards to gather more information in different categories. Since then, the digital product offer has skyrocketed.

Despite this growth, it has been difficult to determine statistically and accurately which learning product is better than the other, or which one gives better results. Often, internet-based learning platforms are judged by interactivity ―they measure the level of interaction the student has with them― and animation ―they measure the level of animation of the platform graphics―, two aspects as flashy as irrelevant when it comes to evaluating them against the goal for which they have been created: helping the student learn better.

It is in the last paragraph in which the relevance of Learning Analytics lays. I like to analyze it from four perspectives, or dimensions if you will.

¿Why is Learning Analytics so relevant?

A few salient aspects can help us understand the relevance of Learning Analytics in the context of education and how to improve the educational outcomes of students:

1. Usage, appropriation and digital vitality. As I said before, the educational apps and systems offer is wide, varied and present in every device. However, it is hard to see it frequently used by students and teachers.

I see it often with my three-year-old child: In my devices, I have more than eighteen educational Apps installed. My daughter has seen them and, after a brief tour of them, she only uses four. The one she uses the most (Pupitre by Santillana) hasn´t managed to keep her engaged for more than ten minutes. People would say ten minutes is a lot for a kid her age, but, when we see the average time of an older student using a learning platform, we notice that it’s about thirteen minutes. Not so different then. Of course, that was before Covid-19. Now, the time of a full session is longer, and the students are using it more often during a day.

Have these platforms been designed for these attention spans? What would it take for the platforms to help students reach their full potential? Are platforms being underused? How do you unleash the complete educational force with which platforms have been created? If the educational platforms are being more often used now that kids are at home, does this mean that the platform needs a human factor that assures its usage? And, if that last one is right, shouldn’t the platform fix its own problem? In other words, shouldn’t it be self-reliable? As of today, I don’t have a specific answer to any of these questions, but Learning Analytics can help us solve this and other paradoxes.

And this is only concerning the students. Historically, many busy teachers (an oxymoron if there ever was one) saw these digital learning platforms as “extra work”, and they were not wrong; in most of those cases, the information provided by the platforms was insufficient ―the teacher couldn’t assess whether to make an individual intervention or a group reteaching of any given learning objective ― and, also, they noticed they were posing as the first line of customer technical support. Which took us to a grand conclusion: if the teacher is not using the platform (regardless of the reasons for this), it is highly likely that the students are not using it either.

Confinement during the pandemic made teachers dust off their digital platforms’ subscriptions and licenses that were barely used, to manage their teaching processes. All education companies saw the platform usage grow exponentially, so this unexpected surge threatened the platforms digital infrastructure. Despite of the unfortunate circumstances, the fact of it happening has uncovered a thrilling reality: for the first time in years, educators and learners started using technology intensively.

And for the first time, when it comes to the statistical analysis of data, we now have a potentially significant sample to measure the real impact of digital products in education. I am convinced that technology is an excellent ally of education and that it provides an excellent tool to enrich a teacher’s work, but only if it delivers relevant information in sufficient quantities.

2. Context: With the current technology, it is possible to gather a great deal of information about how the learning and the performance of a student are taking place. We now know when he/she connects, from which devices and if the student uses the same device every time. Also, we know about screen resolution, operative system, hours on the platform, connectivity, etc.

This data allows us to infer a lot about the students. Years ago, some said that internet access would be the great equalizer ―that it would eliminate any socioeconomic stratification and create some sort of global democratization― I was always in disagreement with this prediction, and to demonstrate my point, I want you to think of a slow internet connection and about the frustration it yields. Internet connectivity conditions have come, if nothing, to widen the socioeconomic gap. As Will Ferrell once tweeted, “Before you marry a person, you should first make them use a computer with slow Internet to see who they are.” This does not only apply for newly engaged people, but also for students (do not underestimate comedy’s ability to portray human reality.)

It is the step that marks the transition from assessment of learning to
assessment for learning.

If a student interacts with a slowly functioning platform, it does not matter if it is the platform fault, the internet connection’s fault, or the device’s fault: this has an emotional impact on the student. Could this in turn influence on the student’s learning process? I would answer with a resolute yes: emotion is a major catalyst or obstacle to learning.

To take this into account, some platforms have optimized content to measure aspects such as the mouse movement or the pressure applied on the screen to select an answer. All of this gives us more information about environmental data from the student, how he/she is learning and in which terms. Ultimately, this is all relevant data that can support the optimization of digital platforms.

3. Performance. Possibly the most effortless dimension to discuss. It is about statistics and the correct answers of a student per learning goal or ability being learned. There is enough information about the subject because it was the first data that started being collected in the ’80s. Having said that, the fact that there is actual collection of data does not guarantee learning per se. It just gives everyone a snapshot of where the student is in a process that is time-bound. So, it is crucial that teachers use the data to assess their own teaching procedures and make the necessary adjustments for further intervention or reteaching. It is the step that marks the transition from assessment of learning to assessment for learning. There is much to do in this area, but the data is there not just to be collected but to be studied and used in a variety of ways. Using data to figure out how to provide students with tools to support understanding, as well as designing adaptive practice tools and incorporate them to the digital content we design will also have an immediate -and undoubtedly measurable-impact on student performance.

4. Learning and language learning evaluation: My favorite dimension, because it is where the magic happens. It has been demonstrated that the quality of our language limits the quality of our thoughts. And the devices we use alter our language.

Some years ago, I had a conversation with two language experts. We were discussing whether it was correct to leave a space between a word and an interrogation mark (“Ask?” or “Ask ?” ) When we write with a pencil or a standard keyboard, we usually do not leave a space before the interrogation mark. However, with tablets and cellphones that have the autocomplete feature on, this space is added automatically. Because of that, lots of students resolving a test item left that space and received a wrong mark. It seems to be a minor detail, mind you, but when a student is taking a test online, the results are more important than we think: college admission based on an online test result, for example.

The result of the debate around how to accommodate content design and the language features associated to different types of devices is not intrinsically important, but the possibility of high student interactivity with the content provides us a more ample scenario for improving the content we offer, and that should reflect in higher-quality interactions between students and digital content. This scenario will challenge those of us involved in the process of creating the platforms, but more importantly, it challenges those who create the content and are required to update it according to new paradigms, so that technology-infused user experiences become closer to the human component of the equation. This is where the magic I mentioned before takes place.

Conclusion

This Covid-19 epidemic, terrible as it undoubtedly is, has done more for the Learning Analytics field than the 80’s personal computer did in its advent. Access to platforms has grown exponentially, and for some users, this was even their first time using a product for which they had already paid. The impact of this is by no means minor: for someone to perceive a digital resource value, it must be used.

As a result, we now have collected an enormous quantity of data about student’s learning, their performance, and their context. But what is coming is even more relevant, and is connected to discussions around myriad questions: how will data be used to enhance present systems and make them really become the teacher’s right hand to maintain students engaged and motivated? How can we facilitate the transition of teachers from seeing technology as time-consuming to seeing it as a time-saving tool to provide just-in-time support to every learner? How are we going to allay fears of potential misuse of data, in times where the behemoths of technology and social networking are being questioned on their motives and means of using their users’ data? After the scandal of Cambridge Analytica, this should come as a surprise to no one, and using well-recognized companies and brands should be an urgent piece of advice to provide to anyone engaging in extended use of technology, educational or recreational. We need to provide more and more refined digital product assessment frameworks to help teachers and parents objectively look at platforms not only in terms of students engagement but also ease of use and impact in learning. The debate is open, the subject fascinating, at least for me, and the room for analysis is endless. RM

We need to provide more refined digital product assessment frameworks to help teachers and parents

Gonzálo Pastor

Ingeniero eléctrico y especialista en Computer Science. Ha enfocado su trabajo en las áreas de Rápid Prototyping y en product life cycle, así como en la implementación de procesos de innovación.

Artículos relacionados

Deja un comentario

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *

Podría interesarte
Close
Back to top button