When a student eagerly comes up to the teacher and asks if they can use a new digital learning tool they have found, a dilemma often arises. The teacher sees the pedagogical potential and the student’s engagement, but at the same time there is also uncertainty. Is this tool safe? Will the students’ personal data be sold? Does it work for students with special needs?
In many classrooms, it has become an unofficial task for the teacher to try to guess whether a digital learning tool is safe to use or not. But the truth is that a teacher neither should nor can be left with this responsibility alone. It is the school and the municipality that have the legal responsibility for ensuring that the digital environment is safe, lawful and accessible for everyone.
A teacher is trained to deliver good teaching, create a sense of mastery and see each individual student. That is the core task. Expecting that the same teacher should also read through long terms of use, assess technical information security or check whether an AI tool complies with European privacy regulations is not reasonable.
When the responsibility for these assessments ends up with the individual teacher, it creates unnecessary uncertainty. Some may choose not to use useful technology because they are afraid of making mistakes, while others take chances without knowing what kind of risk they are exposing students to. To build a good digital school, we need to distinguish between the pedagogical choice and the technical and legal approval. The teacher should know which digital learning tools are available, while the school should ensure that everything in the shared toolbox has been properly assessed and is safe to use.
It is easy to understand why responsibility for digital assessments often ends up with the individual teacher. Technological development in schools has accelerated rapidly, and it is demanding for any municipality or school management to keep up with hundreds of new digital tools and services. When resources and time are limited, it is natural that some tasks occasionally fall between the cracks. This creates a situation where both teachers and leaders are left with a sense of uncertainty.
For digital learning to work in practice, schools need a system that makes it easy to collect and share this information. It does little good if the IT department has carried out thorough assessments if the results end up in a drawer or in a complicated spreadsheet that teachers cannot access. This is where Edudata comes in as a practical tool to create a simple, shared overview.
Instead of teachers having to spend time sending emails or waiting for answers from school leaders every time a question comes up, Edudata provides access to a library of more than 5,000 pre-analysed digital learning tools. Here you will find details about the digital learning tools, where the assessments are based on legal analyses developed by CIPP/E-certified privacy advisors. You will also get ready-made question sets for assessing risks related to privacy, security, artificial intelligence and accessibility. This means that the school does not have to start from scratch with every single assessment. Some teachers may have an extended responsibility to report needs for new solutions on behalf of their grade level, while all other teachers can easily open the overview and see what has been approved and how the tools are to be used. When the information is open and up to date, it becomes both easier and clearer to identify which digital learning tools are actually safe for students. The teacher can quickly check whether the tool the student asked about is on the list, and when the answer is a clear “yes”, teaching can continue.
When a digital learning tool appears as approved in Edudata, there are four important pillars that need to be in place before a tool can be called “school ready”:
Privacy and GDPR are the foundation. It is about making sure that children’s names, faces and work do not end up in the wrong hands or get used for commercial purposes.
Security must also be in place so that unauthorised persons cannot gain access to the classroom digitally. The tool must handle login and data storage in a responsible manner, so that students' information is not exposed to unauthorized parties.
In addition, artificial intelligence (AI) has become a natural part of many tools. Here it is important to assess how the technology affects students. Are their data being used to train new models? Are the results that are generated reliable? A thorough assessment of AI ensures that the technology supports learning without losing control of the information.
Finally, there is universal design. A digital learning tool is not fully assessed until we are certain that it works for everyone, meeting the WCAG 2.1 Level AA standards required by the EU Web Accessibility Directive. If a tool is fantastic for nine students, but unusable for the tenth due to poor contrast or lack of captions, it is not approved for school use. Inclusion is not optional, and in Edudata, verifying accessibility through documentation like a VPAT is a natural part of the checklist.
When the framework is clear and the responsibility is placed where it belongs, teachers get room to be creative. It is about building a culture where new technology can be used without first having to spend time on legal and technical questions.
When the school uses a system like Edudata to keep track of its digital learning tools, the path from a good idea to practical use in the classroom becomes much shorter. This gives the teacher the freedom to focus on the students, while the system takes care of security.
Does your school need a better overview of which digital learning tools are safe to use? Edudata has already assessed more than 5,000 tools based on privacy, security, artificial intelligence and accessibility. This makes it easier for the municipality to give teachers the answers they need. Read more about how you can create a safer digital school here.