The Department for Education (DfE)’s new standards for digital filtering and monitoring will mean a change of emphasis for academy trusts. Here’s what you need to know.
On March 29th, 2023, the DfE published its new filtering and monitoring standards for all UK academy trusts, schools and colleges.
Hopefully, your internet connections already have filtering and monitoring in place – in fact, the DfE says you should already meet its new standards. But there’s a clear change of emphasis in the standards, from passive to proactive. So when we consider that many academy trusts see filtering and monitoring as something to “set and forget”, it’s important to understand how you can stay in the DfE’s good books.
This blog will run through four steps you can take to ensure your trust can meet the new standards, in a way that’s appropriate and proportionate for you.
Four steps to filtering and monitoring compliance
Step 1: Ensure your lines of responsibility are clear
The DfE states you should “identify and assign roles and responsibilities to manage your filtering and monitoring systems”.
Ideally, this means assigning a senior academy leader and the trustee with safeguarding oversight to work together to see that standards are met and that all staff are trained to follow filtering and monitoring procedures. This coalition will also work closely with any IT staff and designated safeguarding leads to:
- Procure, manage, and review your technology
- Respond to any safeguarding concerns
- Document and report decision making
What these roles look like in practice will depend on the size and type of your trust and the staff resources available (variables that the DfE acknowledges). Regardless, you must clarify to everyone who’s responsible – at every level – for producing, managing, and reviewing your filtering and monitoring systems.
Step 2: Schedule regular filtering and monitoring reviews
The DfE says you should review your filtering and monitoring provision at least annually. Realistically, you want to make this a much more regular habit.
As well as ensuring your systems meet your academy’s safeguarding requirements, you’ll want to confirm everything’s working as it should – seeing what network activity is happening, and looking for patterns in the internet traffic you’ve blocked.
To get this right at a trust level, we highly recommend scheduling regular meetings – ideally including relevant member(s) of the senior leadership team, the IT service provider, and the trustee with oversight of safeguarding – to discuss and review:
- If your arrangements still meet your academy’s changing risk profile
- Any recent safeguarding reports flagged by filtering and monitoring
- What your system blocks and allows – and is it still fit for purpose
- Whether students (or staff) have found new ways to circumvent the system
- New monitoring strategies to proactively prevent harmful exposure
Using cloud-based software to manage filtering and monitoring can really help here, by facilitating consistent reviews and informing decision making. You can gain a shared view of online activity and the granular insights needed to adapt filtering rules to new educational trends, identify at-risk individuals, and proactively protect at scale.
We also recommend that your safeguarding policies and procedures are clear on what relevant staff must do if they identify problematic network behaviour, and that you review staff awareness as part of your regular meetings.
Step 3: Use adaptable filtering to match restrictions to the individual
It can be easy to get overzealous with your filtering and monitoring and have your system block anything that could be considered – or associated with – harmful content. But it’s likely your students and staff will sometimes need to access learning materials that may be otherwise regarded as mature, controversial, or offensive to specific audiences.
The DfE states that your filtering system should block harmful and inappropriate content without negatively impacting teaching and learning. And as the government’s Keeping Children Safe in Education (KCSiE) guidelines state, this includes blocking harmful content, contact, conduct, and commerce.
To achieve the ideal balance of protection and accessibility, we recommend using an adaptable safeguarding platform with personalised access for different users. This lets you relax or tighten rules based on school year, class, key stage, and time. You may also want to involve curriculum leads in your system reviews on an ad hoc basis to ensure that the filter accounts for the latest curriculum themes and content.
Your platform must also be able to identify user intent and block content based on efforts to bypass filtering with misspellings, VPNs, and abbreviations. Plus, as part of your regular meetings, check it is always integrated with the latest threat lists – such as the Internet Watch Foundation, UK Safer Internet Centre and Counter-Terrorism Internet Referral Unit list (CTIRU) – and any emerging trends in extremist content that may impact student mental health or well‑being.
Step 4: Make it easy for staff to spot concerns – and respond appropriately
In addition to filtering, the DfE says you should have effective monitoring strategies that meet your school or college's safeguarding needs.
There are a variety of tools and approaches that can help. While physically observing students online is an option in some circumstances, a digital solution might make it easier to monitor behaviour remotely and in the right place at the right time, and give teachers a context-ready way to capture, report, and escalate observed incidents quickly. It’s a question of what’s right for your context.
Whichever approaches you use, it’s essential that your staff understand what they’re looking for when monitoring online activities, and how to respond appropriately. It’s therefore essential to build digital monitoring into your safeguarding procedures and ensure staff are appropriately trained.
Training responsibility will depend on each school’s or trust’s resources, but we recommend tying this into your system reviews to help keep everyone updated on monitoring best practices.
KCSiE updates its filtering and monitoring guidance
The UK Government has provided light updates to its Keeping Children Safe in Education (KCSiE) guidelines for 2023, including a reference to the DfE’s filtering and monitoring standards.
KCSiE states that your commitment to filtering and monitoring should be reflected in your child protection policy, and that you must consider the number of and age range of your students—and who of those are at greater risk of harm. It also updates the Designated Safeguarding Lead’s (DSL) role description to say that the DSL’s responsibilities include understanding the filtering and monitoring systems and processes in place.
Need more support meeting this standard? RM can help.
We have protected more than one million children with our cloud-based web content filtering platform for schools and trusts: RM SafetyNet.
Automatic updates, simple user-based controls, and full KCSiE compliance deliver total peace of mind. And because its cloud delivery removes the need to perform manual maintenance and upgrades, you can spend more time building a safe and productive learning environment.
By default, we block sites included on the Internet Watch Foundation, the Home Office, the Counter Terrorist list and security intelligence , and other extremist sources. It is also adaptable and highly scalable, so you can deploy and personalise filters on all network devices when and where you most need them. We can also work with proven partners to meet your content monitoring needs.
Please get in touch to learn more about how RM can protect your staff and students online.