Content filtering is always an interesting topic for discussion because it is so multifaceted. In my earlier post I listed the points below as part of the discussion arena. In my context, the framework is K-12 education.
- filter or not?
- if you do, how much?
- if you do, is it done centrally or at the school level?
- how do you align content filtering with educational resource selection processes for print, video etc.?
- block or allow social networking?
- keep students safe
- sufficient band width
- how do you define ‘educational content’ in a way that makes sense in a K-12 context?
- should content filtering be more age or grade appropriate?
There are no easy answers. It is easy to find valid reasons to sit on either side of the fence for each point. More importantly, there is a growing need to keep learning about what is right, what is doable and keeping the agenda moving forward in an appropriate fashion for K-12 education. Further reflection on this post has two aspects of content filtering churning around in my mind.
1. Copyright: Content filtering must respect copyright and your country/jurisdictions laws and regulations. This whole aspect of internet use is blurry in the global community. At face value, what you see in your browser is relatively consistent from your vantage point on the globe. The internet seems like ‘one place’. The reality is the servers, and therefore content are in different countries. What you are able to do with content (copy, download , redistribute, use in a school classroom setting etc.) will likely vary, depending on your regulations and these need to be respected.
2. Equitable access: A couple of weeks ago, I was having a meeting with @socmediatrust (Twitter) discussing Digital Citizenship and his work at schools presenting Internet safety sessions to students and parent groups. At some point in the conversation, we landed on content filtering. As mentioned above, there are many approaches to dealing with this. The focus of our discussion was bullet #3 – centralized or distributed to the school level systems. This led to an interesting talk framed around consistency and equity of access.
Providing content filtering from a centrally run system provides equitable access to resources deemed suitable for use by all students/staff/sites within a system. To me, this makes the most sense. Deem the content that is acceptable for use in a particular system through a fair mechanism to select and align content with educational needs. Then you can work away at fine tuning needs in a strategic way. I can see value in having a ‘sliding scale’ effect for content filtering so it is adjusted for age levels – maybe something along the lines of tightly controlled (young students), medium access (maybe grades six to eight) and more open for high school. Validate readiness for each level with a strong Digital Citizenship program to teach ethical, responsible, safe use and digital literacies.
Now, imagine a system where access is controlled at the school level. This could potentially be a dogs breakfast so to speak. Two (or more) schools serving the same age group of students may be serving up completely different content and access to web tools. This leads me to many questions about equity of access, lack of consistent approach within a large system, lack of consistent expectations and use by staff and students and awkward to dialog with parents when the rules (access) varies from site to site. As curriculum leaders, do school administrators bring their own ‘rules of access’ with them as they move site to site over their careers? Hmmm.
My View: It seems to me, at least at this point in my thinking, the distributed model leaves more questions than answers. I would cast my vote for a centrally run system that allows for the ‘sliding scale’ fine tuning approach that is well aligned with curriculum needs and resource selection processes.