There has been a lot of attention around the Australian Government's mandatory ISP level filtering proposal. Google--and many of you--have argued that the proposal goes too far, with a broad-scoped filter, and a regime which takes the focus off more important areas such as online safety education and better support for policing efforts.

In December we expressed our concern with the Government's filtering proposal in this blog. Today we join the Australian Library and Information Association (ALIA), which represents 12 million library users around Australia, Yahoo! and the Inspire Foundation in proposing some core principles for a Safer Internet. We also expand on our views in a submission to the Department of Broadband, Communications and the Digital Economy.

Here are the highlights from our submission:
It would block some important content. The scope of content to be filtered ("Refused Classification" or "RC") is very wide. The report Untangling The Net: The Scope of Content Caught By Mandatory Internet Filtering has found that a wide scope of content could be prohibited including not just child pornography but also socially and politically controversial material. This raises genuine questions about restrictions on access to information, which is vital in a democracy.
It removes choices. The Government's proposal removes choices for parents as to what they and their children can access online. Moreover a filter may give a false sense of security and create a climate of complacency that someone else is managing your (or your children's) online experience.
It isn't effective in protecting kids. A large proportion of child sexual abuse content is not found on public websites, but in chat-rooms or peer-to-peer networks. The proposed filtering regime will not effectively protect children from this objectionable material.
Moreover, the filter appears to not work for high volume sites such as Wikipedia, YouTube, Facebook, Twitter, as the impact of the filter on Internet access speeds would be too great.

YouTube is a platform for free expression. We have clear policies about what is allowed and not allowed on the site. For example, we do not permit hate speech or sexually explicit material, and all videos uploaded must comply with our Community Guidelines. Like all law-abiding companies, YouTube complies with the laws in the countries in which we operate. When we receive a valid legal request, like a court order, to remove content alleged to violate local laws, we first check that the request complies with the law, and we will seek to narrow it if the request is overly broad. Beyond these clearly defined parameters, we will not remove material from YouTube.

Our view is that online safety should focus on user education, individual user empowerment through technology tools (such as SafeSearch Lock, Safety Mode on YouTube), and cooperation between law enforcement and industry partners. We're partnering with some tremendous organisations in Australia towards this goal.