[AusNOG] Happy new year / New rules for age-restricted internetand mobile content after the 20th of january 2008
lathiat at bur.st
Wed Jan 2 11:15:58 EST 2008
On 02/01/2008, at 10:02 AM, Bevan Slattery wrote:
> Hi 'nogers.
> Sorry if I offend people with my contrary popular views on the matter,
> but in the interest of providing discussion points here we go :)
> Technically providing a clean feed isn't as hard as everyone makes it
> out to be. We know of numerous ISP's pushing serious amounts of data
> and intercepting traffic (for eg. HTTP, P2P) and pushing cached copies
> out. We know a number of P2P caches now have 10GbE interfaces and I
> would have thought injecting P2P cache files would be a more
> challenging interception problem than stopping Johny from going to
> whitehouse.com . I know such a solution would cost more, but that's
> another argument altogether. But fundamentally arguing that the
> cost to
> intercept large amounts of internet bandwidth (n x 10G) is technically
> impossible and probably more importantly cause significant latency is
> not entirely accurate and possibly misleading. I think the latency
> argument is simply setting yourself up for an industry expert to come
> and prove you wrong and thus give the freshly minted Senator with
> ammunition to label you as 'creating arguments to remove your
> responsibilities to protect Australia's children'.
> The fundamental issue here is one which Government is completely
> clueless on the problems they are creating. It is PRACTICALLY
> impossible to guarantee a clean feed. If you can't guarantee a clean
> feed, then you are providing parents with a false sense of security
> thinking they have. Ultimately, parents will still have to install
> filtering software on their computers and supervise their kids.
> The Netalert filter system implemented by DCITA back in the 1999 days
> was an absolute disgrace. Most people don't know my history, but we
> (iSeek) licenced the N2H2 filtering system for Australia, New Zealand
> and parts of Asia. Our system is what filters about 50% of Australian
> schools today (well at least two years ago it was - don't know what
> happened since). We were part of the original approved filtering
> systems under Netaltert.
> At the time (1999), our system had 5.5 million URL's in it's list. As
> part of the approval to be on the list we had to include the list of
> sites (as updated from time to time) to our list of potentially
> offensive sites. Over 12 months and many millions of dollars, the
> Government list had approx 220 URL's. Yes, that's right, 220 compared
> to the then 9 million URL's we had. And all 220 of the Government
> were already on our list. Probably worked out to about $50,000 per
> or more.
> N2H2 had a permanent team of reviewers being fed URL's that were
> referred to by existing users (users could report sites) or via an
> intelligent spider that had access to the direct search engine feeds
> got from Inktomi and others. The reviewers were about 30 people at
> one time working in 4 hour shifts. This was based in Seattle and
> included numerous multi-language students who could assist with the
> then, massive growth of pr0n coming from Asia (remember this was
> So I'd say there was around 200+ reviewers on the books (casual,
> part-time and permanent).
> Like most of the new government Internet policies, they are big on
> ideas, even bigger on hype and will be huge on collateral damage.
> will be an absolute failure and I would feel that any ISP that is
> to provide a 'clean feed' should be afforded indemnity by the Federal
> Government for the class action that will follow when parents sue for
> false and misleading advertising when they realize a clean feed is not
> actually a clean feed.
> It is the governments responsibility to provide the list and a
> indemnity from legal prosecution. Like everything else the government
> is proposing, we need to get them to push out the details on how it is
> going to be implemented and the framework associated. We should hit
> them with a list of questions in the absence of the framework.
Thanks for the rather insightful and interesting email.
I am personally of the opinion that a manual blacklist (or even a semi-
automated one) will absolutely never be a viable solution, especially
as we come into the age of more, more, more and more data on the
internet it is basically impossible to sift through it all and provide
a high quality filter.
IMHO the only reasonable solution would be some kind of intelligent
filter of some sort that did some kind of analysis and this would be
very, very resource intensive.
Perhaps google could provide a little URL checking service of whether
they thought it was adult safe or not, becuase they already have a-lot
of crazy scalable infrastructure - but I'm waiting for the law suits
for mis-listing :)
Anyone else got some potentially more informed thoughts on the matter?
More information about the AusNOG