We offer an optional internet content filtering service to our residential
and business customers using M86's appliance
(http://www.m86security.com/products/web_security/m86-web-filtering-reportin
g-suite.asp).
I've been in conversation with them since Q1 regards IPv6 support, but the
update I received today was that IPv6 support won't be available until
middle to late next year. That's not ideal, because the local college is a
significant user and they started with IPv6 this summer. College students
can easily bypass content filtering by using the IPv6 version of the site
(i.e. http://www.playboy.com.sixxs.org)
Wondering if anyone can point me to a similar appliance. I know that
Barracuda has such an appliance but it has one limitation I don't like, and
that Fortinet and Ironport have more expensive products.
It must be able to operate in pass-by/SPAN mode, not inline, and handle
traffic rates up to 1 Gbps. We currently move 400+ Mbps "by" it and
internet usage only goes up.
Thanks in advance for any suggestions you have.
Kind regards,
Frank
Emmm.. if they can use that to circumvent your filter don't you think
those same people won't be able to find out about other proxy servers,
it is not like the internet is not filled with them or anything.
Please note to yourself that you are fighting a lost cause as there are
more locations on the Internet that are annoying for the policy than you
can list, thus one of the very few ways to make it very hard to 'filter'
is to only allow approved sites, and with 'approve' I mean fetch the URL
on a controlled machine, scrub it and pass it back, as the moment
somebody can have a host on the outside and can send a few bits to it
and get an answer back they are outside, if you like it or not.
That said, there are loads of free HTTP proxies, anonymizers and other
such tools and most of them are not caught by your filtering toy anyway.
But indeed, it is a bad thing that they are unable to update their
little box to do IPv6, there really is not that much different there.
Greets,
Jeroen
(Who could block stuff on the above URL actually, but except for
silly people trying to run torrents over it which does not work but
which do hammer those boxes nothing gets blocked [CP is the except])
(Excuse me if I missed part of the email chain. This may have already been mentioned)
It could be a bit of an annoyance for configuration but the one method you could use is to force a proxy internally.
I am a bit unsure why most don't do this already but it has it's flaws.
1) Lack of static/dynamic IP's
2) More work for tracking
3) Management of additional infrastructure
However, you could force a proxy and run it inhouse, like squid or some other type.
This would give you some advantages:
1) Content caching - increasing speeds for users while decreasing your overall bandwidth utilization.
2) Increased security for filtering out malware/virii.
Does anybody have any real-world stats on what size local Squid/whatever cache
they're using and what % of bandwidth savings they're seeing? (Bonus points if
you've identified specific things it helps, like Patch Tuesday or whatever).
Might be interesting to see what impact a local Akamai node has on your own
caching savings too...
Jeroen:
Their filtering appliance also filters out free HTTP proxies and anonymizers, some because their known, others because of signatures. It's not perfect, but it catches a lot more than what you might think. And we don't market it as the silver bullet and we let our customers know that this is not the be-all and end-all of content filtering, but something that catches the vast majority accidental site visits. If someone wants to work around it they can run a VPN, but for 99.99% of the subscribers of this service, it's a lot better than nothing or running software on each PC (which doesn't help for Xbox, etc).
If you have a URL you want me to try, let me know and I'll be able to tell you what the appliance thinks.
Regards,
Frank
I have seen 30-50% savings on some networks when patch Tuesday hits. Its not achievable on a vanilla squid though and needs some code magic.
With general traffic the savings tend to be around the 10-20% mark. Unforunately much of the stuff you really want to cache like your YouTube vids is intentionally filled with cookies that make it un-cachable. This is done intentionally for copyright compliance and various other things.