In your User Defined Filter, click the Advanced button and use a regular expression in URL patterns. (newstudyhallx1) should suffice, I believe.
Useful page for using regex: regexpal.com
I'm trying to block a site within sites.google.com, (sites.google.com/site/newstudyhallx1) but it's just not working. Is there any way to filter URLs by keywords, like MS ISA does? It feels like this should be something already in Smoothwall, but I can't find it. I'm looking for a way to put, say, 'newstudyhallx1' as a keyword to automatically block any URL containing that.
Thanks for any advice!
zachariah (14th May 2013)
Caveat: URL filtering is NOT effective against SSL requests unless you are MITM inspecting (this is true for all filter products) as you cannot see the URL part without decrypt.
Just an update to say this is still not working, and url filtering on Smoothwall appears to be ineffective.
It certainly does work - but there are two reasons it wouldn't apply:
1. The site is using SSL encryption, and you aren't MITMing it - we then have no access to the URL.
2. There's another rule above the one you're writing which applies an allow or whitelist action to the site in question.
zachariah (9th October 2013)
Thanks - No.1 is probably it, though with 'sites.' being a sub-domain of Google, we are loathe to put any filtering on it that might cause any slowdown. Any workarounds for this?
As a subdomain you can differentiate sites, and just mitm that?
There are currently 1 users browsing this thread. (0 members and 1 guests)