We use Squid here, and needed a way to view which sites were being loaded from the cache, and which sites weren't and were therefore downloaded a fresh.
Is there any way to view this info? And to compare cached/new sites?
I no this can be done but not sure how. The obvious one is that some sites are told not to use the proxy (defined in your brouser/OS settings) so are not cached, but some are sent thought the proxy but not cached there. It may help to post a note on the Squid-users mailing list, Amos Jeffries is usually a fountain of all knoledge when it comes to Squid.
Last edited by mjs_mjs; 21st October 2009 at 09:40 AM. Reason: spelling mistakes
Would webalizer or some other log analyzer provide what you what? Then based on your findings you could create an ACL to deny caching of a specified list of sites?
Basically, the reason I ask is because our website is now our homepage and bandwidth is going through the roof because it loads for every user when they load IE.
I wanted to make certain our website was loading from the cache instead of a new copy each time, thus reducing bandwidth?
I see Is your site hosted externally?
The caching features aren't something I ever properly got my head round tbh, wish I could be more help. I only ever managed to set a list of sites for squid to not cache. My primary use of squid is for the filtering.
The command below will give you some of the cache manager info, it will tell you a little bit about whats going on.
You need squidclient installed for it, you can apt-get squidclient too if you on debian!Code:squidclient mgr:info
Squid: The Definitive Guide - Pretty good manual for squid there if you have the time to trawl through it.
If your website is generating dynamic content, for example it's written in PHP, then it will send the no-cache cache control header all the time. Additionally it will set pages to expire in the past. Thus Squid will not cache the site at all. Hence your bandwidth issue.
Although slightly evil, you can override this behavior. The magic squid configuration command is 'refresh-pattern'.
squid : refresh_pattern configuration directive
I would need to know what version of squid you are running to tell you exactly what to use, as there were a lot of improvements made to the refresh-pattern configuraiton directive between 2.STABLE and 3.1!
Different topic but still related to Squid....
Our net is whitelisted for pupils, so we already had an ACL for 'allowedsites' and I've just added an ACL for 'deniedsites'.
We have MTV . COM in our allowed sites, but when I added this site to denied sites it blocked the site - ideal. This is what I wanted it to do.... kind of.
MTV . COM is in our allowed sites, but I want to block MTV . COM / GAMES so all of MTV is accessible apart from the games. I've tested the denied file to work (i.e. when I add a site it is tested to block that site), but when I add this domain with the /GAMES it doesn't block it.
I may have over complicated this actually - just trying to give you an idea of my setup. So basically I want to allow MTV apart from /GAMES.
Hightower (23rd October 2009)
Says 'use this file as input, it is a list of regular expressions'
Geoff (23rd October 2009)
nice find. very helpful.
There are currently 1 users browsing this thread. (0 members and 1 guests)