+ Post New Thread
Results 1 to 15 of 15
*nix Thread, Squid Cache in Technical; We use Squid here, and needed a way to view which sites were being loaded from the cache, and which ...
  1. #1

    Hightower's Avatar
    Join Date
    Jun 2008
    Location
    Cloud 9
    Posts
    4,920
    Thank Post
    494
    Thanked 690 Times in 444 Posts
    Rep Power
    241

    Squid Cache

    We use Squid here, and needed a way to view which sites were being loaded from the cache, and which sites weren't and were therefore downloaded a fresh.

    Is there any way to view this info? And to compare cached/new sites?

    Cheers,

  2. #2

    Hightower's Avatar
    Join Date
    Jun 2008
    Location
    Cloud 9
    Posts
    4,920
    Thank Post
    494
    Thanked 690 Times in 444 Posts
    Rep Power
    241
    No ideas?

  3. #3
    mjs_mjs's Avatar
    Join Date
    Jan 2009
    Location
    bexleyheath, london
    Posts
    1,020
    Thank Post
    37
    Thanked 111 Times in 95 Posts
    Rep Power
    37
    I no this can be done but not sure how. The obvious one is that some sites are told not to use the proxy (defined in your brouser/OS settings) so are not cached, but some are sent thought the proxy but not cached there. It may help to post a note on the Squid-users mailing list, Amos Jeffries is usually a fountain of all knoledge when it comes to Squid.
    Last edited by mjs_mjs; 21st October 2009 at 09:40 AM. Reason: spelling mistakes

  4. #4
    ind1ekid's Avatar
    Join Date
    Jul 2008
    Location
    Nottinghamshire
    Posts
    82
    Thank Post
    6
    Thanked 16 Times in 13 Posts
    Rep Power
    15
    Would webalizer or some other log analyzer provide what you what? Then based on your findings you could create an ACL to deny caching of a specified list of sites?

  5. #5

    Hightower's Avatar
    Join Date
    Jun 2008
    Location
    Cloud 9
    Posts
    4,920
    Thank Post
    494
    Thanked 690 Times in 444 Posts
    Rep Power
    241
    Basically, the reason I ask is because our website is now our homepage and bandwidth is going through the roof because it loads for every user when they load IE.

    I wanted to make certain our website was loading from the cache instead of a new copy each time, thus reducing bandwidth?

  6. #6
    ind1ekid's Avatar
    Join Date
    Jul 2008
    Location
    Nottinghamshire
    Posts
    82
    Thank Post
    6
    Thanked 16 Times in 13 Posts
    Rep Power
    15
    I see Is your site hosted externally?
    The caching features aren't something I ever properly got my head round tbh, wish I could be more help. I only ever managed to set a list of sites for squid to not cache. My primary use of squid is for the filtering.

    The command below will give you some of the cache manager info, it will tell you a little bit about whats going on.
    Code:
    squidclient mgr:info
    You need squidclient installed for it, you can apt-get squidclient too if you on debian!

    Squid: The Definitive Guide - Pretty good manual for squid there if you have the time to trawl through it.

  7. #7

    Geoff's Avatar
    Join Date
    Jun 2005
    Location
    Fylde, Lancs, UK.
    Posts
    11,804
    Thank Post
    110
    Thanked 583 Times in 504 Posts
    Blog Entries
    1
    Rep Power
    224
    If your website is generating dynamic content, for example it's written in PHP, then it will send the no-cache cache control header all the time. Additionally it will set pages to expire in the past. Thus Squid will not cache the site at all. Hence your bandwidth issue.

    Although slightly evil, you can override this behavior. The magic squid configuration command is 'refresh-pattern'.

    squid : refresh_pattern configuration directive

    I would need to know what version of squid you are running to tell you exactly what to use, as there were a lot of improvements made to the refresh-pattern configuraiton directive between 2.STABLE and 3.1!

  8. #8

    Hightower's Avatar
    Join Date
    Jun 2008
    Location
    Cloud 9
    Posts
    4,920
    Thank Post
    494
    Thanked 690 Times in 444 Posts
    Rep Power
    241
    Different topic but still related to Squid....

    Our net is whitelisted for pupils, so we already had an ACL for 'allowedsites' and I've just added an ACL for 'deniedsites'.

    We have MTV . COM in our allowed sites, but when I added this site to denied sites it blocked the site - ideal. This is what I wanted it to do.... kind of.

    MTV . COM is in our allowed sites, but I want to block MTV . COM / GAMES so all of MTV is accessible apart from the games. I've tested the denied file to work (i.e. when I add a site it is tested to block that site), but when I add this domain with the /GAMES it doesn't block it.

    I may have over complicated this actually - just trying to give you an idea of my setup. So basically I want to allow MTV apart from /GAMES.

    Any help?

  9. #9
    ind1ekid's Avatar
    Join Date
    Jul 2008
    Location
    Nottinghamshire
    Posts
    82
    Thank Post
    6
    Thanked 16 Times in 13 Posts
    Rep Power
    15
    Quote Originally Posted by Hightower View Post
    Different topic but still related to Squid....

    Our net is whitelisted for pupils, so we already had an ACL for 'allowedsites' and I've just added an ACL for 'deniedsites'.

    We have MTV . COM in our allowed sites, but when I added this site to denied sites it blocked the site - ideal. This is what I wanted it to do.... kind of.

    MTV . COM is in our allowed sites, but I want to block MTV . COM / GAMES so all of MTV is accessible apart from the games. I've tested the denied file to work (i.e. when I add a site it is tested to block that site), but when I add this domain with the /GAMES it doesn't block it.

    I may have over complicated this actually - just trying to give you an idea of my setup. So basically I want to allow MTV apart from /GAMES.

    Any help?
    A url_regex acl will do the job for you.

    Code:
    acl games1 url_regex -i "/etc/squid/acl/gameurls.list"
    
    http_access deny all games1
    Add your sites to the list and your sorted

  10. Thanks to ind1ekid from:

    Hightower (23rd October 2009)

  11. #10

    Hightower's Avatar
    Join Date
    Jun 2008
    Location
    Cloud 9
    Posts
    4,920
    Thank Post
    494
    Thanked 690 Times in 444 Posts
    Rep Power
    241
    Quote Originally Posted by ind1ekid View Post
    A url_regex acl will do the job for you.

    Code:
    acl games1 url_regex -i "/etc/squid/acl/gameurls.list"
    
    http_access deny all games1
    Add your sites to the list and your sorted
    Care to explain what url_regex -i does?

  12. #11

    powdarrmonkey's Avatar
    Join Date
    Feb 2008
    Location
    Alcester, Warwickshire
    Posts
    4,859
    Thank Post
    412
    Thanked 777 Times in 650 Posts
    Rep Power
    182
    Says 'use this file as input, it is a list of regular expressions'

  13. #12
    ind1ekid's Avatar
    Join Date
    Jul 2008
    Location
    Nottinghamshire
    Posts
    82
    Thank Post
    6
    Thanked 16 Times in 13 Posts
    Rep Power
    15
    My explanations are poor, so have a read here

    About a third of the way down:
    ViSolve - Squid Configuration Manual

    Has a nice example too!

  14. #13


    tom_newton's Avatar
    Join Date
    Sep 2006
    Location
    Leeds
    Posts
    4,475
    Thank Post
    866
    Thanked 849 Times in 671 Posts
    Rep Power
    196
    Sorry to weigh in late-doors on the cacheability issue - but this might help:

    Cacheability Query

  15. Thanks to tom_newton from:

    Geoff (23rd October 2009)

  16. #14

    Geoff's Avatar
    Join Date
    Jun 2005
    Location
    Fylde, Lancs, UK.
    Posts
    11,804
    Thank Post
    110
    Thanked 583 Times in 504 Posts
    Blog Entries
    1
    Rep Power
    224
    nice find. very helpful.

  17. #15

    Hightower's Avatar
    Join Date
    Jun 2008
    Location
    Cloud 9
    Posts
    4,920
    Thank Post
    494
    Thanked 690 Times in 444 Posts
    Rep Power
    241
    Quote Originally Posted by ind1ekid View Post
    A url_regex acl will do the job for you.

    Code:
    acl games1 url_regex -i "/etc/squid/acl/gameurls.list"
    
    http_access deny all games1
    Add your sites to the list and your sorted
    Awesome - that worked a treat. Many thanks!

SHARE:
+ Post New Thread

Similar Threads

  1. Web cache
    By Little-Miss in forum How do you do....it?
    Replies: 4
    Last Post: 20th October 2009, 05:56 PM
  2. [Ubuntu] Squid Cache
    By Hightower in forum *nix
    Replies: 4
    Last Post: 23rd September 2009, 12:40 PM
  3. Cache Settings
    By mattx in forum How do you do....it?
    Replies: 0
    Last Post: 23rd January 2009, 03:36 PM
  4. cache box
    By david12345 in forum Hardware
    Replies: 2
    Last Post: 8th May 2008, 10:15 AM
  5. Google Cache
    By tomscaper in forum Windows
    Replies: 7
    Last Post: 23rd November 2007, 09:47 AM

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •