On 02/04/11 02:22, Daniel Plappert wrote:
> Hi all,
>
> I am new to squid, so I hope you don't feel offended if this is a beginner's question. ;-) I am trying to replace a Apache httpd server, who works as a delegating proxy. Let me explain the scenario shortly:
>
> internet -> Apache httpd delegator -> server[1-3]
>
> Because, to the outside, we have just one ip-address, the httpd delegator forwards the request according to the URL to one of the internal server, i.e. wiki.example.com is forwarded to server1, dms.example.com is forwarded to server2. This is done with virtual-hosts and rewrite rules, i.e. for server1:
>
> RewriteRule ^(.*)$ http://wiki/$1 [L,P]
>
> As you can see here, the request is delegated to an internal server called wiki.
>
> What I am trying to do now is to replace the Apache httpd delegator with squid. What I've done so far is to configure squid as an accelerator and declared the corresponding nodes:
>
> acl wiki_sites dstdomain wiki.example.com
> http_port 80 accel defaultsite=example.com vhost
> http_access allow wiki_sites
So far good.
Note:
by using "defaultsite=example.com" this makes the 'broken' clients
which do no send hostname properly use "example.com", which does not
match your domain ACL "wiki.example.com".
Result: clients which do not send "wiki.example.com" properly as the
virtual domain name will not get to the wiki server.
Whether this is a good behaviour is up to you. Just be aware of it.
> cache_peer wiki parent 80 0 no-query originserver forceddomain=wiki name=wiki
Mostly good.
Use "forcedomain=" only if the peer is sightly broken and requires all
traffic to arrive with that value as its public domain/host name.
Squid will prefer to send on the public domain FQDN (in this case
"wiki.example.com") to the peer so that it can easily and properly
generate public redirects, cookies and page content URLs etc.
> forwarded_for on
"forwarded_for" is not strictly relevant, but fine.
> cache_peer_access wiki allow wiki_sites
Okay good.
>
> Forwarding the request works as expected, but there is one problem: server1 (the (t)wiki server) adds now a wrong base url in the html header:
>
> <base href="http://wiki" />
Bingo. The wiki server is using what it sees as the public host/domain
name (Host: header) to general URLs. see above.
>
> This doesn't happen with the apache delegator.
Apache is sending rather broken headers to the wiki server.
They look like this:
GET http://wiki/foo.html HTTP/1.1
Host: wiki.example.com
...
Whereas Squid is sending proper HTTP headers based on the URL (as
altered by forcedomain):
GET /foo.html HTTP/1.1
Host: wiki
>
> So, finally my question: how is it possible to configure squid, in a way that the base url is as it was before:<base href="http://wiki.example.com" /> I need the URL from the outside (internet), not from the internal (intranet).
>
With Squid you will get the same URLs publicly and internally. So
traffic will hopefully all go through Squid where you can centralize a
set of ACLs for the internal/external access if it actually matters.
Amos
-- Please be using Current Stable Squid 2.7.STABLE9 or 3.1.11 Beta testers wanted for 3.2.0.5Received on Sun Apr 03 2011 - 06:39:39 MDT
This archive was generated by hypermail 2.2.0 : Wed Apr 06 2011 - 12:00:03 MDT