My company has enforced an HTTP proxy requirement on my work computer. That effectively makes the computer unusable for web related work unless I hook it up to the corporate network using a VPN connection. So a local Squid proxy to the rescue!
I create a local domain using the same naming scheme that my work has, create a DNS entry mirroring the proxy host and setup a basic caching Squid proxy.
I choose this setup: https://github.com/sameersbn/docker-squid and forked the repo to my own GitHub account (https://github.com/mry/docker-squid and added my own customizations.
![](/2017/02/16/setting-up-a-web-proxy-using-squid-and-docker/16379f67-8add-4863-b9ba-b80e7bcbe375.png)
Create the Docker image
docker build -t emryl/squid github.com/mry/docker-squid |
![](/2017/02/16/setting-up-a-web-proxy-using-squid-and-docker/create-the-docker-image.png)
Login to DockerHub
docker login |
![](/2017/02/16/setting-up-a-web-proxy-using-squid-and-docker/login-to-dockerhub.png)
Push image to Docker Hub
docker push |
![](/2017/02/16/setting-up-a-web-proxy-using-squid-and-docker/push-image-to-docker-hub.png)
Running using docker-compose
I set it up to run on my Synology NAS instance. I’ve create the neccessary folder structure according to the volume listing in the docker-compose.yml file.
When first run, it will download the image I uploaded in the previous step. On first run, omit the -d (for daemon) to see the container starts up without issues.
![](/2017/02/16/setting-up-a-web-proxy-using-squid-and-docker/running-using-docker-compose.png)
![](/2017/02/16/setting-up-a-web-proxy-using-squid-and-docker/5ac49a57-e844-4d1d-801d-cb954a7b9c20.png)
Check the mapped folders are used
![](/2017/02/16/setting-up-a-web-proxy-using-squid-and-docker/check-the-mapped-folders-are-used.png)
Check if Squid is reachable
Then configure the HTTP proxy setting in the client for port 8080.
![](/2017/02/16/setting-up-a-web-proxy-using-squid-and-docker/check-if-squid-is-reachable.png)