For dynamic web projects, complex layouts and rising user rates sig­ni­fic­antly drag down per­form­ance. Reserve proxies make it possible to curtail these losses and relieve web servers by answering server requests. Reserve proxies work by storing requested material—static content, like pictures, as well as fre­quently requested dynamic pages—in their cache. One very popular caching software is Varnish. Unlike many of its com­pet­it­ors, Varnish was planned from the ground up to be a web ac­cel­er­at­or. In­stalling and con­fig­ur­ing a varnish cache requires web servers to preside over all root rights and run on Unix operating systems.

How Varnish Cache works

In the chain of processes that occur during data requests, Varnish is po­si­tioned directly upstream from the web server where the desired content is found. While page requests are still initially processed by the original server, it’s the Varnish proxy that saves the request and required content. Further requests of this kind are dealt with by loading the desired data directly from the Varnish Cache. This means that the software is able to cache all data located in the working memory and allows the operating system to determine what’s to be saved to the server’s hard drive. Doing this helps users avoid sim­ul­tan­eously saving data on both their hard drives and in the cache. Varnish also functions as a load balancer . With the help of the Round Robin procedure, incoming client requests are evaluated as separate worker threads that are dealt with in se­quen­tial fashion by the Varnish Cache. A fixed limit de­term­ines how many sim­ul­tan­eously active threads are able to be processed. Once this threshold is reached, all other requests end up in a queue where they wait to be processed. Incoming con­nec­tions are only blocked once the queue’s limit is reached. Con­fig­ur­ing Varnish reserve proxies is mostly con­trolled via the Varnish Con­fig­ur­a­tion Language (VCL). This makes it possible for hooks (here: a technique which allows users to integrate foreign code into the ap­plic­a­tion) to be written. Once this VCL script is loaded, it’s then trans­lated into the pro­gram­ming language C and compiled into a program library; the VCL in­struc­tions are linked to the Varnish cache. If the applied CMS, e-commerce software, or web ap­plic­a­tion supports the markup language ESI (Edge Side Includes), Varnish is also able to transmit entirely cached pages. The tagging language generates ESI tags in the HTML files, which are used for labelling dynamic content. During client requests, Varnish Cache is able to recognise these tags and reload their cor­res­pond­ing content.

The pros and cons of Varnish hosting

In many cases, op­tim­ising cus­tom­ised hosting solutions with a Varnish cache can be the answer to chal­lenges brought about by the growing com­plex­ity and rising user rates of your web project. This doesn’t mean that the software is best suited to all web presences, though. Take a look at the pros and cons of Varnish hosting in the overview we’ve created below:

Ad­vant­ages:Dis­ad­vant­ages:
Faster loading times thanks to caching in the RAM No sub­stan­tial op­tim­isa­tion for systems that don’t support ESI
Web server relief Increased com­plex­ity and error rate
Supports ESI Doesn’t support TLS/SSL (HTTPS)
Operating system exports content to server hard drive Demanding set-up and con­fig­ur­a­tion
Load dis­tri­bu­tion based on Round Robin procedure Only for Unix systems
Flexible con­fig­ur­a­tion pos­sib­il­it­ies with VCL

The com­par­is­on above il­lus­trates once more that Varnish hosting is only a viable al­tern­at­ive to caching functions for clients and web servers when working with web ap­plic­a­tions that support ESI. Ad­di­tion­ally, setting up and con­fig­ur­ing the Varnish Cache with ESI tags can prove to be taxing. And given that Varnish doesn’t support any TLS/SSL con­nec­tions, you’ll need an ad­di­tion­al proxy server for secure transfers.

Off putting as some of these points may seem, a properly con­figured Varnish Cache with ESI tags can speed up your web projects in a way that con­ven­tion­al caching methods can’t. This greatly decreases loading times for visitors, helping you achieve a greater overall con­ver­sion rate in the long run. These efforts are also rewarded with a better search engine ranking and sig­ni­fic­antly relieve the webserver, which is no longer re­spons­ible for pro­cessing all incoming con­nec­tions. Varnish hosting is es­pe­cially popular with those operating online stores and websites with a variety of content.

In­stalling Varnish Cache

Ad­min­is­trat­ive rights of the Unix system in use is required in order to install the Varnish Cache. Ad­di­tion­ally, the web server located down­stream from the Varnish Cache needs to be installed before you can begin. The following in­struc­tions lay out the necessary steps on how to install and configure varnish. The example that follows has been done on an Ubuntu operating system and an Apache web server:

1. First step:

Per default, Varnish is included in Ubuntu’s software package man­age­ment, but it’s not always the latest version. For this reason, Varnish gives users the op­por­tun­ity to access its online directory when in­stalling the software. Here’s how to open the directory and add it as a source:

sudo apt-get install apt-transport-https
sudo curl https://repo.varnish-cache.org/GPG-key.txt | apt-key add -
sudo echo "deb https://repo.varnish-cache.org/ubuntu/ trusty varnish-4.1" >> /etc/apt/sources.list.d/varnish-cache.list

2. Second step:

For the next step, reread the package list and install Varnish:

sudo apt-get update
sudo apt-get install varnish

3. Third step:

At this point, the Varnish file should be con­figured so that the software ‘knows’ where it can find web content:

sudo nano /etc/default/varnish

Change entries under ‘DAEMON_OPTS’ as follows:

NameVirtualHost 127.0.0.1:8080
Listen 127.0.0.1:8080
DAEMON_OPTS="-a :80 \
-T localhost:6082 \
-f /etc/varnish/default.vcl \
-S /etc/varnish/secret \
-s malloc,256m"

4. Fourth step:

Save the changes and open the default.vlc file:

sudo nano /etc/varnish/default.vlc

Enter port 8080 as the source for content created with Varnish:

backend default {
.host = "127.0.0.1";
.port = "8080";
}

5. Fifth step:

Finally, also set up port 8080 (default is 80) for Apache. Open the cor­res­pond­ing Apache port con­fig­ur­a­tion file:

sudo nano /etc/apache2/ports.conf

Change the port number for the entry ‘NameVir­tu­al­Host’ and ‘Listen’ as follows:

NameVirtualHost 127.0.0.1:8080
Listen 127.0.0.1:8080

6. Sixth step:

To finish the in­stall­a­tion, configure the default file (etc/apache2/sites-available/default) to the Vir­tu­al­Host entry using the same method as from step 5.

7. Seventh step:

Restart the server and Varnish to finish the in­stall­a­tion:

sudo service apache2 restart
sudo service varnish restart

To find ad­di­tion­al in­struc­tions on how to install Varnish on other Unix-based operating systems as well as the software’s program code, head to the download section on Varnish’s official website.

Go to Main Menu