If your online project is growing and beginning to attract international customers, the workload of your server is also increasing. High user numbers and the large geographical distribution of the clients result in ever-increasing loading times and slow transfer rates. By using a CDN (content delivery network), you’ll be able to react to increasing demands and optimise your data delivery.
The goal of search engine optimisation is to gain the best possible placing in Google’s organic search results list. Achieving this requires strong content, a solid backlink profile, and a consistent social media presence. But it is also technical aspects that are very important, too. Google demands from website operators that they design their offers as user- and search engine friendly as possible. A content delivery network (CDN) can help with technical optimisation, particularly when it comes to the likes of performance and loading speed. Here, we outline how you can gain an SEO advantage with the help of a CDN.
What is a CDN (Content Delivery Network)?
A CDN is a net of servers that are spread globally across various data centers and connected with one another. An outgoing server keeps a website’s original and current content accessible. The job of the so-called replica server is the flexible distribution of content. To do this, files are sourced from the original server. These are then ‘mirrored’ by the replica server, and then regularly checked to be kept up to date.
More information on this can be found in our digital guide article ‘What is a content delivery network?’.
How does a CDN help with search engine optimisation?
The term SEO is often heard in connection with content delivery networks. This is because CDNs offer a clear advantage when it comes to search engine optimisation – a CDN significantly improves a page’s loading speed. And this is undoubtedly an asset. The likes of Google, Amazon, and Yahoo have all undertaken studies to see what effect PageSpeed has on consumer and visitor behaviour. Amazon found that an extra load time of just 100 ms (millisecond) caused a 1 per cent drop in sales, while Google discovered that a 500 ms increase in load time resulted in 20% fewer searches. What these figures reveal is that longer loading times lead to unhappy users and consumers, and in the worst case, none at all. This is of course something that should be avoided at all costs.
A website’s loading speed is hugely important for SEO success as it has an influence on many different parameters, such as the bounce rate. If it is the case that a page loads slowly or not at all, then many visitors will simply leave the website. This leads to an increase in the bounce rate, something which is negatively registered by Google. Even if the bounce rate isn’t an official ranking factor, it does still have an influence on other evaluation statistics, such as the amount of returning visitors and denoting what kind of website it is.
Bounce rate aside, long loading times can also have negative effects on the length of stay and the conversion rate – in short, it takes away from the whole user experience. Google has compiled a list of over 100 different ranking factors with the aim of creating the ideal user experience.
Myths about CDNs
The fact that a CDN can bring about significant benefits regarding SEO does not mean that the whole subject is immune to certain myths and falsehoods.
Duplicate content becomes a problem
One falsehood that is often heard is that using a CDN can lead to the creation of duplicate content; something which is not viewed favourably by Google. Search engines appraise duplicate content negatively because it is of no value to users. It is possible that a CDN can cause the creation of duplicate content, but by following a few rules, this can be easily avoided.
- Canonical Header: Every CDN user should incorporate a so-called ‘Canonical Header’. This particular HTTP header lets Google know that the content in the CDN is just a copy. Most CDN providers will offer a corresponding feature, which allows you to integrate such a header in just a few clicks.
- Robots.txt file: When the Google bot scans a website, it is looking for robots.txt files. This allows website providers to set down the guidelines which the bot should follow for scanning content – and which of them should be entered into the index. This then allows you to avoid duplicate content. If there is no file, then the bot will simply scan all the content. Usually it is the case that CDN providers will not activate the robot.txt file. Everything will be scanned, but the canonical header is usually enough to prevent duplicate content being created.
Using a CDN is expensive
There are numerous providers out there that offer content delivery networks across a whole spectrum of uses. Everything is available: from the expensive enterprise package to the much cheaper solution for small to medium web projects. At the end of the day, the idea that a CDN solution has to be expensive is in no way true.
Some packages are even available completely free of charge, e.g. the beginner’s offer from Cloudfare. Others are a bit more expensive, such as Amazon Cloudfront or Akamai, which are both well tried and tested. Amazon charges per gigabyte, while Akamai will provide a price quote upon request for individual projects. There are even cheap CDN solutions for WordPress projects.
Help setting up a CDN
From the point of view of a layman, the principle of a CDN may not always be immediately clear – but at the same time the setup is by no means rocket science. Depending on the provider and the type of package, support will be offered throughout the implementation process, and there is also a lot of information online. The most important job for the provider is to decide which files should be considered in the CDN. From here, the appropriate configuration follows, which looks after which file requests need to be sent to the CDN. Nowadays there are also plugins that assist with the installation, mainly aimed at larger content management and shop systems.