Search­met­rics has been providing an annual, in-depth analysis of search engine rankings for the US since 2012. However, the ranking factors spe­cific­ally tailored to the market leader, Google, appeared in this form for the last time at the end of 2016. Google explains the era of general SEO check­lists is over and has begun to use a more soph­ist­ic­ated approach to search engine op­tim­isa­tion. In the future, the company’s annual SEO report will include sector-specific analyses. One of the reasons for this is Google’s AI system, RankBrain, which is much more flexible than tra­di­tion­al search al­gorithms thanks to machine learning. In June 2016, Google employee Jeff Dean disclosed to technical editor Steven Levy that RankBrain would be involved in the pro­cessing of all search engine requests. According to Levy’s report on Backchan­nel, the AI system is one of the three most important factors in search engine rankings. With the current study 'Ranking factors – rebooting for rel­ev­ance', Search­met­rics presents a catalogue of factors that decide where a website places in the search engine results. The results of the in­vest­ig­a­tion are primarily intended as guidelines for in­di­vidu­al, sector-specific analyses. As in previous years, the study is based on a set of 10,000 search engine-related keywords. Current findings were in­ter­preted with a view to previous in­vest­ig­a­tions. We have sum­mar­ised the most important results from the Search­met­rics study.

Content factors

Strengthened by the recent changes to the Google core algorithm, one thing is certain for Search­met­rics: a website’s content has to be the focus when op­tim­ising for the search engine. One main factor here is content relevance. This only became a ranking factor in 2016. Op­tim­ising in­di­vidu­al keywords, on the other hand, is losing sig­ni­fic­ance in favour of holistic text design.

Content relevance is becoming the main ranking factor

Good SEO content is char­ac­ter­ised by how much it cor­res­ponds to what the user is looking for. However, this differs from search query to search query. The challenge that content marketers face is that they have to answer as many questions as possible with just one text. To achieve this, content is created based on holistic texts that take into account different aspects of a topic and are optimised for different keywords on a se­mantic­ally related topic. Holistic content therefore aims to achieve good search results for several relevant keywords. The in­di­vidu­al keyword, however, fades into the back­ground. The search metrics study analysed the content relevance of texts in relation to the search term used. This was carried out on the basis of lin­guist­ic corpora and on the concept of semantic re­la­tion­ships. The result isn’t sur­pris­ing:                 'The URLs with the highest content relevance are those in position 3 to 6' The relevance score decreases con­stantly for sub­sequent search results. Positions 1 and 2 need to be con­sidered sep­ar­ately in this study. According to Search­met­rics, there are generally websites of well-known brands, which benefit from factors such as re­cog­nis­ab­il­ity, user trust, and brand image when it comes to the Google ranking, so their position isn’t ne­ces­sar­ily due to the content being relevant.

The average number of words is in­creas­ing

The average word count of well-ranking landing pages has been steadily in­creas­ing for years. According to Search­met­rics, this reflects a more intensive analysis of the re­spect­ive topic areas in terms of holistic content creation.

                'The average number of words increased by 50 percent in 2016'

Sig­ni­fic­ant dif­fer­ences in the word count are shown here with the com­par­is­on of desktop content and mobile landing pages: according to the analysis, desktop versions of a website are on average 30 percent longer than their mobile coun­ter­parts.

The keyword in the title is losing relevance

This motto is still valid in the world of SEO: the keyword needs to be in the title - and placed as far forward as possible. Whether the search engine agrees with this as­sump­tion is shown by the findings of one of the Search­met­rics studies:

                'In 2016 only 45 percent of the top 20 URLs had the keyword in the title'

This de­vel­op­ment can also be explained by the holistic approach of content creation, in which texts are optimised for topics rather than keywords. Google’s AI system is now capable of analysing semantic re­la­tion­ships without the need for keywords.

Similarly, this de­vel­op­ment can be seen by looking at the headlines and de­scrip­tions. According to Search­met­rics, only one-third of all the top 20 landing pages have the keyword in the main heading (H1).

User signals

In order to determine whether Google users are satisfied with the proposed URLs, the web operator doesn’t have to solely rely on indirect factors such as the semantic analysis of website content. In addition to the search engine, Google products such as the Chrome web browser, the web analytics plugin Analytics, the ad­vert­ising system AdWords, and the mobile operating system Android, also provide detailed in­form­a­tion on users’ online behaviour.

Google can find out whether a website offers what it promises by analysing detailed user signals such as clicks and bounce rates as well as the average retention time combined with a huge database. The Search­met­rics study provides helpful input for website operators and SEO con­sult­ants.

The average click-through rate of positions 1 to 3 is 36 percent

Users put a lot of trust into the search engine’s relevance analysis. This was already indicated in the 2014 Search­met­rics study when the average click-through rate (CTR) of the analysed top URLs was de­term­ined.

Con­sequently, websites on position 1 receive the most clicks. For pole position, the Search­met­rics team cal­cu­lated an average CTR of 44%. The per­cent­age decreases the further down the page the website is. At position 3 the CTR is already down to 29%.

                'The average click-through rate of position 1 to 3 is 36 percent.'

There’s a clear increase in the CTR in position 11. The first website on the second search results page actually receives more clicks than the websites po­si­tioned at the bottom on the first search results page.

Compared to 2014, the average CTR of the top 10 URLs in the Google ranking has risen sig­ni­fic­antly.

The bounce rate on the first search results page has increased to about 40 percent

In addition to the CTR, the so-called bounce rate is also taken into account when con­duct­ing a website relevance as­sess­ment. This shows how many users return to Google.com after clicking on a search result without looking at other URLs on the domain they’re visiting.

For the in­vest­ig­ated URLs, the Search­met­rics team found an average of 40% increase in the bounce rate for the top 10 positions compared to 2014. There’s a clear deviation between the top two positions.

According to analysts, however, a website’s relevance can only be partly de­term­ined by the bounce rate. Users have different reasons for leaving a website and it could also be that they leave quickly even after finding what they were looking for. This is possible, for example, with quick search queries, which can easily be answered by looking through a glossary or dic­tion­ary.

One ex­plan­a­tion for the detected increase in the average bounce rate could be the high quality of Google’s algorithm, according to the Search­met­rics team. Thanks to the new AI system, the search engine is now able to display the URL that best fits the user’s needs. This means that fewer URLs need to be clicked on to find the desired in­form­a­tion.

The average retention time is in­creas­ing

What’s es­pe­cially important for a relevant as­sess­ment of a website is mainly the time between clicking and leaving the site. Webana­lyt­ics can ac­cur­ately measure the time a user spends on the website (time on site). Here, Search­met­rics provides us with an average value for which there could be different reasons.

                'The time on site for the top 10 URLs is 3 minutes and 43 seconds.'

Comparing the value of the current study with that of 2014, a sig­ni­fic­ant increase in the retention time can be seen for the top 10 URLs. This de­vel­op­ment can be at­trib­uted to the fact that website operators have started providing visitors with better quality content since they want to rank well with Google.

Although a high retention time can be due to high-quality content, this does not mean that a short stay ne­ces­sar­ily means that you website is of low quality. For example, a user searching for a weather report or the results from last weekend’s soccer match won’t need to spend ages on the website.

Technical factors and usability

Creating high-quality valuable content is just one part of search engine op­tim­isa­tion. Even the best content will not reach the top positions in the search results if the basic technical re­quire­ments are missing. Topical websites must be analysed for factors such as per­form­ance, security, and usability. In terms of technical pro­vi­sions regarding web content, two main SEO trends stood out in 2016. On top of general factors such as page loading time and file size, more and more website operators are striving to make mobile content available to their visitors. In addition, Search­met­rics stresses the im­port­ance of transport en­cryp­tion via HTTPS for the search engine ranking.

HTTPS has become a necessity in 2017

Web­mas­ters planning to forego HTTPS transport en­cryp­tion in 2017 will find it hard to improve their position in the search results. Even in September 2016, Google announced that it planned to classify HTTP sites as 'unsafe' from 2017 onwards if sensitive data is processed on them. You don’t have to be an SEO expert to figure out how much this will affect a user’s will­ing­ness to click on a website. Not least for this reason, the number of HTTPS-encrypted websites is growing steadily. While the number of top 10 websites based on HTTPS was still 14% the previous year, Search­met­rics was able to record a sig­ni­fic­ant increase to 33% in its current study.                 'One third of websites in the top 10 are now based on HTTPS en­cryp­tion.'

Mobile-friend­li­ness is a pre­requis­ite for a good ranking

Search queries from mobile devices are con­stantly in­creas­ing. According to a Google report from 2015, the search engine market leader has more search queries coming from mobile devices than desktop computers in the USA and Japan. Google’s Mobile First Strategy shows what a different mobile friend­li­ness makes on the search engine ranking. In October 2016, the company announced it would use the mobile index as a main index for web search in the future. It is therefore assumed that this will also affect the ranking for desktop web search. Web­mas­ters who are still neg­lect­ing their mobile users should consider re­design­ing their website for 2017. Mobile sub domains, re­spons­ive web design, and dynamic content are all good ways of making your site more mobile-friendly.

Social signals

As in 2014, the 2016 Search­met­rics study continued to show a strong cor­rel­a­tion between a top rating in the search engine and so-called social signals. This refers to com­mu­nic­at­ive signals such as shares, likes, and comments on social networks like Facebook, Twitter, Pinterest, and Google+. Analysts still find it difficult to associate a good Google ranking with social media presence.

Backlinks

The number of backlinks on a website still cor­rel­ates somewhat with its placement in the search results. However, link building is fading into the back­ground as a result of new de­vel­op­ments in search engine op­tim­isa­tion. According to Search­met­rics, a website’s backlink profile is no longer the decisive factor for im­press­ing Google, but rather one of many.

Today, Google is able to analyse a website’s value through semantic context and direct user signals. The backlink profile has been replaced by content relevance as a quality feature. If there are numerous inbound links from rep­res­ent­at­ive websites, web­mas­ters need to ask them­selves whether the users’ ex­pect­a­tions are met by the content these websites offer.

You shouldn’t fear backlinks com­pletely dis­ap­pear­ing from the Google algorithm in the future. This was also confirmed by the Search­met­rics study. However, backlink-oriented search engine op­tim­isa­tion is now outdated.

  'The cor­rel­a­tions for backlinks are still high, but their im­port­ance for the ranking will continue to decrease.'               

Ranking factors in cor­rel­a­tion to the search engine position

The following graphic shows the ranking cor­rel­a­tion of general ranking factors for the top 20 URLs analysed by Search­met­rics as well as any other de­vel­op­ments compared to the previous year. Ranking factors, which were collected for the first time in 2016 as well as re­cal­cu­lated factors, have been marked with an asterisk.

Click here to download the in­fograph­ic of  the Top 20 Rank Cor­rel­a­tions from Google.com

Con­clu­sion: the future of search engine op­tim­isa­tion

The Search­met­rics study 'Ranking Factors - Rebooting for Relevance' shows that there are still factors that correlate to a good Google ranking. However, these factors can no longer be trans­ferred to almost any website. The different demands that man and machine place on websites from different sectors can’t be properly depicted with general ranking factors.

At the time the current analysis was published, the Search­met­rics team announced more soph­ist­ic­ated follow-up studies, which examine the needs of in­di­vidu­al sectors sep­ar­ately. The annual study on ranking factors and rank cor­rel­a­tions will also be based on sector-specific keyword sets (e.g. for e-commerce, health, finance, etc.). The 2016 Search­met­rics study can be used by web­mas­ters as a simple SEO checklist and a way to interpret trends and pre­dic­tions in the field of search engine op­tim­isa­tion on a sector-specific basis.

Author

Go to Main Menu