How Google Ranks Websites - Website Authority
A few years ago I tried to answer in an article “The Question” that we get asked by nearly every prospect. To paraphrase “If i sign up for an SEO service from Nettonic, how much will it cost and how long will it take”.
In this article I will try to go a bit deeper and explain how search engines rank sites in accordance to authority.
When it comes to ranking websites, Google uses more than 200 signals to decide which pages to display in the search results based on queries asked by users. While no one outside of Google knows exactly how each metric influences the rank, some of them have been proven and agreed upon by many people all over the world while others have been more divisive.
It is also important to note that these signals are also not permanent and may change over time as Google makes changes to its search algorithm. Therefore, it is in a state of flux where it can remain the same but also change.
To understand it properly, we need to dive a little deeper into various aspects such as how the ranking system works, how these signals are measured by tools and what determines a good Website Authority
The Google Ranking System – How it Works?
There are two parts to how Google determines the Website Authority of a page. On one part, Google is constantly updating its indexes with billions of web pages as new ones are being created each day. The information in those pages is gathered through the search engine spiders and then a ranking system is created for each page.
Then when a user searches for his/her query using certain keywords, Google then pulls a list of pages that best answers the query based on its ranking parameters. The results being structured the way they are when one types a query is a direct consequence of Google’s opinion on the authority of each of those pages.
Therefore, being at the top of SERP communicates that your page has the highest Website Authority and “relevance” and is likely most reflective of the user’s search intent.
The PageRank Algorithm – Its History and Early Influence in Determining Authority
PageRank was an algorithm that Google initially used in the earlier days to determine which websites to show in the search results. It did so by figuring out how many links led to a particular page and keywords. The more links there were, the higher the visibility of that page. To prevent content creators from spamming their pages with links, Google also looked into the source of the link to determine the validity of the page. So if a page had many links leading to it from other pages, then any link from that page will be given more weight when it came to displaying it in search results. However, that system was later abused by many online content creators which led to Google changing the way it ranked websites.
Google Ranking System – How it Works Now?
As mentioned earlier, today Google utilises more than 200 signals or metrics to decide its top results. This change came into being with the RankBrain update in 2015. Some of these signals carry more weight than the other but unlike before, the ranking is based on a variety of factors making it more mysterious but fairer too. In earlier incarnations of Google’s ranking system, websites used to be spammed with keywords all over, just to be ranked on top of search results. Fortunately, that cannot be the case now thanks to the RankBrain update. Now Google also utilises artificial intelligence to determine the relevancy and priority (Website Authority) of pages in search results.
Apart from that, Google also has a team of more than 10,000 whose job it is to gauge the quality of websites by following a 200-page guideline book. Any website that doesn’t meet the required standards will be flagged as such. It is important to note that Google determines authority based on each web page. This allows search results of a certain website to be more relevant towards the user’s query even if some of its other pages aren’t as optimised for the ranking algorithm. Another proven aspect of the ranking system is that the percentage of original or duplicate content also plays an important role in determining the visibility of webpages.
How Can These Signals be Measured?
There are many SEO tools out there today that can help understand Google’s ranking system and Website Authority. While the use of these tools cannot guarantee successful results, the ones listed in this article are some of the most popularly used by companies all over the world due to their relatively higher success rate of helping websites improve their visibility. All these tools need to be purchased and they offer unique features to predict the strength of websites when it comes to being ranked high in search results. Some of them are:
Moz – Domain Authority
Moz utilises Domain Authority which is a search engine ranking score they developed to figure out the chances of a certain website appearing as the promoted result whenever a user types in a relevant query. The Domain Authority score is on a scale of 0 to 100 where the higher the number, the more likely the website will rank higher in search results.
The score is calculated by sifting through various factors ranging such as linking root domains, number of links on the pages and more. This score, however, is not something Google considers for its ranking system and bears no consequence in terms of the search engine results page. Do note that Domain Authority (DA) is not the same as Page Authority as the former is used to predict the ranking strength of the domain whereas the latter is used to calculate the strength of the individual webpages.
Checking Domain Authority and It’s Scoring
The Domain Authority (DA) of a website can be checked either through the Link Explorer from Moz, the Moz SEO toolbar, or in the keyword explorer. A website’s domain authority score may change from time to time as different data points may be used while calculating the score each time. Therefore, the main use of the score is for comparing one website against the linking profile of another one.
DA is calculated using multiple factors and is based on the data from Moz’s Link Explorer web index. So you can’t directly influence the DA score by doing one specific thing. The only way to get a better score is to improve all aspects of SEO as a whole. As it uses a logarithmic scale, a website can attain a higher score at the lower range of the scale rather than a higher range. So a website may be able to go from a score of 10 to 20 more easily than from 80 to 90.
How Does the Domain Authority Score Work?
The Domain Authority score will always be 1 for brand new websites. As the site is built upon and updated for SEO, the score will change. To utilise the DA score to its fullest potential, you have to compare your website with higher-ranked competitor websites to see the difference between both. Due to this, there is no good or bad score one can get. By having a high-quality link profile, you will be able to score higher. Sometimes, however, you might notice that there isn’t any change in your score despite having more relevant links added. While one can’t pinpoint the specific reason why the score would change, it may be due to
- Websites with a higher score had substantial changes to their link growth which impacts the scaling process.
- Your website may have gotten new links from sources that don’t contribute to Google ranking.
- Less linking domains got crawled in a recent update of the score
- Any improvement in link growth didn’t get updated with Moz’s web index.
The above are just a few reasons why the score might change. To maintain a high score, it is essential to keep improving the SEO of the website. Moz Ranking Tool
BBC with a Domain Authority of 96 is not bad….Nettonic.co.uk at 32 still has a way to go 🙂
Ahrefs – Domain Rating
Ahrefs on the other hand use their proprietary Domain Rating (DR) metric to calculate the strength of a website’s backlink profile. Both size and quality of the backlink profile are measured here. Similar to DA, DR also uses a logarithmic scale of 0 to 100 to measure their score. So the higher the score, the more difficult it will be to go even higher.
DR and Ahrefs Rank (AR) are quite similar to one another as AR can be considered to a more granular form of DR. The advantage of DR is that it can see all the gaps between the backlink profiles of different websites. So AR can order websites based on the strength of their backlink profiles but DR can tell you how much apart the websites are from one another. This makes it a very useful tool to use for websites that want to stay competitive amongst Google’s SERP.
Calculating Domain Rating
The calculation of the Domain Rating is done by
- Looking at the number of unique domains that have at least one link to the target website
- Measuring the DR value of linking domains
- Looking at the number of unique domains those websites link to
- Get a rough DR score using their unique code and math
- Assign the score on the 0 to 100 scale to keep track of future fluctuations
When going through the above, it is important to note that any subsequent links from the same source won’t make any change in the DR. If any website links to yours with nofollow links, then it won’t bear any consequence on your score. Linking websites that get more dofollow backlinks will have a higher DR score. This will also affect the score of each of the website being linked to.
How Does the Domain Rating Score Work?
When it comes to the impact of bad links, there is a lot of divisive opinions amongst those who specialise in SEO. Some believe that questionable links won’t be penalised as they will just be ignored while others believe that one should get rid of any suspicious backlinks. As no one has a definitive answer, for now, the wise thing to do is to always maintain high quality on webpages.
Domain Rating cannot be used as a definitive metric for gauging the quality of a website because it wasn’t made to do that. A newly created website with only a few links to it may have a very low score. However, that may not be because of low-quality content or spammy links. The low score could be due to the website being new and not having enough backlinks to it. Therefore, the only plausible way to check and see the reason for a low score would be to manually go through the links to see if they are good or just spam. Ahrefs Ranking Tool
SEMrush – Authority Score
The Authority Score is a metric used by SEMrush and is based on a complex set of parameters to represent the overall SEO quality of a domain. The score is calculated by making use of both a neural network as well as machine learning to ensure accuracy. On a scale of 0 to 100, a higher score would mean that the website has good quality SEO along with better backlinks.
Using the score along with the number of backlinks, one can compare one website to another to gauge the quality of the links being used. For instance, if website A has 1 million backlinks and an authority score of 60 whereas website B has only 200k backlinks but an authority score of 70, then it displays the level of quality between both sites.
Calculating Authority Score
The Authority Score is calculated in two unique steps.
Step One – The machine learning algorithm utilises various sets of data pertaining to searches, web traffic and backlinks to somewhat understand how the top websites stay at the top of the ranking list. The specific actual reasons cannot be measured of course but a general idea can be ascertained by looking and comparing various data sets.
Step Two – The neural network algorithm then utilises data on backlinks to see the impact on its authority from increasing links. This is done through getting data about various aspects such as the number of domains referring to the target site, the authority of the referring domains, the number of dofollow and nofollow links, the total number of backlinks and more.
How Does the Authority Score Work?
As mentioned above, the Authority Score is used to determine the overall quality of a domain. Similar to other tools, this one is also meant to be used to compare websites with their competitors to figure out ways to improve the ranking. One can use the AS score for the following purposes
- Analyse the different metrics with the competition
- Check out the various link building prospects
- Check out potential domains to buy
- Analyse and track the consequences of the SEO campaign
- Look out for negative SEO
By looking at the trend, you can see how the website has been fairing over time and figure out what worked and what might not have worked. As the score might fluctuate from time to time, there is no need to change anything every time the score changes. As a general rule of thumb, you should always wait for the score to remain in a particular range consistently before thinking of changing your SEO strategy. You can also make use of various other tools from SEMrush to find domains (similar niche) that you can get links from. Semrush Authority Score Tool
Majestic – Trust Flow and Citations Flow
Majestic uses metrics that are grouped into two categories namely – Trust Flow and Citation Flow. Trust Flow determines a score that is used to see how trustworthy a page is and is calculated by looking at whether the sites linked to it are trustworthy themselves. Citation Flow, on the other hand, scores websites on how influential a link is by taking a look at how many other sites use the same link.
Therefore, to simplify it, Trust Flow shows the quality of the links used whereas Citation Flow points to the volume of links. Both of the metrics are based on a scale of 1 to 100 and to make the best use of the score, one can create a chart to see the overall trend.
Calculating Trust Flow and Citation Flow
Citation Flow – Citation Flow is used to determine the importance of links being used in a site by considering all the other links pointing to it. Therefore, if a lot of websites point to the same link, then that link is deemed to be influential. However, it is important to understand what influential means in this context. If a website has had an impact on readers or is extremely relevant to a niche, then other websites may link to it. As the number of websites linking to it increases, the more influential the URL becomes. One interesting fact about Citation Flow is that when Trust Flow increases, Citation Flow also increases in score most of the time but not the other way around.
Trust Flow –Trust Flow of a website is almost always lesser than the Citation Flow. This is because Trust Flow gauges the quality of links and not the quantity. It’s never possible to always have only high-quality links on a website as sometimes low-quality ones will be created automatically due to various types of backlinks. A website with high Trust Flow will get more web traffic due to its high-quality backlink profile and therefore is a valuable metric in understanding how the website might rank.
How to Use Trust Flow and Citation Flow?
Both of these metrics can be used separately to gain valuable information about a website. However, they shine when used together and this is its biggest strength compared to other tools. By utilising both of them, one can get the trust ratio of a website which is Trust Flow / Citation Flow. If the ratio is near 1, then the website is found to be desirable with good potential. Average websites get a number close to 0.5 whereas untrustworthy sites get a number lower than 0.5. So if website A has a Citation Flow score of 54 and a Trust Flow score of 68, then the trust ratio is 1.2 which is considered to be very good.
On the other hand, if you get a trust ratio of 0.3 or 0.2, then it indicates that the website has less number of quality links compared to the total number of links it has. The lower the trust ratio score, the more urgent the need is to optimise the site. So by tweaking the parameters that determine both Trust Flow and Citation Flow, one can work on their websites and get the SEO up to a higher standard for better visibility.
Both these metrics cannot be used as an absolute value to determine the quality of a domain. Sometimes a good quality site may have significantly less Trust Flow and high Citation Flow. As with all the other tools, one must always exercise common sense when using these tools to get a better understanding of the situation at hand. You can also work on improving the accuracy of these metrics by looking into anchor text ratios and comparing trust ratios within niche specific websites. Majestic Trust & Citation Flow Tool
2021 Google Page Experience Update and its Impact on Ranking
If you have managed to read this far then I am hoping that you now understand Site Authority and that will not change…. but good old Google has decided to add another tier which will kick in come 2021.
The new metrics called Core Web Vitals will now be used to determine the ranking of webpages based on the perception of users towards that page. Therefore, if Google observes that users will have a poor experience on a particular website, it may decrease the rank.
Detailed information regarding the new Page Experience Update is available online for everyone to see. In short, with Core Web Vitals, Google will now consider a wide host of features such as page loading speed, mobile-friendliness, presence of intrusive ads and more to gauge the general perception users might have about a webpage.
Page experience will also play an important role in determining the “Top Stories” section on mobiles. As more people start using smartphones with every passing day, this metric will undoubtedly help to increase visibility in more ways than one. Currently, the “Top Stories” section displays websites based on the Google AMP (Accelerated Mobile Pages). Many content creators however don’t use it as it requires a restrictive HTML framework. With the new update, the new metric will also help in determining the sites that can be shown in the “Top Stories” section.
Web Core Vitals – What Are They?
The new Web Core Vitals are new metrics that are centred towards the user which provides a score based on the potential experience one may have while visiting a webpage. It is tracked and analysed by looking at a few primary metrics such as
- LCP – This measures the performance of the web page in terms of loading speed. A webpage can be considered to be providing a good experience to users if the loading time is less than 2.5 seconds.
- FID – This one measures the first input delay and any website that has an FID less than 100 milliseconds is considered to be good in this regard.
- CLS – A CLS (cumulative layout shift) score of less than 0.1 is considered to be ideal when it comes to measuring this metric. This metric looks at how stable a webpage is in terms of the layout. So while the user is browsing, buttons and interactive/non-interactive elements should remain stable at all times.
The above is what constitutes Web Core Vitals. All of these individual metrics are considered together and if the website meets or exceeds the above expectations, then its Web Core Vitals is defined as good. Therefore, the faster your website loads and with fewer errors, the higher the chance of it getting ranked higher in search results.
Apart from the above, other metrics are also used in determining the page experience such as
- Mobile Friendliness – Here Google will look at how mobile-friendly a website is to determine whether it can offer a good browsing experience to its users.
- Safety – Whether a website is safe for browsing or not is extremely important in determining the experience it can provide. Any webpage that has malicious content such as malware or utilises deceptive tactics such as social engineering will be flagged by Google.
- HTTPS Connectivity – The webpage needs to have a secure connection and served over HTTPS.
- Intrusive Ads – Webpages need to have their content easily accessible to the user and not locked under intrusive ads.
Tools to Measure Core Web Vitals
There are a few tools that you can use to gauge the page performance on your website. The below tools all help to measure all three Core Web Vitals. They are:
- PageSpeed Insights – PageSpeed Insights provides an overview of all three metrics as well as recommendations on how they can be improved. It can also help to identify the issues on a per-page basis to see how your website might potentially score as far as user experience is concerned.
- Chrome UX Report – This one is a public dataset of user experience data collected from millions of websites. By looking at the data, you can take a look at competitor websites and see where they are ahead or behind yours and work on ways to improve specific aspects. With the help of the CrUX API, you can check out the Web Core Vitals.
- Search Console – With the Google Search Console, you can request a report by entering your URL and see how your pages perform based on field data. With it, you can identify groups of pages on your website that might need improvements and get to work on them accordingly.
- Web Vitals Extension – This extension can be installed from the Chrome Store and can be used to figure out potential issues during the development stage of the website. It does so by measuring the Core Web Vitals and displaying the score in real-time.
- Mobile-Friendly Test – The mobile-friendly test allows you to input your URL and see if your webpages are mobile-friendly. Within a few seconds, you will get a report that will let you know if that particular page is mobile-friendly or not as well as any minor issues that may exist such as loading issues.
- Security Issues Report – The security issues report can be accessed from the Google Search Console and determine whether your website has any browsing issues or not. This includes malware, hacked content and social engineering.
- Site Connection Security – You can check for site connection security by heading over to Chrome and looking at the security status found to the left of the web address bar. If your site is shown to be not secure or dangerous, then you can secure the site with HTTPS to solve the issue.
Interstitials and Content Accessibility
With this new signal, some sites may face issues when it comes to the readability of content on mobile devices as the screens are significantly smaller compared to desktops. Here are some of the examples of content being difficult to access:
- A popup covering the main content right after the user enters the page.
- Standalone interstitials where the user needs to close it to access the content on a page.
- Layout shifting on mobile due to which the content gets hidden beneath a standalone interstitial.
On the other hand, here are a few examples where interstitials can be used without affecting the new metric.
- Interstitials that appear due to a legal obligation such as age verification or permissions.
- Login dialog boxes that are locked behind a paywall.
- Banners that can be dismissed with ease that take up only a small part of the screen. A good example of this can be seen on most App stores where a small banner will be visible up top or at the bottom.
How to Use the New Google Page Experience Metric to Improve Ranking?
While the new update will add new metrics to the ranking system, it is important to note that these alone won’t get your website more easily visible in search results. These are but the latest signals being introduced to the existing system that will play a role in determining site ranking. Similar to other signals, the best way to use the new metric is to compare your website with your competitors. There are many tools online that can help you to get this data.
Quality of content will always reign supreme but high-quality content which answers users questions is certainly the way to go.
Final Words Our Recommendations
One advantage we have as an agency is experience. Nettonic has access to all the tools mentioned in this article and I have spent hours analysing metrics and data of client’s website comparing these against their competition.
Knowledge and experience remove guess work, so we are better placed to recommend to our clients the most suitable marketing strategies to improve their site AUTHORITY.
If you have managed to get all the way to the end of this read and don’t want to purchase these tools but are curious to know what your websites authority is then….we are happy to provide a Free Consultation