Some case studies suggest what webmasters can do to restore lower rankings after a Google Core update.
The Google Core update of March has led to a loss of ranking for some websites. Other websites, on the other hand, benefited, including some that had lost the core update from last year’s August.
Meanwhile, there are some findings of possible factors that might have played a role in the updates. In summary, it can be said that there seems to be a connection with the Google quality guidelines and especially with EAT, where EAT stands for “Expertise, Authoritativeness, Trustworthiness”.
The quality guidelines serve Google as a basis for the evaluation of websites within the scope of manual tests. The results of these tests do not directly affect the rankings but are used to optimize the algorithms. The quality guidelines give examples of good and bad quality.
EAT plays an important role in this: it is about expertise and knowledge, i.e. being an authoritative source of information and trustworthiness. EAT is particularly important for Google when it comes to so-called YMYL pages (“Your Money, Your Life”), i.e. websites on sensitive topics such as medicine, finance or law.
How Denver websites manage a trend reversal
Current case studies by an SEO company in Denver and Colorado Springs show that there can actually be a connection between EAT and ranking gains as well as ranking losses due to the current core update. In a report on Search Engine Land, the development of three different websites was followed exemplarily. In each case, the measures that had been implemented in the run-up to the last core update of 12 March were described. There does not have to be a causal link between these measures and the ranking process. If one considers the changes made against the background of the Google quality guidelines, however, a connection appears at least plausible.
The most important findings are summarised below:
No or bad ratings available: If there are no or predominantly negative reviews of a website on the web, this could have a negative impact on Google’s assessment of the website. Efforts to revise the evaluations and to react appropriately to them seem to have paid off for the pages considered.
No reference links: Particularly in the case of scientific topics, theories should not be published without evidence. This is especially true for websites with a medical background. Here, care should always be taken to substantiate allegations with serious reference links.
Too little information about the authors: Even a highly competent author is not shown to advantage on a website if there is too little information about him to be found there. A few sentences about the experience or even better a complete page with the previous publications can help Google to recognize and appreciate the value of an author. It’s also quite possible that Google may use information about an author’s publications on other sites, as well as awards he’s won, for rating purposes.
Minor changes such as improving page speed, solving problems with mixed content and downgrading some dubious links can also help to improve the rankings.
On e-commerce sites it can be helpful to improve product sites: It should be avoided to simply adopt the texts of the product manufacturers. It is better to formulate your own texts and enrich them with additional information such as videos, FAQs, and instructions.
Thin content should be reduced as much as possible. Only content that adds real value for users should be published on a Denver website.
And finally: Relevant links are important! Google evaluates the relevance of a website not only by the number of backlinks but also by their quality. And here, if the links come from sites that Google believes are relevant to a topic, the backlinks from those sites will help make Google consider the linked sites more relevant.
A study shows the importance of quality content
The importance of content quality can be seen by looking at a website that has been and is still being optimized by the author of this article. In the initial situation, the website consisted of a number of information texts written by a text agency. There were around a dozen texts, all of which had no significant added value in terms of information. One had the feeling that the texts had only been written to fill the page with content.
An example: For the product, which is the object of the website, there was a post with the topic “Buy online”. But those who expected the possibility to buy the product online on the site were severely disappointed. There were only general descriptions of online shopping and not even links to suitable shops.
In addition, there were hundreds of Denver city pages with automatically generated content on the website, i.e. a lot of thin content. So it’s no wonder that Google punished the site with the August core update.
In order to improve the situation, a dual strategy was chosen: The city pages were set to “noindex”, the information pages are gradually filled with well-researched content that brings real added value to the site visitors.
This paid off, as you can see from the sales of impressions and clicks from the Google Search Console. Since the 12th of March, things have been clearly improving here again:
The examples described show: Google is increasingly focusing on content quality and EAT. This applies in particular to websites in the sensitive YMYL area.
Webmasters of sites that have lost a core update should read Google’s quality guidelines carefully. It contains valuable information on what Google understands by quality.
In particular, the expertise of the authors of a website should be underlined. References should also be made to reputable sources to support claims and theories.
It’s all work to do. With small adjustments here and there, however, unfortunately, one will not be able to achieve any more lasting successes.