On the 1st August, Google confirmed it had released a ‘broad core algorithm update’ like it does several times a year. However, this one appears to be a much larger change to the algorithm than previous iterations, causing a large flux in rankings which is probably the biggest we’ve seen since Penguin 4 of September 2016.
Which queries did the update affect?
This algorithm change is commonly being referred to as the ‘medic update’, due to rankings around health content experiencing the largest shifts.
Moz has a useful tool called MozCast which reports turbulence in Google rankings, the higher the temperature the bigger variance in rankings day-by-day.
More interestingly, they break down their temperatures into keyword categories evaluating changes by sector, which gives real insight into which websites or content may have been affected by an update:
Temperature by Keyword Category – August 1st
By a good margin, keywords falling under the category of health experienced the largest changes in search rankings on the 1st August, the day the update was rolled out.
Your money or your life (YMYL) content
Most of the signs from this update suggest that Google has significantly tweaked the ranking factors for ‘Your Money or Your Life’ (YMYL) content, a concept introduced in Google’s Search Quality Ratings Guidelines for “types of pages which could potentially impact the future happiness, health, financial stability, or safety of users”.
This fits with the ranking shifts within health content, but also the changes in finance rankings which other tools picked up on.
The search quality guidelines suggest that the following types of pages could fall under YMYL:
- Shopping or financial transaction pages
- Information or advice in the following areas: financial, medical or legal
- Information about local/state/national government processes, policies, people, and laws
This type of content is high risk to Google, where serving inaccurate or misinformed content around financial, medical or legal topics could have significant ramifications for a user if they follow the given advice.
Say for example, Google’s algorithm chooses incorrect informational content around prescription drug dosages to display in the top positions, which a user then goes on to trust and base a decision upon, this could obviously have a very bad result for the user and longer term, Google.
Where in contrast, if the algorithm ranks the wrong information for putting up a flat pack TV stand, the impacts are far less significant for the user.
For queries potentially affecting YMYL, it’s even more crucial that Google identifies the source with the most expertise, authority and trust – and ranks them accordingly.
Every niche saw some sort of ranking flux from the update, however it looks like the signals which Google changed in the algorithm were ‘turned up’ even more for YMYL queries.
Expertise, authority & trust (EAT)
Google’s search quality ratings guidelines make constant mention of expertise, authority and trust (EAT) as a signifier of high quality content.
This EAT signal appears to have increased in significance particularly for YMYL queries, but also potentially more generally across the algorithm – a lot of sectors beyond even the widest definitions of your money or your life content have seen large shifts over the last few weeks.
So, what exactly is Google looking for in terms of a websites’ expertise, authority and trust?
Google doesn’t actually define clear examples of EAT, however it does provide examples of signals of the lowest EAT, which in turn tells us a lot about what the search engine is looking for.
Google’s example signals of pages with significantly low EAT:
- The creator of the content does not have adequate expertise in the topic. e.g. a tax form instruction video made by someone with no clear expertise in tax preparation
- The website is not an authoritative source for the topic of the page, e.g. tax information on a cooking website
- The content is generally not trustworthy, e.g. a shopping checkout page that has an insecure connection
Effectively, this suggests that Google is looking for content written by trusted authors in their niche, as well as being on a trustworthy, relevant platform.
Which kind of signals could Google be using to evaluate EAT?
It’s hard to say for definite the exact signals which Google could be looking at to determine EAT, but with all the information that they have available to them you can probably make some suggestions as to the kind of things they’d be looking for.
You’d assume that Google is looking at both on-page and off-page signals to evaluate a writer’s authority, in the same way that this combination of factors is used to assess the majority of other parts in the algorithm:
Off-page - it’s very likely that Google is assessing a writer’s authority by how many times the publisher is cited across different web-pages, as well as how many times they’ve been featured on different domains.
This will probably go beyond links, potentially looking at both author and publisher mentions. With all of the content in the index it’s not unrealistic to suggest they could evaluate this with good accuracy on a massive scale.
On-page - factors evaluating the content on a website will surely also play a part in an algorithms’ measurement of contextual EAT, where clear author profiles and company credentials in a website’s about us page could be good places to show off your EAT.
Interestingly, RankRanger suggest that from their winner & loser analysis, sites which displayed credible author profiles throughout articles, particularly in YMYL verticals, tended to perform well following the update and many sites without authorship profiles lost visibility.
Authorship and building the credibility of both the writers on your website and the site itself seems to be the key to improving your EAT and receiving a positive assessment from this part of the algorithm.
Looking at the winners and losers from the latest update, another observation is that websites with the purest intent to inform a user look to have been those that have either remained safe from ranking shifts or seen large increases in visibility.
Commercial websites which host informational content often have split intent, where they’ll be using their articles and guides to support another interest, whether it be to sell something, drive pageviews to serve ads or to support the rankings of another part of their site. This can be seen as impure intent: where they’ll be trying to serve both their own intent and the users.
A good example of a site with obvious split intent is draxe.com, a medical site in the US which sells both recipes and diet plans, as well as providing advert space for other health products:
Through aggressively selling diet plans and more detailed health guides, as well as hosting adverts often for other medical products, draxe.com is one of the clearest examples of split intent.
From first landing on the Dr Axe website, it looks like an informational site with the pure intention of solely informing users, however aggressive, yet backhanded, adverts and product promotions follow you around the site, raising clear questions as to how much you can really trust their content.
This suspect split intent is probably a key factor in Dr Axe’s ranking drops from the August update:
Dr Axe’s organic visibility in 2018
On the other side, websites that can support themselves without either hosting ads or selling a product or service to a user are likely to be governmental, organisational or not-for-profit, with a genuine interest in purely informing users.
From a Google perspective these sites are easy candidates for those to give the most trust, being most likely to be clear from bias and provide valid and trustworthy information.
Take for example the NHS website, hosting a mass of medical content, written by doctors & experts, free from adverts, product placements and link building strategy, with the sole intention of informing users without bias:
The NHS site has only one type of content, with a single intent.
This looks to be the type of content which Google is attempting to identify: the safest content possible to display to users in YMYL areas. Unsurprisingly, the NHS’s website saw a big jump up in visibility from the update, surely as a result of being a website with one of the highest amounts of EAT available.
NHS.uk’s organic visibility in 2018
These sites with the purest informational intent are the ones which look to have seen the most traction from the August update, whilst the commercial sites producing informational content on the side look to be those with the biggest falls.
So, should you stop creating content with split intent?
Despite the latest update suggesting that it may be harder to get informational content to rank if you’re not a publisher with pure informational intent (particularly in YMYL), this isn’t to say you should be discouraged from producing new content for your website.
It’s worth going back to basics and thinking about what your content would be there for if search engines weren’t a thing. Supplementary content should have a genuine purpose beyond ranking organically and driving traffic to your site. Just because it doesn’t rank, it doesn’t mean it has no value.
If a user is 90% of the way to making a purchase and then has second thoughts, you need content like buying guides which will get them back on track. No matter which niche you’re in, informational content should have much broader aims than just organic traffic acquisition and can be a valuable tool to drive conversion from the users which are already present on your site.
As well as this, despite the fact that it may be now harder to rank for informational queries in YMYL areas if you have a commercially orientated website, this doesn’t mean you won’t rank at all. Google will still be looking for content to display for the longer-tail queries which the most obvious EAT publishers haven’t written on and it could just be a case of evaluating your competition more thoroughly during your content strategy processes to find the gaps.
Actionable strategy to improve your EAT
Managing your expertise, authority and trust signals are crucial if you’re a publisher or website producing content in YMYL areas.
However, the latest core update has also caused substantial movements across many different areas of search and as the quality rating guidelines state that the highest quality content will have a large amount of EAT, it’s probably a good idea to review your website in light of this factor regardless of your site’s niche.
So, what can you do?
Review your website’s intentions
Split intent can really damage Google’s perception of how much it can trust your website in high risk areas, but similarly it looks like content intent has been reviewed across informational topics as a whole, agnostic of niche.
If you’re an informational content producer, it’s worth considering the following:
- Splitting out your content by intent – Many websites try to blur the lines between informing users and selling to them, which it looks like this update has attempted to target. The main content of an informational page should purely be about informing a user rather than selling to them. If you want to target informational users for a commercial purpose you need to do it without interfering with their route to find the answer to the question.
- Reviewing ads on your site – just because the advert space on your website may be ran by a third party, it doesn’t mean that it’s not a part of your site’s content. If you’re writing on a particular topic, it’s going to make your website look biased if your page includes promotional space for products or services relevant to what you’re writing on. This is even more crucial for YMYL.
Create on-page authorship
It looks like the websites which have clear signs of authorship (and trusted authors) throughout their content have performed well in this update. The perceived success of authorship profiles could just be as a result of having the E-A-T which Google is looking for, content written by trusted authors is very likely to be profiled naturally, but it’s a trend which probably shouldn’t be ignored.
Building out author profiles on your site - author profiles, potentially on their own page but also at the bottom of each informational article can be a clear sign to Google that you’re profiling your authors. Authorship pages can also be a good source of organic link acquisition if you have writers on your domain that are popular within their niche. Regardless of whether they’ll count for SEO purposes, authorship profiles can be a nice addition to your website and also useful for the user.
- Show-off the credentials of your writers - if your writers have impressive experience or qualifications relevant to the content they’re writing on, there’s no better place to put this then an on-page author bio or profile. Credential boasting signals to both users and search engines that your authors have the E-A-T to produce reliable content.
Domain E-A-T signals
Reputation matters more than ever, of both your writers and the the domain that’s producing informational content. Where possible, show off as much expertise, authority and trust signals that your website or company has in relevant pages around your website. Pages to review for E-A-T signals could include:
- Your ‘About Us’ page
- FAQ pages
- ‘Meet the Team’ pages
- Your Contact Us pages’
Things like the presence of positive reviews and money back guarantees, as well as security signals like HTTPS will likely all count towards a positive trust evaluation for your website.
Get the experts involved
Rather than trying to manipulate E-A-T signals, the most future-proof way of producing content that works for this part of the algorithm is to get the experts in your niche involved in your content production. In an ideal world the authoritative authors should be involved every step of the way, from planning the content to writing it - it’s not going to work as well if you have experts as the face of your content but low quality copy to go with it.
Of course, if you have experts writing your content, make sure you show off about it on your website.
Build off-page authorship
Ensuring you have off-page trust signals for both your authors and website is essential to performing well out of this algorithm. In a similar way that you’d build authoritative links to your domain, it’s worth trying to get your authors out there, whether it be a link back to your domain or mentions of them in relevant web-pages.
It’s hard to say the exact places to get your authors covered on the web, but the following places would be a good start:
Industry bodies - trade associations in your niche may provide certification if you’re an accredited professional or company.
Education or organisation accreditation - similarly, if you’ve earned a qualification or are a part of an organisation, make sure that you’ve made the most out of opportunities to receive a mention on the accreditor’s website
- Relevant publications - experts in a sector will probably have been quoted across the press for their view on industry news. It’s worth pursuing digital PR to build the profile of your experts in your business, which Google can then pick up on to confirm their E-A-T.