They way Google grades and rates website has always been shrouded in secrecy - along the same lines as the recipe for Coca Cola - but in November 2015, Google released their Search Guidelines (SQEG); the closest we’ve got so far to understanding their algorithm.
What we do know about the SQEG is that it’s the set of guidelines that Google’s website graders use to check if their algorithm “works” - so checking if a website is actually “good” if the algorithm rates it highly.
We know these guidelines don’t influence ranking factor, but they are a good indication of the kinds of things Google prioritises and prefers when it comes to content.
For most marketers, nothing the SQEG revealed was surprising - Google likes websites that answers the searcher’s question and websites that fulfills their purpose, it likes content with high authority, expertise and trust, and it likes websites with longer, more in-depth pieces.
But the SQEG does leave us with some questions unanswered. Here are five things the SQEG doesn’t tell us, and how we can find out the answers:
What the SQEG doesn’t tell us
1. Can you bluff expertise?
As a content marketing agency, we produce pieces of content for a huge range of publications in dozens of different industries.
While we have experts in the majority of our major specialisms - finance, fashion, travel etc - we can’t possibly have expertise across every single area we work in. So as far as the SQEG is concerned, we are not experts.But what if we wrote a medical piece that used reputable medical journals as its source material? Would it matter that the content was not technically “written” by an expert? This is something the SQEG doesn’t address.
2. How authentic does authorship need to be?
Quite often in content marketing we produce pieces based on interviews that might end up being “authored” by the interviewee, rather than the writer who actually produced the piece. It’s clear that when it comes to Google’s Your Money or Your Life content (that is, content that offers information that could affect the reader either physically or financially - including ecommerce sites) that interviews are important. So does it matter if the content is “authored” by someone who was interviewed but didn’t actually write the piece?
3. Do raters research the authors of content?
How do the raters know if you are an expert? Do they do research into your background? Do they look at your LinkedIn profile or google your name? Would a doctor without a social media presence be rated lower than one without?
4. What is trustworthiness?
The third element of Google’s “EAT” (Expertise, Authority, Trustworthiness) criteria is that a website should be trustworthy, but as with authenticity and expertise, it’s not entirely clear what denotes trustworthiness. It seems that Google qualifies a website with a high number of backlinks to be more trustworthy, but what if backlinks are at the expense of quality? What makes the BBC a more trustworthy source than a blog with a high number of guest posts?
5. Can you improve EAT?
How can you improve EAT? And if you don’t rate highly on EAT, can you make changes, and if so when will your site be rated again? If you change the author of all of your website to be one particular person, would that improve your authorship? Would better sourcing and fewer opinion pieces raise your expertise? We know what makes someone at expert, an authority or trustworthy, but how does this translate online and can it be improved?
We asked our content manager, Joe Boyd, what the SQEG means for content marketers:
"The SQEG has confirmed a few things we already know, but also changed several others. For example, many marketers have underestimated how heavily Google weights reputation, expertise and trustworthiness. To produce the best content, we are going to have to make sure everything comes from a place of authority.
Interviews and original research should therefore be even more important for content marketers, ensuring everything we create is backed up by expertise."