Many SEO knows, for instance, that freshness is one of the elements Google thinks about significant, however, this doesn’t mean every substance call for regular refreshing. Think about the contrast between blue whales and protein bars, Bowman proposed — actualities around one subject may change substantially less oftentimes than realities about the other. A subject that resonated in the SEO people group during SMX Advanced in Seattle was that Google needs applicable, separated substance to the exclusion of everything else.
(For more SEO topics check this article out: Reasons why fresh content is critical for your website and SEO)
At SMX Advanced in Seattle on June 4, the 2019 edition of the Periodic Table of SEO Success Factors was officially unveiled. The Periodic Table, first launched in 2011, is updated every two years to reflect changes in Google ranking factors, emerging trends in technology and the evolution of SEO strategy.
Hundreds of SEO experts voted in Search Engine Land’s online poll on the relative importance of factors like content freshness, site speed, backlinks, user experience and so on. As is their standard practice, Search Engine Land’s editorial team then reviewed the poll results, added their own input, and produced an updated Periodic Table and explanatory materials, which can be reviewed and downloaded here.
Onstage at SMX Advanced, Search Engine Land’s editor in chief Ginny Marvin was joined by members of the editorial team, including Barry Schwartz, Detlef Johnson and Jessica Bowman, to discuss the new edition of the table and share their own views on the success factors that matter most.
The common theme throughout the discussion, and in the updated Periodic Table itself, was the importance of high-quality, relevant content, delivered to users in a well-structured, easily digestible format with an emphasis on usability.
As Schwartz noted in his introductory remarks, in the contest between content and links, it’s now a settled fact that content is more important.
Content is, to be sure, one of the main categories of success factors in the chart, and has been in previous editions as well. In the first part of their discussion, the panel discussed these content-oriented factors, with Schwartz noting that Google is getting very good at determining the authority of sources, especially in verticals where authority is particularly important, such as healthcare.
Bowman emphasized the importance of training your writing staff to produce high-quality content that is at least as good if not better than your nearest competitors. Johnson reminded us that Google’s increasingly sophisticated use of machine learning and neural networks is making it possible for the company to understand content on a conceptual level, rather than in terms of matching precise words.
The panel suggested that Google is also very good these days at detecting attempts to fake out its content quality criteria. Many SEOs know, for example, that freshness is one of the factors Google considers important, but this doesn’t mean all content calls for frequent updating. Think of the difference between blue whales and protein bars, Bowman suggested — facts about one topic might change much less frequently than facts about the other.
A testament to the dominance of content as a success factor is the fact that it came up frequently as a theme during the panel’s discussion of other factors. In discussing a range of topics around the theme of Architecture, for instance, the panel suggested that page speed, crawlability, page organization and other architectural factors are important primarily because they help to govern the extent to which Google will index the content you want it to index.
Schwartz even cited Google’s John Mueller as saying that the key to solving many problems (presumably even those that might have secondary attributable causes like site architecture) is simply to make your content better.
On the next topic, HTML, Johnson offered a useful overview of the semantic markup opportunities available in HTML5, suggesting that Google is looking more closely at these tags as website developers make better use of them. Tags that allow you to identify such semantic components as nav bars, article body text, and page sidebars can help Google properly understand your pages.
Schwartz countered, however, that Google is painfully aware that most of the web’s HTML is “broken,” meaning that pages do not display proper markup for elements like title tags and that in light of this fact, Google has had to develop so many workarounds that semantic markup may be less effective than we would wish. In other words, Google’s distrust of markup consistency may cause it to discount even valid examples.
Still, Johnson’s suggestion highlights one more trend in the direction of “things not strings,” a topic I’ve also written on recently in light of parallel developments in travel search toward a broader realization of the semantic web.
On the topic of Trust, the panel again emphasized that the best way to build authority is to adhere to Google’s quality guidelines and its tenets of Expertise, Authority and Trust. Readers will recall that the core algorithm update last August brought the increased importance of E-A-T to the attention of the SEO community. These tenets, again, are merely another way of emphasizing relevant, high-quality content. Though Google likely still uses traditional signals like page rank in the background, its emphasis on relevance these days is more prominent for ranking.
The panel also suggested that some factors once considered important on their own should today be seen in the broader contexts of authority and content relevance. Bowman, for instance, suggested that if the bounce rate is too high on pages that are supposed to be engaging, the solution is to look at improving the quality of content.
The last two positive categories in the Periodic Table, Links and User, were each covered somewhat quickly. The panel agreed that quality backlinks, the core of Google’s original algorithm, will probably never lose their importance entirely, despite Google’s suggestion that you don’t need any links these days to rank well.
As for user-related factors, Schwartz noted that Google has recently claimed they are not personalizing search results based on user search history as much as had been done previously. Whereas it used to be true that when a user searched a lot for “Jaguars” the football team, that user would eventually see fewer results for “Jaguar” the car, these days Google claims to focus only on the user’s geolocation and immediate previous query when it comes to personalization.
Schwartz also suggested that Google has become more sophisticated in its analysis of the user experience offered by a web page, having developed tools over the last four years that allow for the rendering of full web pages during the crawling and indexing process, so that interface elements can be examined in a manner that is similar to the way humans interact with them.
Bowman noted that user intent, though not rated as highly in the Periodic Table as she might have expected, is critical in the sense that relevance is a measure of the match between intent and page content.
The discussion then turned to the Toxins section of the Periodic Table, which lists factors detrimental to ranking such as cloaking, keyword stuffing and obtrusiveness of ad copy. The panel noted that these and many other factors should be avoided wherever possible, though many of them remain popular. Schwartz mentioned link schemes as an ongoing problem, and Bowman and Schwartz both noted that ad-heavy content is so common these days that many SEOs must work carefully to strike the right balance between ads, often a key source of site revenue and primary content.
The final section of the chart, Emerging Verticals, lists factors that are too new (or too complex) to have a proven role in ranking. These include voice, local, images and videos.
Among these, voice is the obviously new factor. The panel agreed that voice search, though still in its early stages, brings many tactics to the fore that are critical now and will become increasingly prominent in the future. These include featured snippets that offer answers to targeted questions, and search content that understands how the needs of users differ by factors like time of day.
Schwartz noted that local was placed in the Emerging Verticals list “because it is constantly changing,” but as Bowman correctly pointed out, local has been around for quite a while and has a big impact in certain verticals.
Generally speaking, a criticism that could be applied to the Periodic Table project as a whole is that it does not speak to differing use cases and the emphasis that makes the most sense for each. On the topic of local, if your goal as an SEO is to attract traffic in local markets for a brick and mortar chain, you’ll be forced to think of local SEO as a central area of activity rather than a peripheral factor, and this emphasis may cause a significant reshuffling of other tactics.
Similarly, sites focused on conversion via e-commerce have very different goals from sites that hope to build authority via frequently updated content in areas like industry news and entertainment.
Certainly, though, it’s useful to take the temperature of the SEO community as to the factors that currently matter most. And the theme that resounds throughout the new chart and the discussion at SMX Advanced is that Google wants relevant, differentiated content above all else.
What do you think?