Ways to create a Google algorithm update resistant SEO strategy.

To stop the roller coaster ride and keep a solid ranking through Google updates.

Google’s algorithm updates can make it seem like the search engine is punishing publishers for mysterious reasons.

Website rankings are never guaranteed.

Improve ranking stability and formulate an SEO strategy more resistant to Google algorithm updates with these tips.

1. User intent is just the beginning:

User intent is important, but it’s just a starting point for creating content that generates revenue day after day, regardless of algorithms.

User intent is one of several ingredients in creating algorithm-resistant web pages.

The reason identifying user intent is important is because it puts the mindset of putting the user (not the keywords) as the primary consideration and that’s where great SEO strategies begin.

2. Make the site visitors the centre of the universe:

One psychological writing trick that works great for creating web pages is to write content in a way that reflects the visitor’s need to see things through the lens of how they are affected.

Site visitors only engage with the pages that engage with them.

A smart pay-per-click marketer that creates landing pages that are tailored to all visitors.

The A / B tests showed that the audience converted at a slightly higher rate with those icons on the web page.

When the site visitor accesses a web page, the world stops spinning around the sun.

It revolves around the site visitor, even if they are in an e-commerce store.

3. Authorized means more than simple links:

There is no authority metric on Google, and yet Google says it wants to rank authoritative content.

Part of determining whether something is authoritative has to do with language.

For example, after Google Hummingbird, Google seemed to have started introducing language-related features to search results pages (SERPs).

Google began ranking university research pages by a two-word phrase that software companies used to rank.

Commercial web pages had links to their sites, much more so than these university research web pages.

All commercial pages were banned from the first two pages of the SERPs except one. That commercial web page had the word “research” in the content of the web page.

University .edu web pages were not ranked by .edu magic or by links.

For a short period of time, Google associated this two-word phrase with a topic type (research) and chose to rank only pages that featured research, which at the time mainly consisted of university web pages.

Google primarily ranks informational web pages for that two-word keyword phrase.

Informational content is authoritative for this two-word keyword phrase.

Links are the traditional measure of authority.

The sites with the most links have authority.

Language can also be a sign of authority.

This is evident in search results where the words that are used influence what is ranked more than the influence of the links.

Links used to be the overwhelming deciding factor that propelled web pages to the top of the SERPs.

Natural language processing decides in which race a web page is to run, and that race is on page two of the search results, based on the user’s intent and what qualifies as authoritative for that type of content.

Informational content will compete on Track 1 (analogous to the top half of the SERPs) and pages with commercial intent could qualify for Track 2 (analogous to the bottom half of those SERPs).

No matter how many links that business page may acquire, the content will never have enough authority to rank at the top of that keyword phrase.

The content can be authoritative for what users are looking for or not, regardless of the links, based only on the content.

4. Full content vs. treating visitors:

When people think of authority, they sometimes think of being holistic, bigger, and in between.

Authority and authority could be about understanding what users want and giving them what they want in the way they want it.

For e-commerce, the authority could be a web page that helps the user make a decision and doesn’t assume they know all the jargon.

Authorized content can be many things:

For example, a site visitor might have the user intent of,

This can be particularly true for sites that are reviewing things that involve technical jargon.

A site that is summarizing the top ten budget products may choose to focus on a quick, easy-to-understand summary that doesn’t have to explain the jargon.

The full review web page may have an explainer in a sidebar or tooltips to explain the jargon.

There is a virtually inexhaustible supply of people who need to have things carefully explained to them, which can become a winning strategy for long-term ranking success.

5. Let the search results be the guide to some extent:

In general, it’s best to let the search results be guided by the guide.

Trying to understand why Google ranks certain web pages is valuable.

But understanding why a page might be ranked doesn’t mean that the next step is to copy those pages.

One way to research search engine results pages is to map the keywords and intentions of the top ten ranked web pages, especially the top three.

Those are the most important.

This is where current SEO practices can be improved.

The two main strategies that can be improved:

The general practice is to copy or emulate what the top-ranked sites are already doing, except.

The idea is that if the top-ranked sites have XYZ factors in common, those XYZ factors are presumed to be what Google wants to see on a web page to rank for a certain keyword phrase.

Outlier is a word from the field of statistics.

When web pages have certain factors in common, those pages are said to be normal.

Web pages that are different are called outliers.

For the purpose of analyzing search results, if the web page does not have the same count of words, keywords, phrases, and topics that the top-ranked sites contain, that web page is considered a statistical outlier.

The search analytics software will recommend changes to make the outlier page more closely match the current ranking.

The problem with this approach is the underlying assumption that Google will rank content with qualities that exist on web pages that are already ranked in search results.

Another site that is statistically an outlier may outperform the top three ranked pages.

The pages ended up having not only a different keyword combination but the content, in general, was designed to better answer the question inherent in the search query.

That’s the difference between focusing on keywords and focusing on the search query.

Analyzing the search results is a good thing to know what the intention of the user is.

The next step should be to take that information and bring the best game to satisfy the need inherent in the user’s intention.

The second strategy is to create content that is better or simply more than the content of the highest-ranked competitors.

Both are about beating the competition by mimicking competitors’ content but making it (vaguely) “better” or simply more extensive or more up-to-date.

So if they have 2,000 words of content, post 3,000 words of content.

And if they have a top ten list, beat them with a top 100 list.

The concept is similar to a piece in a comedy where a clearly deranged man communicates his strategy to outsell a famous 8-Minute Abs video by creating a video called 7-Minute Abs.

Just because the content is longer or has more than the competitor has does not automatically make it better or inherently easier to rank for or get links to.

So instead of focusing on vague recommendations of being ten times better or more specific, but a completely random recommendation of being more than the competitor.

Return to search results as a guide:

Extracting search results to understand why Google ranks web pages will not yield useful information.

6. Create diversity in the promotional strategy:

It is never a good idea to promote a site in one way. Anything that spreads the word is great.

Do podcasts, write a book, get interviewed on YouTube, appear on television, etc.

Be everywhere as much as possible so that the way the site is promoted, the way people learn about the site comes from many different areas.

This will help build a strong foundation for the site that can overcome algorithm changes.

For example, if word of mouth signals become important, a site that has focused on word of mouth will be ready for it.

7. Work to prevent link rot:

Link Rot occurs when links to a web page are losing links, thus reducing the amount of influence they confer on the web page.

The solution for Link Rot is to maintain a link acquisition project, even if it is a modest endeavour.

This will help counteract the natural process in which links lose their value.

8. Website promotion:

Web pages should be promoted.

Lack of promotion can cause a website to slowly and steadily lose reach and be unable to connect with the people who need to view the content.

It is simply that people know that the site is available.

It can be through social networks, participating in Facebook groups and forums, through local promotions, with cross-promotions with other companies and many other techniques.

Branding is where a business name becomes almost synonymous with a type of product or website.

9. Diversity of links:

One of the reasons some sites bounce up and down in search results is that there is a weakness that sometimes has to do with a lack of diversity in inbound links.

Anecdotal observations have noted that the sites that tend to rank at the top of search results are the kind that has different types of links from different types of websites.

This may no longer be the case with the advent of natural language processing (NLP) technologies that can put a greater emphasis on content than links.

Links continue to play a role, especially the right types of links.

Leaving aside the influence of NLP and focusing only on links, it can be helpful for a site to resist changes in Google’s linking algorithms by cultivating a diverse set of inbound links.

There are many types of links.

  • Resource links.
  • Links are given in articles.
  • Recommendations links are given by bloggers.
  • Links in news articles.

It no longer matters if a search engine blocks a link from following it by using a link attribute called nofollow.

Google can choose to follow those links.

Also, some links have value in generating popularity and awareness of a site.

10. Classification of signals and E-A-T:

There are many signals that Google uses to rank a site. Google will even ignore links or fraudulent content to rank a site that is doing other things well.

Therefore, there are qualities in a site that can overcome spam links or SEO.

11. Stay on top of changes:

To create a site that is resistant to algorithm changes, it is important to be aware of all announced changes to the Google algorithm.

It’s important to keep up with changes like ticket ranking, BERT, and how Google ranks reviews.

When it comes to interpreting what an algorithm means, don’t speculate on the motives.

That’s always a bad idea and it never helps to form an actionable ranking strategy.

Recent changes in the way Google rates reviews could be interpreted as Google broadening the range of sites that must be trusted and accurate.

Focusing on the steps outlined can help to create a high-quality site that can withstand changes to the Google algorithm.