Technical SEO and user experience are increasingly important for ranking on Google.
Business e-commerce sites can greatly benefit from SEO.
The world of SEO is always changing as Google adjusts its algorithm based on click data, site speed, links, location, custom results, and many other factors.
In e-commerce, the pandemic has emphasized the need for organizations to be agile in their marketing strategies and focus on growing organic traffic, to protect the brand when the paid media budget needs to be cut.
Having a solid e-commerce SEO strategy is key to positioning the business to weather any storm.
Apply SEO when going from offline selling to online selling.
Optimizing the website to rank high on search engine results pages and often for the target keywords can lead to significant results for the business for years to come.
3 key areas of business SEO to win in e-commerce:
If someone is planning to adjust their online sales strategy, take advantage of organic search as an important channel that has the potential to generate immediate traffic and income.
1. Information architecture:
Information architecture should be leveraged to structure websites in such a way as to improve searchability and usability for both search engines and end-users.
In simpler terms, it takes the complexity of the website and turns it into an understandable structure.
If one can’t find something on the website, Google’s search bots don’t like to find it easily either (let alone shoppers).
Organizing site information becomes more complicated the further to go into the site taxonomy.
The sites are structured to have categories and subcategories, then faceted navigation pages.
The header navigation includes things like camping and hiking, rowing, running, biking, etc.
Organize filtered content in a simple and effective way while giving users options on how they filter and find information.
Organize all the information, but there are also other areas like blogs, community forums, or guides that should be structured in a way that helps search engines understand how relevant they are to a search query.
Once all the information is organized, a plan can be made on how to best organize this information on the site to make it easier for search engines to crawl.
Many business sites organize topics into keyword groups around general topic ideas.
This helps to understand if they have organized and created content around the core subject areas.
Planning all of this can be tricky, but I suggest that to define the content cubes first.
In REI’s case, they could take the main backpackers bucket and start organizing things like this:
Keyword Groups for Knapsack Terms Pro Tip: Tools like MarketMuse can help organizations begin to understand the gaps in their SEO strategy.
Collect and organize the site information, the two most important things in optimizing the information architecture for SEO are searchability and usability.
2. Ease of search:
Search capability refers to the ability of search robots and end-users to find content on the website using internal links, external links, site maps, and internal site searches.
Sorted information ranks higher in Google search due to its availability and visibility.
Google uses a tracking budget based on on-site authority to search and process the information on a website.
A critical part of business SEO is helping Google and other search engines find the most important information without spending all the crawl budgets stuck on JavaScript, CSS, and other files that should be excluded from indexing.
Google Search Console Data Source: Google Search Console data.
A great way to understand what information Google finds on the site is to look at the Google Search Console (GSC) data.
The goal is to help Google be efficient when they visit the site and find useful information as quickly as possible.
By analyzing GSC data, one can quickly understand which pages they are finding and indexing.
Key takeaway: Using a Robots.txt protocol suggests to Google that they shouldn’t crawl the page.
1) Insert noindex meta tags in URLs.
2) Be patient and check GSC to see if Google removes pages once Google crawls them.
3) Manually check if the URLs have been removed from the Google index by doing a site: URL search.
4) Add URL parameters in the robots.txt file which will not be allowed to save crawl budget.
3. Usability:
Usability is commonly known as user experience, the combination of user interface and information architecture.
Google announced that it will introduce a new page experience ranking algorithm specifically designed to judge and rank pages based on how users perceive the experience of interacting with the web page.
When creating an effective information structure for SEO, it is critical to consider which pages are most valuable to the organization.
The higher a page is in the site architecture, the more likely it is to rank for competitive keywords.
Start wide and dive deep into conversion-focused pages.
This approach takes users from the top of the sales funnel to the bottom funnel conversion pages naturally as users research, learn, and convert.
Leveraging keyword research will help guide the decision on how to most effectively structure the site for organic search.
Site optimizations to improve e-commerce rankings:
Enterprise-level organizations have a variety of support locations, products, services, and resources that need to be optimized.
With hundreds of thousands of web pages to optimize, this level of optimization can be time consuming and expensive.
Manual vs. automated SEO:
Depending on the complexity of the system being used, manual SEO could take a lot of time and resources to implement the most basic changes to each page, such as the title, meta description, and content container elements.
Meanwhile, automated optimizations involve using software to implement the desired strategies, which means it is faster but more limited.
Some general rules of thumb on what to automate and what to do manually include:
Manual SEO elements:
Top landing pages.
*Category.
*Subcategory.
Main elements of the landing page (that is, title tags, meta description tags, header and content tags).
Meta Robots tags for main pages (i.e. Noindex / Index / Nofollow / Follow tags).
Content container for product pages.
The main structure of the landing page URL.
301 redirects.
Automated SEO elements:
Items on product pages.
*Title tags.
*Meta description tags.
*Header tags.
*Image alt text and image title text.
*Image size and compression.
Creation of XML Sitemap.
Canonical label elements.
*Product pages.
*Faceted navigation pages.
*Blog tags/category tags.
Generation of bread crumbs.
Related product links.
Scheme marking.
*Prices.
*Product inventory level (that is, in stock or out of stock).
*Product name.
*Product description.
*Product rating and star rating.
Meta Robots tags for filter navigation pages (i.e. Noindex / Index / Nofollow / Follow tags).
Product page redirects (that is, based on whether the product will be restocked).
Consider striking a balance between these two processes so that the business site has more optimized web pages and high-value pages can receive professional attention to produce stronger SEO results.
Technical SEO and site improvements:
Numerous tools can help to find performance issues on a site and allows to get started with developers and create a plan to clean things up.
DeepCrawl, Botify, Screaming Frog, and the like are useful for crawling massive amounts of pages on a site to understand how a search engine can find problems like broken links and code problems.
A quick way to understand a site’s performance is to use Google Chrome’s inspection tools to generate a performance report.
This is an excellent starting point for discovering some basic problems.
Depending on the platform the site is built on, certain items are quick to fix or require a lot of time and resources to find a good solution.
Here’s a quick list of things one can dive into to make sure SEO isn’t negatively affected:
Page speed:
With Google’s shift to a ranking system that prioritizes mobile devices and Google’s announcement that page speed will influence the ranking for user experience, it is critical that all pages are fully optimized for speed of fast charge.
Here are the site improvements one should consider adopting to increase the page speed:
Minimize JavaScript, HTML, and CSS:
Have a developer optimize the code by removing code comments, unnecessary formatting, and unused code. Google recommends using tools like CSSNano and UglifyJS to help with this.
Reduce redirects:
Every time having a page redirected to another page, to add extra time waiting for the HTTP request cycle to complete.
Make sure the redirects have the least path of resistance to improve the user experience.
Browser caching:
Browsers will collect information so that when a visitor returns to the site, they do not have to reload the entire web page. Using a tool that will allow setting an expiration date for the cache will be more helpful to the site as to make changes.
Tracking budget:
Crawling budget refers to the number of pages that a search engine will crawl and index within a specified period of time.
Businesses should define the appropriate crawl speed for their website to ensure that the most popular pages are crawled and indexed in the most efficient way.
Crawling-related issues can limit a search engine’s ability to monitor and index the most important content before the crawl budget reaches capacity.
XML sitemap:
An XML sitemap tells Google where information can be found on the website and helps search robots find the information to rank accordingly.
Effective sitemaps consider the following:
*Format the sitemap in an XML document.
*Follow the XML sitemap protocol.
*Properly update the sitemap to include all the pages of the website.
*Successfully submit the sitemap to Google Search Console.
Strategically, business SEO can increase brand awareness, generate consistent income, and dominate industry presence without paying for a click every time someone visits the site.
Brands that don’t properly invest and develop a business SEO strategy risk losing critical space on search engine results pages, which can lead to significant drops in revenue.
As a company, the site already has strong brand awareness.
Maintain industry dominance through effective SEO optimization and constant monitoring.