The vast world of SEO introduces terms like canonicals, alt tags, meta descriptions, robots.txt; indexation, keyword cannibalization… it can seem like a maze if it’s not your realm of work, especially if you’re a business owner (you’ve got better things to do).

Search Engine Optimization – equal parts an art, a science, and a guessing game – is an ever-evolving process that requires the work of a web professional. However, basic knowledge of SEO is critical as you develop your digital marketing strategy and expand your new-to-file customer base in 2018.

In order to understand SEO as a process, let’s take it back to the basics and review how search engines, specifically Google, organically ranks websites in the first place.

Understanding Google’s Algorithms

Let’s say you’re roasting carrots and need to know what temperature to set the oven. You pull up Google on your phone and type in “best temperature to bake carrots”. During that fraction of a second, the Google ranking systems sort through billions of web pages in their index to give you the most relevant results possible.

Remember: Google’s primary goal is to provide you with the best, high-quality source based on your search.

These ranking systems are made up hundreds of algorithms, or artificial intelligence (AI) systems, that assess what it is you are looking for and decide what information to give you in return. As algorithms evolve, they get smarter – analyzing your searches and their results in finer detail to provide users with the best search experience.

Crawling & Indexing

Google has crawled nearly 130 trillion webpages. Every webpage you see in a search result (commonly referred to as a Search Engine Results Page, or SERP) has been crawled and indexed by Google.

Think of Google like a colony of spiders – the judging panel. Every time a web address is curated by a website owner, it is fed through these spiders, or bots, to crawl and render. As crawlers visit websites, they use links on those sites to discover other pages. They jump from link to link and send data about those pages back to Google’s servers to be indexed – similar to sorting books in a library. They take note of important signals (and penalties!) and keep track of it all in the Search index.

A webpage will not be crawled just one time – Google’s bots pay attention to new websites, changes to existing sites and red flags – such as broken links or duplicate content. They decide which sites to crawl, when, how often, and how many pages to index from each site.

SEO specialists have the ability to tell Google’s bots that a webpage is ready to be re-crawled (be it they updated content to the page or maybe changed the URL) via Google Webmaster Tools, commonly referred to as Search Console.

Google Ranking Factors

As mentioned, Google takes note of important signals as they crawl your webpage for indexing.

In short, signals are aspects of a website or webpage that Google uses to help determine how to rank it in search. These algorithms of signals change frequently, and Google does not spill all their secrets; we cannot ever know exactly why Google ranks some pages higher than others. (Remember the guessing game?) Luckily, Google periodically announces changes or updates to algorithms in their blog.

In the old days, Google would analyze a webpage to see how many times a specific keyword appeared, but Google’s algorithms are much smarter than they used to be at the dawn of the search. Instead of only measuring content, they now focus on context.

Below are a few of the key signals Google considers:

  • Website quality and architecture – is the website mobile friendly? Does it have an SSL (https://)? Does it have a fast load time?
  • HTML attributes – the quality of a page’s title tag, meta description, and URL slug; In other words, how your page is structured and appears in search. (Remember SEO as a science?)
  • Content quality – high-quality, well-written content of at least a recommended 1,500 words (written for humans – and no keyword stuffing.)
  • Link quality – Through links, crawlers can discover how pages are related to one other and in what ways. Search engines use links like streets – like this external link to my source!

A link from one site to another is like giving Google a nod of approval.

  • Domain authority – The trustworthiness and quality of your website domain, using links as a guide. Domain Authority is scored between 1 and 100, with higher scores corresponding to a greater ability to rank.
  • Content freshness – Google likes pages that are updated for relevancy.
  • User experience – Does the website provide a good, flowing user experience? Are users staying on the webpage or website for a long time?

Tune in next week to read about this past decade’s biggest algorithm updates!