Rewriting the Beginner’s Guide to SEO, Chapter 2: Crawling, Indexing, and Ranking

Posted by BritneyMuller

It’s been a few months since our last share of our work-in-progress rewrite of the Beginner’s Guide to SEO, but after a brief hiatus, we’re back to share our draft of Chapter Two with you! This wouldn’t have been possible without the help of Kameron Jenkins, who has thoughtfully contributed her great talent for wordsmithing throughout this piece.

This is your resource, the guide that likely kicked off your interest in and knowledge of SEO, and we want to do right by you. You left amazingly helpful commentary on our outline and draft of Chapter One, and we’d be honored if you would take the time to let us know what you think of Chapter Two in the comments below.


Chapter 2: How Search Engines Work – Crawling, Indexing, and Ranking

First, show up.

As we mentioned in Chapter 1, search engines are answer machines. They exist to discover, understand, and organize the internet’s content in order to offer the most relevant results to the questions searchers are asking.

In order to show up in search results, your content needs to first be visible to search engines. It’s arguably the most important piece of the SEO puzzle: If your site can’t be found, there’s no way you’ll ever show up in the SERPs (Search Engine Results Page).

How do search engines work?

Search engines have three primary functions:

  1. Crawl: Scour the Internet for content, looking over the code/content for each URL they find.
  2. Index: Store and organize the content found during the crawling process. Once a page is in the index, it’s in the running to be displayed as a result to relevant queries.
  3. Rank: Provide the pieces of content that will best answer a searcher’s query. Order the search results by the most helpful to a particular query.

What is search engine crawling?

Crawling, is the discovery process in which search engines send out a team of robots (known as crawlers or spiders) to find new and updated content. Content can vary — it could be a webpage, an image, a video, a PDF, etc. — but regardless of the format, content is discovered by links.

The bot starts out by fetching a few web pages, and then follows the links on those webpages to find new URLs. By hopping along this path of links, crawlers are able to find new content and add it to their index — a massive database of discovered URLs — to later be retrieved when a searcher is seeking information that the content on that URL is a good match for.

What is a search engine index?

Search engines process and store information they find in an index, a huge database of all the content they’ve discovered and deem good enough to serve up to searchers.

Search engine ranking

When someone performs a search, search engines scour their index for highly relevant content and then orders that content in the hopes of solving the searcher’s query. This ordering of search results by relevance is known as ranking. In general, you can assume that the higher a website is ranked, the more relevant the search engine believes that site is to the query.

It’s possible to block search engine crawlers from part or all of your site, or instruct search engines to avoid storing certain pages in their index. While there can be reasons for doing this, if you want your content found by searchers, you have to first make sure it’s accessible to crawlers and is indexable. Otherwise, it’s as good as invisible.

By the end of this chapter, you’ll have the context you need to work with the search engine, rather than against it!

Note: In SEO, not all search engines are equal

Many beginners wonder about the relative importance of particular search engines. Most people know that Google has the largest market share, but how important it is to optimize for Bing, Yahoo, and others? The truth is that despite the existence of more than 30 major web search engines, the SEO community really only pays attention to Google. Why? The short answer is that Google is where the vast majority of people search the web. If we include Google Images, Google Maps, and YouTube (a Google property), more than 90% of web searches happen on Google — that’s nearly 20 times Bing and Yahoo combined.

Crawling: Can search engines find your site?

As you’ve just learned, making sure your site gets crawled and indexed is a prerequisite for showing up in the SERPs. First things first: You can check to see how many and which pages of your website have been indexed by Google using “site:yourdomain.com“, an advanced search operator.

Head to Google and type “site:yourdomain.com” into the search bar. This will return results Google has in its index for the site specified:

The number of results Google displays (see “About __ results” above) isn’t exact, but it does give you a solid idea of which pages are indexed on your site and how they are currently showing up in search results.

For more accurate results, monitor and use the Index Coverage report in Google Search Console. You can sign up for a free Google Search Console account if you don’t currently have one. With this tool, you can submit sitemaps for your site and monitor how many submitted pages have actually been added to Google’s index, among other things.

If you’re not showing up anywhere in the search results, there are a few possible reasons why:

  • Your site is brand new and hasn’t been crawled yet.
  • Your site isn’t linked to from any external websites.
  • Your site’s navigation makes it hard for a robot to crawl it effectively.
  • Your site contains some basic code called crawler directives that is blocking search engines.
  • Your site has been penalized by Google for spammy tactics.

If your site doesn’t have any other sites linking to it, you still might be able to get it indexed by submitting your XML sitemap in Google Search Console or manually submitting individual URLs to Google. There’s no guarantee they’ll include a submitted URL in their index, but it’s worth a try!

Can search engines see your whole site?

Sometimes a search engine will be able to find parts of your site by crawling, but other pages or sections might be obscured for one reason or another. It’s important to make sure that search engines are able to discover all the content you want indexed, and not just your homepage.

Ask yourself this: Can the bot crawl through your website, and not just to it?

Is your content hidden behind login forms?

If you require users to log in, fill out forms, or answer surveys before accessing certain content, search engines won’t see those protected pages. A crawler is definitely not going to log in.

Are you relying on search forms?

Robots cannot use search forms. Some individuals believe that if they place a search box on their site, search engines will be able to find everything that their visitors search for.

Is text hidden within non-text content?

Non-text media forms (images, video, GIFs, etc.) should not be used to display text that you wish to be indexed. While search engines are getting better at recognizing images, there’s no guarantee they will be able to read and understand it just yet. It’s always best to add text within the <HTML> markup of your webpage.

Can search engines follow your site navigation?

Just as a crawler needs to discover your site via links from other sites, it needs a path of links on your own site to guide it from page to page. If you’ve got a page you want search engines to find but it isn’t linked to from any other pages, it’s as good as invisible. Many sites make the critical mistake of structuring their navigation in ways that are inaccessible to search engines, hindering their ability to get listed in search results.

Common navigation mistakes that can keep crawlers from seeing all of your site:

  • Having a mobile navigation that shows different results than your desktop navigation
  • Any type of navigation where the menu items are not in the HTML, such as JavaScript-enabled navigations. Google has gotten much better at crawling and understanding Javascript, but it’s still not a perfect process. The more surefire way to ensure something gets found, understood, and indexed by Google is by putting it in the HTML.
  • Personalization, or showing unique navigation to a specific type of visitor versus others, could appear to be cloaking to a search engine crawler
  • Forgetting to link to a primary page on your website through your navigation — remember, links are the paths crawlers follow to new pages!

This is why it’s essential that your website has a clear navigation and helpful URL folder structures.

Information architecture

Information architecture is the practice of organizing and labeling content on a website to improve efficiency and fundability for users. The best information architecture is intuitive, meaning that users shouldn’t have to think very hard to flow through your website or to find something.

Your site should also have a useful 404 (page not found) page for when a visitor clicks on a dead link or mistypes a URL. The best 404 pages allow users to click back into your site so they don’t bounce off just because they tried to access a nonexistent link.

Tell search engines how to crawl your site

In addition to making sure crawlers can reach your most important pages, it’s also pertinent to note that you’ll have pages on your site you don’t want them to find. These might include things like old URLs that have thin content, duplicate URLs (such as sort-and-filter parameters for e-commerce), special promo code pages, staging or test pages, and so on.

Blocking pages from search engines can also help crawlers prioritize your most important pages and maximize your crawl budget (the average number of pages a search engine bot will crawl on your site).

Crawler directives allow you to control what you want Googlebot to crawl and index using a robots.txt file, meta tag, sitemap.xml file, or Google Search Console.

Robots.txt

Robots.txt files are located in the root directory of websites (ex. yourdomain.com/robots.txt) and suggest which parts of your site search engines should and shouldn’t crawl via specific robots.txt directives. This is a great solution when trying to block search engines from non-private pages on your site.

You wouldn’t want to block private/sensitive pages from being crawled here because the file is easily accessible by users and bots.

Pro tip:

  • If Googlebot can’t find a robots.txt file for a site (40X HTTP status code), it proceeds to crawl the site.
  • If Googlebot finds a robots.txt file for a site (20X HTTP status code), it will usually abide by the suggestions and proceed to crawl the site.
  • If Googlebot finds neither a 20X or a 40X HTTP status code (ex. a 501 server error) it can’t determine if you have a robots.txt file or not and won’t crawl your site.

Meta directives

The two types of meta directives are the meta robots tag (more commonly used) and the x-robots-tag. Each provides crawlers with stronger instructions on how to crawl and index a URL’s content.

The x-robots tag provides more flexibility and functionality if you want to block search engines at scale because you can use regular expressions, block non-HTML files, and apply sitewide noindex tags.

These are the best options for blocking more sensitive*/private URLs from search engines.

*For very sensitive URLs, it is best practice to remove them from or require a secure login to view the pages.

WordPress Tip: In Dashboard > Settings > Reading, make sure the “Search Engine Visibility” box is not checked. This blocks search engines from coming to your site via your robots.txt file!

Avoid these common pitfalls, and you’ll have clean, crawlable content that will allow bots easy access to your pages.

Once you’ve ensured your site has been crawled, the next order of business is to make sure it can be indexed. That’s right — just because your site can be discovered and crawled by a search engine doesn’t necessarily mean that it will be stored in their index. Read on to learn about how indexing works and how you can make sure your site makes it into this all-important database.

Sitemaps

A sitemap is just what it sounds like: a list of URLs on your site that crawlers can use to discover and index your content. One of the easiest ways to ensure Google is finding your highest priority pages is to create a file that meets Google’s standards and submit it through Google Search Console. While submitting a sitemap doesn’t replace the need for good site navigation, it can certainly help crawlers follow a path to all of your important pages.

Google Search Console

Some sites (most common with e-commerce) make the same content available on multiple different URLs by appending certain parameters to URLs. If you’ve ever shopped online, you’ve likely narrowed down your search via filters. For example, you may search for “shoes” on Amazon, and then refine your search by size, color, and style. Each time you refine, the URL changes slightly. How does Google know which version of the URL to serve to searchers? Google does a pretty good job at figuring out the representative URL on its own, but you can use the URL Parameters feature in Google Search Console to tell Google exactly how you want them to treat your pages.

Indexing: How do search engines understand and remember your site?

Once you’ve ensured your site has been crawled, the next order of business is to make sure it can be indexed. That’s right — just because your site can be discovered and crawled by a search engine doesn’t necessarily mean that it will be stored in their index. In the previous section on crawling, we discussed how search engines discover your web pages. The index is where your discovered pages are stored. After a crawler finds a page, the search engine renders it just like a browser would. In the process of doing so, the search engine analyzes that page’s contents. All of that information is stored in its index.

Read on to learn about how indexing works and how you can make sure your site makes it into this all-important database.

Can I see how a Googlebot crawler sees my pages?

Yes, the cached version of your page will reflect a snapshot of the last time googlebot crawled it.

Google crawls and caches web pages at different frequencies. More established, well-known sites that post frequently like https://www.nytimes.com will be crawled more frequently than the much-less-famous website for Roger the Mozbot’s side hustle, http://www.rogerlovescupcakes.com (if only it were real…)

You can view what your cached version of a page looks like by clicking the drop-down arrow next to the URL in the SERP and choosing “Cached”:

You can also view the text-only version of your site to determine if your important content is being crawled and cached effectively.

Are pages ever removed from the index?

Yes, pages can be removed from the index! Some of the main reasons why a URL might be removed include:

  • The URL is returning a “not found” error (4XX) or server error (5XX) – This could be accidental (the page was moved and a 301 redirect was not set up) or intentional (the page was deleted and 404ed in order to get it removed from the index)
  • The URL had a noindex meta tag added – This tag can be added by site owners to instruct the search engine to omit the page from its index.
  • The URL has been manually penalized for violating the search engine’s Webmaster Guidelines and, as a result, was removed from the index.
  • The URL has been blocked from crawling with the addition of a password required before visitors can access the page.

If you believe that a page on your website that was previously in Google’s index is no longer showing up, you can manually submit the URL to Google by navigating to the “Submit URL” tool in Search Console.

Ranking: How do search engines rank URLs?

How do search engines ensure that when someone types a query into the search bar, they get relevant results in return? That process is known as ranking, or the ordering of search results by most relevant to least relevant to a particular query.

To determine relevance, search engines use algorithms, a process or formula by which stored information is retrieved and ordered in meaningful ways. These algorithms have gone through many changes over the years in order to improve the quality of search results. Google, for example, makes algorithm adjustments every day — some of these updates are minor quality tweaks, whereas others are core/broad algorithm updates deployed to tackle a specific issue, like Penguin to tackle link spam. Check out our Google Algorithm Change History for a list of both confirmed and unconfirmed Google updates going back to the year 2000.

Why does the algorithm change so often? Is Google just trying to keep us on our toes? While Google doesn’t always reveal specifics as to why they do what they do, we do know that Google’s aim when making algorithm adjustments is to improve overall search quality. That’s why, in response to algorithm update questions, Google will answer with something along the lines of: “We’re making quality updates all the time.” This indicates that, if your site suffered after an algorithm adjustment, compare it against Google’s Quality Guidelines or Search Quality Rater Guidelines, both are very telling in terms of what search engines want.

What do search engines want?

Search engines have always wanted the same thing: to provide useful answers to searcher’s questions in the most helpful formats. If that’s true, then why does it appear that SEO is different now than in years past?

Think about it in terms of someone learning a new language.

At first, their understanding of the language is very rudimentary — “See Spot Run.” Over time, their understanding starts to deepen, and they learn semantics—- the meaning behind language and the relationship between words and phrases. Eventually, with enough practice, the student knows the language well enough to even understand nuance, and is able to provide answers to even vague or incomplete questions.

When search engines were just beginning to learn our language, it was much easier to game the system by using tricks and tactics that actually go against quality guidelines. Take keyword stuffing, for example. If you wanted to rank for a particular keyword like “funny jokes,” you might add the words “funny jokes” a bunch of times onto your page, and make it bold, in hopes of boosting your ranking for that term:

Welcome to funny jokes! We tell the funniest jokes in the world. Funny jokes are fun and crazy. Your funny joke awaits. Sit back and read funny jokes because funny jokes can make you happy and funnier. Some funny favorite funny jokes.

This tactic made for terrible user experiences, and instead of laughing at funny jokes, people were bombarded by annoying, hard-to-read text. It may have worked in the past, but this is never what search engines wanted.

The role links play in SEO

When we talk about links, we could mean two things. Backlinks or “inbound links” are links from other websites that point to your website, while internal links are links on your own site that point to your other pages (on the same site).

Links have historically played a big role in SEO. Very early on, search engines needed help figuring out which URLs were more trustworthy than others to help them determine how to rank search results. Calculating the number of links pointing to any given site helped them do this.

Backlinks work very similarly to real life WOM (Word-Of-Mouth) referrals. Let’s take a hypothetical coffee shop, Jenny’s Coffee, as an example:

  • Referrals from others = good sign of authority
    Example: Many different people have all told you that Jenny’s Coffee is the best in town
  • Referrals from yourself = biased, so not a good sign of authority
    Example: Jenny claims that Jenny’s Coffee is the best in town
  • Referrals from irrelevant or low-quality sources = not a good sign of authority and could even get you flagged for spam
    Example: Jenny paid to have people who have never visited her coffee shop tell others how good it is.
  • No referrals = unclear authority
    Example: Jenny’s Coffee might be good, but you’ve been unable to find anyone who has an opinion so you can’t be sure.

This is why PageRank was created. PageRank (part of Google’s core algorithm) is a link analysis algorithm named after one of Google’s founders, Larry Page. PageRank estimates the importance of a web page by measuring the quality and quantity of links pointing to it. The assumption is that the more relevant, important, and trustworthy a web page is, the more links it will have earned.

The more natural backlinks you have from high-authority (trusted) websites, the better your odds are to rank higher within search results.

The role content plays in SEO

There would be no point to links if they didn’t direct searchers to something. That something is content! Content is more than just words; it’s anything meant to be consumed by searchers — there’s video content, image content, and of course, text. If search engines are answer machines, content is the means by which the engines deliver those answers.

Any time someone performs a search, there are thousands of possible results, so how do search engines decide which pages the searcher is going to find valuable? A big part of determining where your page will rank for a given query is how well the content on your page matches the query’s intent. In other words, does this page match the words that were searched and help fulfill the task the searcher was trying to accomplish?

Because of this focus on user satisfaction and task accomplishment, there’s no strict benchmarks on how long your content should be, how many times it should contain a keyword, or what you put in your header tags. All those can play a role in how well a page performs in search, but the focus should be on the users who will be reading the content.

Today, with hundreds or even thousands of ranking signals, the top three have stayed fairly consistent: links to your website (which serve as a third-party credibility signals), on-page content (quality content that fulfills a searcher’s intent), and RankBrain.

What is RankBrain?

RankBrain is the machine learning component of Google’s core algorithm. Machine learning is a computer program that continues to improve its predictions over time through new observations and training data. In other words, it’s always learning, and because it’s always learning, search results should be constantly improving.

For example, if RankBrain notices a lower ranking URL providing a better result to users than the higher ranking URLs, you can bet that RankBrain will adjust those results, moving the more relevant result higher and demoting the lesser relevant pages as a byproduct.

Like most things with the search engine, we don’t know exactly what comprises RankBrain, but apparently, neither do the folks at Google.

What does this mean for SEOs?

Because Google will continue leveraging RankBrain to promote the most relevant, helpful content, we need to focus on fulfilling searcher intent more than ever before. Provide the best possible information and experience for searchers who might land on your page, and you’ve taken a big first step to performing well in a RankBrain world.

Engagement metrics: correlation, causation, or both?

With Google rankings, engagement metrics are most likely part correlation and part causation.

When we say engagement metrics, we mean data that represents how searchers interact with your site from search results. This includes things like:

  • Clicks (visits from search)
  • Time on page (amount of time the visitor spent on a page before leaving it)
  • Bounce rate (the percentage of all website sessions where users viewed only one page)
  • Pogo-sticking (clicking on an organic result and then quickly returning to the SERP to choose another result)

Many tests, including Moz’s own ranking factor survey, have indicated that engagement metrics correlate with higher ranking, but causation has been hotly debated. Are good engagement metrics just indicative of highly ranked sites? Or are sites ranked highly because they possess good engagement metrics?

What Google has said

While they’ve never used the term “direct ranking signal,” Google has been clear that they absolutely use click data to modify the SERP for particular queries.

According to Google’s former Chief of Search Quality, Udi Manber:

“The ranking itself is affected by the click data. If we discover that, for a particular query, 80% of people click on #2 and only 10% click on #1, after a while we figure out probably #2 is the one people want, so we’ll switch it.”

Another comment from former Google engineer Edmond Lau corroborates this:

“It’s pretty clear that any reasonable search engine would use click data on their own results to feed back into ranking to improve the quality of search results. The actual mechanics of how click data is used is often proprietary, but Google makes it obvious that it uses click data with its patents on systems like rank-adjusted content items.”

Because Google needs to maintain and improve search quality, it seems inevitable that engagement metrics are more than correlation, but it would appear that Google falls short of calling engagement metrics a “ranking signal” because those metrics are used to improve search quality, and the rank of individual URLs is just a byproduct of that.

What tests have confirmed

Various tests have confirmed that Google will adjust SERP order in response to searcher engagement:

  • Rand Fishkin’s 2014 test resulted in a #7 result moving up to the #1 spot after getting around 200 people to click on the URL from the SERP. Interestingly, ranking improvement seemed to be isolated to the location of the people who visited the link. The rank position spiked in the US, where many participants were located, whereas it remained lower on the page in Google Canada, Google Australia, etc.
  • Larry Kim’s comparison of top pages and their average dwell time pre- and post-RankBrain seemed to indicate that the machine-learning component of Google’s algorithm demotes the rank position of pages that people don’t spend as much time on.
  • Darren Shaw’s testing has shown user behavior’s impact on local search and map pack results as well.

Since user engagement metrics are clearly used to adjust the SERPs for quality, and rank position changes as a byproduct, it’s safe to say that SEOs should optimize for engagement. Engagement doesn’t change the objective quality of your web page, but rather your value to searchers relative to other results for that query. That’s why, after no changes to your page or its backlinks, it could decline in rankings if searchers’ behaviors indicates they like other pages better.

In terms of ranking web pages, engagement metrics act like a fact-checker. Objective factors such as links and content first rank the page, then engagement metrics help Google adjust if they didn’t get it right.

The evolution of search results

Back when search engines lacked a lot of the sophistication they have today, the term “10 blue links” was coined to describe the flat structure of the SERP. Any time a search was performed, Google would return a page with 10 organic results, each in the same format.

In this search landscape, holding the #1 spot was the holy grail of SEO. But then something happened. Google began adding results in new formats on their search result pages, called SERP features. Some of these SERP features include:

  • Paid advertisements
  • Featured snippets
  • People Also Ask boxes
  • Local (map) pack
  • Knowledge panel
  • Sitelinks

And Google is adding new ones all the time. It even experimented with “zero-result SERPs,” a phenomenon where only one result from the Knowledge Graph was displayed on the SERP with no results below it except for an option to “view more results.”

The addition of these features caused some initial panic for two main reasons. For one, many of these features caused organic results to be pushed down further on the SERP. Another byproduct is that fewer searchers are clicking on the organic results since more queries are being answered on the SERP itself.

So why would Google do this? It all goes back to the search experience. User behavior indicates that some queries are better satisfied by different content formats. Notice how the different types of SERP features match the different types of query intents.

Query Intent

Possible SERP Feature Triggered

Informational

Featured Snippet

Informational with one answer

Knowledge Graph / Instant Answer

Local

Map Pack

Transactional

Shopping

We’ll talk more about intent in Chapter 3, but for now, it’s important to know that answers can be delivered to searchers in a wide array of formats, and how you structure your content can impact the format in which it appears in search.

Localized search

A search engine like Google has its own proprietary index of local business listings, from which it creates local search results.

If you are performing local SEO work for a business that has a physical location customers can visit (ex: dentist) or for a business that travels to visit their customers (ex: plumber), make sure that you claim, verify, and optimize a free Google My Business Listing.

When it comes to localized search results, Google uses three main factors to determine ranking:

  1. Relevance
  2. Distance
  3. Prominence

Relevance

Relevance is how well a local business matches what the searcher is looking for. To ensure that the business is doing everything it can to be relevant to searchers, make sure the business’ information is thoroughly and accurately filled out.

Distance

Google use your geo-location to better serve you local results. Local search results are extremely sensitive to proximity, which refers to the location of the searcher and/or the location specified in the query (if the searcher included one).

Organic search results are sensitive to a searcher’s location, though seldom as pronounced as in local pack results.

Prominence

With prominence as a factor, Google is looking to reward businesses that are well-known in the real world. In addition to a business’ offline prominence, Google also looks to some online factors to determine local ranking, such as:

Reviews

The number of Google reviews a local business receives, and the sentiment of those reviews, have a notable impact on their ability to rank in local results.

Citations

A “business citation” or “business listing” is a web-based reference to a local business’ “NAP” (name, address, phone number) on a localized platform (Yelp, Acxiom, YP, Infogroup, Localeze, etc.).

Local rankings are influenced by the number and consistency of local business citations. Google pulls data from a wide variety of sources in continuously making up its local business index. When Google finds multiple consistent references to a business’s name, location, and phone number it strengthens Google’s “trust” in the validity of that data. This then leads to Google being able to show the business with a higher degree of confidence. Google also uses information from other sources on the web, such as links and articles.

Check a local business’ citation accuracy here.

Organic ranking

SEO best practices also apply to local SEO, since Google also considers a website’s position in organic search results when determining local ranking.

In the next chapter, you’ll learn on-page best practices that will help Google and users better understand your content.

[Bonus!] Local engagement

Although not listed by Google as a local ranking determiner, the role of engagement is only going to increase as time goes on. Google continues to enrich local results by incorporating real-world data like popular times to visit and average length of visits…

Screenshot of Google SERP result for a local business showing busy times of day

…and even provides searchers with the ability to ask the business questions!

Screenshot of the Questions & Answers portion of a local Google SERP result

Undoubtedly now more than ever before, local results are being influenced by real-world data. This interactivity is how searchers interact with and respond to local businesses, rather than purely static (and game-able) information like links and citations.

Since Google wants to deliver the best, most relevant local businesses to searchers, it makes perfect sense for them to use real time engagement metrics to determine quality and relevance.


You don’t have to know the ins and outs of Google’s algorithm (that remains a mystery!), but by now you should have a great baseline knowledge of how the search engine finds, interprets, stores, and ranks content. Armed with that knowledge, let’s learn about choosing the keywords your content will target!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 months ago from tracking.feedpress.it

Troubleshooting Local Ranking Failures [Updated for 2018]

Posted by MiriamEllis

I love a mystery… especially a local search ranking mystery I can solve for someone.

Now, the truth is, some ranking puzzles are so complex, they can only be solved by a formal competitive audit. But there are many others that can be cleared up by spending 15 minutes or less going through an organized 10-point checklist of the commonest problems that can cause a business to rank lower than the owner thinks it should. By zipping through the following checklist, there’s a good chance you’ll be able to find one or more obvious “whodunits” contributing to poor Google local pack visibility for a given search.

Since I wrote the original version of this post in 2014, so much has changed. Branding, tools, tactics — things are really different in 2018. Definitely time for a complete overhaul, with the goal of making you a super sleuth for your forum friends, clients, agency teammates, or executive superiors.

Let’s emulate the Stratemeyer Syndicate, which earned lasting fame by hitting on a simple formula for surfacing and solving mysteries in a most enjoyable way.

Before we break out our magnifying glass, it’s critical to stress one very important thing. The local rankings I see from an office in North Beach, San Francisco are not the rankings you see while roaming around Golden Gate park in the same city. The rankings your client in Des Moines sees for things in his town are not the same rankings you see from your apartment in Albuquerque when you look at Des Moines results. With the user having become the centroid of search for true local searches, it is no mystery at all that we see different results when we are different places, and it is no cause for concern.

And now that we’ve gotten that out of the way and are in the proper detective spirit, let’s dive into how to solve for each item on our checklist!


☑ Google updates/bugs

The first thing to ask if a business experiences a sudden change in rankings is whether Google has done something. Search Engine Land strikes me as the fastest reporter of Google updates, with MozCast offering an ongoing weather report of changes in the SERPs. Also, check out the Moz Google Algo Change history list and the Moz Blog for some of the most in-depth strategic coverage of updates, penalties, and filters.

For local-specific bugs (or even just suspected tests), check out the Local Search Forum, the Google My Business forum, and Mike Blumenthal’s blog. See if the effects being described match the weirdness you are seeing in your local packs. If so, it’s a matter of fixing a problematic practice (like iffy link building) that has been caught in an update, waiting to see how the update plays out, or waiting for Google to fix a bug or turn a dial down to normalize results.

*Pro tip: Don’t make the mistake of thinking organic updates have nothing to do with local SEO. Crack detectives know organic and local are closely connected.

☑ Eligibility to list and rank

When a business owner wants to know why he isn’t ranking well locally, always ask these four questions:

  1. Does the business have a real address? (Not a PO box, virtual office, or a string of employees’ houses!)
  2. Does the business make face-to-face contact with its customers?
  3. What city is the business in?
  4. What is the exact keyword phrase they are hoping to rank for?

If the answer is “no” to either of the first two questions, the business isn’t eligible for a Google My Business listing. And while spam does flow through Google, a lack of eligibility could well be the key to a lack of rankings.

For the third question, you need to know the city the business is in so that you can see if it’s likely to rank for the search phrase cited in the fourth question. For example, a plumber with a street address in Sugar Land, TX should not expect to rank for “plumber Dallas TX.” If a business lacks a physical location in a given city, it’s atypical for it to rank for queries that stem from or relate to that locale. It’s amazing just how often this simple fact solves local pack mysteries.

☑ Guideline spam

To be an ace local sleuth, you must commit to memory the guidelines for representing your business on Google so that you can quickly spot violations. Common acts of spam include:

  • Keyword stuffing the business name field
  • Improper wording of the business name field
  • Creating listings for ineligible locations, departments, or people
  • Category spam
  • Incorrect phone number implementation
  • Incorrect website URL implementation
  • Review guideline violations

If any of the above conundrums are new to you, definitely spend 10 minutes reading the guidelines. Make flash cards, if necessary, to test yourself on your spam awareness until you can instantly detect glaring errors. With this enhanced perception, you’ll be able to see problems that may possibly be leading to lowered rankings, or even… suspensions!

☑ Suspensions

There are two key things to look for here when a local business owner comes to you with a ranking woe:

  1. If the listing was formerly verified, but has mysteriously become unverified, you should suspect a soft suspension. Soft suspensions might occur around something like a report of keyword-stuffing the GMB business name field. Oddly, however, there is little anecdotal evidence to support the idea that soft suspensions cause ranking drops. Nevertheless, it’s important to spot the un-verification clue and tell the owner to stop breaking guidelines. It’s possible that the listing may lose reviews or images during this type of suspension, but in most cases, the owner should be able to re-verify his listing. Just remember: a soft suspension is not a likely cause of low local pack rankings.
  2. If the listing’s rankings totally disappear and you can’t even find the listing via a branded search, it’s time to suspect a hard suspension. Hard suspensions can result from a listing falling afoul of a Google guideline or new update, a Google employee, or just a member of the public who has reported the business for something like an ineligible location. If the hard suspension is deserved, as in the case of creating a listing at a fake address, then there’s nothing you can do about it. But, if a hard suspension results from a mistake, I recommend taking it to the Google My Business forum to plead for help. Be prepared to prove that you are 100% guideline-compliant and eligible in hopes of getting your listing reinstated with its authority and reviews intact.

☑ Duplicates

Notorious for their ability to divide ranking strength, duplicate listings are at their worst when there is more than one verified listing representing a single entity. If you encounter a business that seems like it should be ranking better than it is for a given search, always check for duplicates.

The quickest way to do this is to get all present and past NAP (name, address, phone) from the business and plug it into the free Moz Check Listing tool. Pay particular attention to any GMB duplicates the tool surfaces. Then:

  1. If the entity is a brick-and-mortar business or service area business, and the NAP exactly matches between the duplicates, contact Google to ask them to merge the listings. If the NAP doesn’t match and represents a typo or error on the duplicate, use the “suggest an edit” link in Google Maps to toggle the “yes/no” toggle to “yes,” and then select the radio button for “never existed.”
  2. If the duplicates represent partners in a multi-practitioner business, Google won’t simply delete them. Things get quite complicated in this scenario, and if you discover practitioner duplicates, tread carefully. There are half a dozen nuances here, including whether you’re dealing with actual duplicates, whether they represent current or past staffers, whether they are claimed or unclaimed, and even whether a past partner is deceased. There isn’t perfect industry agreement on the handling of all of the ins-and-outs of practitioner listings. Given this, I would advise an affected business to read all of the following before making a move in any direction:

☑ Missing/inaccurate listings

While you’ve got Moz Check Listing fired up, pay attention to anything it tells you about missing or inaccurate listings. The tool will show you how accurate and complete your listings on are on the major local business data aggregators, plus other important platforms like Google My Business, Facebook, Factual, Yelp, and more. Why does this matter?

  1. Google can pull information from anywhere on the web and plunk it into your Google My Business listing.
  2. While no one can quantify the exact degree to which citation/listing consistency directly impacts Google local rankings for every possible search query, it has been a top 5 ranking factor in the annual Local Search Ranking Factors survey as far back as I can remember. Recently, I’ve seen some industry discussion as to whether citations still matter, with some practitioners claiming they can’t see the difference they make. I believe that conclusion may stem from working mainly in ultra-competitive markets where everyone has already got their citations in near-perfect order, forcing practitioners to look for differentiation tactics beyond the basics. But without those basics, you’re missing table stakes in the game.
  3. Indirectly, listing absence or inconsistency impacts local rankings in that it undermines the quest for good local KPIs as well as organic authority. Every lost or misdirected consumer represents a failure to have someone click-for-directions, click-to-call, click-to-your website, or find your website at all. Online and offline traffic, conversions, reputation, and even organic authority all hang in the balance of active citation management.

☑ Lack of organic authority

Full website or competitive audits are not the work of a minute. They really take time, and deep delving. But, at a glance, you can access some quick metrics to let you know whether a business’ lack of achievement on the organic side of things could be holding them back in the local packs. Get yourself the free MozBar SEO toolbar and try this:

  1. Turn the MozBar on by clicking the little “M” at the top of your browser so that it is blue.
  2. Perform your search and look at the first few pages of the organic results, ignoring anything from major directory sites like Yelp (they aren’t competing with you for local pack rankings, eh?).
  3. Note down the Page Authority, Domain Authority, and link counts for each of the businesses coming up on the first 3 pages of the organic results.
  4. Finally, bring up the website of the business you’re investigating. If you see that the top competitors have Domain Authorities of 50 and links numbering in the hundreds or thousands, whereas your target site is well below in these metrics, chances are good that organic authority is playing a strong role in lack of local search visibility. How do we know this is true? Do some local searches and note just how often the businesses that make it into the 3-pack or the top of the local finder view have correlating high organic rankings.

Where organic authority is poor, a business has a big job of work ahead. They need to focus on content dev + link building + social outreach to begin building up their brand in the minds of consumers and the “RankBrain” of Google.

One other element needs to be mentioned here, and that’s the concept of how time affects authority. When you’re talking to a business with a ranking problem, it’s very important to ascertain whether they just launched their website or just built their local business listings last week, or even just a few months ago. Typically, if they have, the fruits of their efforts have yet to fully materialize. That being said, it’s not a given that a new business will have little authority. Large brands have marketing departments which exist solely to build tremendous awareness of new assets before they even launch. It’s important to keep that in mind, while also realizing that if the business is smaller, building authority will likely represent a longer haul.

☑ Possum effect

Where local rankings are absent, always ask:

“Are there any other businesses in your building or even on your street that share your Google category?”

If the answer is “yes,” search for the business’ desired keyword phase and look at the local finder view in Google Maps. Note which companies are ranking. Then begin to zoom in on the map, level by level, noting changes in the local finder as you go. If, a few levels in, the business you’re advising suddenly appears on the map and in the local finder, chances are good it’s the Possum filter that’s causing their apparent invisibility at the automatic zoom level.

Google Possum rolled out in September 2016, and its observable effects included a geographic diversification of the local results, filtering out many listings that share a category and are in close proximity to one another. Then, about one year later, Google initiated the Hawk update, which appears to have tightened the radius of Possum, with the result that while many businesses in the same building are still being filtered out, a number of nearby neighbors have reappeared at the automatic zoom level of the results.

If your sleuthing turns up a brand that is being impacted by Possum/Hawk, the only surefire way to beat the filter is to put in the necessary work to become the most authoritative answer for the desired search phrase. It’s important to remember that filters are the norm in Google’s local results, and have long been observed impacting listings that share an address, share a phone number, etc. If it’s vital for a particular listing to outrank all others that possess shared characteristics, then authority must be built around it in every possible way to make it one of the most dominant results.

☑ Local Service Ads effect

The question you ask here is:

“Is yours a service-area business?”

And if the answer is “yes,” then brace yourself for ongoing results disruption in the coming year.

Google’s Local Service Ads (formerly Home Service Ads) make Google the middleman between consumers and service providers, and in the 2+ years since first early testing, they’ve caused some pretty startling things to happen to local search results. These have included:

Suffice it to say, rollout to an ever-increasing number of cities and categories hasn’t been for the faint of heart, and I would hazard a guess that Google’s recent re-brand of this program signifies their intention to move beyond the traditional SAB market. One possible benefit of Google getting into this type of lead gen is that it could decrease spam, but I’m not sold on this, given that fake locations have ended up qualifying for LSA inclusion. While I honor Google’s need to be profitable, I share some of the qualms business owners have expressed about the potential impacts of this venture.

Since I can’t offer a solid prediction of what precise form these impacts will take in the coming months, the best I can do here is to recommend that if an SAB experiences a ranking change/loss, the first thing to look for is whether LSA has come to town. If so, alteration of the SERPs may be unavoidable, and the only strategy left for overcoming vanished visibility may be to pay for it… by qualifying for the program.

☑ GMB neglect

Sometimes, a lack of competitive rankings can simply be chalked up to a lack of effort. If a business wonders why they’re not doing better in the local packs, pull up their GMB listing and do a quick evaluation of:

  • Verification status – While you can rank without verifying, lack of verification is a hallmark of listing neglect.
  • Basic accuracy – If NAP or map markers are incorrect, it’s a sure sign of neglect.
  • Category choices – Wrong categories make right rankings impossible.
  • Image optimization – Every business needs a good set of the most professional, persuasive photos it can acquire, and should even consider periodic new photo shoots for seasonal freshness; imagery impacts KPIs, which are believed to impact rank.
  • Review count, sentiment and management – Too few reviews, low ratings, and lack of responses = utter neglect of this core rank/reputation-driver.
  • Hours of operation – If they’re blank or incorrect, conversions are being missed.
  • Main URL choice – Does the GMB listing point to a strong, authoritative website page or a weak one?
  • Additional URL choices – If menus, bookings, reservations, or placing orders is part of the business model, a variety of optional URLs are supported by Google and should be explored.
  • Google Posts – Early-days testing indicates that regular posting may impact rank.
  • Google Questions and Answers – Pre-populate with best FAQs and actively manage incoming questions.

There is literally no business, large or small, with a local footprint that can afford to neglect its Google My Business listing. And while some fixes and practices move the ranking needle more than others, the increasing number of consumer actions that take place within Google is reason enough to put active GMB management at the top of your list.


Closing the case

The Hardy Boys never went anywhere without their handy kit of detection tools. Their father was so confident in their utter preparedness that he even let them chase down gangs in Hong Kong and dictators in the Guyanas (which, on second thought, doesn’t seem terribly wise.) But I have that kind of confidence in you. I hope my troubleshooting checklist is one you’ll bookmark and share to be prepared for the local ranking mysteries awaiting you and your digital marketing colleagues in 2018. Happy sleuthing!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 9 months ago from tracking.feedpress.it

Local search ranking factors: What’s working in 2017 [Podcast]

In our new episode, we chat with Darren Shaw about the just-released Local Search Ranking Factors survey and discuss what marketers need to know about local SEO in 2017.

The post Local search ranking factors: What’s working in 2017 [Podcast] appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Reblogged 1 year ago from feeds.searchengineland.com

Announcing the 2017 Local Search Ranking Factors Survey Results

Posted by Whitespark

Since its inception in 2008, David Mihm has been running the Local Search Ranking Factors survey. It is the go-to resource for helping businesses and digital marketers understand what drives local search results and what they should focus on to increase their rankings. This year, David is focusing on his new company, Tidings, a genius service that automatically generates perfectly branded newsletters by pulling in the content from your Facebook page and leading content sources in your industry. While he will certainly still be connected to the local search industry, he’s spending less time on local search research, and has passed the reins to me to run the survey.

David is one of the smartest, nicest, most honest, and most generous people you will ever meet. In so many ways, he has helped direct and shape my career into what it is today. He has mentored me and promoted me by giving me my first speaking opportunities at Local U events, collaborated with me on research projects, and recommended me as a speaker at important industry conferences. And now, he has passed on one of the most important resources in our industry into my care. I am extremely grateful.

Thank you, David, for all that you have done for me personally, and for the local search industry. I am sure I speak for all who know you personally and those that know you through your work in this space; we wish you great success with your new venture!

I’m excited to dig into the results, so without further ado, read below for my observations, or:

Click here for the full results!

Shifting priorities

Here are the results of the thematic factors in 2017, compared to 2015:

Thematic Factors

2015

2017

Change

GMB Signals

21.63%

19.01%

-12.11%

Link Signals

14.83%

17.31%

+16.73%

On-Page Signals

14.23%

13.81%

-2.95%

Citation Signals

17.14%

13.31%

-22.36%

Review Signals

10.80%

13.13%

+21.53%

Behavioral Signals

8.60%

10.17%

+18.22%

Personalization

8.21%

9.76%

+18.81%

Social Signals

4.58%

3.53%

-22.89%

If you look at the Change column, you might get the impression that there were some major shifts in priorities this year, but the Change number doesn’t tell the whole story. Social factors may have seen the biggest drop with a -22.89% change, but a shift in emphasis on social factors from 4.58% to 3.53% isn’t particularly noteworthy.

The decreased emphasis on citations compared to the increased emphasis on link and review factors, is reflective of shifting focus, but as I’ll discuss below, citations are still crucial to laying down a proper foundation in local search. We’re just getting smarter about how far you need to go with them.

The importance of proximity

For the past two years, Physical Address in City of Search has been the #1 local pack/finder ranking factor. This makes sense. It’s tough to rank in the local pack of a city that you’re not physically located in.

Well, as of this year’s survey, the new #1 factor is… drumroll please…

Proximity of Address to the Point of Search

This factor has been climbing from position #8 in 2014, to position #4 in 2015, to claim the #1 spot in 2017. I’ve been seeing this factor’s increased importance for at least the past year, and clearly others have noticed as well. As I note in my recent post on proximity, this leads to poor results in most categories. I’m looking for the best lawyer in town, not the closest one. Hopefully we see the dial get turned down on this in the near future.

While Proximity of Address to the Point of Search is playing a stronger role than ever in the rankings, it’s certainly not the only factor impacting rankings. Businesses with higher relevancy and prominence will rank in a wider radius around their business and take a larger percentage of the local search pie. There’s still plenty to be gained from investing in local search strategies.

Here’s how the proximity factors changed from 2015 to 2017:

Proximity Factors

2015

2017

Change

Proximity of Address to the Point of Search

#4

#1

+3

Proximity of Address to Centroid of Other Businesses in Industry

#20

#30

-10

Proximity of Address to Centroid

#16

#50

-34

While we can see that Proximity to the Point of Search has seen a significant boost to become the new #1 factor, the other proximity factors which we once thought were extremely important have seen a major drop.

I’d caution people against ignoring Proximity of Address to Centroid, though. There is a situation where I think it still plays a role in local rankings. When you’re searching from outside of a city for a key phrase that contains the city name (Ex: Denver plumbers), then I believe Google geo-locates the search to the centroid and Proximity of Address to Centroid impacts rankings. This is important for business categories that are trying to attract searchers from outside of their city, such as attractions and hotels.

Local SEOs love links

Looking through the results and the comments, a clear theme emerges: Local SEOs are all about the links these days.

In this year’s survey results, we’re seeing significant increases for link-related factors across the board:

Local Pack/Finder Link Factors

2015

2017

Change

Quality/Authority of Inbound Links to Domain

#12

#4

+8

Domain Authority of Website

#6

#6

Diversity of Inbound Links to Domain

#27

#16

+11

Quality/Authority of Inbound Links to GMB Landing Page URL

#15

#11

+4

Quantity of Inbound Links to Domain

#34

#17

+17

Quantity of Inbound Links to Domain from Locally Relevant Domains

#31

#20

+11

Page Authority of GMB Landing Page URL

#24

#22

+2

Quantity of Inbound Links to Domain from Industry-Relevant Domains

#41

#28

+13

Product/Service Keywords in Anchor Text of Inbound Links to Domain

#33

+17

Location Keywords in Anchor Text of Inbound Links to Domain

#45

#38

+7

Diversity of Inbound Links to GMB Landing Page URL

#39

+11

Quantity of Inbound Links to GMB Landing Page URL from LocallyRelevant Domains

#48

+2

Google is still leaning heavily on links as a primary measure of a business’ authority and prominence, and the local search practitioners that invest time and resources to secure quality links for their clients are reaping the ranking rewards.

Fun fact: “links” appears 76 times in the commentary.

By comparison, “citations” were mentioned 32 times, and “reviews” were mentioned 45 times.

Shifting priorities with citations

At first glance at all the declining factors in the table below, you might think that yes, citations have declined in importance, but the situation is more nuanced than that.

Local Pack/Finder Citation Factors

2015

2017

Change

Consistency of Citations on The Primary Data Sources

n/a

#5

n/a

Quality/Authority of Structured Citations

#5

#8

-3

Consistency of Citations on Tier 1 Citation Sources

n/a

#9

n/a

Quality/Authority of Unstructured Citations (Newspaper Articles, Blog Posts, Gov Sites, Industry Associations)

#18

#21

-3

Quantity of Citations from Locally Relevant Domains

#21

#29

-8

Prominence on Key Industry-Relevant Domains

n/a

#37

n/a

Quantity of Citations from Industry-Relevant Domains

#19

#40

-21

Enhancement/Completeness of Citations

n/a

#44

n/a

Proper Category Associations on Aggregators and Tier 1 Citation Sources

n/a

#45

n/a

Quantity of Structured Citations (IYPs, Data Aggregators)

#14

#47

-33

Consistency of Structured Citations

#2

n/a

n/a

Quantity of Unstructured Citations (Newspaper Articles, Blog Posts)

#39

-11

You’ll notice that there are many “n/a” cells on this table. This is because I made some changes to the citation factors. I elaborate on this in the survey results, but for your quick reference here:

  1. To reflect the reality that you don’t need to clean up your citations on hundreds of sites, Consistency of Structured Citations has been broken down into 4 new factors:
    1. Consistency of Citations on The Primary Data Sources
    2. Consistency of Citations on Tier 1 Citation Sources
    3. Consistency of Citations on Tier 2 Citation Sources
    4. Consistency of Citations on Tier 3 Citation Sources
  2. I added these new citation factors:
    1. Enhancement/Completeness of Citations
    2. Presence of Business on Expert-Curated “Best of” and Similar Lists
    3. Prominence on Key Industry-Relevant Domains
    4. Proper Category Associations on Aggregators and Top Tier Citation Sources

Note that there are now more citation factors showing up, so some of the scores given to citation factors in 2015 are now being split across multiple factors in 2017:

  • In 2015, there were 7 citation factors in the top 50
  • In 2017, there are 10 citation factors in the top 50

That said, overall, I do think that the emphasis on citations has seen some decline (certainly in favor of links), and rightly so. In particular, there is an increasing focus on quality over quantity.

I was disappointed to see that Presence of Business on Expert-Curated “Best of” and Similar Lists didn’t make the top 50. I think this factor can provide a significant boost to a business’ local prominence and, in turn, their rankings. Granted, it’s a challenging factor to directly influence, but I would love to see an agency make a concerted effort to outreach to get their clients listed on these, measure the impact, and do a case study. Any takers?

GMB factors

There is no longer an editable description on your GMB listing, so any factors related to the GMB description field were removed from the survey. This is a good thing, since the field was typically poorly used, or abused, in the past. Google is on record saying that they didn’t use it for ranking, so stuffing it with keywords has always been more likely to get you penalized than to help you rank.

Here are the changes in GMB factors:

GMB Factors

2015

2017

Change

Proper GMB Category Associations

#3

#3

Product/Service Keyword in GMB Business Title

#7

#7

Location Keyword in GMB Business Title

#17

#12

+5

Verified GMB Listing

#13

#13

GMB Primary Category Matches a Broader Category of the Search Category (e.g. primary category=restaurant & search=pizza)

#22

#15

+7

Age of GMB Listing

#23

#25

-2

Local Area Code on GMB Listing

#33

#32

+1

Association of Photos with GMB Listing

#36

+14

Matching Google Account Domain to GMB Landing Page Domain

#36

-14

While we did see some upward movement in the Location Keyword in GMB Business Title factor, I’m shocked to see that Product/Service Keyword in GMB Business Title did not also go up this year. It is hands-down one of the strongest factors in local pack/finder rankings. Maybe THE strongest, after Proximity of Address to the Point of Search. It seems to me that everyone and their dog is complaining about how effective this is for spammers.

Be warned: if you decide to stuff your business title with keywords, international spam hunter Joy Hawkins will probably hunt your listing down and get you penalized. 🙂

Also, remember what happened back when everyone was spamming links with private blog networks, and then got slapped by the Penguin Update? Google has a complete history of changes to your GMB listing, and they could decide at any time to roll out an update that will retroactively penalize your listing. Is it really worth the risk?

Age of GMB Listing might have dropped two spots, but it was ranked extremely high by Joy Hawkins and Colan Neilsen. They’re both top contributors at the Google My Business forum, and I’m not saying they know something we don’t know, but uh, maybe they know something we don’t know.

Association of Photos with GMB Listing is a factor that I’ve heard some chatter about lately. It didn’t make the top 50 in 2015, but now it’s coming in at #36. Apparently, some Google support people have said it can help your rankings. I suppose it makes sense as a quality consideration. Listings with photos might indicate a more engaged business owner. I wonder if it matters whether the photos are uploaded by the business owner, or if it’s a steady stream of incoming photo uploads from the general public to the listing. I can imagine that a business that’s regularly getting photo uploads from users might be a signal of a popular and important business.

While this factor came in as somewhat benign in the Negative Factors section (#26), No Hours of Operation on GMB Listing might be something to pay attention to, as well. Nick Neels noted in the comments:

Our data showed listings that were incomplete and missing hours of operation were highly likely to be filtered out of the results and lose visibility. As a result, we worked with our clients to gather hours for any listings missing them. Once the hours of operation were uploaded, the listings no longer were filtered.

Behavioral factors

Here are the numbers:

GMB Factors

2015

2017

Change

Clicks to Call Business

#38

#35

+3

Driving Directions to Business Clicks

#29

#43

-14

Not very exciting, but these numbers do NOT reflect the serious impact that behavioral factors are having on local search rankings and the increased impact they will have in the future. In fact, we’re never going to get numbers that truly reflect the value of behavioral factors, because many of the factors that Google has access to are inaccessible and unmeasurable by SEOs. The best place to get a sense of the impact of these factors is in the comments. When asked about what he’s seeing driving rankings this year, Phil Rozek notes:

There seem to be more “black box” ranking scenarios, which to me suggests that behavioral factors have grown in importance. What terms do people type in before clicking on you? Where do those people search from? How many customers click on you rather than on the competitor one spot above you? If Google moves you up or down in the rankings, will many people still click? I think we’re somewhere past the beginning of the era of mushy ranking factors.

Mike Blumenthal also talks about behavioral factors in his comments:

Google is in a transition period from a web-based linking approach to a knowledge graph semantic approach. As we move towards a mobile-first index, the lack of linking as a common mobile practice, voice search, and single-response answers, Google needs to and has been developing ranking factors that are not link-dependent. Content, actual in-store visitations, on-page verifiable truth, third-party validation, and news-worthiness are all becoming increasingly important.

But Google never throws anything away. Citations and links as we have known them will continue to play a part in the ranking algo, but they will be less and less important as Google increases their understanding of entity prominence and the real world.

And David Mihm says:

It’s a very difficult concept to survey about, but the overriding ranking factor in local — across both pack and organic results — is entity authority. Ask yourself, “If I were Google, how would I define a local entity, and once I did, how would I rank it relative to others?” and you’ll have the underlying algorithmic logic for at least the next decade.

    • How widely known is the entity? Especially locally, but oh man, if it’s nationally known, searchers should REALLY know about it.
    • What are people saying about the entity? (It should probably rank for similar phrases)
    • What is the engagement with the entity? Do people recognize it when they see it in search results? How many Gmail users read its newsletter? How many call or visit it after seeing it in search results? How many visit its location?

David touches on this topic in the survey response above, and then goes full BEAST MODE on the future of local rankings in his must-read post on Tidings, The Difference-Making Local Ranking Factor of 2020. (David, thank you for letting me do the Local Search Ranking Factors, but please, don’t ever leave us.)

The thing is, Google has access to so much additional data now through Chrome, Android, Maps, Ads, and Search. They’d be crazy to not use this data to help them understand which businesses are favored by real, live humans, and then rank those businesses accordingly. You can’t game this stuff, folks. In the future, my ranking advice might just be: “Be an awesome business that people like and that people interact with.” Fortunately, David thinks we have until 2020 before this really sets in, so we have a few years left of keyword-stuffing business titles and building anchor text-optimized links. Phew.

To survey or to study? That is not the question

I’m a fan of Andrew Shotland’s and Dan Leibson’s Local SEO Ranking Factors Study. I think that the yearly Local Search Ranking Factors Survey and the yearly (hopefully) Local SEO Ranking Factors Study nicely complement each other. It’s great to see some hard data on what factors correlate with rankings. It confirms a lot of what the contributors to this survey are intuitively seeing impact rankings for their clients.

There are some factors that you just can’t get data for, though, and the number of these “black box” factors will continue to grow over the coming years. Factors such as:

  • Behavioral factors and entity authority, as described above. I don’t think Google is going to give SEOs this data anytime soon.
  • Relevancy. It’s tough to measure a general relevancy score for a business from all the different sources Google could be pulling this data from.
  • Even citation consistency is hard to measure. You can get a general sense of this from tools like Moz Local or Yext, but there is no single citation consistency metric you can use to score businesses by. The ecosystem is too large, too complicated, and too nuanced to get a value for consistency across all the location data that Google has access to.

The survey, on the other hand, aggregates opinions from the people that are practicing and studying local search day in and day out. They do work for clients, test things, and can see what had a positive impact on rankings and what didn’t. They can see that when they built out all of the service pages for a local home renovations company, their rankings across the board went up through increased relevancy for those terms. You can’t analyze these kinds of impacts with a quantitative study like the Local SEO Ranking Factors Study. It takes some amount of intuition and insight, and while the survey approach certainly has its flaws, it does a good job of surfacing those insights.

Going forward, I think there is great value in both the survey to get the general sense of what’s impacting rankings, and the study to back up any of our theories with data — or to potentially refute them, as they may have done with city names in webpage title tags. Andrew and Dan’s empirical study gives us more clues than we had before, so I’m looking forward to seeing what other data sources they can pull in for future editions.

Possum’s impact has been negligible

Other than Proper GMB Category Associations, which is definitely seeing a boost because of Possum, you can look at the results in this section more from the perspective of “this is what people are focusing on more IN GENERAL.” Possum hasn’t made much of an impact on what we do to rank businesses in local. It has simply added another point of failure in cases where a business gets filtered.

One question that’s still outstanding in my mind is: what do you do if you are filtered? Why is one business filtered and not the other? Can you do some work to make your business rank and demote the competitor to the filter? Is it more links? More relevancy? Hopefully someone puts out some case studies soon on how to defeat the dreaded Possum filter (paging Joy Hawkins).

Focusing on More Since Possum

#1

Proximity of Address to the Point of Search

#2

Proper GMB Category Associations

#3

Quality/Authority of Inbound Links to Domain

#4

Quantity of Inbound Links to Domain from Locally Relevant Domains

#5

Click-Through Rate from Search Results

Focusing on Less Since Possum

#1

Proximity of Address to Centroid

#2

Physical Address in City of Search

#3

Proximity of Address to Centroid of Other Businesses in Industry

#4

Quantity of Structured Citations (IYPs, Data Aggregators)

#5

Consistency of Citations on Tier 3 Citation Sources

Foundational factors vs. competitive difference-makers

There are many factors in this survey that I’d consider table stakes. To get a seat at the rankings table, you must at least have these factors in order. Then there are the factors which I’d consider competitive difference-makers. These are the factors that, once you have a seat at the table, will move your rankings beyond your competitors. It’s important to note that you need BOTH. You probably won’t rank with only the foundation unless you’re in an extremely low-competition market, and you definitely won’t rank if you’re missing that foundation, no matter how many links you have.

This year I added a section to try to get a sense of what the local search experts consider foundational factors and what they consider to be competitive difference-makers. Here are the top 5 in these two categories:

Foundational

Competitive Difference Makers

#1

Proper GMB Category Associations

Quality/Authority of Inbound Links to Domain

#2

Consistency of Citations on the Primary Data Sources

Quantity of Inbound Links to Domain from Industry-Relevant Domains

#3

Physical Address in City of Search

Quality/Authority of Inbound Links to GMB Landing Page URL

#4

Proximity of Address to the Point of Search (Searcher-Business Distance)

Quantity of Inbound Links to Domain from Locally Relevant Domains

#5

Consistency of Citations on Tier 1 Citation Sources

Quantity of Native Google Reviews (with text)

I love how you can look at just these 10 factors and pretty much extract the basics of how to rank in local:

“You need to have a physical location in the city you’re trying to rank in, and it’s helpful for it to be close to the searcher. Then, make sure to have the proper categories associated with your listing, and get your citations built out and consistent on the most important sites. Now, to really move the needle, focus on getting links and reviews.”

This is the much over-simplified version, of course, so I suggest you dive into the full survey results for all the juicy details. The amount of commentary from participants is double what it was in 2015, and it’s jam-packed with nuggets of wisdom. Well worth your time.

Got your coffee? Ready to dive in?

Take a look at the full results

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 1 year ago from tracking.feedpress.it

Local SEO &amp; Beyond: Ranking Your Local Business in 2017

Posted by Casey_Meraz

In 2016, I predicted that ranking in the 3-pack was hard and it would continually get more competitive. I maintain that prediction for 2017, but I want to make one thing clear. If you haven’t done so, I believe local businesses should start to look outside of a local-SEO-3-Pack-ONLY focused strategy.

While local SEO still presents a tremendous opportunity to grow your business, I’m going to look at some supplementary organic strategies you can take into your local marketing campaign, as well.

In this post I’m going to address:

  • How local search has changed since last year
  • Why & how your overall focus may need to change in 2017
  • Actionable advice on how to rank better to get more local traffic & more business

In local search success, one thing is clear

The days of getting in the 3-pack and having a one-trick pony strategy are over. Every business wants to get the free traffic from Google’s local results, but the chances are getting harder everyday. Not only are you fighting against all of your competitors trying to get the same rankings, but now you’re also fighting against even more ads.

If you thought it was hard to get top placement today in the local pack, just consider that you’re also fighting against 4+ ads before customers even have the possibility of seeing your business.

Today’s SERPs are ad-rich with 4 paid ads at the top, and now it’s not uncommon to find paid listings prioritized in local results. Just take a look at this example that Gyi Tsakalakis shared with me, showing one ad in the local pack on mobile ranking above the 3-pack results. Keep in mind, there are four other ads above this.

If you were on desktop and you clicked on one of the 3-pack results, you’re taken to the local finder. In the desktop search example below, once you make it to the local finder you’ll see two paid local results above the other businesses.

Notice how only the companies participating in paid ads have stars. Do you think that gives them an advantage? I do.


Don’t worry though, I’m not jaded by ads

After all of that gloomy ad SERP talk, you’re probably getting a little depressed. Don’t. With every change there comes new opportunity, and we’ve seen many of our clients excel in search by focusing on multiple strategies that work for their business.

Focusing on the local pack should still be a strong priority for you, even if you don’t have a pay-to-play budget for ads. Getting listed in the local finder can still result in easy wins — especially if you have the most reviews, as Google has very handy sorting options.

If you have the highest rating score, you can easily get clicks when users decide to sort the results they see by the business rating. Below is an example of how users can easily sort by ratings.

But what else can you do to compete effectively in your local market?


Consider altering your local strategy

Most businesses I speak with seem to have tunnel vision. They think it’s more important to rank in the local pack and, in some cases, even prioritize this over the real goal: more customers.

Every day, I talk to new businesses and marketers that seem to have a single area of focus. While it’s not necessarily a bad thing to do one thing really well, the ones that are most successful are managing a variety of campaigns tied to their business goals.

Instead of taking a single approach of focusing on just free local clicks, expand your horizon a bit and ask yourself this question: Where are my customers looking and how can I get in front of them?

Sometimes taking a step back and looking at things from the 30,000-ft view is beneficial.


You can start by asking yourself these questions by examining the SERPs:

1. What websites, OTHER THAN MY OWN, have the most visibility for the topics and keywords I’m interested in?

You can bet people are clicking on results other than your own website underneath the local results. Are they websites you can show up on? How do you increase that visibility?

I think STAT has a great tracking tool for this. You simply set up the keywords you want to track and their Share of Voice feature shows who’s ranking where and what percentage of visibility they have in your specific market.

In the example below, you can see the current leaders in a space I’m tracking. Notice how Findlaw & Yelp show up there. With a little further research I can find out if they have number 1–2 rankings (which they do) and determine whether I should put in place a strategy to rank there. This is called barnacle SEO.

2. Are my customers using voice search?

Maybe it’s just me, but I find it strange to talk to my computer. That being said, I have no reservations about talking to my phone — even when I’m in places I shouldn’t. Stone Temple recently published a great study on voice command search, which you can check out here.

Some of the cool takeaways from that study were where people search from. It seems people are more likely to search from the privacy of their own home, but most mobile devices out there today have voice search integrated. I wonder how many people are doing this from their cars?
This goes to show that local queries are not just about the 3-pack. While many people may ask their device “What’s the nearest pizza place,” other’s may ask a variety of questions like:

Where is the highest-rated pizza place nearby?
Who makes the best pizza in Denver?
What’s the closest pizza place near me?

Don’t ignore voice search when thinking about your localized organic strategy. Voice is mobile and voice can sure be local. What localized searches would someone be interested in when looking for my business? What questions might they be asking that would drive them to my local business?

3. Is my website optimized for “near me” searches?

“Near me” searches have been on the rise over the past five years and I don’t expect that to stop. Sometimes customers are just looking for something close by. Google Trends data shows how this has changed in the past five years:
Are you optimizing for a “near me” strategy for your business? Recently the guys over at Local SEO Guide did a study of “near me” local SEO ranking factors. Optimizing for “near me” searches is important and it falls right in line with some of the tactical advice we have for increasing your Google My Business rankings as well. More on that later.

4. Should my business stay away from ads?

Let’s start by looking at a some facts. Google makes money off of their paid ads. According to an article from Adweek, “During the second quarter of 2016, Alphabet’s revenue hit $21.5 billion, a 21% year-over-year increase. Of that revenue, $19.1 billion came from Google’s advertising business, up from $16 billion a year ago.”

This roughly translates to: “Ads aren’t going anywhere and Google is going to do whatever they can to put them in your face.” If you didn’t see the Home Service ad test with all ads that Mike Blumenthal pointed out, you can check it out below. Google is trying to find more creative ways to monetize local search.
Incase you haven’t heard it before, having both organic and paid listings ranking highly increases your overall click-through rate.

Although the last study I found was from Google in 2012, we’ve found that our clients have the most success when they rank strong organically, locally, and have paid placements. All of these things tie together. If potential customers are already searching for your business, you’ll see great results by being involved in all of these areas.

While I’m not a fan of only taking a pay-to-play approach, you need to at least start considering it and testing it for your niche to see if it works for you. Combine it with your overall local and organic strategy.

5. Are we ignoring the featured snippets?

Searches with local intent can still trigger featured snippets. One example that I saw recently and really liked was the snowboard size chart example, which you can see below. In this example, someone who is interested in snowboards gets an answer box that showcases a company. If someone is doing this type of research, there’s a likelihood that they may wish to purchase a snowboard soon.
Depending on your niche, there are plenty of opportunities to increase your local visibility by not ignoring featured snippets and creating content to rank there. Check out this Whiteboard Friday to learn more about how you can get featured snippets.

Now that we’ve looked at some ways you can expand your strategies, let’s look at some tactical steps you can take to move the needle.


Here’s how you can gain more visibility

Now that you have an open mind, let’s take a look at the actionable things you can do to improve your overall visibility and rankings in locally centric campaigns. As much as I like to think local SEO is rocket science, it really isn’t. You really need to focus your attention on the things that are going to move the needle.

I’m also going to assume you’ve already done the basics, like optimize your listing by filling out the profile 100%.

Later last year, Local SEO Guide and Placescout did a great study that looked at 100+ variables from 30,000 businesses to determine what factors might have the most overall impact in local 3-pack rankings. If you have some spare time I recommend checking it out. It verified that the signals we put the most effort into seem to have the greatest overall effect.

I’m only going to dive into a few of those factors, but here are the things I would do to focus on a results-first strategy:

Start with a solid website/foundation

What good are rankings without conversions? The answer is they aren’t any good. If you’re always keeping your business goals in mind, start with the basics. If your website isn’t loading fast, you’re losing conversions and you may experience a reduced crawl budget.

My #1 recommendation that affects all aspects of SEO and conversions is to start with a solid website. Ignoring this usually creates bigger problems later down the road and can negatively impact your overall rankings.

Your website should be SEO-friendly and load in the 90th percentile on Google’s Page Speed Insights. You can also see how fast your website loads for users using tools like GTMetrix. Google seems to reduce the visibility of slower websites, so if you’re ignoring the foundation you’re going to have issues. Here are 6 tips you can use for a faster WordPress website.

Crawl errors for bots can also wreak havoc on your website. You should always strive to maintain a healthy site. Check up on your website using Google’s Search Console and use Moz Pro to monitor your clients’ campaigns by actively tracking the sites’ health, crawl issues, and domain health over time. Having higher scores and less errors should be your focus.

Continue with a strong review generation strategy

I’m sure many of you took a deep breath when earlier this month Google changed the review threshold to only 1 review. That’s right. In case you didn’t hear, Google is now giving all businesses a review score based on any number of reviews you have, as you can see in the example below:
I know a lot of my colleagues were a big fan of this, but I have mixed feelings since Google isn’t taking any serious measures to reduce review spam or penalize manipulative businesses at this point.

Don’t ignore the other benefits of reviews, as well. Earlier I mentioned that users can sort by review stars; having more reviews will increase your overall CTR. Plus, after talking to many local businesses, we’ve gotten a lot of feedback that consumers are actively using these scores more than ever.

So, how do you get more reviews?

Luckily, Google’s current Review and Photo Policies do not prohibit the direct solicitation of reviews at this point (unlike Yelp).

Start by soliciting past customers on your list
If you’re not already collecting customer information on your website or in-store, you’re behind the times and you need to start doing so immediately.

I work mainly with attorneys. Working in that space, there are regulations we have to follow, and typically the number of clients is substantially less than a pizza joint. In pickles like this, where the volume is low, we can take a manual approach where we identify the happiest clients and reach out to them using this process. This particular process also creates happy employees. 🙂

  1. List creation: We start by screening the happiest clients. We then sort these by who has a Gmail account for priority’s sake.
  2. Outreach by phone: I don’t know why digital marketers are afraid of the phone, but we’ve had a lot of success calling our prior clients. We have the main point-of-contact from the business who’s worked with them before call and ask how the service they received was. The caller informs them that they have a favor to ask and that their overall job performance is partially based off of client feedback. They indicate they’re going to send a follow-up email if it’s OK with the customer.
  3. Send a follow-up email: We then use a Google review link generator, which creates an exact URL that opens the review box for the person if they’re logged into their Gmail account.
  4. Follow-up email: Sometimes emails get lost. We follow up a few times to make sure the client leaves the review…
  5. You have a new review!

The method above works great for low-volume businesses. If you’re a higher-volume business or have a lot of contacts, I recommend using a more automated service to prepare for future and ongoing reviews, as it’ll make the process a heck of a lot easier. Typically we use Get Five Stars or Infusionsoft integrations to complete this for our clients.

If you run a good business that people like, you can see results like this. This is a local business which had 7 reviews in 2015. Look where they are now with a little automation asking happy customers to leave a review:

Don’t ignore & don’t be afraid of links

One thing Google succeeded at is scaring away people from getting manipulative links. In many areas, that went too far and resulted in people not going after links at all, diminishing their value as a ranking factor, and telling the world that links are dead.

Well, I’m here to tell you that you need good links to your website. If you want to rank in competitive niches or in certain geographic areas, the anchor text can make a big difference. Multiple studies have shown the effectiveness of links to this very day, and their importance cannot be overlooked.

This table outlines which link tactics work best for each strategy:

Strategy Type Link Tactic
Local SEO (3-Pack) Links to local GMB-connected landing page will help 3-pack rankings. City, state, and keyword-included anchor text is beneficial
Featured Snippets Links to pages where you want to get a featured snippet will help boost the authority of that page.
Paid Ads Links will not help your paid ads.
“Near Me” Searches Links with city, state, or area anchor text will help you in near me searches.
Voice Search Links to pages that are FAQ or consist of long-tail keyword content will help them rank better organically.
Barnacle SEO Links to websites you don’t own can help them rank better. Focus on high-authority profiles or business listings.

There are hundreds of ways to build links for your firm. You need to avoid paying for links and spammy tactics because they’re just going to hurt you. Focus on strong and sustainable strategies — if you want to do it right, there aren’t any shortcuts.

Since there are so many great link building resources out there, I’ve linked to a few of my favorite where you can get tactical advice and start building links below.

For specific tactical link building strategies, check out these resources:

If you participate in outreach or broken link building, check out this new post from Directive Consulting — “How We Increased Our Email Response Rate from ~8% to 34%” — to increase the effectiveness of your outreach.

Get relevant & high-authority citations

While the importance of citations has taken a dive in recent years as a major ranking factor, they still carry quite a bit of importance.

Do you remember the example from earlier in this post, where we saw Findlaw and Yelp having strong visibility in the market? These websites get traffic, and if a potential customer is looking for you somewhere where you’re not, that’s one touchpoint lost. You’ll still need to address quality over quantity. The days of looking for 1,000 citations are over and have been for many years. If you have 1,000 citations, you probably have a lot of spam links to your website. We don’t need those. But what we do need is highly relevant directories to either our city or niche.

This post I wrote over 4 years ago is still pretty relevant on how you can find these citations and build them with consistency. Remember that high-authority citations can also be unstructured (not a typical business directory). They can also be very high-quality links if the site is authoritative and has fewer business listings. There are millions of listings on Yelp, but maybe less than one hundred on some other powerful, very niche-specific websites.

Citation and link idea: What awards was your business eligible or nominated for?

One way to get these is to consider awards where you can get an authoritative citation and link to your website. Take a look at the example below of a legal website. This site is a peanut compared to a directory like Yelp. Sure, it doesn’t carry near as much authority, but the link equity is more evenly distributed.


Lastly, stay on point

2017 is sure to be a volatile year for local search, but it’s important to stay on point. Spread your wings, open your mind, and diversify with strategies that are going to get your business more customers.

Now it’s time to tell me what you think! Is something I didn’t mention working better for you? Where are you focusing your efforts in local search?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 1 year ago from tracking.feedpress.it

Results from the Local SEO Ranking Factors Study presented at SMX East

Wonder what factors correlate with strong local rankings? Contributor Christine Churchill summarizes the results of a recent local ranking factors study presented at SMX East 2016.

The post Results from the Local SEO Ranking Factors Study presented at SMX East appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Reblogged 2 years ago from feeds.searchengineland.com

When You Rank High Organically But Not Locally (Case Study)

You’ve done everything right in terms of local SEO — you’re even ranking high in organic results — but you just can’t seem to get a place in the map pack. What’s wrong? Columnist Joy Hawkins explores.

The post When You Rank High Organically But Not Locally (Case Study) appeared first on Search…

Please visit Search Engine Land for the full article.

Reblogged 2 years ago from feeds.searchengineland.com

Local Search Ranking Factors 2015

The 2015 Local Search Ranking Factors report is out, and it’s a must-read for anyone in the local SEO arena. As you may know, the survey polls roughly 40 leading local SEO practitioners on what they believe to be the variables most responsible for driving rankings in Google local search…

Please visit Search Engine Land for the full article.

Reblogged 3 years ago from feeds.searchengineland.com