Troubleshooting Local Ranking Failures [Updated for 2018]

Posted by MiriamEllis

I love a mystery… especially a local search ranking mystery I can solve for someone.

Now, the truth is, some ranking puzzles are so complex, they can only be solved by a formal competitive audit. But there are many others that can be cleared up by spending 15 minutes or less going through an organized 10-point checklist of the commonest problems that can cause a business to rank lower than the owner thinks it should. By zipping through the following checklist, there’s a good chance you’ll be able to find one or more obvious “whodunits” contributing to poor Google local pack visibility for a given search.

Since I wrote the original version of this post in 2014, so much has changed. Branding, tools, tactics — things are really different in 2018. Definitely time for a complete overhaul, with the goal of making you a super sleuth for your forum friends, clients, agency teammates, or executive superiors.

Let’s emulate the Stratemeyer Syndicate, which earned lasting fame by hitting on a simple formula for surfacing and solving mysteries in a most enjoyable way.

Before we break out our magnifying glass, it’s critical to stress one very important thing. The local rankings I see from an office in North Beach, San Francisco are not the rankings you see while roaming around Golden Gate park in the same city. The rankings your client in Des Moines sees for things in his town are not the same rankings you see from your apartment in Albuquerque when you look at Des Moines results. With the user having become the centroid of search for true local searches, it is no mystery at all that we see different results when we are different places, and it is no cause for concern.

And now that we’ve gotten that out of the way and are in the proper detective spirit, let’s dive into how to solve for each item on our checklist!


☑ Google updates/bugs

The first thing to ask if a business experiences a sudden change in rankings is whether Google has done something. Search Engine Land strikes me as the fastest reporter of Google updates, with MozCast offering an ongoing weather report of changes in the SERPs. Also, check out the Moz Google Algo Change history list and the Moz Blog for some of the most in-depth strategic coverage of updates, penalties, and filters.

For local-specific bugs (or even just suspected tests), check out the Local Search Forum, the Google My Business forum, and Mike Blumenthal’s blog. See if the effects being described match the weirdness you are seeing in your local packs. If so, it’s a matter of fixing a problematic practice (like iffy link building) that has been caught in an update, waiting to see how the update plays out, or waiting for Google to fix a bug or turn a dial down to normalize results.

*Pro tip: Don’t make the mistake of thinking organic updates have nothing to do with local SEO. Crack detectives know organic and local are closely connected.

☑ Eligibility to list and rank

When a business owner wants to know why he isn’t ranking well locally, always ask these four questions:

  1. Does the business have a real address? (Not a PO box, virtual office, or a string of employees’ houses!)
  2. Does the business make face-to-face contact with its customers?
  3. What city is the business in?
  4. What is the exact keyword phrase they are hoping to rank for?

If the answer is “no” to either of the first two questions, the business isn’t eligible for a Google My Business listing. And while spam does flow through Google, a lack of eligibility could well be the key to a lack of rankings.

For the third question, you need to know the city the business is in so that you can see if it’s likely to rank for the search phrase cited in the fourth question. For example, a plumber with a street address in Sugar Land, TX should not expect to rank for “plumber Dallas TX.” If a business lacks a physical location in a given city, it’s atypical for it to rank for queries that stem from or relate to that locale. It’s amazing just how often this simple fact solves local pack mysteries.

☑ Guideline spam

To be an ace local sleuth, you must commit to memory the guidelines for representing your business on Google so that you can quickly spot violations. Common acts of spam include:

  • Keyword stuffing the business name field
  • Improper wording of the business name field
  • Creating listings for ineligible locations, departments, or people
  • Category spam
  • Incorrect phone number implementation
  • Incorrect website URL implementation
  • Review guideline violations

If any of the above conundrums are new to you, definitely spend 10 minutes reading the guidelines. Make flash cards, if necessary, to test yourself on your spam awareness until you can instantly detect glaring errors. With this enhanced perception, you’ll be able to see problems that may possibly be leading to lowered rankings, or even… suspensions!

☑ Suspensions

There are two key things to look for here when a local business owner comes to you with a ranking woe:

  1. If the listing was formerly verified, but has mysteriously become unverified, you should suspect a soft suspension. Soft suspensions might occur around something like a report of keyword-stuffing the GMB business name field. Oddly, however, there is little anecdotal evidence to support the idea that soft suspensions cause ranking drops. Nevertheless, it’s important to spot the un-verification clue and tell the owner to stop breaking guidelines. It’s possible that the listing may lose reviews or images during this type of suspension, but in most cases, the owner should be able to re-verify his listing. Just remember: a soft suspension is not a likely cause of low local pack rankings.
  2. If the listing’s rankings totally disappear and you can’t even find the listing via a branded search, it’s time to suspect a hard suspension. Hard suspensions can result from a listing falling afoul of a Google guideline or new update, a Google employee, or just a member of the public who has reported the business for something like an ineligible location. If the hard suspension is deserved, as in the case of creating a listing at a fake address, then there’s nothing you can do about it. But, if a hard suspension results from a mistake, I recommend taking it to the Google My Business forum to plead for help. Be prepared to prove that you are 100% guideline-compliant and eligible in hopes of getting your listing reinstated with its authority and reviews intact.

☑ Duplicates

Notorious for their ability to divide ranking strength, duplicate listings are at their worst when there is more than one verified listing representing a single entity. If you encounter a business that seems like it should be ranking better than it is for a given search, always check for duplicates.

The quickest way to do this is to get all present and past NAP (name, address, phone) from the business and plug it into the free Moz Check Listing tool. Pay particular attention to any GMB duplicates the tool surfaces. Then:

  1. If the entity is a brick-and-mortar business or service area business, and the NAP exactly matches between the duplicates, contact Google to ask them to merge the listings. If the NAP doesn’t match and represents a typo or error on the duplicate, use the “suggest an edit” link in Google Maps to toggle the “yes/no” toggle to “yes,” and then select the radio button for “never existed.”
  2. If the duplicates represent partners in a multi-practitioner business, Google won’t simply delete them. Things get quite complicated in this scenario, and if you discover practitioner duplicates, tread carefully. There are half a dozen nuances here, including whether you’re dealing with actual duplicates, whether they represent current or past staffers, whether they are claimed or unclaimed, and even whether a past partner is deceased. There isn’t perfect industry agreement on the handling of all of the ins-and-outs of practitioner listings. Given this, I would advise an affected business to read all of the following before making a move in any direction:

☑ Missing/inaccurate listings

While you’ve got Moz Check Listing fired up, pay attention to anything it tells you about missing or inaccurate listings. The tool will show you how accurate and complete your listings on are on the major local business data aggregators, plus other important platforms like Google My Business, Facebook, Factual, Yelp, and more. Why does this matter?

  1. Google can pull information from anywhere on the web and plunk it into your Google My Business listing.
  2. While no one can quantify the exact degree to which citation/listing consistency directly impacts Google local rankings for every possible search query, it has been a top 5 ranking factor in the annual Local Search Ranking Factors survey as far back as I can remember. Recently, I’ve seen some industry discussion as to whether citations still matter, with some practitioners claiming they can’t see the difference they make. I believe that conclusion may stem from working mainly in ultra-competitive markets where everyone has already got their citations in near-perfect order, forcing practitioners to look for differentiation tactics beyond the basics. But without those basics, you’re missing table stakes in the game.
  3. Indirectly, listing absence or inconsistency impacts local rankings in that it undermines the quest for good local KPIs as well as organic authority. Every lost or misdirected consumer represents a failure to have someone click-for-directions, click-to-call, click-to-your website, or find your website at all. Online and offline traffic, conversions, reputation, and even organic authority all hang in the balance of active citation management.

☑ Lack of organic authority

Full website or competitive audits are not the work of a minute. They really take time, and deep delving. But, at a glance, you can access some quick metrics to let you know whether a business’ lack of achievement on the organic side of things could be holding them back in the local packs. Get yourself the free MozBar SEO toolbar and try this:

  1. Turn the MozBar on by clicking the little “M” at the top of your browser so that it is blue.
  2. Perform your search and look at the first few pages of the organic results, ignoring anything from major directory sites like Yelp (they aren’t competing with you for local pack rankings, eh?).
  3. Note down the Page Authority, Domain Authority, and link counts for each of the businesses coming up on the first 3 pages of the organic results.
  4. Finally, bring up the website of the business you’re investigating. If you see that the top competitors have Domain Authorities of 50 and links numbering in the hundreds or thousands, whereas your target site is well below in these metrics, chances are good that organic authority is playing a strong role in lack of local search visibility. How do we know this is true? Do some local searches and note just how often the businesses that make it into the 3-pack or the top of the local finder view have correlating high organic rankings.

Where organic authority is poor, a business has a big job of work ahead. They need to focus on content dev + link building + social outreach to begin building up their brand in the minds of consumers and the “RankBrain” of Google.

One other element needs to be mentioned here, and that’s the concept of how time affects authority. When you’re talking to a business with a ranking problem, it’s very important to ascertain whether they just launched their website or just built their local business listings last week, or even just a few months ago. Typically, if they have, the fruits of their efforts have yet to fully materialize. That being said, it’s not a given that a new business will have little authority. Large brands have marketing departments which exist solely to build tremendous awareness of new assets before they even launch. It’s important to keep that in mind, while also realizing that if the business is smaller, building authority will likely represent a longer haul.

☑ Possum effect

Where local rankings are absent, always ask:

“Are there any other businesses in your building or even on your street that share your Google category?”

If the answer is “yes,” search for the business’ desired keyword phase and look at the local finder view in Google Maps. Note which companies are ranking. Then begin to zoom in on the map, level by level, noting changes in the local finder as you go. If, a few levels in, the business you’re advising suddenly appears on the map and in the local finder, chances are good it’s the Possum filter that’s causing their apparent invisibility at the automatic zoom level.

Google Possum rolled out in September 2016, and its observable effects included a geographic diversification of the local results, filtering out many listings that share a category and are in close proximity to one another. Then, about one year later, Google initiated the Hawk update, which appears to have tightened the radius of Possum, with the result that while many businesses in the same building are still being filtered out, a number of nearby neighbors have reappeared at the automatic zoom level of the results.

If your sleuthing turns up a brand that is being impacted by Possum/Hawk, the only surefire way to beat the filter is to put in the necessary work to become the most authoritative answer for the desired search phrase. It’s important to remember that filters are the norm in Google’s local results, and have long been observed impacting listings that share an address, share a phone number, etc. If it’s vital for a particular listing to outrank all others that possess shared characteristics, then authority must be built around it in every possible way to make it one of the most dominant results.

☑ Local Service Ads effect

The question you ask here is:

“Is yours a service-area business?”

And if the answer is “yes,” then brace yourself for ongoing results disruption in the coming year.

Google’s Local Service Ads (formerly Home Service Ads) make Google the middleman between consumers and service providers, and in the 2+ years since first early testing, they’ve caused some pretty startling things to happen to local search results. These have included:

Suffice it to say, rollout to an ever-increasing number of cities and categories hasn’t been for the faint of heart, and I would hazard a guess that Google’s recent re-brand of this program signifies their intention to move beyond the traditional SAB market. One possible benefit of Google getting into this type of lead gen is that it could decrease spam, but I’m not sold on this, given that fake locations have ended up qualifying for LSA inclusion. While I honor Google’s need to be profitable, I share some of the qualms business owners have expressed about the potential impacts of this venture.

Since I can’t offer a solid prediction of what precise form these impacts will take in the coming months, the best I can do here is to recommend that if an SAB experiences a ranking change/loss, the first thing to look for is whether LSA has come to town. If so, alteration of the SERPs may be unavoidable, and the only strategy left for overcoming vanished visibility may be to pay for it… by qualifying for the program.

☑ GMB neglect

Sometimes, a lack of competitive rankings can simply be chalked up to a lack of effort. If a business wonders why they’re not doing better in the local packs, pull up their GMB listing and do a quick evaluation of:

  • Verification status – While you can rank without verifying, lack of verification is a hallmark of listing neglect.
  • Basic accuracy – If NAP or map markers are incorrect, it’s a sure sign of neglect.
  • Category choices – Wrong categories make right rankings impossible.
  • Image optimization – Every business needs a good set of the most professional, persuasive photos it can acquire, and should even consider periodic new photo shoots for seasonal freshness; imagery impacts KPIs, which are believed to impact rank.
  • Review count, sentiment and management – Too few reviews, low ratings, and lack of responses = utter neglect of this core rank/reputation-driver.
  • Hours of operation – If they’re blank or incorrect, conversions are being missed.
  • Main URL choice – Does the GMB listing point to a strong, authoritative website page or a weak one?
  • Additional URL choices – If menus, bookings, reservations, or placing orders is part of the business model, a variety of optional URLs are supported by Google and should be explored.
  • Google Posts – Early-days testing indicates that regular posting may impact rank.
  • Google Questions and Answers – Pre-populate with best FAQs and actively manage incoming questions.

There is literally no business, large or small, with a local footprint that can afford to neglect its Google My Business listing. And while some fixes and practices move the ranking needle more than others, the increasing number of consumer actions that take place within Google is reason enough to put active GMB management at the top of your list.


Closing the case

The Hardy Boys never went anywhere without their handy kit of detection tools. Their father was so confident in their utter preparedness that he even let them chase down gangs in Hong Kong and dictators in the Guyanas (which, on second thought, doesn’t seem terribly wise.) But I have that kind of confidence in you. I hope my troubleshooting checklist is one you’ll bookmark and share to be prepared for the local ranking mysteries awaiting you and your digital marketing colleagues in 2018. Happy sleuthing!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 6 months ago from tracking.feedpress.it

Google updates the AMP report in the Google Search Console

Five months after launching the AMP error report in the Google Search Console, Google has updated the report to make it easier to spot errors.

The post Google updates the AMP report in the Google Search Console appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Reblogged 2 years ago from feeds.searchengineland.com

Google Updates The Search Quality Rating Guidelines

Google has updated their search quality raters guidelines document on March 28, 2016 reducing it from 160 to 146 pages.

The post Google Updates The Search Quality Rating Guidelines appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Reblogged 2 years ago from feeds.searchengineland.com

A new timesaver: Best New Links

We have updated the “new” tab as part of the Majestic toolset so that it is easier to work through the data and identify new links within the Fresh Index, as part of a particular time frame, because whilst our New Links tool is a great way to work through a comprehensive list of the…

The post A new timesaver: Best New Links appeared first on Majestic Blog.

Reblogged 2 years ago from blog.majestic.com

16 How-To Training videos from Majestic

We’ve updated our Training section on the blog as well as our “how-to” video range. Following a lot of customer feedback, we wanted to make it easier to understand how-to use Majestic, the tools and the data, so we have produced 16 new videos which will show you a feature like “how-to“: Use the Keyword…

The post 16 How-To Training videos from Majestic appeared first on Majestic Blog.

Reblogged 2 years ago from blog.majestic.com

16 How-To Training videos from Majestic

We’ve updated our Training section on the blog as well as our “how-to” video range. Following a lot of customer feedback, we wanted to make it easier to understand how-to use Majestic, the tools and the data, so we have produced 16 new videos which will show you a feature like “how-to“: Use the Keyword…

The post 16 How-To Training videos from Majestic appeared first on Majestic Blog.

Reblogged 2 years ago from blog.majestic.com

Subtitled: How to use the Majestic Site Explorer video translated

Our “How to use the Majestic Site Explorer” video is a walk-through explaining in detail exactly how the Majestic Site Explorer can help you analyse backlink information. As one of Majestic’s main tools, this video which was updated this year, will provide you with guidance on “how-to” use the Majestic data, study a website domain…

The post Subtitled: How to use the Majestic Site Explorer video translated appeared first on Majestic Blog.

Reblogged 2 years ago from blog.majestic.com

The Linkbait Bump: How Viral Content Creates Long-Term Lift in Organic Traffic – Whiteboard Friday

Posted by randfish

A single fantastic (or “10x”) piece of content can lift a site’s traffic curves long beyond the popularity of that one piece. In today’s Whiteboard Friday, Rand talks about why those curves settle into a “new normal,” and how you can go about creating the content that drives that change.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about the linkbait bump, classic phrase in the SEO world and almost a little dated. I think today we’re talking a little bit more about viral content and how high-quality content, content that really is the cornerstone of a brand or a website’s content can be an incredible and powerful driver of traffic, not just when it initially launches but over time.

So let’s take a look.

This is a classic linkbait bump, viral content bump analytics chart. I’m seeing over here my traffic and over here the different months of the year. You know, January, February, March, like I’m under a thousand. Maybe I’m at 500 visits or something, and then I have this big piece of viral content. It performs outstandingly well from a relative standpoint for my site. It gets 10,000 or more visits, drives a ton more people to my site, and then what happens is that that traffic falls back down. But the new normal down here, new normal is higher than the old normal was. So the new normal might be at 1,000, 1,500 or 2,000 visits whereas before I was at 500.

Why does this happen?

A lot of folks see an analytics chart like this, see examples of content that’s done this for websites, and they want to know: Why does this happen and how can I replicate that effect? The reasons why are it sort of feeds back into that viral loop or the flywheel, which we’ve talked about in previous Whiteboard Fridays, where essentially you start with a piece of content. That content does well, and then you have things like more social followers on your brand’s accounts. So now next time you go to amplify content or share content socially, you’re reaching more potential people. You have a bigger audience. You have more people who share your content because they’ve seen that that content performs well for them in social. So they want to find other content from you that might help their social accounts perform well.

You see more RSS and email subscribers because people see your interesting content and go, “Hey, I want to see when these guys produce something else.” You see more branded search traffic because people are looking specifically for content from you, not necessarily just around this viral piece, although that’s often a big part of it, but around other pieces as well, especially if you do a good job of exposing them to that additional content. You get more bookmark and type in traffic, more searchers biased by personalization because they’ve already visited your site. So now when they search and they’re logged into their accounts, they’re going to see your site ranking higher than they normally would otherwise, and you get an organic SEO lift from all the links and shares and engagement.

So there’s a ton of different factors that feed into this, and you kind of want to hit all of these things. If you have a piece of content that gets a lot of shares, a lot of links, but then doesn’t promote engagement, doesn’t get more people signing up, doesn’t get more people searching for your brand or searching for that content specifically, then it’s not going to have the same impact. Your traffic might fall further and more quickly.

How do you achieve this?

How do we get content that’s going to do this? Well, we’re going to talk through a number of things that we’ve talked about previously on Whiteboard Friday. But there are some additional ones as well. This isn’t just creating good content or creating high quality content, it’s creating a particular kind of content. So for this what you want is a deep understanding, not necessarily of what your standard users or standard customers are interested in, but a deep understanding of what influencers in your niche will share and promote and why they do that.

This often means that you follow a lot of sharers and influencers in your field, and you understand, hey, they’re all sharing X piece of content. Why? Oh, because it does this, because it makes them look good, because it helps their authority in the field, because it provides a lot of value to their followers, because they know it’s going to get a lot of retweets and shares and traffic. Whatever that because is, you have to have a deep understanding of it in order to have success with viral kinds of content.

Next, you want to have empathy for users and what will give them the best possible experience. So if you know, for example, that a lot of people are coming on mobile and are going to be sharing on mobile, which is true of almost all viral content today, FYI, you need to be providing a great mobile and desktop experience. Oftentimes that mobile experience has to be different, not just responsive design, but actually a different format, a different way of being able to scroll through or watch or see or experience that content.

There are some good examples out there of content that does that. It makes a very different user experience based on the browser or the device you’re using.

You also need to be aware of what will turn them off. So promotional messages, pop-ups, trying to sell to them, oftentimes that diminishes user experience. It means that content that could have been more viral, that could have gotten more shares won’t.

Unique value and attributes that separate your content from everything else in the field. So if there’s like ABCD and whoa, what’s that? That’s very unique. That stands out from the crowd. That provides a different form of value in a different way than what everyone else is doing. That uniqueness is often a big reason why content spreads virally, why it gets more shared than just the normal stuff.

I’ve talk about this a number of times, but content that’s 10X better than what the competition provides. So unique value from the competition, but also quality that is not just a step up, but 10X better, massively, massively better than what else you can get out there. That makes it unique enough. That makes it stand out from the crowd, and that’s a very hard thing to do, but that’s why this is so rare and so valuable.

This is a critical one, and I think one that, I’ll just say, many organizations fail at. That is the freedom and support to fail many times, to try to create these types of effects, to have this impact many times before you hit on a success. A lot of managers and clients and teams and execs just don’t give marketing teams and content teams the freedom to say, “Yeah, you know what? You spent a month and developer resources and designer resources and spent some money to go do some research and contracted with this third party, and it wasn’t a hit. It didn’t work. We didn’t get the viral content bump. It just kind of did okay. You know what? We believe in you. You’ve got a lot of chances. You should try this another 9 or 10 times before we throw it out. We really want to have a success here.”

That is something that very few teams invest in. The powerful thing is because so few people are willing to invest that way, the ones that do, the ones that believe in this, the ones that invest long term, the ones that are willing to take those failures are going to have a much better shot at success, and they can stand out from the crowd. They can get these bumps. It’s powerful.

Not a requirement, but it really, really helps to have a strong engaged community, either on your site and around your brand, or at least in your niche and your topic area that will help, that wants to see you, your brand, your content succeed. If you’re in a space that has no community, I would work on building one, even if it’s very small. We’re not talking about building a community of thousands or tens of thousands. A community of 100 people, a community of 50 people even can be powerful enough to help content get that catalyst, that first bump that’ll boost it into viral potential.

Then finally, for this type of content, you need to have a logical and not overly promotional match between your brand and the content itself. You can see many sites in what I call sketchy niches. So like a criminal law site or a casino site or a pharmaceutical site that’s offering like an interactive musical experience widget, and you’re like, “Why in the world is this brand promoting this content? Why did they even make it? How does that match up with what they do? Oh, it’s clearly just intentionally promotional.”

Look, many of these brands go out there and they say, “Hey, the average web user doesn’t know and doesn’t care.” I agree. But the average web user is not an influencer. Influencers know. Well, they’re very, very suspicious of why content is being produced and promoted, and they’re very skeptical of promoting content that they don’t think is altruistic. So this kills a lot of content for brands that try and invest in it when there’s no match. So I think you really need that.

Now, when you do these linkbait bump kinds of things, I would strongly recommend that you follow up, that you consider the quality of the content that you’re producing. Thereafter, that you invest in reproducing these resources, keeping those resources updated, and that you don’t simply give up on content production after this. However, if you’re a small business site, a small or medium business, you might think about only doing one or two of these a year. If you are a heavy content player, you’re doing a lot of content marketing, content marketing is how you’re investing in web traffic, I’d probably be considering these weekly or monthly at the least.

All right, everyone. Look forward to your experiences with the linkbait bump, and I will see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 years ago from tracking.feedpress.it

Now we have over 3 Trillion URLS!

We have just launched a new Historic Index and broken the 3 Trillion mark! Unique URLs crawled: 800,654,991,863 Unique URLs found: 3,088,860,810,721 Date range: 01 Oct 2009 to 04 May 2015 Last updated: 15 Jun 2015 This means we have crossed a milestone of 3 Trillion URLs found.  

The post Now we have over 3 Trillion URLS! appeared first on Majestic Blog.

Reblogged 3 years ago from blog.majestic.com

Why We Can’t Do Keyword Research Like It’s 2010 – Whiteboard Friday

Posted by randfish

Keyword Research is a very different field than it was just five years ago, and if we don’t keep up with the times we might end up doing more harm than good. From the research itself to the selection and targeting process, in today’s Whiteboard Friday Rand explains what has changed and what we all need to do to conduct effective keyword research today.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

What do we need to change to keep up with the changing world of keyword research?

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat a little bit about keyword research, why it’s changed from the last five, six years and what we need to do differently now that things have changed. So I want to talk about changing up not just the research but also the selection and targeting process.

There are three big areas that I’ll cover here. There’s lots more in-depth stuff, but I think we should start with these three.

1) The Adwords keyword tool hides data!

This is where almost all of us in the SEO world start and oftentimes end with our keyword research. We go to AdWords Keyword Tool, what used to be the external keyword tool and now is inside AdWords Ad Planner. We go inside that tool, and we look at the volume that’s reported and we sort of record that as, well, it’s not good, but it’s the best we’re going to do.

However, I think there are a few things to consider here. First off, that tool is hiding data. What I mean by that is not that they’re not telling the truth, but they’re not telling the whole truth. They’re not telling nothing but the truth, because those rounded off numbers that you always see, you know that those are inaccurate. Anytime you’ve bought keywords, you’ve seen that the impression count never matches the count that you see in the AdWords tool. It’s not usually massively off, but it’s often off by a good degree, and the only thing it’s great for is telling relative volume from one from another.

But because AdWords hides data essentially by saying like, “Hey, you’re going to type in . . .” Let’s say I’m going to type in “college tuition,” and Google knows that a lot of people search for how to reduce college tuition, but that doesn’t come up in the suggestions because it’s not a commercial term, or they don’t think that an advertiser who bids on that is going to do particularly well and so they don’t show it in there. I’m giving an example. They might indeed show that one.

But because that data is hidden, we need to go deeper. We need to go beyond and look at things like Google Suggest and related searches, which are down at the bottom. We need to start conducting customer interviews and staff interviews, which hopefully has always been part of your brainstorming process but really needs to be now. Then you can apply that to AdWords. You can apply that to suggest and related.

The beautiful thing is once you get these tools from places like visiting forums or communities, discussion boards and seeing what terms and phrases people are using, you can collect all this stuff up, plug it back into AdWords, and now they will tell you how much volume they’ve got. So you take that how to lower college tuition term, you plug it into AdWords, they will show you a number, a non-zero number. They were just hiding it in the suggestions because they thought, “Hey, you probably don’t want to bid on that. That won’t bring you a good ROI.” So you’ve got to be careful with that, especially when it comes to SEO kinds of keyword research.

2) Building separate pages for each term or phrase doesn’t make sense

It used to be the case that we built separate pages for every single term and phrase that was in there, because we wanted to have the maximum keyword targeting that we could. So it didn’t matter to us that college scholarship and university scholarships were essentially people looking for exactly the same thing, just using different terminology. We would make one page for one and one page for the other. That’s not the case anymore.

Today, we need to group by the same searcher intent. If two searchers are searching for two different terms or phrases but both of them have exactly the same intent, they want the same information, they’re looking for the same answers, their query is going to be resolved by the same content, we want one page to serve those, and that’s changed up a little bit of how we’ve done keyword research and how we do selection and targeting as well.

3) Build your keyword consideration and prioritization spreadsheet with the right metrics

Everybody’s got an Excel version of this, because I think there’s just no awesome tool out there that everyone loves yet that kind of solves this problem for us, and Excel is very, very flexible. So we go into Excel, we put in our keyword, the volume, and then a lot of times we almost stop there. We did keyword volume and then like value to the business and then we prioritize.

What are all these new columns you’re showing me, Rand? Well, here I think is how sophisticated, modern SEOs that I’m seeing in the more advanced agencies, the more advanced in-house practitioners, this is what I’m seeing them add to the keyword process.

Difficulty

A lot of folks have done this, but difficulty helps us say, “Hey, this has a lot of volume, but it’s going to be tremendously hard to rank.”

The difficulty score that Moz uses and attempts to calculate is a weighted average of the top 10 domain authorities. It also uses page authority, so it’s kind of a weighted stack out of the two. If you’re seeing very, very challenging pages, very challenging domains to get in there, it’s going to be super hard to rank against them. The difficulty is high. For all of these ones it’s going to be high because college and university terms are just incredibly lucrative.

That difficulty can help bias you against chasing after terms and phrases for which you are very unlikely to rank for at least early on. If you feel like, “Hey, I already have a powerful domain. I can rank for everything I want. I am the thousand pound gorilla in my space,” great. Go after the difficulty of your choice, but this helps prioritize.

Opportunity

This is actually very rarely used, but I think sophisticated marketers are using it extremely intelligently. Essentially what they’re saying is, “Hey, if you look at a set of search results, sometimes there are two or three ads at the top instead of just the ones on the sidebar, and that’s biasing some of the click-through rate curve.” Sometimes there’s an instant answer or a Knowledge Graph or a news box or images or video, or all these kinds of things that search results can be marked up with, that are not just the classic 10 web results. Unfortunately, if you’re building a spreadsheet like this and treating every single search result like it’s just 10 blue links, well you’re going to lose out. You’re missing the potential opportunity and the opportunity cost that comes with ads at the top or all of these kinds of features that will bias the click-through rate curve.

So what I’ve seen some really smart marketers do is essentially build some kind of a framework to say, “Hey, you know what? When we see that there’s a top ad and an instant answer, we’re saying the opportunity if I was ranking number 1 is not 10 out of 10. I don’t expect to get whatever the average traffic for the number 1 position is. I expect to get something considerably less than that. Maybe something around 60% of that, because of this instant answer and these top ads.” So I’m going to mark this opportunity as a 6 out of 10.

There are 2 top ads here, so I’m giving this a 7 out of 10. This has two top ads and then it has a news block below the first position. So again, I’m going to reduce that click-through rate. I think that’s going down to a 6 out of 10.

You can get more and less scientific and specific with this. Click-through rate curves are imperfect by nature because we truly can’t measure exactly how those things change. However, I think smart marketers can make some good assumptions from general click-through rate data, which there are several resources out there on that to build a model like this and then include it in their keyword research.

This does mean that you have to run a query for every keyword you’re thinking about, but you should be doing that anyway. You want to get a good look at who’s ranking in those search results and what kind of content they’re building . If you’re running a keyword difficulty tool, you are already getting something like that.

Business value

This is a classic one. Business value is essentially saying, “What’s it worth to us if visitors come through with this search term?” You can get that from bidding through AdWords. That’s the most sort of scientific, mathematically sound way to get it. Then, of course, you can also get it through your own intuition. It’s better to start with your intuition than nothing if you don’t already have AdWords data or you haven’t started bidding, and then you can refine your sort of estimate over time as you see search visitors visit the pages that are ranking, as you potentially buy those ads, and those kinds of things.

You can get more sophisticated around this. I think a 10 point scale is just fine. You could also use a one, two, or three there, that’s also fine.

Requirements or Options

Then I don’t exactly know what to call this column. I can’t remember the person who’ve showed me theirs that had it in there. I think they called it Optional Data or Additional SERPs Data, but I’m going to call it Requirements or Options. Requirements because this is essentially saying, “Hey, if I want to rank in these search results, am I seeing that the top two or three are all video? Oh, they’re all video. They’re all coming from YouTube. If I want to be in there, I’ve got to be video.”

Or something like, “Hey, I’m seeing that most of the top results have been produced or updated in the last six months. Google appears to be biasing to very fresh information here.” So, for example, if I were searching for “university scholarships Cambridge 2015,” well, guess what? Google probably wants to bias to show results that have been either from the official page on Cambridge’s website or articles from this year about getting into that university and the scholarships that are available or offered. I saw those in two of these search results, both the college and university scholarships had a significant number of the SERPs where a fresh bump appeared to be required. You can see that a lot because the date will be shown ahead of the description, and the date will be very fresh, sometime in the last six months or a year.

Prioritization

Then finally I can build my prioritization. So based on all the data I had here, I essentially said, “Hey, you know what? These are not 1 and 2. This is actually 1A and 1B, because these are the same concepts. I’m going to build a single page to target both of those keyword phrases.” I think that makes good sense. Someone who is looking for college scholarships, university scholarships, same intent.

I am giving it a slight prioritization, 1A versus 1B, and the reason I do this is because I always have one keyword phrase that I’m leaning on a little more heavily. Because Google isn’t perfect around this, the search results will be a little different. I want to bias to one versus the other. In this case, my title tag, since I more targeting university over college, I might say something like college and university scholarships so that university and scholarships are nicely together, near the front of the title, that kind of thing. Then 1B, 2, 3.

This is kind of the way that modern SEOs are building a more sophisticated process with better data, more inclusive data that helps them select the right kinds of keywords and prioritize to the right ones. I’m sure you guys have built some awesome stuff. The Moz community is filled with very advanced marketers, probably plenty of you who’ve done even more than this.

I look forward to hearing from you in the comments. I would love to chat more about this topic, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it