The Local Algorithm: Relevance, Proximity, and Prominence

Posted by MaryBowling

How does Google decide what goes into the local pack? It doesn’t have to be a black box — there’s logic behind the order. In this week’s Whiteboard Friday, renowned local SEO expert Mary Bowling lays out the three factors that drive Google’s local algorithm and local rankings in a simple and concise way anyone can understand.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, Moz fans. This is Mary Bowling from Ignitor Digital, and today I want to talk to you about the local algorithm. I’d like to make this as simple as possible for people to understand, because I think it’s a very confusing thing for a lot of SEOs who don’t do this every day.

The local algorithm has always been based on relevance, prominence, and proximity

1. Relevance

For relevance, what the algorithm is asking is, “Does this business do or sell or have the attributes that the searcher is looking for?” That’s pretty simple. So that gives us all these businesses over here that might be relevant. For prominence, the algorithm is asking, “Which businesses are the most popular and the most well regarded in their local market area?”

2. Proximity

For proximity, the question really is, “Is the business close enough to the searcher to be considered to be a good answer for this query?” This is what trips people up. This is what really defines the local algorithm — proximity. So I’m going to try to explain that in very simple terms here today.

Let’s say we have a searcher in a particular location, and she’s really hungry today and she wants some egg rolls. So her query is egg rolls. If she were to ask for egg rolls near me, these businesses are the ones that the algorithm would favor.

3. Prominence

They are the closest to her, and Google would rank them most likely by their prominence. If she were to ask for something in a particular place, let’s say this is a downtown area and she asked for egg rolls downtown because she didn’t want to be away from work too long, then the algorithm is actually going to favor the businesses that sell egg rolls in the downtown area even though that’s further away from where the searcher is.

If she were to ask for egg rolls open now, there might be a business here and a business here and a business here that are open now, and they would be the ones that the algorithm would consider. So relevance is kicking in on the query. If she were to ask for the cheapest egg rolls, that might be here and here.

If she were to ask for the best egg rolls, that might be very, very far away, or it could be a combination of all kinds of locations. So you really need to think of proximity as a fluid thing. It’s like a rubber band, and depending on… 

  • the query
  • the searcher’s location
  • the relevance to the query
  • and the prominence of the business 

….is what Google is going to show in that local pack.

I hope that makes it much clearer to those of you who haven’t understood the Local Algorithm. If you have some comments or suggestions, please make them below and thanks for listening.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

How Does the Local Algorithm Work? – Whiteboard Friday

Posted by JoyHawkins

When it comes to Google’s algorithms, there’s quite a difference between how they treat local and organic. Get the scoop on which factors drive the local algorithm and how it works from local SEO extraordinaire, Joy Hawkins, as she offers a taste of her full talk from MozCon 2019.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hello, Moz fans. I’m Joy Hawkins. I run a local SEO agency from Toronto, Canada, and a search forum known as the Local Search Forum, which basically is devoted to anything related to local SEO or local search. Today I’m going to be talking to you about Google’s local algorithm and the three main factors that drive it. 

If you’re wondering what I’m talking about when I say the local algorithm, this is the algorithm that fuels what we call the three-pack here. When you do a local search or a search that Google thinks has local intents, like plumbers let’s say, you traditionally will get three results at the top with the map, and then everything below it I refer to as organic. This algorithm I’ll be kind of breaking down is what fuels this three-pack, also known as Google My Business listings or Google Maps listings.

They’re all talking about the exact same thing. If you search Google’s Help Center on what they look at with ranking these entities, they tell you that there are three main things that fuel this algorithm. The three things that they talk about are proximity, prominence, and relevance. I’m going to basically be breaking down each one and explaining how the factors work.

1. Proximity

I’ll kind of start here with proximity. Proximity is basically defined as your location when you are searching on your phone or your computer and you type something in. It’s where Google thinks you are located. If you’re not really sure, often you can scroll down to the bottom of your page, and at the bottom of your page it will often list a zip code that Google thinks you’re in.

Zip code (desktop)

The other way to tell is if you’re on a phone, sometimes you can also see a little blue dot on the map, which is exactly where Google thinks you’re located. On a high level, we often think that Google thinks we’re located in a city, but this is actually pretty false, which I know that there’s been actually a lot of talk at MozCon about how Google pretty much always knows a little deeper than that as far as where users are located.

Generally speaking, if you’re on a computer, they know what zip code you’re in, and they’ll list that at the bottom. There are a variety of tools that can help you check ranking based on zip codes, some of which would be Moz Check Your Presence Tool, BrightLocal, Whitespark, or Places Scout. All of these tools have the ability to track at the zip code level. 

Geo coordinates (mobile)

However, when you’re on a phone, usually Google knows your location even more detailed, and they actually generally know the geo coordinates of your actual location, and they pinpoint this using that little blue dot.

It knows even more about the zip code. It knows where you’re actually located. It’s a bit creepy. But there are a couple of tools that will actually let you see results based on geo coordinates, which is really cool and very accurate. Those tools include the Local Falcon, and there is a Chrome extension which is 100% free, that you can put in your browser, called GS Location Changer.

I use this all the time in an incognito browser if I want to just see what search results look like from a very, very specific location. Now these two levels, depending on what industry you are working in, it’s really important to know which level you need to be looking at. If you work with lawyers, for example, zip code level is usually good enough.

There aren’t enough lawyers to make a huge difference at certain like little points inside a given zip code. However, if you work with dentists or restaurants, let’s say, you really need to be looking at geo coordinate levels. We have seen lots of cases where we will scan a specific keyword using these two tools, and depending on where in that zip code we are, we see completely different three-packs.

It’s very, very key to know that this factor here for proximity really influences the results that you see. This can be challenging, because when you’re trying to explain this to clients or business owners, they search from their home, and they’re like, “Why am I not there?” It’s because their proximity or their location is different than where their office is located.

I realize this is a challenging problem to solve for a lot of agencies on how to represent this, but that’s kind of the tools that you need to look at and use. 

2. Prominence

Moving to the next factor, so prominence, this is basically how important Google thinks you are. Like Is this business a big deal, or are they just some random, crappy business or a new business that we don’t know much about?

  • This looks at things like links, for example. 
  • Store visits, if you are a brick-and-mortar business and you get no foot traffic, Google likely won’t think you’re very prominent. 
  • Reviews, the number of reviews often factors in here. We often see in cases where businesses have a lot of reviews and a lot of old reviews, they generally have a lot of prominence.
  • Citations can also factor in here due to the number of citations. That can also factor into prominence. 

3. Relevance

Moving into the relevance factor, relevance is basically, does Google think you are related to the query that is typed in? You can be as prominent as anyone else, but if you do not have content on your page that is structured well, that covers the topic the user is searching about, your relevance will be very low, and you will run into issues.

It’s very important to know that these three things all kind of work together, and it’s really important to make sure you are looking at all three. On the relevance end, it looks at things like:

  • content
  • onsite SEO, so your title tags, your meta tags, all that nice SEO stuff
  • Citations also factor in here, because it looks at things like your address. Like are you actually in this city? Are you relevant to the city that the user is trying to get locations from? 
  • Categories are huge here, your Google My Business categories. Google currently has just under 4,000 different Google My Business categories, and they add an insane amount every year and they also remove ones. It’s very important to keep on top of that and make sure that you have the correct categories on your listing or you won’t rank well.
  • The business name is unfortunately a huge factor as well in here. Merely having keywords in your business name can often give you relevance to rank. It shouldn’t, but it does. 
  • Then review content. I know Mike Blumenthal did a really cool experiment on this a couple years ago, where he actually had a bunch of people write a bunch of fake reviews on Yelp mentioning certain terms to see if it would influence ranking on Google in the local results, and it did. Google is definitely looking at the content inside the reviews to see what words people are using so they can see how that impacts relevance. 

How to rank without proximity, prominence, or relevance

Obviously you want all three of these things. It is possible to rank if you don’t have all three, and I’ll give a couple examples. If you’re looking to expand your radius, you service a lot of people.

You don’t just service people on your block. You’re like, “I serve the whole city of Chicago,” for example. You are not likely going to rank in all of Chicago for very common terms, things like dentist or personal injury attorney. However, if you have a lot of prominence and you have a really relevant page or content related to really niche terms, we often see that it is possible to really expand your radius for long tail keywords, which is great.

Prominence is probably the number one thing that will expand your radius inside competitive terms. We’ll often see Google bringing in a business that is slightly outside of the same area as other businesses, just because they have an astronomical number of reviews, or maybe their domain authority is ridiculously high and they have all these linking domains.

Those two factors are definitely what influences the amount of area you cover with your local exposure. 

Spam and fake listings

On the flip side, spam is something I talk a lot about. Fake listings are a big problem in the local search space. Fake listings, these lead gen providers create these listings, and they rank with zero prominence.

They have no prominence. They have no citations. They have no authority. They often don’t even have websites, and they still rank because of these two factors. You create 100 listings in a city, you are going to be close to someone searching. Then if you stuff a bunch of keywords in your business name, you will have some relevance, and by somehow eliminating the prominence factor, they are able to get these listings to rank, which is very frustrating.

Obviously, Google is kind of trying to evolve this algorithm over time. We are hoping that maybe the prominence factor will increase over time to kind of eliminate that problem, but ultimately we’ll have to see what Google does. We also did a study recently to test to see which of these two factors kind of carries more weight.

An experiment: Linking to your site within GMB

One thing I’ve kind of highlighted here is when you link to a website inside your Google My Business listing, there’s often a debate. Should I link to my homepage, or should I link to my location page if I’ve got three or four or five offices? We did an experiment to see what happens when we switch a client’s Google My Business listing from their location page to their homepage, and we’ve pretty much almost always seen a positive impact by switching to the homepage, even if that homepage is not relevant at all.

In one example, we had a client that was in Houston, and they opened up a location in Dallas. Their homepage was optimized for Houston, but their location page was optimized for Dallas. I had a conversation with a couple of other SEOs, and they were like, “Oh, well, obviously link to the Dallas page on the Dallas listing. That makes perfect sense.”

But we were wondering what would happen if we linked to the homepage, which is optimized for Houston. We saw a lift in rankings and a lift in the number of search queries that this business showed for when we switched to the homepage, even though the homepage didn’t really mention Dallas at all. Something to think about. Make sure you’re always testing these different factors and chasing the right ones when you’re coming up with your local SEO strategy. Finally, something I’ll mention at the top here.

Local algorithm vs organic algorithm

As far as the local algorithm versus the organic algorithm, some of you might be thinking, okay, these things really look at the same factors. They really kind of, sort of work the same way. Honestly, if that is your thinking, I would really strongly recommend you change it. I’ll quote this. This is from a Moz whitepaper that they did recently, where they found that only 8% of local pack listings had their website also appearing in the organic search results below.

I feel like the overlap between these two is definitely shrinking, which is kind of why I’m a bit obsessed with figuring out how the local algorithm works to make sure that we can have clients successful in both spaces. Hopefully you learned something. If you have any questions, please hit me up in the comments. Thanks for listening.

Video transcription by Speechpad.com


If you liked this episode of Whiteboard Friday, you’ll love all the SEO thought leadership goodness you’ll get from our newly released MozCon 2019 video bundle. Catch Joy’s full talk on the differences between the local and organic algorithm, plus 26 additional future-focused topics from our top-notch speakers:

Grab the sessions now!

We suggest scheduling a good old-fashioned knowledge share with your colleagues to educate the whole team — after all, who didn’t love movie day in school? 😉

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

How Often Does Google Update Its Algorithm?

Posted by Dr-Pete

In 2018, Google reported an incredible 3,234 improvements to search. That’s more than 8 times the number of updates they reported in 2009 — less than a decade ago — and an average of almost 9 per day. How have algorithm updates evolved over the past decade, and how can we possibly keep tabs on all of them? Should we even try?

To kick this off, here’s a list of every confirmed count we have (sources at end of post):

  • 2018 – 3,234 “improvements”
  • 2017 – 2,453 “changes”
  • 2016 – 1,653 “improvements”
  • 2013 – 890 “improvements”
  • 2012 – 665 “launches”
  • 2011 – 538 “launches”
  • 2010 – 516 “changes”
  • 2009 – 350–400 “changes”

Unfortunately, we don’t have confirmed data for 2014-2015 (if you know differently, please let me know in the comments).

A brief history of update counts

Our first peek into this data came in spring of 2010, when Google’s Matt Cutts revealed that “on average, [Google] tends to roll out 350–400 things per year.” It wasn’t an exact number, but given that SEOs at the time (and to this day) were tracking at most dozens of algorithm changes, the idea of roughly one change per day was eye-opening.

In fall of 2011, Eric Schmidt was called to testify before Congress, and revealed our first precise update count and an even more shocking scope of testing and changes:

“To give you a sense of the scale of the changes that Google considers, in 2010 we conducted 13,311 precision evaluations to see whether proposed algorithm changes improved the quality of its search results, 8,157 side-by-side experiments where it presented two sets of search results to a panel of human testers and had the evaluators rank which set of results was better, and 2,800 click evaluations to see how a small sample of real-life Google users responded to the change. Ultimately, the process resulted in 516 changes that were determined to be useful to users based on the data and, therefore, were made to Google’s algorithm.”

Later, Google would reveal similar data in an online feature called “How Search Works.” Unfortunately, some of the earlier years are only available via the Internet Archive, but here’s a screenshot from 2012:

Note that Google uses “launches” and “improvements” somewhat interchangeably. This diagram provided a fascinating peek into Google’s process, and also revealed a startling jump from 13,311 precisions evaluations (changes that were shown to human evaluators) to 118,812 in just two years.

Is the Google algorithm heating up?

Since MozCast has kept the same keyword set since almost the beginning of data collection, we’re able to make some long-term comparisons. The graph below represents five years of temperatures. Note that the system was originally tuned (in early 2012) to an average temperature of 70°F. The redder the bar, the hotter the temperature …

Click to open a high-resolution version in a new tab

You’ll notice that the temperature ranges aren’t fixed — instead, I’ve split the label into eight roughly equal buckets (i.e. they represent the same number of days). This gives us a little more sensitivity in the more common ranges.

The trend is pretty clear. The latter half of this 5-year timeframe has clearly been hotter than the first half. While warming trend is evident, though, it’s not a steady increase over time like Google’s update counts might suggest. Instead, we see a stark shift in the fall of 2016 and a very hot summer of 2017. More recently, we’ve actually seen signs of cooling. Below are the means and medians for each year (note that 2014 and 2019 are partial years):

  • 2019 – 83.7° /82.0°
  • 2018 – 89.9° /88.0°
  • 2017 – 94.0° /93.7°
  • 2016 – 75.1° / 73.7°
  • 2015 – 62.9° / 60.3°
  • 2014 – 65.8° / 65.9°

Note that search engine rankings are naturally noisy, and our error measurements tend to be large (making day-to-day changes hard to interpret). The difference from 2015 to 2017, however, is clearly significant.

Are there really 9 updates per day?

No, there are only 8.86 – feel better? Ok, that’s probably not what you meant. Even back in 2009, Matt Cutts said something pretty interesting that seems to have been lost in the mists of time…

“We might batch [algorithm changes] up and go to a meeting once a week where we talk about 8 or 10 or 12 or 6 different things that we would want to launch, but then after those get approved … those will roll out as we can get them into production.”

In 2016, I did a study of algorithm flux that demonstrated a weekly pattern evident during clearer episodes of ranking changes. From a software engineering standpoint, this just makes sense — updates have to be approved and tend to be rolled out in batches. So, while measuring a daily average may help illustrate the rate of change, it probably has very little basis in the reality of how Google handles algorithm updates.

Do all of these algo updates matter?

Some changes are small. Many improvements are likely not even things we in the SEO industry would consider “algorithm updates” — they could be new features, for example, or UI changes.

As SERP verticals and features evolve, and new elements are added, there are also more moving parts subject to being fixed and improved. Local SEO, for example, has clearly seen an accelerated rate of change over the past 2-3 years. So, we’d naturally expect the overall rate of change to increase.

A lot of this is also in the eye of the beholder. Let’s say Google makes an update to how they handle misspelled words in Korean. For most of us in the United States, that change isn’t going to be actionable. If you’re a Korean brand trying to rank for a commonly misspelled, high-volume term, this change could be huge. Some changes also are vertical-specific, representing radical change for one industry and little or no impact outside that niche.

On the other hand, you’ll hear comments in the industry along the lines of “There are 3,000 changes per year; stop worrying about it!” To me that’s like saying “The weather changes every day; stop worrying about it!” Yes, not every weather report is interesting, but I still want to know when it’s going to snow or if there’s a tornado coming my way. Recognizing that most updates won’t affect you is fine, but it’s a fallacy to stretch that into saying that no updates matter or that SEOs shouldn’t care about algorithm changes.

Ultimately, I believe it helps to know when major changes happen, if only to understand whether rankings shifted due something we did or something Google did. It’s also clear that the rate of change has accelerated, no matter how you measure it, and there’s no evidence to suggest that Google is slowing down.


Appendix A: Update count sources

2009 – Google’s Matt Cutts, video (Search Engine Land)
2010 – Google’s Eric Schmidt, testifying before Congress (Search Engine Land)
2012 – Google’s “How Search Works” page (Internet Archive)
2013 – Google’s Amit Singhal, Google+ (Search Engine Land)
2016 – Google’s “How Search Works” page (Internet Archive)
2017 – Unnamed Google employees (CNBC)
2018 – Google’s “How Search Works” page (Google.com)

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Will Google’s new search algorithm really penalize popovers?

The technology company has said that it will begin to punish sites that display interstitials or pop-ups that obscure indexed content. The change isn’t due to come into play until January 2017, but we wanted to take the opportunity to explore the extent of the new rules and their possible impact.

We know from Google’s past algorithm updates that the focus has turned to the ever-increasing number of mobile users. One change Google said it was “experimenting” with in relation to the ranking signal was mobile-friendly design. The company added a ‘Mobile-friendly’ label, which appeared in the search results when a site conformed to its criteria – such as using text that’s readable without zooming, sizing content to the screen or avoiding software like Flash.

It’s clear, then, that there are multiple factors in the way that Google rates websites into the mobile experience – so how much weighting will it be applying to those using pop-ups or interstitials? We won’t know until it happens, but we can speculate.

How do people browse the web on mobile?

Let’s think about people’s usage and habits when it comes to browsing the web on a mobile device.

Dissimilar from perhaps on a laptop or a desktop, those searching the web on their mobile will tend to be picking up the device to look up something specific. These searches will often be long-tail keywords which will draw up deeper links from a site, and this is where brands need to be careful. Popovers featured on these detail pages, and which distract from the main content, can be a barrier to conversion and lead to bounces.

Rather, marketers need to be selective about how they use pop-ups and take a more considered approach when it comes to the user experience.

What constitutes a bad UX?

No-one wants to create a bad user experience, because it can be detrimental to credibility, performance and conversions. However, if companies achieve good response rates to newsletter sign-up popovers, you could argue that they aren’t providing a negative web experience and, in fact, it would be wrong to penalize.

With the right tool, brands can also be cleverer about when, where and how popovers appear. If a company is trying to collect a steady stream of email addresses from new website visitors, it might make sense to host the popover somewhere on the homepage. After all, the homepage is your company’s shop window and its purpose is to lure people in.

It would also be wise to consider when it pops up. In order not to disrupt the journey and experience, you would want to prevent the popover from appearing immediately. And, of course, you would also want to prevent the pop-up from appearing on the next visit if the user had either signed up or dismissed it.

Will it or will it not?

Let’s remember that the new signal in Google’s algorithm is just one of hundreds of signals that are used to determine rankings – so popovers could make up a small percentage of the overall score. What we take from it all is: if a page’s content is relevant, gets lots of clicks and has a decent dwell time, it may still rank highly (in fact, read the official Google blog post). If a popover is enhancing the experience by giving users another way to consume similar content, and there is positive uptake, we don’t see the harm.

[ccw-atrib-link]

Google updates Penguin, says it now runs in real time within the core search algorithm

The latest announced release, Penguin 4.0, will also be the last, given its new real-time nature.

The post Google updates Penguin, says it now runs in real time within the core search algorithm appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

[ccw-atrib-link]

Everything you need to know about Google’s ‘Possum’ algorithm update

Wondering what’s up with local search rankings lately? Columnist Joy Hawkins has the scoop on a recent local algorithm update that local SEO experts are calling ‘Possum.’

The post Everything you need to know about Google’s ‘Possum’ algorithm update appeared first on Search Engine…

Please visit Search Engine Land for the full article.

[ccw-atrib-link]