How to Perform a Basic Local Business Competitive Audit

Posted by MiriamEllis

“Why are those folks outranking me in Google’s local pack?”

If you or a client is asking this question, the answer lies in competitive analysis. You’ve got to stack Business A up against Business B to identify the strengths and weaknesses of both competitors, and then make an educated guess as to which factors Google is weighting most in the results for a specific search term.

Today, I’d like to share a real-world example of a random competitive audit, including a chart that depicts which factors I’ve investigated and explanatory tips and tools for how I came up with the numbers and facts. Also included: a downloadable version of the spreadsheet that you can use for your own company or clients. Your goal with this audit is to identify exactly how one player is winning the game so that you can create a to-do list for any company trying to move up in the rankings. Alternatively, some competitive audits can be defensive, identifying a dominant player’s weaknesses so that they can be corrected to ensure continued high rankings.

It’s my hope that seeing this audit in action will help you better answer the question of why “this person is outranking that person,” and that you may share with our community some analytical tips of your own!

The scenario:

Search term: Chinese Restaurant San Rafael

Statistics about San Rafael: A large town of approximately 22 square miles in the San Francisco Bay Area with a population of 58,954 and 15+ Chinese restaurants.

Consistency of results: From 20 miles away to 2000+ miles away, Ping’s Chinese Cuisine outranks Yet Wah Restaurant in Google’s local pack for the search term. We don’t look closer than 20 miles, or proximity of the searcher creates too much diversity.

The challenge: Why is Ping’s Chinese Cuisine outranking Yet Wah Restaurant in Google’s Local Pack for the search term?

The comparison chart

*Where there’s a clear winner, it’s noted in bolded, italicized text.

Basic business information

NAP

Ping’s Chinese Cuisine

248 Northgate Dr.

San Rafael, CA 94903

(415) 492-8808

Yet Wah Restaurant

1238 4th St.

San Rafael, CA 94901

(415) 460-9883

GMB landing page URL

http://pingsnorthgate.com/

http://www.yetwahchinese.com/

Local Pack rank

1

2

Organic rank

17

5

Organic rank among business-owned sites


*Remove directories and review platforms from the equation, as they typically shouldn’t be viewed as direct competitors

8

1

Business model eligible for GMB listing at this address?


*Check Google’s Guidelines if unsure: https://support.google.com/business/answer/3038177…

Yes

Yes

Oddities

Note that Ping’s has redirected pingschinesecuisine.com to pingsnorthgate.com. Ping’s also has a www and non-www version of pingsnorthgate.com.

A 2nd website for same business at same location with same phone number: http://yetwahsanrafael.com/. This website is ranking directly below the authoritative (GMB-linked) website for this business in organic SERP for the search in question.

Business listings

GMB review count

32

38

GMB review rating

4.1

3.8

Most recent GMB review


*Sort GMB reviews by “most recent” filter

1 week ago

1 month ago

Proper GMB categories?

Yes

Yes

Estimated age of GMB listing


*Estimated by date of oldest reviews and photos, but can only be seen as an estimate

At least 2 years old

At least 6 years old

Moz Local score (completeness + accuracy + lack of duplicates)


*Tool: https://moz.com/local/search

49%

75%

Moz Local duplicate findings


*Tool: https://moz.com/local/search

0

1 (Facebook)

Keywords in GMB name

chinese

restaurant

Keywords in GMB website landing page title tag

Nothing at all. Just “home page”

Yes

Spam in GMB title


*Look at GMB photos, Google Streetview, and the website to check for inconsistencies

No

Yes: “restaurant” not in website logo or street level signage

Hours and photos on GMB?

Yes

Yes

Proximity to city centroid


*Look up city by name in Google Maps and see where it places the name of the city on the map. That’s the city “centroid.” Get driving directions from the business to an address located in the centroid.

3.5 miles

410.1 feet

Proximity to nearest competitor


*Zoom in on Google map to surface as many adjacent competitors as possible. Can be a Possum factor in some cases.

1.1 mile

0.2 miles

Within Google Maps boundaries?


*Look up city by name in Google Maps and note the pink border via which Google designates that city’s boundaries

Yes

Yes

Website

Age of domain


*Tool: http://smallseotools.com/domain-age-checker/

March 2013

August 2011

Domain Authority


*Tool: https://moz.com/products/pro/seo-toolbar

16

8

GMB Landing Page Authority


*Tool: https://moz.com/products/pro/seo-toolbar

30

21

Links to domain

*Tool: https://moz.com/researchtools/ose/

53

2

DA/PA of most authoritative link earned


*Tool: https://moz.com/researchtools/ose/

72/32

38/16

Evaluation of website content

*This is a first-pass, visual gut check, just reading through the top-level pages of the website to see how they strike you in terms of quality.

Extremely thin, just adequate to identify restaurant. At least has menu on own site. Of the 2 sites, this one has the most total text, by virtue of a sentence on the homepage and menus in real text.

Extremely thin, almost zero text on homepage, menu link goes to another website.

Evaluation of website design

Outdated

Outdated, mostly images

Evaluation of website UX

Can be navigated, but few directives or CTAs

Can be navigated, but few directives or CTAs

Mobile-friendly


*Tool: https://search.google.com/test/mobile-friendly

Basic mobile design, but Google’s mobile-friendly test tool says both www and non-www cannot be reached because it’s unavailable or blocked by robots txt. They have disallowed scripts, photos, Flash, images, and plugins. This needs to be further investigated and resolved. Mobile site URL is http://pingsnorthgate.com/#2962. Both this URL and the other domains are failing Google’s test.

Basic mobile design passes Google’s mobile-friendly test

Evaluation of overall onsite SEO


*A first-pass visual look at the page code of top level pages, checking for titles, descriptions, header tags, schema, + the presence of problems like Flash.

Pretty much no optimization

Minimal, indeed, but a little bit of effort made. Some title tags, some schema, some header tags.

HTML NAP on website?

Yes

Yes

Website NAP matches GMB NAP?

No (Northgate One instead of Northgate Drive)

Yes

Total number of wins: Ping’s 7, Yet Wah 9.

Download your own version of my competitive audit spreadsheet by making a copy of the file.

Takeaways from the comparison chart

Yet Wah significantly outranks Ping’s in the organic results, but is being beaten by them in the Local Pack. Looking at the organic factors, we see evidence that, despite the fact that Ping’s has greater DA, greater PA of the GMB landing page, more links, and stronger links, they are not outranking Yet Wah organically. This is something of a surprise that leads us to look at their content and on-page SEO.

While Ping’s has slightly better text content on their website, they have almost done almost zero optimization work, their URLs have canonical issues, and their robots.txt isn’t properly configured. Yet Wah has almost no on-site content, but they have modestly optimized their title tags, implemented H tags and some schema, and their site passes Google’s mobile-friendly test.

So, our theory regarding Yet Wah’s superior organic ranking is that, in this particular case, Yet Wah’s moderate efforts with on-page SEO have managed to beat out Ping’s superior DA/PA/link metrics. Yet Wah’s website is also a couple of years older than Ping’s.

All that being said, Yet Wah’s organic win is failing to translate into a local win for them. How can we explain Ping’s local win? Ping’s has a slightly higher overall review rating, higher DA and GMB landing page PA, more total links, and higher authority links. They also have slightly more text content on their website, even if it’s not optimized.

So, our theory regarding Ping’s superior local rank is that, in this particular case, website authority/links appear to be winning the day for Ping’s. And the basic website text they have could possibly be contributing, despite lack of optimization.

In sum, basic on-page SEO appears to be contributing to Yet Wah’s organic win, while DA/PA/links appear to be contributing to Ping’s local win.

Things that bother me

I chose this competitive scenario at random, because when I took an initial look at the local and organic rankings, they bothered me a little. I would have expected Yet Wah to be first in the local pack if they were first in organic. I see local and organic rankings correlate strongly so much of the time, that this case seemed odd to me.

By the end of the audit, I’ve come up with a working theory, but I’m not 100% satisfied with it. It makes me ask questions like:

  • Is Ping’s better local rank stemming from some hidden factor no one knows about?
  • In this particular case, why is Google appearing to value Ping’s links more that Yet Wah’s on-page SEO in determining local rank? Would I see this same trend across the board if I analyzed 1,000 restaurants? The industry says links are huge in local SEO right now. I guess we’re seeing proof of that here.
  • Why isn’t Google weighting Yet Wah’s superior citation set more than they apparently are? Ping’s citations are in bad shape. I’ve seen citation health play a much greater apparent role in other audits, but something feels weird here.
  • Why isn’t Google “punishing” Yet Wah in the organic results for that second website with duplicate NAP on it? That seems like it should matter.
  • Why isn’t age factoring in more here? My inspection shows that Yet Wah’s domain and GMB listing are significantly older. This could be moving the organic needle for them, but it’s not moving the local one.
  • Could user behavior be making Ping’s the local winner? This is a huge open question at the end of my basic audit.* See below.

*I don’t have access to either restaurant’s Google Analytics, GMB Insights, or Google Search Console accounts, so perhaps that would turn up penalties, traffic patterns, or things like superior clicks-to-call, clicks-for-directions, or clicks-to-website that would make Ping’s local win easier to explain. If one of these restaurants were your client, you’d want to add chart rows for these things based on full access to the brand’s accounts and tools, and whatever data your tools can access about the competitor. For example, using a tool like SimilarWeb, I see that between May and June of this year, YetWah’s traffic rose from an average 150 monthly visits up to a peak of 500, while Ping’s saw a drop from 700 to 350 visits in that same period. Also, in a scenario in which one or both parties have a large or complex link profile, you might want additional rows for link metrics, taken from tools like Moz Pro, Ahrefs, or Majestic.

In this case, Ping’s has 7 total wins in my chart and Yet Wah has 9. The best I can do is look at which factors each business is winning at to try to identify a pattern of what Google is weighting most, both organically and locally. With both restaurants being so basic in their marketing, and with neither one absolutely running away with the game, what we have here is a close race. While I’d love to be able to declare a totally obvious winner, the best I could do as a consultant, in this case, would be to draw up a plan of defense or offense.

If my client were Ping’s:

Ping’s needs to defend its #1 local ranking if it doesn’t want to lose it. Its greatest weaknesses which must be resolved are:

  • The absence of on-page SEO
  • Thin content
  • Robots.txt issues

To remain strong, Ping’s should also work on:

  • Improving citation health
  • Directing the non-www version of their site to the www one
  • A professional site redesign could possibly improve conversions

Ping’s should accomplish these things to defend its current local rank and to try to move up organically.

If my client were Yet Wah:

Yet Wah needs to try to achieve victory over Ping’s in the local packs, as it has done in the organic results. To do that, Yet Wah should:

  • Earn links to the GMB landing page URL and the domain
  • Create strong text content on its high-level pages, including putting a complete dining menu in real text on the website
  • Deal with the second website featuring duplicate NAP

Yet Wah should also:

  • Complete work on its citation health
  • Work hard to get some new 5-star reviews by delighting customers with something special
  • Consider adding the word “Restaurant” to their signage, so that they can’t be reported for spamming the GMB name field.
  • Consider a professional redesign of the website to improve conversions

Yet Wah should accomplish these things in an effort to surpass Ping’s.

And, with either client being mine, I’d then be taking a second pass to further investigate anything problematic that came up in the initial audit, so that I could make further technical or creative suggestions.

Big geo-industry picture analysis

Given that no competitor for this particular search term has been able to beat out Ping’s or Yet Wah in the local pack, and given the minimal efforts these two brands have thus far made, there’s a tremendous chance for any Chinese restaurant in San Rafael to become the dominant player. Any competitor that dedicates itself to running on all cylinders (professional, optimized website with great content, a healthy link profile, a competitive number of high-star reviews, healthy citations, etc.) could definitely surpass all other contestants. This is not a tough market and there are no players who can’t be bested.

My sample case has been, as I’ve said, a close race. You may be facing an audit where there are deeply entrenched dominant players whose statistics far surpass those of a business you’re hoping to assist. But the basic process is the same:

  1. Look at the top-ranking business.
  2. Fill out the chart (adding any other fields you feel are important).
  3. Then discover the strengths of the dominant company, as well as its potential weaknesses.
  4. Contrast these findings with those you’ve charted for the company you’re helping and you’ll be able to form a plan for improvement.

And don’t forget the user proximity factor. Any company’s most adjacent customers will see pack results that vary either slightly or significantly from what a user sees from 20, 50, or 1,000 miles away. In my specific study, it happened to be the third result in the pack that went haywire once a user got 50 miles away, while the top two remained dominant and statically ranked for searchers as far away as the East Coast.

Because of this phenomenon of distance, it’s vital for business owners to be educated about the fact that they are serving two user groups: one that is located in the neighborhood or city of the business, and another that could be anywhere in the country or the world. This doesn’t just matter for destinations like hotels or public amusements. In California (a big state), Internet users on a road trip from Palm Springs may be looking to end their 500-mile drive at a Chinese restaurant in San Rafael, so you can’t just think hyper-locally; you’ve got to see the bigger local picture. And you’ve got to do the analysis to find ways of winning as often as you can with both consumer groups.

You take it from here, auditor!

My local competitive audit chart is a basic one, looking at 30+ factors. What would you add? How would you improve it? Did I miss a GMB duplicate listing, or review spam? What’s working best for your agency in doing local audits these days? Do you use a chart, or just provide a high-level text summary of your internal findings? And, if you have any further theories as to how Ping’s is winning the local pack, I’d love for you to share them in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 years ago from tracking.feedpress.it

5 Spreadsheet Tips for Manual Link Audits

Posted by MarieHaynes

Link auditing is the part of my job that I love the most. I have audited a LOT of links over the last few years. While there are some programs out there that can be quite helpful to the avid link auditor, I still prefer to create a spreadsheet of my links in Excel and then to audit those links one-by-one from within Google Spreadsheets. Over the years I have learned a few tricks and formulas that have helped me in this process. In this article, I will share several of these with you.

Please know that while I am quite comfortable being labelled a link auditing expert, I am not an Excel wizard. I am betting that some of the things that I am doing could be improved upon if you’re an advanced user. As such, if you have any suggestions or tips of your own I’d love to hear them in the comments section!

1. Extract the domain or subdomain from a URL

OK. You’ve downloaded links from as many sources as possible and now you want to manually visit and evaluate one link from every domain. But, holy moly, some of these domains can have THOUSANDS of links pointing to the site. So, let’s break these down so that you are just seeing one link from each domain. The first step is to extract the domain or subdomain from each url.

I am going to show you examples from a Google spreadsheet as I find that these display nicer for demonstration purposes. However, if you’ve got a fairly large site, you’ll find that the spreadsheets are easier to create in Excel. If you’re confused about any of these steps, check out the animated gif at the end of each step to see the process in action.

Here is how you extract a domain or subdomain from a url:

  • Create a new column to the left of your url column.
  • Use this formula:

    =LEFT(B1,FIND(“/”,B1,9)-1)

    What this will do is remove everything after the trailing slash following the domain name. http://www.example.com/article.html will now become http://www.example.com and http://www.subdomain.example.com/article.html will now become http://www.subdomain.example.com.

  • Copy our new column A and paste it right back where it was using the “paste as values” function. If you don’t do this, you won’t be able to use the Find and Replace feature.
  • Use Find and Replace to replace each of the following with a blank (i.e. nothing):
    http://
    https://
    www.

And BOOM! We are left with a column that contains just domain names and subdomain names. This animated gif shows each of the steps we just outlined:

2. Just show one link from each domain

The next step is to filter this list so that we are just seeing one link from each domain. If you are manually reviewing links, there’s usually no point in reviewing every single link from every domain. I will throw in a word of caution here though. Sometimes a domain can have both a good link and a bad link pointing to you. Or in some cases, you may find that links from one page are followed and from another page on the same site they are nofollowed. You can miss some of these by just looking at one link from each domain. Personally, I have some checks built in to my process where I use Scrapebox and some internal tools that I have created to make sure that I’m not missing the odd link by just looking at one link from each domain. For most link audits, however, you are not going to miss very much by assessing one link from each domain.

Here’s how we do it:

  • Highlight our domains column and sort the column in alphabetical order.
  • Create a column to the left of our domains, so that the domains are in column B.
  • Use this formula:

    =IF(B1=B2,”duplicate”,”unique”)

  • Copy that formula down the column.
  • Use the filter function so that you are just seeing the duplicates.
  • Delete those rows. Note: If you have tens of thousands of rows to delete, the spreadsheet may crash. A workaround here is to use “Clear Rows” instead of “Delete Rows” and then sort your domains column from A-Z once you are finished.

We’ve now got a list of one link from every domain linking to us.

Here’s the gif that shows each of these steps:

You may wonder why I didn’t use Excel’s dedupe function to simply deduplicate these entries. I have found that it doesn’t take much deduplication to crash Excel, which is why I do this step manually.

3. Finding patterns FTW!

Sometimes when you are auditing links, you’ll find that unnatural links have patterns. I LOVE when I see these, because sometimes I can quickly go through hundreds of links without having to check each one manually. Here is an example. Let’s say that your website has a bunch of spammy directory links. As you’re auditing you notice patterns such as one of these:

  • All of these directory links come from a url that contains …/computers/internet/item40682/
  • A whole bunch of spammy links that all come from a particular free subdomain like blogspot, wordpress, weebly, etc.
  • A lot of links that all contain a particular keyword for anchor text (this is assuming you’ve included anchor text in your spreadsheet when making it.)

You can quickly find all of these links and mark them as “disavow” or “keep” by doing the following:

  • Create a new column. In my example, I am going to create a new column in Column C and look for patterns in urls that are in Column B.
  • Use this formula:

    =FIND(“/item40682”,B1)
    (You would replace “item40682” with the phrase that you are looking for.)

  • Copy this formula down the column.
  • Filter your new column so that you are seeing any rows that have a number in this column. If the phrase doesn’t exist in that url, you’ll see “N/A”, and we can ignore those.
  • Now you can mark these all as disavow

4. Check your disavow file

This next tip is one that you can use to check your disavow file across your list of domains that you want to audit. The goal here is to see which links you have disavowed so that you don’t waste time reassessing them. This particular tip only works for checking links that you have disavowed on the domain level.

The first thing you’ll want to do is download your current disavow file from Google. For some strange reason, Google gives you the disavow file in CSV format. I have never understood this because they want you to upload the file in .txt. Still, I guess this is what works best for Google. All of your entries will be in column A of the CSV:

What we are going to do now is add these to a new sheet on our current spreadsheet and use a VLOOKUP function to mark which of our domains we have disavowed.

Here are the steps:

  • Create a new sheet on your current spreadsheet workbook.
  • Copy and paste column A from your disavow spreadsheet onto this new sheet. Or, alternatively, use the import function to import the entire CSV onto this sheet.
  • In B1, write “previously disavowed” and copy this down the entire column.
  • Remove the “domain:” from each of the entries by doing a Find and Replace to replace domain: with a blank.
  • Now go back to your link audit spreadsheet. If your domains are in column A and if you had, say, 1500 domains in your disavow file, your formula would look like this:

    =VLOOKUP(A1,Sheet2!$A$1:$B$1500,2,FALSE)

When you copy this formula down the spreadsheet, it will check each of your domains, and if it finds the domain in Sheet 2, it will write “previously disavowed” on our link audit spreadsheet.

Here is a gif that shows the process:

5. Make monthly or quarterly disavow work easier

That same formula described above is a great one to use if you are doing regular repeated link audits. In this case, your second sheet on your spreadsheet would contain domains that you have previously audited, and column B of this spreadsheet would say, “previously audited” rather than “previously disavowed“.

Your tips?

These are just a few of the formulas that you can use to help make link auditing work easier. But there are lots of other things you can do with Excel or Google Sheets to help speed up the process as well. If you have some tips to add, leave a comment below. Also, if you need clarification on any of these tips, I’m happy to answer questions in the comments section.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

How to perform an SEO Audit for your website

Performing an seo audit of your website is important for many reasons: you can make an accurate evaluation of the health of your website, identify the specific areas that need to be improved,…

Reblogged 4 years ago from www.youtube.com

Why the Links You’ve Built Aren’t Helping Your Page Rank Higher – Whiteboard Friday

Posted by randfish

Link building can be incredibly effective, but sometimes a lot of effort can go into earning links with absolutely no improvement in rankings. Why? In today’s Whiteboard Friday, Rand shows us four things we should look at in these cases, help us hone our link building skills and make the process more effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about why link building sometimes fails.

So I’ve got an example here. I’m going to do a search for artificial sweeteners. Let’s say I’m working for these guys, ScienceMag.org. Well, this is actually in position 10. I put it in position 3 here, but I see that I’m position 10. I think to myself, “Man, if I could get higher up on this page, that would be excellent. I’ve already produced the content. It’s on my domain. Like, Google seems to have indexed it fine. It’s performing well enough to perform on page one, granted at the bottom of page one, for this competitive query. Now I want to move my rankings up.”

So a lot of SEOs, naturally and historically, for a long time have thought, “I need to build more links to that page. If I can get more links pointing to this page, I can move up the rankings.” Granted, there are some other ways to do that too, and we’ve discussed those in previous Whiteboard Fridays. But links are one of the big ones that people use.

I think one of the challenges that we encounter is sometimes we invest that effort. We go through the process of that outreach campaign, talking to bloggers and other news sites and looking at where our link sources are coming from and trying to get some more of those. It just doesn’t seem to do anything. The link building appears to fail. It’s like, man, I’ve got all these nice links and no new results. I didn’t move up at all. I am basically staying where I am, or maybe I’m even falling down. Why is that? Why does link building sometimes work so well and so clearly and obviously, and sometimes it seems to do nothing at all?

What are some possible reasons link acquisition efforts may not be effective?

Oftentimes if you get a fresh set of eyes on it, an outside SEO perspective, they can do this audit, and they’ll walk through a lot of this stuff and help you realize, “Oh yeah, that’s probably why.” These are things that you might need to change strategically or tactically as you approach this problem. But you can do this yourself as well by looking at why a link building campaign, why a link building effort, for a particular page, might not be working.

1) Not the right links

First one, it’s not the right links. Not the right links, I mean a wide range of things, even broader than what I’ve listed here. But a lot of times that could mean low domain diversity. Yeah, you’re getting new links, but they’re coming from all the same places that you always get links from. Google, potentially, maybe views that as not particularly worthy of moving you up the rankings, especially around competitive queries.

It might be trustworthiness of source. So maybe they’re saying “Yeah, you got some links, but they’re not from particularly trustworthy places.” Tied into that maybe we don’t think or we’re sure that they’re not editorial. Maybe we think they’re paid, or we think they’re promotional in some way rather than being truly editorially given by this independent resource.

They might not come from a site or from a page that has the authority that’s necessary to move you up. Again, particularly for competitive queries, sometimes low-value links are just that. They’re not going to move the needle, especially not like they used to three, four, five or six years ago, where really just a large quantity of links, even from diverse domains, even if they were crappy links on crappy pages on relatively crappy or unknown websites would move the needle, not so much anymore. Google is seeing a lot more about these things.

Where else does the source link to? Is that source pointing to other stuff that is potentially looking manipulative to Google and so they discounted the outgoing links from that particular domain or those sites or those pages on those sites?

They might look at the relevance and say, “Hey, you know what? Yeah, you got linked to by some technology press articles. That doesn’t really have anything to do with artificial sweeteners, this topic, this realm, or this region.” So you’re not getting the same result. Now we’ve shown that off-topic links can oftentimes move the rankings, but in particular areas and in health, in fact, may be one of those Google might be more topically sensitive to where the links are coming from than other places.

Location on page. So I’ve got a page here and maybe all of my links are coming from a bunch of different domains, but it’s always in the right sidebar and it’s always in this little feed section. So Google’s saying, “Hey, that’s not really an editorial endorsement. That’s just them showing all the links that come through your particular blog feed or a subscription that they’ve got to your content or whatever it is promotionally pushing out. So we’re not going to count it that way.” Same thing a lot of times with footer links. Doesn’t work quite as well. If you’re being honest with yourself, you really want those in content links. Generally speaking, those tend to perform the best.

Or uniqueness. So they might look and they might say, “Yeah, you’ve got a ton of links from people who are republishing your same article and then just linking back to it. That doesn’t feel to us like an editorial endorsement, and so we’re just going to treat those copies as if those links didn’t exist at all.” But the links themselves may not actually be the problem. I think this can be a really important topic if you’re doing link acquisition auditing, because sometimes people get too focused on, “Oh, it must be something about the links that we’re getting.” That’s not always the case actually.

2) Not the right content

Sometimes it’s not the right content. So that could mean things like it’s temporally focused versus evergreen. So for different kinds of queries, Google interprets the intent of the searchers to be different. So it could be that when they see a search like “artificial sweeteners,” they say, “Yeah, it’s great that you wrote this piece about this recent research that came out. But you know what, we’re actually thinking that searchers are going to want in the top few results something that’s evergreen, that contains all the broad information that a searcher might need around this particular topic.”

That speaks to it might not answer the searchers questions. You might think, “Well, I’m answering a great question here.” The problem is, yeah you’re answering one. Searchers may have many questions that they’re asking around a topic, and Google is looking for something comprehensive, something that doesn’t mean a searcher clicks your result and then says, “Well, that was interesting, but I need more from a different result.” They’re looking for the one true result, the one true answer that tells them, “Hey, this person is very happy with these types of results.”

It could be poor user experience causing people to bounce back. That could be speed things, UI things, layout things, browser support things, multi-device support things. It might not use language formatting or text that people or engines can interpret as on the topic. Perhaps this is way over people’s heads, far too scientifically focused, most searchers can’t understand the language, or the other way around. It’s a highly scientific search query and a very advanced search query and your language is way dumbed down. Google isn’t interpreting that as on-topic. All the Hummingbird and topic modeling kind of things that they have say this isn’t for them.

Or it might not match expectations of searchers. This is distinct and different from searchers’ questions. So searchers’ questions is, “I want to know how artificial sweeteners might affect me.” Expectations might be, “I expect to learn this kind of information. I expect to find out these things.” For example, if you go down a rabbit hole of artificial sweeteners will make your skin shiny, they’re like, “Well, that doesn’t meet with my expectation. I don’t think that’s right.” Even if you have some data around that, that’s not what they were expecting to find. They might bounce back. Engines might not interpret you as on-topic, etc. So lots of content kinds of things.

3) Not the right domain

Then there are also domain issues. You might not have the right domain. Your domain might not be associated with the topic or content that Google and searchers are expecting. So they see Mayo Clinic, they see MedicineNet, and they go, “ScienceMag? Do they do health information? I don’t think they do. I’m not sure if that’s an appropriate one.” It might be perceived, even if you aren’t, as spammy or manipulative by Google, more probably than by searchers. Or searchers just won’t click your brand for that content. This is a very frustrating one, because we have seen a ton of times when search behavior is biased by the brand itself, by what’s in this green text here, the domain name or the brand name that Google might show there. That’s very frustrating, but it means that you need to build brand affinity between that topic, that keyword, and what’s in searchers’ heads.

4) Accessibility or technical issues

Then finally, there could be some accessibility or technical issues. Usually when that’s the case, you will notice pretty easily because the page will have an error. It won’t show the content properly. The cache will be an issue. That’s a rare one, but you might want to check for it as well.

But hopefully, using this kind of an audit system, you can figure out why a link building campaign, a link building effort isn’t working to move the needle on your rankings.

With that, we will see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

The Nifty Guide to Local Content Strategy and Marketing

Posted by NiftyMarketing

This is my Grandma.

She helped raised me and I love her dearly. That chunky baby with the Gerber cheeks is
me. The scarlet letter “A” means nothing… I hope.

This is a rolled up newspaper. 

rolled up newspaper

When I was growing up, I was the king of mischief and had a hard time following parental guidelines. To ensure the lessons she wanted me to learn “sunk in” my grandma would give me a soft whack with a rolled up newspaper and would say,

“Mike, you like to learn the hard way.”

She was right. I have
spent my life and career learning things the hard way.

Local content has been no different. I started out my career creating duplicate local doorway pages using “find and replace” with city names. After getting whacked by the figurative newspaper a few times, I decided there had to be a better way. To save others from the struggles I experienced, I hope that the hard lessons I have learned about local content strategy and marketing help to save you fearing a rolled newspaper the same way I do.

Lesson one: Local content doesn’t just mean the written word

local content ecosystem

Content is everything around you. It all tells a story. If you don’t have a plan for how that story is being told, then you might not like how it turns out. In the local world, even your brick and mortar building is a piece of content. It speaks about your brand, your values, your appreciation of customers and employees, and can be used to attract organic visitors if it is positioned well and provides a good user experience. If you just try to make the front of a building look good, but don’t back up the inside inch by inch with the same quality, people will literally say, “Hey man, this place sucks… let’s bounce.”

I had this experience proved to me recently while conducting an interview at
Nifty for our law division. Our office is a beautifully designed brick, mustache, animal on the wall, leg lamp in the center of the room, piece of work you would expect for a creative company.

nifty offices idaho

Anywho, for our little town of Burley, Idaho it is a unique space, and helps to set apart our business in our community. But, the conference room has a fluorescent ballast light system that can buzz so loudly that you literally can’t carry on a proper conversation at times, and in the recent interviews I literally had to conduct them in the dark because it was so bad.

I’m cheap and slow to spend money, so I haven’t got it fixed yet. The problem is I have two more interviews this week and I am so embarrassed by the experience in that room, I am thinking of holding them offsite to ensure that we don’t product a bad content experience. What I need to do is just fix the light but I will end up spending weeks going back and forth with the landlord on whose responsibility it is.

Meanwhile, the content experience suffers. Like I said, I like to learn the hard way.

Start thinking about everything in the frame of content and you will find that you make better decisions and less costly mistakes.

Lesson two: Scalable does not mean fast and easy growth

In every sales conversation I have had about local content, the question of scalability comes up. Usually, people want two things:

  1. Extremely Fast Production 
  2. Extremely Low Cost

While these two things would be great for every project, I have come to find that there are rare cases where quality can be achieved if you are optimizing for fast production and low cost. A better way to look at scale is as follows:

The rate of growth in revenue/traffic is greater than the cost of continued content creation.

A good local content strategy at scale will create a model that looks like this:

scaling content graph

Lesson three: You need a continuous local content strategy

This is where the difference between local content marketing and content strategy kicks in. Creating a single piece of content that does well is fairly easy to achieve. Building a true scalable machine that continually puts out great local content and consistently tells your story is not. This is a graph I created outlining the process behind creating and maintaining a local content strategy:

local content strategy

This process is not a one-time thing. It is not a box to be checked off. It is a structure that should become the foundation of your marketing program and will need to be revisited, re-tweaked, and replicated over and over again.

1. Identify your local audience

Most of you reading this will already have a service or product and hopefully local customers. Do you have personas developed for attracting and retaining more of them? Here are some helpful tools available to give you an idea of how many people fit your personas in any given market.

Facebook Insights

Pretend for a minute that you live in the unique market of Utah and have a custom wedding dress line. You focus on selling modest wedding dresses. It is a definite niche product, but one that shows the idea of personas very well.

You have interviewed your customer base and found a few interests that your customer base share. Taking that information and putting it into Facebook insights will give you a plethora of data to help you build out your understanding of a local persona.

facebook insights data

We are able to see from the interests of our customers there are roughly 6k-7k current engaged woman in Utah who have similar interests to our customer base.

The location tab gives us a break down of the specific cities and, understandably, Salt Lake City has the highest percentage with Provo (home of BYU) in second place. You can also see pages this group would like, activity levels on Facebook, and household income with spending habits. If you wanted to find more potential locations for future growth you can open up the search to a region or country.

localized facebook insights data

From this data it’s apparent that Arizona would be a great expansion opportunity after Utah.

Neilson Prizm

Neilson offers a free and extremely useful tool for local persona research called Zip Code Lookup that allows you to identify pre-determined personas in a given market.

Here is a look at my hometown and the personas they have developed are dead on.

Neilson Prizm data

Each persona can be expanded to learn more about the traits, income level, and areas across the country with other high concentrations of the same persona group.

You can also use the segment explorer to get a better idea of pre-determined persona lists and can work backwards to determine the locations with the highest density of a given persona.

Google Keyword Planner Tool

The keyword tool is fantastic for local research. Using our same Facebook Insight data above we can match keyword search volume against the audience size to determine how active our persona is in product research and purchasing. In the case of engaged woman looking for dresses, it is a very active group with a potential of 20-30% actively searching online for a dress.

google keyword planner tool

2. Create goals and rules

I think the most important idea for creating the goals and rules around your local content is the following from the must read book Content Strategy for the Web.

You also need to ensure that everyone who will be working on things even remotely related to content has access to style and brand guides and, ultimately, understands the core purpose for what, why, and how everything is happening.

3. Audit and analyze your current local content

The point of this step is to determine how the current content you have stacks up against the goals and rules you established, and determine the value of current pages on your site. With tools like Siteliner (for finding duplicate content) and ScreamingFrog (identifying page titles, word count, error codes and many other things) you can grab a lot of information very fast. Beyond that, there are a few tools that deserve a more in-depth look.

BuzzSumo

With BuzzSumo you can see social data and incoming links behind important pages on your site. This can you a good idea which locations or areas are getting more promotion than others and identify what some of the causes could be.

Buzzsumo also can give you access to competitors’ information where you might find some new ideas. In the following example you can see that one of Airbnb.com’s most shared pages was a motiongraphic of its impact on Berlin.

Buzzsumo

urlProfiler

This is another great tool for scraping urls for large sites that can return about every type of measurement you could want. For sites with 1000s of pages, this tool could save hours of data gathering and can spit out a lovely formatted CSV document that will allow you to sort by things like word count, page authority, link numbers, social shares, or about anything else you could imagine.

url profiler

4. Develop local content marketing tactics

This is how most of you look when marketing tactics are brought up.

monkey

Let me remind you of something with a picture. 

rolled up newspaper

Do not start with tactics. Do the other things first. It will ensure your marketing tactics fall in line with a much bigger organizational movement and process. With the warning out of the way, here are a few tactics that could work for you.

Local landing page content

Our initial concept of local landing pages has stood the test of time. If you are scared to even think about local pages with the upcoming doorway page update then please read this analysis and don’t be too afraid. Here are local landing pages that are done right.

Marriott local content

Marriot’s Burley local page is great. They didn’t think about just ensuring they had 500 unique words. They have custom local imagery of the exterior/interior, detailed information about the area’s activities, and even their own review platform that showcases both positive and negative reviews with responses from local management.

If you can’t build your own platform handling reviews like that, might I recommend looking at Get Five Stars as a platform that could help you integrate reviews as part of your continuous content strategy.

Airbnb Neighborhood Guides

I not so secretly have a big crush on Airbnb’s approach to local. These neighborhood guides started it. They only have roughly 21 guides thus far and handle one at a time with Seoul being the most recent addition. The idea is simple, they looked at extremely hot markets for them and built out guides not just for the city, but down to a specific neighborhood.

air bnb neighborhood guides

Here is a look at Hell’s Kitchen in New York by imagery. They hire a local photographer to shoot the area, then they take some of their current popular listing data and reviews and integrate them into the page. This idea would have never flown if they only cared about creating content that could be fast and easy for every market they serve.

Reverse infographicing

Every decently sized city has had a plethora of infographics made about them. People spent the time curating information and coming up with the concept, but a majority just made the image and didn’t think about the crawlability or page title from an SEO standpoint.

Here is an example of an image search for Portland infographics.

image search results portland infographics

Take an infographic and repurpose it into crawlable content with a new twist or timely additions. Usually infographics share their data sources in the footer so you can easily find similar, new, or more information and create some seriously compelling data based content. You can even link to or share the infographic as part of it if you would like.

Become an Upworthy of local content

No one I know does this better than Movoto. Read the link for their own spin on how they did it and then look at these examples and share numbers from their local content.

60k shares in Boise by appealing to that hometown knowledge.

movoto boise content

65k shares in Salt Lake following the same formula.

movoto salt lake city content

It seems to work with video as well.

movoto video results

Think like a local directory

Directories understand where content should be housed. Not every local piece should be on the blog. Look at where Trip Advisor’s famous “Things to Do” page is listed. Right on the main city page.

trip advisor things to do in salt lake city

Or look at how many timely, fresh, quality pieces of content Yelp is showcasing from their main city page.

yelp main city page

The key point to understand is that local content isn’t just about being unique on a landing page. It is about BEING local and useful.

Ideas of things that are local:

  • Sports teams
  • Local celebrities or heroes 
  • Groups and events
  • Local pride points
  • Local pain points

Ideas of things that are useful:

  • Directions
  • Favorite local sports
  • Granular details only “locals” know

The other point to realize is that in looking at our definition of scale you don’t need to take shortcuts that un-localize the experience for users. Figure and test a location at a time until you have a winning formula and then move forward at a speed that ensures a quality local experience.

5. Create a content calendar

I am not going to get into telling you exactly how or what your content calendar needs to include. That will largely be based on the size and organization of your team and every situation might call for a unique approach. What I will do is explain how we do things at Nifty.

  1. We follow the steps above.
  2. We schedule the big projects and timelines first. These could be months out or weeks out. 
  3. We determine the weekly deliverables, checkpoints, and publish times.
  4. We put all of the information as tasks assigned to individuals or teams in Asana.

asana content calendar

The information then can be viewed by individual, team, groups of team, due dates, or any other way you would wish to sort. Repeatable tasks can be scheduled and we can run our entire operation visible to as many people as need access to the information through desktop or mobile devices. That is what works for us.

6. Launch and promote content

My personal favorite way to promote local content (other than the obvious ideas of sharing with your current followers or outreaching to local influencers) is to use Facebook ads to target the specific local personas you are trying to reach. Here is an example:

I just wrapped up playing Harold Hill in our communities production of The Music Man. When you live in a small town like Burley, Idaho you get the opportunity to play a lead role without having too much talent or a glee-based upbringing. You also get the opportunity to do all of the advertising, set design, and costuming yourself and sometime even get to pay for it.

For my advertising responsibilities, I decided to write a few blog posts and drive traffic to them. As any good Harold Hill would do, I used fear tactics.

music man blog post

I then created Facebook ads that had the following stats: Costs of $.06 per click, 12.7% click through rate, and naturally organic sharing that led to thousands of visits in a small Idaho farming community where people still think a phone book is the only way to find local businesses.

facebook ads setup

Then we did it again.

There was a protestor in Burley for over a year that parked a red pickup with signs saying things like, “I wud not trust Da Mayor” or “Don’t Bank wid Zions”. Basically, you weren’t working hard enough if you name didn’t get on the truck during the year.

Everyone knew that ol’ red pickup as it was parked on the corner of Main and Overland, which is one of the few stoplights in town. Then one day it was gone. We came up with the idea to bring the red truck back, put signs on it that said, “I wud Not Trust Pool Tables” and “Resist Sins n’ Corruption” and other things that were part of The Music Man and wrote another blog complete with pictures.

facebook ads red truck

Then I created another Facebook Ad.

facebook ads set up

A little under $200 in ad spend resulted in thousands more visits to the site which promoted the play and sold tickets to a generation that might not have been very familiar with the show otherwise.

All of it was local targeting and there was no other way would could have driven that much traffic in a community like Burley without paying Facebook and trying to create click bait ads in hope the promotion led to an organic sharing.

7. Measure and report

This is another very personal step where everyone will have different needs. At Nifty we put together very custom weekly or monthly reports that cover all of the plan, execution, and relevant stats such as traffic to specific content or location, share data, revenue or lead data if available, analysis of what worked and what didn’t, and the plan for the following period.

There is no exact data that needs to be shared. Everyone will want something slightly different, which is why we moved away from automated reporting years ago (when we moved away from auto link building… hehe) and built our report around our clients even if it took added time.

I always said that the product of a SEO or content shop is the report. That is what people buy because it is likely that is all they will see or understand.

8. In conclusion, you must refine and repeat the process

local content strategy - refine and repeat

From my point of view, this is by far the most important step and sums everything up nicely. This process model isn’t perfect. There will be things that are missed, things that need tweaked, and ways that you will be able to improve on your local content strategy and marketing all the time. The idea of the cycle is that it is never done. It never sleeps. It never quits. It never surrenders. You just keep perfecting the process until you reach the point that few locally-focused companies ever achieve… where your local content reaches and grows your target audience every time you click the publish button.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Check Your Local Business Listings in the UK

Posted by David-Mihm

One of the most consistent refrains from the Moz community as we’ve
released features over the last two years has been the desire to see Moz Local expand to countries outside the U.S. Today I’m pleased to announce that we’re embarking on our journey to global expansion with support for U.K. business listing searches in our Check Listing tool.

Some of you may remember limited U.K. functionality as part of GetListed.org, but as a very small company we couldn’t keep up with the maintenance required to present reliable results. It’s taken us longer than we would have liked to get here, but now with more resources, the Moz Local team has the bandwidth and important experience from the past year of Moz Local in the U.S. to fully support U.K. businesses.

How It Works

We’ve updated our search feature to accept both U.S. and U.K. postal codes, so just head on over to
moz.com/local/search to check it out!

After entering the name of your business and a U.K. postcode, we go out and ping Google and other important local search sites in the U.K., and return what we found. Simply select the closest-matching business and we’ll proceed to run a full audit of your listings across these sites.

You can click through and discover incomplete listings, inconsistent NAP information, duplicate listings, and more.

This check listing feature is free to all Moz community members.

You’ve no doubt noted in the screenshot above that we project a listing score improvement. We do plan to release a fully-featured U.K. version of Moz Local later this spring (with the same distribution, reporting, and duplicate-closure features that are available in the U.S.), and you can enter your email address—either on that page or right here—to be notified when we do!

.sendgrid-subscription-widget .response {
font-style: italic;
font-size: 14px;
font-weight: 300;
}

.sendgrid-subscription-widget .response.success {
color: #93e7b6;
font-size: 14px;
}

.sendgrid-subscription-widget form .response.error {
color: #fcbb4a;
font-size: 14px;
}

.sendgrid-subscription-widget form input[type=”submit”].btn {
}

.sendgrid-subscription-widget span {
display: none;
}

.sendgrid-subscription-widget form input[type=”email”] {
color: #000000;
width: 200px;
}

U.K.-Specific Partners

As I’ve mentioned in previous blog comments, there are a certain number of global data platforms (Google, Facebook, Yelp, Bing, Foursquare, and Factual, among others) where it’s valuable to be listed correctly and completely no matter which country you’re in.

But every country has its own unique set of domestically relevant players as well, and we’re pleased to have worked with two of them on this release: Central Index and Thomson Local. (Head on over to the
Moz Local Learning Center for more information about country-specific data providers.)

We’re continuing discussions with a handful of other prospective data partners in the U.K. If you’re interested in working with us, please
let us know!

What’s Next?

Requests for further expansion, especially to Canada and Australia, I’m sure will be loud and clear in the comments below! Further expansion is on our roadmap, but it’s balanced against a more complete feature set in the (more populous) U.S. and U.K. markets. We’ll continue to use our experience in those markets as we prioritize when and where to expand next.

A few lucky members of the Moz Local team are already on their way to
BrightonSEO. So if you’re attending that awesome event later this week, please stop by our booth and let us know what you’d like to see us work on next.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Website Optimization Company | Web Presence Group

Get a free SEO audit from the Web Presence Group, a professional and progressive website optimization company. http://www.webpresencegroup.net/website-optimi…

Reblogged 4 years ago from www.youtube.com

SEMrush Webinar Four Basic SEO Changes. BIG Impact.

Join SEMrush for this special guest webinar with Heather Lloyd Martin of SuccessWorks as she teaches you about how to use SEMRush’s site audit tool to uncove…

Reblogged 4 years ago from www.youtube.com