How to Find Your True Local Competitors

Posted by MiriamEllis

Who are your clients’ true competitors?

It’s a question that’s become harder to answer. What felt like a fairly simple triangulation between Google, brand, and searcher in the early days of the local web has multiplied into a geodesic dome of localization, personalization, intent matching, and other facets.

This evolution from a simple shape to a more complex shape has the local SEO industry starting to understand the need to talk about trends and patterns vs. empirical rankings.

For instance, you might notice that you just can’t deliver client reports that say, “Congratulations, you’re #1” anymore. And that’s because the new reality is that there is no #1 for all searchers. A user on the north side of town may see a completely different local pack of results if they go south, or if they modify their search language. An SEO may get a whole different SERP if they search on one rank checking tool vs. another — or even on the same tool, just five minutes later.

Despite all this, you still need to analyze and report — it remains a core task to audit a client’s competitive landscape.

 Today, let’s talk about how we can distill this dynamic, complex environment down to the simplest shapes to understand who your client’s true competitors are. I’ll be sharing a spreadsheet to help you and your clients see the trends and patterns that can create the basis for competitive strategy.

Why are competitive audits necessary…and challenging?

Before we dive into a demo, let’s sync up on what the basic point is of auditing local competitors. Essentially, you’re seeking contrast — you stack up two brands side-by-side to discover the metrics that appear to be making one of them dominant in the local or localized organic SERPs.

From there, you can develop a strategy to emulate the successes of the current winner with the goal of meeting and then surpassing them with superior efforts.

But before you start comparing your brand A to their brand B, you’ve got to know who brand B actually is. What obstacles do you face?

1. SERPs are incredibly diversified

    A recent STAT whitepaper that looked at 1.2 million keywords says it all: every SERP is a local SERP. And since both local packs and organic results are both subject to the whims of geo-location and geo-modification, incorporating them into your tracking strategy is a must.

    To explain, imagine two searchers are sitting on the same couch. One searches for “Mexican restaurant” and the other searches for “Mexican restaurant near me”. Then, they divvy up searching “Mexican restaurant near me” vs. “Mexican restaurant in San Jose”. And, so on. What they see are local packs that are only about 80 percent similar based on Google recognizing different intents. That’s significant variability.

    The scenario gets even more interesting when one of the searchers gets up and travels across town to a different zip code. At that point, the two people making identical queries can see local packs that range from only about 26–65 percent similar. In other words, quite different.

    Now, let’s say your client wants to rank for seven key phrases — like “Mexican restaurant,” “Mexican restaurant near me,” “Mexican restaurant San Jose,” “best Mexican restaurant,” “cheap Mexican restaurant,” etc. Your client doesn’t have just three businesses to compete against in the local pack; they now have multiple multiples of three!

    2) Even good rank tracking tools can be inconsistent

    There are many useful local rank tracking tools out there, and one of the most popular comes to us from BrightLocal. I really like the super easy interface of this tool, but there is a consistency issue with this and other tools I’ve tried, which I’ve captured in a screenshot, below.

    Here I’m performing the same search at 5-minute intervals, showing how the reported localized organic ranking of a single business vary widely across time.

    The business above appears to move from position 5 to position 12. This illustrates the difficulty of answering the question of who is actually the top competitor when using a tool. My understanding is that this type of variability may result from the use of proxies. If you know of a local rank checker that doesn’t do this, please let our community know in the comments.

    In the meantime, what I’ve discovered in my own work is that it’s really hard to find a strong and consistent substitute for manually checking which competitors rank where, on the ground. So, let’s try something out together.

    The simplest solution for finding true competitors

    Your client owns a Mexican restaurant and has seven main keyword phrases they want to compete for. Follow these five easy steps:

    Step 1: Give the client a local pack crash course

    If the client doesn’t already know, teach them how to perform a search on Google and recognize what a local pack is. Show them how businesses in the pack rank 1, 2, and 3. If they have more questions about local packs, how they show up in results, and how Google ranks content, they can check out our updated Beginners Guide to SEO.

    Step 2: Give the client a spreadsheet and a tiny bit of homework

    Give the client a copy of this free spreadsheet, filled out with their most desired keyword phrases. Have them conduct seven searches from a computer located at their place of business* and then fill out the spreadsheet with the names of the three competitors they see for each of the seven phrases. Tell them not to pay attention to any of the other fields of the spreadsheet.

    *Be sure the client does this task from their business’ physical location as this is the best way to see what searchers in their area will see in the local results. Why are we doing this? Because Google weights proximity of the searcher-to-the-business so heavily, we have to pretend we’re a searcher at or near the business to emulate Google’s “thought process”.

    Step 3: Roll up your sleeves for your part of the work

    Now it’s your turn. Look up “directions Google” in Google.

    Enter your client’s business address and the address of their first competitor. Write down the distance in the spreadsheet. Repeat for every entry in each of the seven local packs. This will take you approximately 10–15 minutes to cover all 21 locations, so make sure you’re doing it on company time to ensure you’re on the clock.

    Step 4: Get measuring

    Now, in the 2nd column of the spreadsheet, note down the greatest distance Google appears to be going to fill out the results for each pack.

    Step 5: Identify competitors by strength

    Finally, rate the competitors by the number of times each one appears across all seven local packs. Your spreadsheet should now look something like this:

    Looking at the example sheet above, we’ve learned that:

    • Mi Casa and El Juan’s are the dominant competitors in your client’s market, ranking in 4/7 packs. Plaza Azul is also a strong competitor, with a place in 3/7 packs.
    • Don Pedro’s and Rubio’s are noteworthy with 2/7 pack appearances.
    • All the others make just one pack appearance, making them basic competitors.
    • The radius to which Google is willing to expand to find relevant businesses varies significantly, depending on the search term. While they’re having to go just a couple of miles to find competitors for “Mexican restaurant”, they’re forced to go more than 15 miles for a long tail term like “organic Mexican restaurant”.

    You now know who the client’s direct competitors are for their most desired searches, and how far Google is willing to go to make up a local pack for each term. You have discovered a pattern of most dominant competition across your client’s top phrases, signaling which players need to be audited to yield clues about which elements are making them so strong.

    The pros and cons of the simple search shape

    The old song says that it’s a gift to be simple, but there are some drawbacks to my methodology, namely:

    • You’ll have to depend on the client to help you out for a few minutes, and some clients are not very good at participation, so you’ll need to convince them of the value of their doing the initial searches for you.
    • Manual work is sometimes tedious.
    • Scaling this for a multi-location enterprise would be time-consuming.
    • Some of your clients are going to be located in large cities and will want to know what competitors are showing up for users across town and in different zip codes. Sometimes, it will be possible to compete with these differently-located competitors, but not always. At any rate, our approach doesn’t cover this scenario and you will be stuck with either using tools (with their known inconsistencies), or sending the client across town to search from that locale. This could quickly become a large chore.

    Negatives aside, the positives of this very basic exercise are:

    • Instead of tying yourself to the limited vision of a single local pack and a single set of competitors, you are seeing a trend, a pattern of dominant market-wide competitors.
    • You will have swiftly arrived at a base set of dominant, strong, and noteworthy competitors to audit, with the above-stated goal of figuring out what’s helping them to win so that you can create a client strategy for emulating and surpassing them.
    • Your agency will have created a useful view of your client’s market, understanding the difference between businesses that seem very embedded (like Mi Casa) across multiple packs, vs. those (like Taco Bell) that are only one-offs and could possibly be easier to outpace.
    • You may discover some extremely valuable competitive intel for your client. For example, if Google is having to cast a 15-mile net to find an organic Mexican restaurant, what if your client started offering more organic items on their menu, writing more about this and getting more reviews that mention it? This will give Google a new option, right in town, to consider for local pack inclusion.
    • It’s really quite fast to do for a single-location business.
    • Client buy-in should be a snap for any research they’ve personally helped on, and the spreadsheet should be something they can intuitively and immediately understand.

    My questions for you

    I’d like to close by asking you some questions about your work doing competitive audits for local businesses. I’d be truly interested in your replies as we all work together to navigate the complex shape of Google’s SERPs:

    1. What percentage of your clients “get” that Google’s results have become so dynamic, with different competitors being shown for different queries and different packs being based on searcher location? What percentage of your clients are “there yet” with this concept vs. the old idea of just being #1, period?
    2. I’ve offered you a manual process for getting at trustworthy data on competitors, but as I’ve said, it does take some work. If something could automate this process for you, especially for multi-location clients, would you be interested in hearing more about it?
    3. How often do you do competitive audits for clients? Monthly? Every six months? Annually?

    Thanks for responding, and allow me to wish you and your clients a happy and empowering audit!

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Reblogged 2 months ago from tracking.feedpress.it

    5 local search tactics your competitors probably aren’t using

    When you and your competitors are all adhering to local SEO best practices, how can you differentiate your business from the rest? Columnist Sherry Bonelli has some ideas.

    The post 5 local search tactics your competitors probably aren’t using appeared first on Search Engine Land.

    Please visit Search Engine Land for the full article.

    Reblogged 1 year ago from feeds.searchengineland.com

    Why Effective, Modern SEO Requires Technical, Creative, and Strategic Thinking – Whiteboard Friday

    Posted by randfish

    There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.

    For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

    Video transcription

    Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.

    There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.

    Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”

    Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.

    First off, some of the good points he did bring up.

    One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”

    Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.

    And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.

    You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.

    But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.

    First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.

    So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.

    Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.

    Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.

    So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.

    And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.

    Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.

    The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”

    Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”

    I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.

    There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.

    Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;

    • Good on-site experience
    • Writing good content
    • Getting others to acknowledge you as an authority
    • Rising in social popularity
    • Earning local relevance
    • Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)

    The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.

    I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.

    But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.

    I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.

    We have to be able to understand things like;

    • Content rendering and indexability
    • Crawl structure, internal links, JavaScript, Ajax. If something’s post-loading after the page and Google’s not able to index it, or there are links that are accessible via JavaScript or Ajax, maybe Google can’t necessarily see those or isn’t crawling them as effectively, or is crawling them, but isn’t assigning them as much link weight as they might be assigning other stuff, and you’ve made it tough to link to them externally, and so they can’t crawl it.
    • Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?
    • Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.
    • Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.

    Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”

    Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.

    • International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?
    • Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?
    • Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.

    Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.

    I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.

    • Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.
    • Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.
    • Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.
    • Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.
    • Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.
    • Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.
    • Diagnosing mobile friendliness issues
    • Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.

    Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.

    So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.

    All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.

    Video transcription by Speechpad.com

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Reblogged 3 years ago from tracking.feedpress.it

    Distance from Perfect

    Posted by wrttnwrd

    In spite of all the advice, the strategic discussions and the conference talks, we Internet marketers are still algorithmic thinkers. That’s obvious when you think of SEO.

    Even when we talk about content, we’re algorithmic thinkers. Ask yourself: How many times has a client asked you, “How much content do we need?” How often do you still hear “How unique does this page need to be?”

    That’s 100% algorithmic thinking: Produce a certain amount of content, move up a certain number of spaces.

    But you and I know it’s complete bullshit.

    I’m not suggesting you ignore the algorithm. You should definitely chase it. Understanding a little bit about what goes on in Google’s pointy little head helps. But it’s not enough.

    A tale of SEO woe that makes you go “whoa”

    I have this friend.

    He ranked #10 for “flibbergibbet.” He wanted to rank #1.

    He compared his site to the #1 site and realized the #1 site had five hundred blog posts.

    “That site has five hundred blog posts,” he said, “I must have more.”

    So he hired a few writers and cranked out five thousand blogs posts that melted Microsoft Word’s grammar check. He didn’t move up in the rankings. I’m shocked.

    “That guy’s spamming,” he decided, “I’ll just report him to Google and hope for the best.”

    What happened? Why didn’t adding five thousand blog posts work?

    It’s pretty obvious: My, uh, friend added nothing but crap content to a site that was already outranked. Bulk is no longer a ranking tactic. Google’s very aware of that tactic. Lots of smart engineers have put time into updates like Panda to compensate.

    He started like this:

    And ended up like this:
    more posts, no rankings

    Alright, yeah, I was Mr. Flood The Site With Content, way back in 2003. Don’t judge me, whippersnappers.

    Reality’s never that obvious. You’re scratching and clawing to move up two spots, you’ve got an overtasked IT team pushing back on changes, and you’ve got a boss who needs to know the implications of every recommendation.

    Why fix duplication if rel=canonical can address it? Fixing duplication will take more time and cost more money. It’s easier to paste in one line of code. You and I know it’s better to fix the duplication. But it’s a hard sell.

    Why deal with 302 versus 404 response codes and home page redirection? The basic user experience remains the same. Again, we just know that a server should return one home page without any redirects and that it should send a ‘not found’ 404 response if a page is missing. If it’s going to take 3 developer hours to reconfigure the server, though, how do we justify it? There’s no flashing sign reading “Your site has a problem!”

    Why change this thing and not that thing?

    At the same time, our boss/client sees that the site above theirs has five hundred blog posts and thousands of links from sites selling correspondence MBAs. So they want five thousand blog posts and cheap links as quickly as possible.

    Cue crazy music.

    SEO lacks clarity

    SEO is, in some ways, for the insane. It’s an absurd collection of technical tweaks, content thinking, link building and other little tactics that may or may not work. A novice gets exposed to one piece of crappy information after another, with an occasional bit of useful stuff mixed in. They create sites that repel search engines and piss off users. They get more awful advice. The cycle repeats. Every time it does, best practices get more muddled.

    SEO lacks clarity. We can’t easily weigh the value of one change or tactic over another. But we can look at our changes and tactics in context. When we examine the potential of several changes or tactics before we flip the switch, we get a closer balance between algorithm-thinking and actual strategy.

    Distance from perfect brings clarity to tactics and strategy

    At some point you have to turn that knowledge into practice. You have to take action based on recommendations, your knowledge of SEO, and business considerations.

    That’s hard when we can’t even agree on subdomains vs. subfolders.

    I know subfolders work better. Sorry, couldn’t resist. Let the flaming comments commence.

    To get clarity, take a deep breath and ask yourself:

    “All other things being equal, will this change, tactic, or strategy move my site closer to perfect than my competitors?”

    Breaking it down:

    “Change, tactic, or strategy”

    A change takes an existing component or policy and makes it something else. Replatforming is a massive change. Adding a new page is a smaller one. Adding ALT attributes to your images is another example. Changing the way your shopping cart works is yet another.

    A tactic is a specific, executable practice. In SEO, that might be fixing broken links, optimizing ALT attributes, optimizing title tags or producing a specific piece of content.

    A strategy is a broader decision that’ll cause change or drive tactics. A long-term content policy is the easiest example. Shifting away from asynchronous content and moving to server-generated content is another example.

    “Perfect”

    No one knows exactly what Google considers “perfect,” and “perfect” can’t really exist, but you can bet a perfect web page/site would have all of the following:

    1. Completely visible content that’s perfectly relevant to the audience and query
    2. A flawless user experience
    3. Instant load time
    4. Zero duplicate content
    5. Every page easily indexed and classified
    6. No mistakes, broken links, redirects or anything else generally yucky
    7. Zero reported problems or suggestions in each search engines’ webmaster tools, sorry, “Search Consoles”
    8. Complete authority through immaculate, organically-generated links

    These 8 categories (and any of the other bazillion that probably exist) give you a way to break down “perfect” and help you focus on what’s really going to move you forward. These different areas may involve different facets of your organization.

    Your IT team can work on load time and creating an error-free front- and back-end. Link building requires the time and effort of content and outreach teams.

    Tactics for relevant, visible content and current best practices in UX are going to be more involved, requiring research and real study of your audience.

    What you need and what resources you have are going to impact which tactics are most realistic for you.

    But there’s a basic rule: If a website would make Googlebot swoon and present zero obstacles to users, it’s close to perfect.

    “All other things being equal”

    Assume every competing website is optimized exactly as well as yours.

    Now ask: Will this [tactic, change or strategy] move you closer to perfect?

    That’s the “all other things being equal” rule. And it’s an incredibly powerful rubric for evaluating potential changes before you act. Pretend you’re in a tie with your competitors. Will this one thing be the tiebreaker? Will it put you ahead? Or will it cause you to fall behind?

    “Closer to perfect than my competitors”

    Perfect is great, but unattainable. What you really need is to be just a little perfect-er.

    Chasing perfect can be dangerous. Perfect is the enemy of the good (I love that quote. Hated Voltaire. But I love that quote). If you wait for the opportunity/resources to reach perfection, you’ll never do anything. And the only way to reduce distance from perfect is to execute.

    Instead of aiming for pure perfection, aim for more perfect than your competitors. Beat them feature-by-feature, tactic-by-tactic. Implement strategy that supports long-term superiority.

    Don’t slack off. But set priorities and measure your effort. If fixing server response codes will take one hour and fixing duplication will take ten, fix the response codes first. Both move you closer to perfect. Fixing response codes may not move the needle as much, but it’s a lot easier to do. Then move on to fixing duplicates.

    Do the 60% that gets you a 90% improvement. Then move on to the next thing and do it again. When you’re done, get to work on that last 40%. Repeat as necessary.

    Take advantage of quick wins. That gives you more time to focus on your bigger solutions.

    Sites that are “fine” are pretty far from perfect

    Google has lots of tweaks, tools and workarounds to help us mitigate sub-optimal sites:

    • Rel=canonical lets us guide Google past duplicate content rather than fix it
    • HTML snapshots let us reveal content that’s delivered using asynchronous content and JavaScript frameworks
    • We can use rel=next and prev to guide search bots through outrageously long pagination tunnels
    • And we can use rel=nofollow to hide spammy links and banners

    Easy, right? All of these solutions may reduce distance from perfect (the search engines don’t guarantee it). But they don’t reduce it as much as fixing the problems.
    Just fine does not equal fixed

    The next time you set up rel=canonical, ask yourself:

    “All other things being equal, will using rel=canonical to make up for duplication move my site closer to perfect than my competitors?”

    Answer: Not if they’re using rel=canonical, too. You’re both using imperfect solutions that force search engines to crawl every page of your site, duplicates included. If you want to pass them on your way to perfect, you need to fix the duplicate content.

    When you use Angular.js to deliver regular content pages, ask yourself:

    “All other things being equal, will using HTML snapshots instead of actual, visible content move my site closer to perfect than my competitors?”

    Answer: No. Just no. Not in your wildest, code-addled dreams. If I’m Google, which site will I prefer? The one that renders for me the same way it renders for users? Or the one that has to deliver two separate versions of every page?

    When you spill banner ads all over your site, ask yourself…

    You get the idea. Nofollow is better than follow, but banner pollution is still pretty dang far from perfect.

    Mitigating SEO issues with search engine-specific tools is “fine.” But it’s far, far from perfect. If search engines are forced to choose, they’ll favor the site that just works.

    Not just SEO

    By the way, distance from perfect absolutely applies to other channels.

    I’m focusing on SEO, but think of other Internet marketing disciplines. I hear stuff like “How fast should my site be?” (Faster than it is right now.) Or “I’ve heard you shouldn’t have any content below the fold.” (Maybe in 2001.) Or “I need background video on my home page!” (Why? Do you have a reason?) Or, my favorite: “What’s a good bounce rate?” (Zero is pretty awesome.)

    And Internet marketing venues are working to measure distance from perfect. Pay-per-click marketing has the quality score: A codified financial reward applied for seeking distance from perfect in as many elements as possible of your advertising program.

    Social media venues are aggressively building their own forms of graphing, scoring and ranking systems designed to separate the good from the bad.

    Really, all marketing includes some measure of distance from perfect. But no channel is more influenced by it than SEO. Instead of arguing one rule at a time, ask yourself and your boss or client: Will this move us closer to perfect?

    Hell, you might even please a customer or two.

    One last note for all of the SEOs in the crowd. Before you start pointing out edge cases, consider this: We spend our days combing Google for embarrassing rankings issues. Every now and then, we find one, point, and start yelling “SEE! SEE!!!! THE GOOGLES MADE MISTAKES!!!!” Google’s got lots of issues. Screwing up the rankings isn’t one of them.

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Reblogged 3 years ago from tracking.feedpress.it

    ​​Measure Your Mobile Rankings and Search Visibility in Moz Analytics

    Posted by jon.white

    We have launched a couple of new things in Moz Pro that we are excited to share with you all: Mobile Rankings and a Search Visibility score. If you want, you can jump right in by heading to a campaign and adding a mobile engine, or keep reading for more details!

    Track your mobile vs. desktop rankings in Moz Analytics

    Mobilegeddon came and went with slightly less fanfare than expected, somewhat due to the vast ‘Mobile Friendly’ updates we all did at super short notice (nice work everyone!). Nevertheless, mobile rankings visibility is now firmly on everyone’s radar, and will only become more important over time.

    Now you can track your campaigns’ mobile rankings for all of the same keywords and locations you are tracking on desktop.

    For this campaign my mobile visibility is almost 20% lower than my desktop visibility and falling;
    I can drill down to find out why

    Clicking on this will take you into a new Engines tab within your Keyword Rankings page where you can find a more detailed version of this chart as well as a tabular view by keyword for both desktop and mobile. Here you can also filter by label and location.

    Here I can see Search Visibility across engines including mobile;
    in this case, for my branded keywords.

    We have given an extra engine to all campaigns

    We’ve given customers an extra engine for each campaign, increasing the number from 3 to 4. Use the extra slot to add the mobile engine and unlock your mobile data!

    We will begin to track mobile rankings within 24 hours of adding to a campaign. Once you are set up, you will notice a new chart on your dashboard showing visibility for Desktop vs. Mobile Search Visibility.

    Measure your Search Visibility score vs. competitors

    The overall Search Visibility for my campaign

    Along with this change we have also added a Search Visibility score to your rankings data. Use your visibility score to track and report on your overall campaign ranking performance, compare to your competitors, and look for any large shifts that might indicate penalties or algorithm changes. For a deeper drill-down into your data you can also segment your visibility score by keyword labels or locations. Visit the rankings summary page on any campaign to get started.

    How is Search Visibility calculated?

    Good question!

    The Search Visibility score is the percentage of clicks we estimate you receive based on your rankings positions, across all of your keywords.

    We take each ranking position for each keyword, multiply by an estimated click-thru-rate, and then take the average of all of your keywords. You can think of it as the percentage of your SERPs that you own. The score is expressed as a percentage, though scores of 100% would be almost impossible unless you are tracking keywords using the “site:” modifier. It is probably more useful to measure yourself vs. your competitors rather than focus on the actual score, but, as a rule of thumb, mid-40s is probably the realistic maximum for non-branded keywords.

    Jeremy, our Moz Analytics TPM, came up with this metaphor:

    Think of the SERPs for your keywords as villages. Each position on the SERP is a plot of land in SERP-village. The Search Visibility score is the average amount of plots you own in each SERP-village. Prime real estate plots (i.e., better ranking positions, like #1) are worth more. A complete monopoly of real estate in SERP-village would equate to a score of 100%. The Search Visibility score equates to how much total land you own in all SERP-villages.

    Some neat ways to use this feature

    • Label and group your keywords, particularly when you add them – As visibility score is an average of all of your keywords, when you add or remove keywords from your campaign you will likely see fluctuations in the score that are unrelated to performance. Solve this by getting in the habit of labeling keywords when you add them. Then segment your data by these labels to track performance of specific keyword groups over time.
    • See how location affects your mobile rankings – Using the Engines tab in Keyword Rankings, use the filters to select just local keywords. Look for big differences between Mobile and Desktop where Google might be assuming local intent for mobile searches but not for desktop. Check out how your competitors perform for these keywords. Can you use this data?

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Reblogged 3 years ago from tracking.feedpress.it

    A Vision for Brand Engagement Online, or "The Goal"

    Posted by EricEnge

    Today’s post focuses on a vision for your online presence. This vision outlines what it takes to be the best, both from an overall reputation and visibility standpoint, as well as an SEO point of view. The reason these are tied together is simple: Your overall online reputation and visibility is a huge factor in your SEO. Period. Let’s start by talking about why.

    Core ranking signals

    For purposes of this post, let’s define three cornerstone ranking signals that most everyone agrees on:

    Links

    Links remain a huge factor in overall ranking. Both Cyrus Shepard and Marcus Tober re-confirmed this on the Periodic Table of SEO Ranking Factors session at the SMX Advanced conference in Seattle this past June.

    On-page content

    On-page content remains a huge factor too, but with some subtleties now thrown in. I wrote about some of this in earlier posts I did on Moz about Term Frequency and Inverse Document Frequency. Suffice it to say that on-page content is about a lot more than pure words on the page, but also includes the supporting pages that you link to.

    User engagement with your site

    This is not one of the traditional SEO signals from the early days of SEO, but most advanced SEO pros that I know consider it a real factor these days. One of the most popular concepts people talk about is called pogo-sticking, which is illustrated here:

    You can learn more about the pogosticking concept by visiting this Whiteboard Friday video by a rookie SEO with a last name of Fishkin.

    New, lesser-known signals

    OK, so these are the more obvious signals, but now let’s look more broadly at the overall web ecosystem and talk about other types of ranking signals. Be warned that some of these signals may be indirect, but that just doesn’t matter. In fact, my first example below is an indirect factor which I will use to demonstrate why whether a signal is direct or indirect is not an issue at all.

    Let me illustrate with an example. Say you spend $1 billion dollars building a huge brand around a product that is massively useful to people. Included in this is a sizable $100 million dollar campaign to support a highly popular charitable foundation, and your employees regularly donate time to help out in schools across your country. In short, the great majority of people love your brand.

    Do you think this will impact the way people link to your site? Of course it does. Do you think it will impact how likely people are to be satisified with quality of the pages of your site? Consider this A/B test scenario of 2 pages from different “brands” (for the one on the left, imagine the image of Coca Cola or Pepsi Cola, whichever one you prefer):

    Do you think that the huge brand will get a benefit of a doubt on their page that the no-name brand does not even though the pages are identical? Of course they will. Now let’s look at some simpler scenarios that don’t involve a $1 billion investment.

    1. Cover major options related to a product or service on “money pages”

    Imagine that a user arrives on your auto parts site after searching on the phrase “oil filter” at Google or Bing. Chances are pretty good that they want an oil filter, but here are some other items they may also want:

    • A guide to picking the right filter for their car
    • Oil
    • An oil filter wrench
    • A drainage pan to drain the old oil into

    This is just the basics, right? But, you would be surprised with how many sites don’t include links or information on directly related products on their money pages. Providing this type of smart site and page design can have a major impact on user engagement with the money pages of your site.

    2. Include other related links on money pages

    In the prior item we covered the user’s most directly related needs, but they may have secondary needs as well. Someone who is changing a car’s oil is either a mechanic or a do-it-yourself-er. What else might they need? How about other parts, such as windshield wipers or air filters?

    These are other fairly easy maintenance steps for someone who is working on their car to complete. Presence of these supporting products could be one way to improve user engagement with your pages.

    3. Offer industry-leading non-commercial content on-site

    Publishing world-class content on your site is a great way to produce links to your site. Of course, if you do this on a blog on your site, it may not provide links directly to your money pages, but it will nonetheless lift overall site authority.

    In addition, if someone has consumed one or more pieces of great content on your site, the chance of their engaging in a more positive manner with your site overall go way up. Why? Because you’ve earned their trust and admiration.

    4. Be everywhere your audiences are with more high-quality, relevant, non-commercial content

    Are there major media sites that cover your market space? Do they consider you to be an expert? Will they quote you in articles they write? Can you provide them with guest posts or let you be a guest columnist? Will they collaborate on larger content projects with you?

    All of these activities put you in front of their audiences, and if those audiences overlap with yours, this provides a great way to build your overall reputation and visibility. This content that you publish, or collaborate on, that shows up on 3rd-party sites will get you mentions and links. In addition, once again, it will provide you with a boost to your branding. People are now more likely to consume your other content more readily, including on your money pages.

    5. Leverage social media

    The concept here shares much in common with the prior point. Social media provides opportunities to get in front of relevant audiences. Every person that’s an avid follower of yours on a social media site is more likely to show very different behavior characteristics interacting with your site than someone that does not know you well at all.

    Note that links from social media sites are nofollowed, but active social media behavior can lead to people implementing “real world” links to your site that are followed, from their blogs and media web sites.

    6. Be active in the offline world as well

    Think your offline activity doesn’t matter online? Think again. Relationships are still most easily built face-to-face. People you meet and spend time with can well become your most loyal fans online. This is particularly important when it comes to building relationships with influential people.

    One great way to do that is to go to public events related to your industry, such as conferences. Better still, obtain speaking engagements at those conferences. This can even impact people who weren’t there to hear you speak, as they become aware that you have been asked to do that. This concept can also work for a small local business. Get out in your community and engage with people at local events.

    The payoff here is similar to the payoff for other items: more engaged, highly loyal fans who engage with you across the web, sending more and more positive signals, both to other people and to search engines, that you are the real deal.

    7. Provide great customer service/support

    Whatever your business may be, you need to take care of your customers as best you can. No one can make everyone happy, that’s unrealistic, but striving for much better than average is a really sound idea. Having satisfied customers saying nice things about you online is a big impact item in the grand scheme of things.

    8. Actively build relationships with influencers too

    While this post is not about the value of influencer relationships, I include this in the list for illustration purposes, for two reasons:

    1. Some opportunities are worth extra effort. Know of someone who could have a major impact on your business? Know that they will be at a public event in the near future? Book your plane tickets and get your butt out there. No guarantee that you will get the result you are looking for, or that it will happen quickly, but your chances go WAY up if you get some face time with them.
    2. Influencers are worth special attention and focus, but your relationship-building approach to the web and SEO is not only about influencers. It’s about the entire ecosystem.

    It’s an integrated ecosystem

    The web provides a level of integrated, real-time connectivity of a kind that the world has never seen before. This is only going to increase. Do something bad to a customer in Hong Kong? Consumers in Boston will know within 5 minutes. That’s where it’s all headed.

    Google and Bing (and any future search engine that may emerge) want to measure these types of signals because they tell them how to improve the quality of the experience on their platforms. There are may ways they can perform these measurements.

    One simple concept is covered by Rand in this recent Whiteboard Friday video. The discussion is about a recent patent granted to Google that shows how the company can use search queries to detect who is an authority on a topic.

    The example he provides is about people who search on “email finding tool”. If Google also finds that a number of people search on “voila norbert email tool”, Google may use that as an authority signal.

    Think about that for a moment. How are you going to get people to search on your brand more while putting it together with a non-branded querly like that? (OK, please leave Mechanical Turk and other services like that out of the discussion).

    Now you can start to see the bigger picture. Measurements like pogosticking and this recent search behavior related patent are just the tip of the iceberg. Undoubtedly, there are many other ways that search engines can measure what people like and engage with the most.

    This is all part of SEO now. UX, product breadth, problem solving, UX, engaging in social media, getting face to face, creating great content that you publish in front of other people’s audiences, and more.

    For the small local business, you can still win at this game, as your focus just needs to be on doing it better than your competitors. The big brands will never be hyper-local like you are, so don’t think you can’t play the game, because you can.

    Whoever you are, get ready, because this new integrated ecosystem is already upon us, and you need to be a part of it.

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Reblogged 3 years ago from tracking.feedpress.it