Geomodified Searches, Localized Results, and How to Track the Right Keywords and Locations for Your Business – Next Level

Posted by jocameron

Welcome to the newest installment of our educational Next Level series! In our last episode, our fearless writer Jo Cameron shared how to uncover low-value content that could hurt your rankings and turn it into something valuable. Today, she’s returned to share how to do effective keyword research and targeting for local queries. Read on and level up!


All around the world, people are searching: X sits at a computer high above the city and searches dreamily for the best beaches in Ko Samui. Y strides down a puddle-drenched street and hastily types good Japanese noodles into an expensive handheld computer. K takes up way too much space and bandwidth on the free wireless network in a chain coffee house, which could be located just about anywhere in the world, and hunts for the best price on a gadgety thing.

As we search, the engines are working hard to churn out relevant results based on what we’re searching, our location, personalized results, and just about anything else that can be jammed into an algorithm about our complex human lives. As a business owner or SEO, you’ll want to be able to identify the best opportunities for your online presence. Even if your business doesn’t have a physical location and you don’t have the pleasure of sweeping leaves off your welcome mat, understanding the local landscape can help you hone in on keywords with more opportunity for your business.

In this Next Level post, we’ll go through the different types of geo-targeted searches, how to track the right keywords and locations for your business in Moz Pro, and how to distribute your physical local business details with Moz Local. If you’d like to follow along with this tutorial, get started with a free 30-day trial of Moz Pro:

Follow along with a free trial

Whether your customer is two streets away or gliding peacefully above us on the International Space Station, you must consider how the intertwining worlds of local and national search impact your online presence.


Geomodified searches vs. geolocated searches

First, so you can confidently stride into your next marketing meeting and effortlessly contribute to a related conversation on Slack, let’s take a quick look at the lingo.

Geomodified searches include the city/neighborhood in the search term itself to target the searcher’s area of interest.

You may have searched some of these examples yourself in a moment of escapism: “beaches in Ko Samui,” “ramen noodles in Seattle,” “solid state drive London,” or “life drawing classes London.”

Geomodified searches state explicit local intent for results related to a particular location. As a marketer or business owner, tracking geomodified keywords gives you insight into how you’re ranking for those searches specifically.

Geolocated searches are searches made while the searcher is physically located in a specific area — generally a city. You may hear the term “location targeting” thrown about, often in the high-roller realm of paid marketing. Rather than looking at keywords that contain certain areas, this type of geotargeting focuses on searches made within an area.

Examples might include: “Japanese noodles,” “Ramen,” “solid state drive,” or “coffee,” searched from the city of Seattle, or the city of London, or the city of Tokyo.

Of course, the above ways of searching and tracking are often intertwined with each other. Our speedy fingers type demands, algorithms buzz, and content providers hit publish and bite their collective nails as analytics charts populate displaying our progress. Smart SEOs will likely have a keyword strategy that accounts for both geomodified and geolocated searches.

Researching local keywords

The more specific your keywords and the location you’re targeting, generally, the less data you’ll find. Check your favorite keyword research tool, like Keyword Explorer, and you’ll see what I’m talking about. In this example, I’m looking at search volume data for “japanese noodles” vs. “japanese noodles london.”

“Japanese noodles”

“Japanese noodles London”

So, do I toss this geomodified keyword? Hold on, buddy — while the Monthly Volume decreases, take a look at that Difficulty score — it increases. It’s an easy search term to dismiss, since the search volume is so low, but what this tells me is that there’s more to the story.

A search for “japanese noodles” is too broad to divine much of the searcher’s intent — do they want to make Japanese noodles? Learn what Japanese noodles are? Find an appetizing image?… and so on and so forth. The term itself doesn’t give us much context to work with.

So, while the search volume may be lower, a search for “japanese noodles london” means so much more — now we have some idea of the searcher’s intent. If your site’s content matches up with the searcher’s intent, and you can beat your competition in the SERPs, you could find that the lower search volume equates to a higher conversion rate, and you could be setting yourself up for a great return on investment.

Digging into hyperlocal niches is a challenge. We’ve got some handy tips for investigating hyperlocal keywords, including using similar but slightly larger regions, digging into auto-suggest to gather keyword ideas, and using the grouping function in Keyword Explorer.

Testing will be your friend here. Build a lovely list, create some content, and then test, analyze, and as the shampoo bottle recommends, rinse and repeat.


Localized ranking signals and results

When search engines impress us all by displaying a gazillion results per point whatever of a second, they aren’t just looking inwards at their index. They’re looking outwards at the searcher, figuring out the ideal pairing of humans and results.

Local rankings factors take into consideration things like proximity between the searcher and the business, consistency of citations, and reviews, to name just a few. These are jumbled together with all the other signals we’re used to, like authority and relevancy. The full and glorious report is available here: https://moz.com/local-search-ranking-factors

I often find myself returning to the local search ranking factors report because there’s just so much to digest. So go ahead bookmark it in a folder called “Local SEO” for easy reference, and delight in how organized you are.

While you may expect a search for “life drawing” to turn up mostly organic results, you can see the Local Pack is elbowing its way in there to serve up classes near me:

And likewise, you may expect a search for “life drawing london” to show only local results, but lookie here: we’ve also got some top organic results that have targeted “life drawing london” and the local results creep ever closer to the top:

From these examples you can see that localized results can have a big impact on your SEO strategy, particularly if you’re competing with Local Pack-heavy results. So let’s go ahead and assemble a good strategy into a format that you can follow for your business.


Tracking what’s right for your business

With your mind brimming with local lingo, let’s take a look at how you can track the right types of keywords and locations for your business using Moz Pro. I’ll also touch on Moz Local for the brick-and-mortar types.

1. Your business is rocking the online world

Quest: Track your target keywords nationally and keep your eye on keywords dominated by SERP features you can’t win, like Local Packs.

Hey there, w-w-w dot Your Great Site dot com! You’re the owner of a sweet, shiny website. You’re a member of the digital revolution, a content creator, a message deliverer, a gadgety thingy provider. Your customers are primarily online. I mean, they exist in real life too, but they are also totally and completely immersed in the online world. (Aren’t we all?)

Start by setting up a brand-new Moz Pro Campaign for your target location.

Select one of each search engine to track for your location. This is what I like to call the full deck:

Another personal favorite is what I call the “Google Special.” Select Google desktop and Google Mobile for two locations. This is especially handy if you want to track two national locations in a single Campaign. Here I’ve gone with the US and Canada:

I like to track Google Mobile along with Google desktop results. Ideally you want to be performing consistently in both. If the results are hugely disparate, you may need to check that your site is mobile friendly.

Pour all your lovely keywords into the Campaign creation wizard. Turn that keyword bucket upside-down and give the bottom a satisfying tap like a drum:

Where have we found all these lovely keywords? Don’t tell me you don’t know!

Head over to Keyword Explorer and enter your website. Yes, friend, that’s right. We can show you the keywords your site is already ranking for:

I’m going to leave you to have some fun with that, but when you’re done frolicking in keywords you’re ranking for, keywords your competitors are ranking for, and keywords your Mum’s blog is ranking for, pop back and we’ll continue on our quest.

Next: Onward to the SERP features!

SERP features are both a blessing and a curse. Yes, you could zip to the top of page 1 if you’re lucky enough to be present in those SERP features, but they’re also a minefield, as they squeeze out the organic results you’ve worked so hard to secure.

Luckily for you, we’ve got the map to this dastardly minefield. Keep your eye out for Local Packs and Local Teasers; these are your main threats.

If you have an online business and you’re seeing too many local-type SERP features, this may be an indication that you’re tracking the wrong keywords. You can also start to identify features that do apply to your business, like Image Packs and Featured Snippets.

When you’re done with your local quest, you can come back and try to own some of these features, just like we explored in a previous Next Level blog post: Hunting Down SERP Features to Understand Intent & Drive Traffic

2. Your business rocks customers in the real world

Quest: Track keywords locally and nationally and hone in on local SERP features + the wonderful world of NAP.

What if you run a cozy little cupcake shop in your cozy little city?

Use the same search engine setup from above, and sprinkle locally tracked keywords into the mix.

If you’re setting up a new Campaign, you can add both national and local keywords like a boss.

You can see I’ve added a mouthwatering selection of keywords in both the National Keywords section and in the Local Keywords field. This is because I want to see if one of my cupcake shop’s landing pages is ranking in Google Desktop, Google Mobile, and Yahoo and Bing, both nationally and locally, in my immediate vicinity of Seattle. Along with gathering comparative national and local ranking data, the other reason to track keywords nationally is so you can see how you’re doing in terms of on-page optimization.

Your path to cupcake domination doesn’t stop there! You’re also going to want to be the big player rocking the Local Pack.

Filter by Local Pack or Local Teaser to see if your site is featured. Keep your eye out for any results marked with a red circle, as these are being dominated by your competitors.

The wonderful world of NAP

As a local business owner, you’ll probably have hours of operation, and maybe even one of those signs that you turn around to indicate whether you’re open or closed. You also have something that blogs and e-commerce sites don’t have: NAP, baby!

As a lingo learner, your lingo learning days are never over, especially in the world of digital marketing (actually, just make that digital anything). NAP is the acronym for business name, address, and phone number. In local SEO you’ll see this term float by more often than a crunchy brown leaf on a cold November morning.

NAP details are your lifeblood: You want people to know them, you want them to be correct, and you want them to be correct everywhere — for the very simple reason that humans and Google will trust you if your data is consistent.

If you manage a single location and decide to go down the manual listing management route, kudos to you, my friend. I’m going to offer some resources to guide you:

3. You manage multiple local businesses with multiple locations

Quest: Bulk-distribute business NAP, fix consistency issues, and stamp out duplicates.

If you are juggling a bunch of locations for your own business, or a client’s, you’ll know that in the world of citation building things can get out of hand pretty gosh-darn quick. Any number of acts can result in your business listing details splitting into multiple fragments, whether you moved locations, inherited a phone number that has an online past, or someone in-house set up your listings incorrectly.

While a single business operating out of a single location may have the choice to manually manage their listing distribution, with every location you add to your list your task becomes exponentially more complex.

Remember earlier, when we talked about those all-important local search ranking factors? The factors that determine local results, like proximity, citation signals, reviews, and so on? Well, now you’ll be really glad you bookmarked that link.

You can do all sorts of things to send appealing local signals to Google. While there isn’t a great deal we can do about proximity right now — people have a tendency to travel where they want to — the foundational act of consistently distributing your NAP details is within your power.

That’s where Moz Local steps in. The main purpose of Moz Local is to help you publish and maintain NAP consistency in bulk.

First, enter your business name and postcode in the free Check Listing tool. Bounce, bounce…

After a few bounces, you’ll get the results:

Moz Local will only manage listings that have been “verified” to prevent spam submissions.

If you’re not seeing what you’d expect in the Check Listing tool, you’ll want to dig up your Google Maps and Facebook Places pages and check them against these requirements on our Help Hub.

When you’re ready to start distributing your business details to our partners, you can select and purchase your listing. You can find out more about purchasing your listing, again on our Help Hub.

Pro Tip: If you have lots of local clients, you’ll probably want to purchase via CSV upload. Follow our documentation to get your CSV all spruced up and formatted correctly.

If tracking your visibility and reputation is high on your to-do list, then you’ll want to look at purchasing your listings at the Professional or Premium level.

We’ll track your local and organic rankings for your Google My Business categories by default, but you can enter your own group of target keywords here. We account for the geographic location of your listings, so be sure to add keywords without any geomodifiers!

If you want to track more keywords, we’ve got you covered. Hop on over to Moz Pro and set up a Campaign like we did in the section above.

4. You’re a dog trainer who services your local area without a storefront

Quest: Help owners of aspiring good dogs find your awesome training skills, even though you don’t have a brick-and-mortar storefront.

At Moz HQ, we love our pooches: they are the sunshine of our lives (as our Instagram feed delightfully confirms). While they’re all good doggos, well-trained pooches have a special place in our hearts.

But back to business. If you train dogs, or run another location-specific business without a shop front, this is called a service-area business (or SAB, another term to add to the new lingo pile).

Start by tracking searches for “dog trainer seattle,” and all the other keywords you discovered in your research, both nationally and locally.

I’ve got my Campaign pulled up, so I’m going to add some keywords and track them nationally and locally.

You may find that some keywords on a national level are just too competitive for your local business. That’s okay! You can refine your list as you go. If you’re happy with your local tracking, then you can remove the nationally tracked keywords from your Campaign and just track your keywords at the local level.

Pro Tip: Remember that if you want to improve your Page Optimization with Moz Pro, you’ll have to have the keyword tracked nationally in your Campaign.

In terms of Moz Local, since accuracy, completeness, and consistency are key factors, the tool pushes your complete address to our partners in order to improve your search ranking. It’s possible to use Moz Local with a service-area business (SAB), but it’s worth noting that some partners do not support hidden addresses. Miriam Ellis describes how Moz Local works with service-area businesses (SABs) in her recent blog post.

Basically, if your business is okay with your address being visible in multiple places, then we can work with your Facebook page, provided it’s showing your address. You won’t achieve a 100% visibility score, but chances are your direct local competitors are in the same boat.


Wrapping up

Whether you’re reaching every corner of the globe with your online presence, or putting cupcakes into the hands of Seattleites, the local SEO landscape has an impact on how your site is represented in search results.

The key is identifying the right opportunities for your business and delivering the most accurate and consistent information to search engines, directories, and your human visitors, too.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 1 year ago from tracking.feedpress.it

4 reasons the world would end at the demise of local SEO

It’s hard to imagine a world without local search. Columnist Lydia Jorden delves into four different industries that must optimize for local search, paired with a specific strategy to help optimize for streamlined customer searches. Does your local search strategy encompass these techniques?

The…

Please visit Search Engine Land for the full article.

Reblogged 3 years ago from feeds.searchengineland.com

Why Effective, Modern SEO Requires Technical, Creative, and Strategic Thinking – Whiteboard Friday

Posted by randfish

There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.

There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.

Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”

Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.

First off, some of the good points he did bring up.

One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”

Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.

And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.

You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.

But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.

First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.

So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.

Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.

Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.

So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.

And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.

Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.

The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”

Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”

I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.

There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.

Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;

  • Good on-site experience
  • Writing good content
  • Getting others to acknowledge you as an authority
  • Rising in social popularity
  • Earning local relevance
  • Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)

The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.

I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.

But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.

I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.

We have to be able to understand things like;

  • Content rendering and indexability
  • Crawl structure, internal links, JavaScript, Ajax. If something’s post-loading after the page and Google’s not able to index it, or there are links that are accessible via JavaScript or Ajax, maybe Google can’t necessarily see those or isn’t crawling them as effectively, or is crawling them, but isn’t assigning them as much link weight as they might be assigning other stuff, and you’ve made it tough to link to them externally, and so they can’t crawl it.
  • Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?
  • Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.
  • Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.

Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”

Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.

  • International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?
  • Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?
  • Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.

Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.

I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.

  • Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.
  • Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.
  • Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.
  • Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.
  • Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.
  • Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.
  • Diagnosing mobile friendliness issues
  • Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.

Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.

So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.

All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Moz Local Officially Launches in the UK

Posted by David-Mihm

To all Moz Local fans in the UK, I’m excited to announce that your wait is over. As the sun rises “across the pond” this morning, Moz Local is officially live in the United Kingdom!

A bit of background

As many of you know, we released the US version of Moz Local in March 2014. After 12 months of terrific growth in the US, and a boatload of technical improvements and feature releases–especially for Enterprise customers–we released the Check Listing feature for a limited set of partner search engines and directories in the UK in April of this year.

Over 20,000 of you have checked your listings (or your clients’ listings) in the last 3-1/2 months. Those lookups have helped us refine and improve the background technology immensely (more on that below). We’ve been just as eager to release the fully-featured product as you’ve been to use it, and the technical pieces have finally fallen into place for us to do so.

How does it work?

The concept is the same as the US version of Moz Local: show you how accurately and completely your business is listed on the most important local search platforms and directories, and optimize and perfect as many of those business listings as we can on your behalf.

For customers specifically looking for you, accurate business listings are obviously important. For customers who might not know about you yet, they’re also among the most important factors for ranking in local searches on Google. Basically, the more times Google sees your name, address, phone, and website listed the same way on quality local websites, the more trust they have in your business, and the higher you’re likely to rank.

Moz Local is designed to help on both these fronts.

To use the product, you simply need to type a name and postcode at moz.com/local. We’ll then show you a list of the closest matching listings we found. We prioritize verified listing information that we find on Google or Facebook, and selecting one of those verified listings means we’ll be able to distribute it on your behalf.

Clicking on a result brings you to a full details report for that listing. We’ll show you how accurate and complete your listings are now, and where they could be after using our product.

Clicking the tabs beneath the Listing Score graphic will show you some of the incompletions and inconsistencies that publishing your listing with Moz Local will address.

For customers with hundreds or thousands of locations, bulk upload is also available using a modified version of your data from Google My Business–feel free to e-mail enterpriselocal@moz.com for more details.

Where do we distribute your data?

We’ve prioritized the most important commercial sites in the UK local search ecosystem, and made them the centerpieces of Moz Local. We’ll update your data directly on globally-important players Factual and Foursquare, and the UK-specific players CentralIndex, Thomson Local, and the Scoot network–which includes key directories like TouchLocal, The Independent, The Sun, The Mirror, The Daily Scotsman, and Wales Online.

We’ll be adding two more major destinations shortly, and for those of you who sign up before that time, your listings will be automatically distributed to the additional destinations when the integrations are complete.

How much does it cost?

The cost per listing is £84/year, which includes distribution to the sites mentioned above with unlimited updates throughout the year, monitoring of your progress over time, geographically- focused reporting, and the ability to find and close duplicate listings right from your Moz Local dashboard–all the great upgrades that my colleague Noam Chitayat blogged about here.

What’s next?

Well, as I mentioned just a couple paragraphs ago, we’ve got two additional destinations to which we’ll be sending your data in very short order. Once those integrations are complete, we’ll be just a few weeks away from releasing our biggest set of features since we launched. I look forward to sharing more about these features at BrightonSEO at the end of the summer!

For those of you around the world in Canada, Australia, and other countries, we know there’s plenty of demand for Moz Local overseas, and we’re working as quickly as we can to build additional relationships abroad. And to our friends in the UK, please let us know how we can continue to make the product even better!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

​​Measure Your Mobile Rankings and Search Visibility in Moz Analytics

Posted by jon.white

We have launched a couple of new things in Moz Pro that we are excited to share with you all: Mobile Rankings and a Search Visibility score. If you want, you can jump right in by heading to a campaign and adding a mobile engine, or keep reading for more details!

Track your mobile vs. desktop rankings in Moz Analytics

Mobilegeddon came and went with slightly less fanfare than expected, somewhat due to the vast ‘Mobile Friendly’ updates we all did at super short notice (nice work everyone!). Nevertheless, mobile rankings visibility is now firmly on everyone’s radar, and will only become more important over time.

Now you can track your campaigns’ mobile rankings for all of the same keywords and locations you are tracking on desktop.

For this campaign my mobile visibility is almost 20% lower than my desktop visibility and falling;
I can drill down to find out why

Clicking on this will take you into a new Engines tab within your Keyword Rankings page where you can find a more detailed version of this chart as well as a tabular view by keyword for both desktop and mobile. Here you can also filter by label and location.

Here I can see Search Visibility across engines including mobile;
in this case, for my branded keywords.

We have given an extra engine to all campaigns

We’ve given customers an extra engine for each campaign, increasing the number from 3 to 4. Use the extra slot to add the mobile engine and unlock your mobile data!

We will begin to track mobile rankings within 24 hours of adding to a campaign. Once you are set up, you will notice a new chart on your dashboard showing visibility for Desktop vs. Mobile Search Visibility.

Measure your Search Visibility score vs. competitors

The overall Search Visibility for my campaign

Along with this change we have also added a Search Visibility score to your rankings data. Use your visibility score to track and report on your overall campaign ranking performance, compare to your competitors, and look for any large shifts that might indicate penalties or algorithm changes. For a deeper drill-down into your data you can also segment your visibility score by keyword labels or locations. Visit the rankings summary page on any campaign to get started.

How is Search Visibility calculated?

Good question!

The Search Visibility score is the percentage of clicks we estimate you receive based on your rankings positions, across all of your keywords.

We take each ranking position for each keyword, multiply by an estimated click-thru-rate, and then take the average of all of your keywords. You can think of it as the percentage of your SERPs that you own. The score is expressed as a percentage, though scores of 100% would be almost impossible unless you are tracking keywords using the “site:” modifier. It is probably more useful to measure yourself vs. your competitors rather than focus on the actual score, but, as a rule of thumb, mid-40s is probably the realistic maximum for non-branded keywords.

Jeremy, our Moz Analytics TPM, came up with this metaphor:

Think of the SERPs for your keywords as villages. Each position on the SERP is a plot of land in SERP-village. The Search Visibility score is the average amount of plots you own in each SERP-village. Prime real estate plots (i.e., better ranking positions, like #1) are worth more. A complete monopoly of real estate in SERP-village would equate to a score of 100%. The Search Visibility score equates to how much total land you own in all SERP-villages.

Some neat ways to use this feature

  • Label and group your keywords, particularly when you add them – As visibility score is an average of all of your keywords, when you add or remove keywords from your campaign you will likely see fluctuations in the score that are unrelated to performance. Solve this by getting in the habit of labeling keywords when you add them. Then segment your data by these labels to track performance of specific keyword groups over time.
  • See how location affects your mobile rankings – Using the Engines tab in Keyword Rankings, use the filters to select just local keywords. Look for big differences between Mobile and Desktop where Google might be assuming local intent for mobile searches but not for desktop. Check out how your competitors perform for these keywords. Can you use this data?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Pinpoint vs. Floodlight Content and Keyword Research Strategies – Whiteboard Friday

Posted by randfish

When we’re doing keyword research and targeting, we have a choice to make: Are we targeting broader keywords with multiple potential searcher intents, or are we targeting very narrow keywords where it’s pretty clear what the searchers were looking for? Those different approaches, it turns out, apply to content creation and site architecture, as well. In today’s Whiteboard Friday, Rand illustrates that connection.

Pinpoint vs Floodlight Content and Keyword Research Strategy Whiteboard

For reference, here are stills of this week’s whiteboards. Click on it to open a high resolution image in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about pinpoint versus floodlight tactics for content targeting, content strategy, and keyword research, keyword targeting strategy. This is also called the shotgun versus sniper approach, but I’m not a big gun fan. So I’m going to stick with my floodlight versus pinpoint, plus, you know, for the opening shot we don’t have a whole lot of weaponry here at Moz, but we do have lighting.

So let’s talk through this at first. You’re going through and doing some keyword research. You’re trying to figure out which terms and phrases to target. You might look down a list like this.

Well, maybe, I’m using an example here around antique science equipment. So you see these various terms and phrases. You’ve got your volume numbers. You probably have lots of other columns. Hopefully, you’ve watched the Whiteboard Friday on how to do keyword research like it’s 2015 and not 2010.

So you know you have all these other columns to choose from, but I’m simplifying here for the purpose of this experiment. So you might choose some of these different terms. Now, they’re going to have different kinds of tactics and a different strategic approach, depending on the breadth and depth of the topic that you’re targeting. That’s going to determine what types of content you want to create and where you place it in your information architecture. So I’ll show you what I mean.

The floodlight approach

For antique science equipment, this is a relatively broad phrase. I’m going to do my floodlight analysis on this, and floodlight analysis is basically saying like, “Okay, are there multiple potential searcher intents?” Yeah, absolutely. That’s a fairly broad phase. People could be looking to transact around it. They might be looking for research information, historical information, different types of scientific equipment that they’re looking for.

<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b15fc96679b8.73854740.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"

Are there four or more approximately unique keyword terms and phrases to target? Well, absolutely, in fact, there’s probably more than that. So antique science equipment, antique scientific equipment, 18th century scientific equipment, all these different terms and phrases that you might explore there.

Is this a broad content topic with many potential subtopics? Again, yes is the answer to this. Are we talking about generally larger search volume? Again, yes, this is going to have a much larger search volume than some of the narrower terms and phrases. That’s not always the case, but it is here.

The pinpoint approach

For pinpoint analysis, we kind of go the opposite direction. So we might look at a term like antique test tubes, which is a very specific kind of search, and that has a clear single searcher intent or maybe two. Someone might be looking for actually purchasing one of those, or they might be looking to research them and see what kinds there are. Not a ton of additional intents behind that. One to three unique keywords, yeah, probably. It’s pretty specific. Antique test tubes, maybe 19th century test tubes, maybe old science test tubes, but you’re talking about a limited set of keywords that you’re targeting. It’s a narrow content topic, typically smaller search volume.

<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b160069eb6b1.12473448.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"

Now, these are going to feed into your IA, your information architecture, and your site structure in this way. So floodlight content generally sits higher up. It’s the category or the subcategory, those broad topic terms and phrases. Those are going to turn into those broad topic category pages. Then you might have multiple, narrower subtopics. So we could go into lab equipment versus astronomical equipment versus chemistry equipment, and then we’d get into those individual pinpoints from the pinpoint analysis.

How do I decide which approach is best for my keywords?

Why are we doing this? Well, generally speaking, if you can take your terms and phrases and categorize them like this and then target them differently, you’re going to provide a better, more logical user experience. Someone who searches for antique scientific equipment, they’re going to really expect to see that category and then to be able to drill down into things. So you’re providing them the experience they predict, the one that they want, the one that they expect.

It’s better for topic modeling analysis and for all of the algorithms around things like Hummingbird, where Google looks at: Are you using the types of terms and phrases, do you have the type of architecture that we expect to find for this keyword?

It’s better for search intent targeting, because the searcher intent is going to be fulfilled if you provide the multiple paths versus the narrow focus. It’s easier keyword targeting for you. You’re going to be able to know, “Hey, I need to target a lot of different terms and phrases and variations in floodlight and one very specific one in pinpoint.”

There’s usually higher searcher satisfaction, which means you get lower bounce rate. You get more engagement. You usually get a higher conversion rate. So it’s good for all those things.

For example…

I’ll actually create pages for each of antique scientific equipment and antique test tubes to illustrate this. So I’ve got two different types of pages here. One is my antique scientific equipment page.

<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b161fa871e32.54731215.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"

This is that floodlight, shotgun approach, and what we’re doing here is going to be very different from a pinpoint approach. It’s looking at like, okay, you’ve landed on antique scientific equipment. Now, where do you want to go? What do you want to specifically explore? So we’re going to have a little bit of content specifically about this topic, and how robust that is depends on the type of topic and the type of site you are.

If this is an e-commerce site or a site that’s showing information about various antiques, well maybe we don’t need very much content here. You can see the filtration that we’ve got is going to be pretty broad. So I can go into different centuries. I can go into chemistry, astronomy, physics. Maybe I have a safe for kids type of stuff if you want to buy your kids antique lab equipment, which you might be. Who knows? Maybe you’re awesome and your kids are too. Then different types of stuff at a very broad level. So I can go to microscopes or test tubes, lab searches.

This is great because it’s got broad intent foci, serving many different kinds of searchers with the same page because we don’t know exactly what they want. It’s got multiple keyword targets so that we can go after broad phrases like antique or old or historical or 13th, 14th, whatever century, science and scientific equipment ,materials, labs, etc., etc., etc. This is a broad page that could reach any and all of those. Then there’s lots of navigational and refinement options once you get there.

Total opposite of pinpoint content.

<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b1622740f0b5.73477500.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"

Pinpoint content, like this antique test tubes page, we’re still going to have some filtration options, but one of the important things to note is note how these are links that take you deeper. Depending on how deep the search volume goes in terms of the types of queries that people are performing, you might want to make a specific page for 17th century antique test tubes. You might not, and if you don’t want to do that, you can have these be filters that are simply clickable and change the content of the page here, narrowing the options rather than creating completely separate pages.

So if there’s no search volume for these different things and you don’t think you need to separately target them, go ahead and just make them filters on the data that already appears on this page or the results that are already in here as opposed to links that are going to take you deeper into specific content and create a new page, a new experience.

You can also see I’ve got my individual content here. I probably would go ahead and add some content specifically to this page that is just unique here and that describes antique test tubes and the things that your searchers need. They might want to know things about price. They might want to know things about make and model. They might want to know things about what they were used for. Great. You can have that information broadly, and then individual pieces of content that someone might dig into.

This is narrower intent foci obviously, serving maybe one or two searcher intents. This is really talking about targeting maybe one to two separate keywords. So antique test tubes, maybe lab tubes or test tube sets, but not much beyond that.

Ten we’re going to have fewer navigational paths, fewer distractions. We want to keep the searcher. Because we know their intent, we want to guide them along the path that we know they probably want to take and that we want them to take.

So when you’re considering your content, choose wisely between shotgun/floodlight approach or sniper/pinpoint approach. Your searchers will be better served. You’ll probably rank better. You’ll be more likely to earn links and amplification. You’re going to be more successful.

Looking forward to the comments, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it