Featured snippets are now used in the related searches section in Google’s mobile results.
Please visit Search Engine Land for the full article.
Posted by jocameron
Welcome to the newest installment of our educational Next Level series! In our last episode, our fearless writer Jo Cameron shared how to uncover low-value content that could hurt your rankings and turn it into something valuable. Today, she’s returned to share how to do effective keyword research and targeting for local queries. Read on and level up!
All around the world, people are searching: X sits at a computer high above the city and searches dreamily for the best beaches in Ko Samui. Y strides down a puddle-drenched street and hastily types good Japanese noodles into an expensive handheld computer. K takes up way too much space and bandwidth on the free wireless network in a chain coffee house, which could be located just about anywhere in the world, and hunts for the best price on a gadgety thing.
As we search, the engines are working hard to churn out relevant results based on what we’re searching, our location, personalized results, and just about anything else that can be jammed into an algorithm about our complex human lives. As a business owner or SEO, you’ll want to be able to identify the best opportunities for your online presence. Even if your business doesn’t have a physical location and you don’t have the pleasure of sweeping leaves off your welcome mat, understanding the local landscape can help you hone in on keywords with more opportunity for your business.
In this Next Level post, we’ll go through the different types of geo-targeted searches, how to track the right keywords and locations for your business in Moz Pro, and how to distribute your physical local business details with Moz Local. If you’d like to follow along with this tutorial, get started with a free 30-day trial of Moz Pro:
Whether your customer is two streets away or gliding peacefully above us on the International Space Station, you must consider how the intertwining worlds of local and national search impact your online presence.
First, so you can confidently stride into your next marketing meeting and effortlessly contribute to a related conversation on Slack, let’s take a quick look at the lingo.
Geomodified searches include the city/neighborhood in the search term itself to target the searcher’s area of interest.
You may have searched some of these examples yourself in a moment of escapism: “beaches in Ko Samui,” “ramen noodles in Seattle,” “solid state drive London,” or “life drawing classes London.”
Geomodified searches state explicit local intent for results related to a particular location. As a marketer or business owner, tracking geomodified keywords gives you insight into how you’re ranking for those searches specifically.
Geolocated searches are searches made while the searcher is physically located in a specific area — generally a city. You may hear the term “location targeting” thrown about, often in the high-roller realm of paid marketing. Rather than looking at keywords that contain certain areas, this type of geotargeting focuses on searches made within an area.
Examples might include: “Japanese noodles,” “Ramen,” “solid state drive,” or “coffee,” searched from the city of Seattle, or the city of London, or the city of Tokyo.
Of course, the above ways of searching and tracking are often intertwined with each other. Our speedy fingers type demands, algorithms buzz, and content providers hit publish and bite their collective nails as analytics charts populate displaying our progress. Smart SEOs will likely have a keyword strategy that accounts for both geomodified and geolocated searches.
The more specific your keywords and the location you’re targeting, generally, the less data you’ll find. Check your favorite keyword research tool, like Keyword Explorer, and you’ll see what I’m talking about. In this example, I’m looking at search volume data for “japanese noodles” vs. “japanese noodles london.”
So, do I toss this geomodified keyword? Hold on, buddy — while the Monthly Volume decreases, take a look at that Difficulty score — it increases. It’s an easy search term to dismiss, since the search volume is so low, but what this tells me is that there’s more to the story.
A search for “japanese noodles” is too broad to divine much of the searcher’s intent — do they want to make Japanese noodles? Learn what Japanese noodles are? Find an appetizing image?… and so on and so forth. The term itself doesn’t give us much context to work with.
So, while the search volume may be lower, a search for “japanese noodles london” means so much more — now we have some idea of the searcher’s intent. If your site’s content matches up with the searcher’s intent, and you can beat your competition in the SERPs, you could find that the lower search volume equates to a higher conversion rate, and you could be setting yourself up for a great return on investment.
Digging into hyperlocal niches is a challenge. We’ve got some handy tips for investigating hyperlocal keywords, including using similar but slightly larger regions, digging into auto-suggest to gather keyword ideas, and using the grouping function in Keyword Explorer.
Testing will be your friend here. Build a lovely list, create some content, and then test, analyze, and as the shampoo bottle recommends, rinse and repeat.
When search engines impress us all by displaying a gazillion results per point whatever of a second, they aren’t just looking inwards at their index. They’re looking outwards at the searcher, figuring out the ideal pairing of humans and results.
Local rankings factors take into consideration things like proximity between the searcher and the business, consistency of citations, and reviews, to name just a few. These are jumbled together with all the other signals we’re used to, like authority and relevancy. The full and glorious report is available here: https://moz.com/local-search-ranking-factors
I often find myself returning to the local search ranking factors report because there’s just so much to digest. So go ahead bookmark it in a folder called “Local SEO” for easy reference, and delight in how organized you are.
While you may expect a search for “life drawing” to turn up mostly organic results, you can see the Local Pack is elbowing its way in there to serve up classes near me:
And likewise, you may expect a search for “life drawing london” to show only local results, but lookie here: we’ve also got some top organic results that have targeted “life drawing london” and the local results creep ever closer to the top:
From these examples you can see that localized results can have a big impact on your SEO strategy, particularly if you’re competing with Local Pack-heavy results. So let’s go ahead and assemble a good strategy into a format that you can follow for your business.
With your mind brimming with local lingo, let’s take a look at how you can track the right types of keywords and locations for your business using Moz Pro. I’ll also touch on Moz Local for the brick-and-mortar types.
Quest: Track your target keywords nationally and keep your eye on keywords dominated by SERP features you can’t win, like Local Packs.
Hey there, w-w-w dot Your Great Site dot com! You’re the owner of a sweet, shiny website. You’re a member of the digital revolution, a content creator, a message deliverer, a gadgety thingy provider. Your customers are primarily online. I mean, they exist in real life too, but they are also totally and completely immersed in the online world. (Aren’t we all?)
Select one of each search engine to track for your location. This is what I like to call the full deck:
Another personal favorite is what I call the “Google Special.” Select Google desktop and Google Mobile for two locations. This is especially handy if you want to track two national locations in a single Campaign. Here I’ve gone with the US and Canada:
I like to track Google Mobile along with Google desktop results. Ideally you want to be performing consistently in both. If the results are hugely disparate, you may need to check that your site is mobile friendly.
Pour all your lovely keywords into the Campaign creation wizard. Turn that keyword bucket upside-down and give the bottom a satisfying tap like a drum:
Where have we found all these lovely keywords? Don’t tell me you don’t know!
Head over to Keyword Explorer and enter your website. Yes, friend, that’s right. We can show you the keywords your site is already ranking for:
I’m going to leave you to have some fun with that, but when you’re done frolicking in keywords you’re ranking for, keywords your competitors are ranking for, and keywords your Mum’s blog is ranking for, pop back and we’ll continue on our quest.
SERP features are both a blessing and a curse. Yes, you could zip to the top of page 1 if you’re lucky enough to be present in those SERP features, but they’re also a minefield, as they squeeze out the organic results you’ve worked so hard to secure.
Luckily for you, we’ve got the map to this dastardly minefield. Keep your eye out for Local Packs and Local Teasers; these are your main threats.
If you have an online business and you’re seeing too many local-type SERP features, this may be an indication that you’re tracking the wrong keywords. You can also start to identify features that do apply to your business, like Image Packs and Featured Snippets.
When you’re done with your local quest, you can come back and try to own some of these features, just like we explored in a previous Next Level blog post: Hunting Down SERP Features to Understand Intent & Drive Traffic
Quest: Track keywords locally and nationally and hone in on local SERP features + the wonderful world of NAP.
What if you run a cozy little cupcake shop in your cozy little city?
Use the same search engine setup from above, and sprinkle locally tracked keywords into the mix.
If you’re setting up a new Campaign, you can add both national and local keywords like a boss.
You can see I’ve added a mouthwatering selection of keywords in both the National Keywords section and in the Local Keywords field. This is because I want to see if one of my cupcake shop’s landing pages is ranking in Google Desktop, Google Mobile, and Yahoo and Bing, both nationally and locally, in my immediate vicinity of Seattle. Along with gathering comparative national and local ranking data, the other reason to track keywords nationally is so you can see how you’re doing in terms of on-page optimization.
Your path to cupcake domination doesn’t stop there! You’re also going to want to be the big player rocking the Local Pack.
Filter by Local Pack or Local Teaser to see if your site is featured. Keep your eye out for any results marked with a red circle, as these are being dominated by your competitors.
As a local business owner, you’ll probably have hours of operation, and maybe even one of those signs that you turn around to indicate whether you’re open or closed. You also have something that blogs and e-commerce sites don’t have: NAP, baby!
As a lingo learner, your lingo learning days are never over, especially in the world of digital marketing (actually, just make that digital anything). NAP is the acronym for business name, address, and phone number. In local SEO you’ll see this term float by more often than a crunchy brown leaf on a cold November morning.
NAP details are your lifeblood: You want people to know them, you want them to be correct, and you want them to be correct everywhere — for the very simple reason that humans and Google will trust you if your data is consistent.
If you manage a single location and decide to go down the manual listing management route, kudos to you, my friend. I’m going to offer some resources to guide you:
Quest: Bulk-distribute business NAP, fix consistency issues, and stamp out duplicates.
If you are juggling a bunch of locations for your own business, or a client’s, you’ll know that in the world of citation building things can get out of hand pretty gosh-darn quick. Any number of acts can result in your business listing details splitting into multiple fragments, whether you moved locations, inherited a phone number that has an online past, or someone in-house set up your listings incorrectly.
While a single business operating out of a single location may have the choice to manually manage their listing distribution, with every location you add to your list your task becomes exponentially more complex.
Remember earlier, when we talked about those all-important local search ranking factors? The factors that determine local results, like proximity, citation signals, reviews, and so on? Well, now you’ll be really glad you bookmarked that link.
You can do all sorts of things to send appealing local signals to Google. While there isn’t a great deal we can do about proximity right now — people have a tendency to travel where they want to — the foundational act of consistently distributing your NAP details is within your power.
That’s where Moz Local steps in. The main purpose of Moz Local is to help you publish and maintain NAP consistency in bulk.
First, enter your business name and postcode in the free Check Listing tool. Bounce, bounce…
After a few bounces, you’ll get the results:
Moz Local will only manage listings that have been “verified” to prevent spam submissions.
If you’re not seeing what you’d expect in the Check Listing tool, you’ll want to dig up your Google Maps and Facebook Places pages and check them against these requirements on our Help Hub.
When you’re ready to start distributing your business details to our partners, you can select and purchase your listing. You can find out more about purchasing your listing, again on our Help Hub.
Pro Tip: If you have lots of local clients, you’ll probably want to purchase via CSV upload. Follow our documentation to get your CSV all spruced up and formatted correctly.
If tracking your visibility and reputation is high on your to-do list, then you’ll want to look at purchasing your listings at the Professional or Premium level.
We’ll track your local and organic rankings for your Google My Business categories by default, but you can enter your own group of target keywords here. We account for the geographic location of your listings, so be sure to add keywords without any geomodifiers!
If you want to track more keywords, we’ve got you covered. Hop on over to Moz Pro and set up a Campaign like we did in the section above.
Quest: Help owners of aspiring good dogs find your awesome training skills, even though you don’t have a brick-and-mortar storefront.
At Moz HQ, we love our pooches: they are the sunshine of our lives (as our Instagram feed delightfully confirms). While they’re all good doggos, well-trained pooches have a special place in our hearts.
But back to business. If you train dogs, or run another location-specific business without a shop front, this is called a service-area business (or SAB, another term to add to the new lingo pile).
Start by tracking searches for “dog trainer seattle,” and all the other keywords you discovered in your research, both nationally and locally.
I’ve got my Campaign pulled up, so I’m going to add some keywords and track them nationally and locally.
You may find that some keywords on a national level are just too competitive for your local business. That’s okay! You can refine your list as you go. If you’re happy with your local tracking, then you can remove the nationally tracked keywords from your Campaign and just track your keywords at the local level.
Pro Tip: Remember that if you want to improve your Page Optimization with Moz Pro, you’ll have to have the keyword tracked nationally in your Campaign.
In terms of Moz Local, since accuracy, completeness, and consistency are key factors, the tool pushes your complete address to our partners in order to improve your search ranking. It’s possible to use Moz Local with a service-area business (SAB), but it’s worth noting that some partners do not support hidden addresses. Miriam Ellis describes how Moz Local works with service-area businesses (SABs) in her recent blog post.
Basically, if your business is okay with your address being visible in multiple places, then we can work with your Facebook page, provided it’s showing your address. You won’t achieve a 100% visibility score, but chances are your direct local competitors are in the same boat.
Whether you’re reaching every corner of the globe with your online presence, or putting cupcakes into the hands of Seattleites, the local SEO landscape has an impact on how your site is represented in search results.
The key is identifying the right opportunities for your business and delivering the most accurate and consistent information to search engines, directories, and your human visitors, too.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
It’s hard to imagine a world without local search. Columnist Lydia Jorden delves into four different industries that must optimize for local search, paired with a specific strategy to help optimize for streamlined customer searches. Does your local search strategy encompass these techniques?
Please visit Search Engine Land for the full article.
Posted by David-Mihm
To all Moz Local fans in the UK, I’m excited to announce that your wait is over. As the sun rises “across the pond” this morning, Moz Local is officially live in the United Kingdom!
As many of you know, we released the US version of Moz Local in March 2014. After 12 months of terrific growth in the US, and a boatload of technical improvements and feature releases–especially for Enterprise customers–we released the Check Listing feature for a limited set of partner search engines and directories in the UK in April of this year.
Over 20,000 of you have checked your listings (or your clients’ listings) in the last 3-1/2 months. Those lookups have helped us refine and improve the background technology immensely (more on that below). We’ve been just as eager to release the fully-featured product as you’ve been to use it, and the technical pieces have finally fallen into place for us to do so.
The concept is the same as the US version of Moz Local: show you how accurately and completely your business is listed on the most important local search platforms and directories, and optimize and perfect as many of those business listings as we can on your behalf.
For customers specifically looking for you, accurate business listings are obviously important. For customers who might not know about you yet, they’re also among the most important factors for ranking in local searches on Google. Basically, the more times Google sees your name, address, phone, and website listed the same way on quality local websites, the more trust they have in your business, and the higher you’re likely to rank.
Moz Local is designed to help on both these fronts.
To use the product, you simply need to type a name and postcode at moz.com/local. We’ll then show you a list of the closest matching listings we found. We prioritize verified listing information that we find on Google or Facebook, and selecting one of those verified listings means we’ll be able to distribute it on your behalf.
Clicking on a result brings you to a full details report for that listing. We’ll show you how accurate and complete your listings are now, and where they could be after using our product.
Clicking the tabs beneath the Listing Score graphic will show you some of the incompletions and inconsistencies that publishing your listing with Moz Local will address.
For customers with hundreds or thousands of locations, bulk upload is also available using a modified version of your data from Google My Business–feel free to e-mail firstname.lastname@example.org for more details.
We’ve prioritized the most important commercial sites in the UK local search ecosystem, and made them the centerpieces of Moz Local. We’ll update your data directly on globally-important players Factual and Foursquare, and the UK-specific players CentralIndex, Thomson Local, and the Scoot network–which includes key directories like TouchLocal, The Independent, The Sun, The Mirror, The Daily Scotsman, and Wales Online.
We’ll be adding two more major destinations shortly, and for those of you who sign up before that time, your listings will be automatically distributed to the additional destinations when the integrations are complete.
The cost per listing is £84/year, which includes distribution to the sites mentioned above with unlimited updates throughout the year, monitoring of your progress over time, geographically- focused reporting, and the ability to find and close duplicate listings right from your Moz Local dashboard–all the great upgrades that my colleague Noam Chitayat blogged about here.
Well, as I mentioned just a couple paragraphs ago, we’ve got two additional destinations to which we’ll be sending your data in very short order. Once those integrations are complete, we’ll be just a few weeks away from releasing our biggest set of features since we launched. I look forward to sharing more about these features at BrightonSEO at the end of the summer!
For those of you around the world in Canada, Australia, and other countries, we know there’s plenty of demand for Moz Local overseas, and we’re working as quickly as we can to build additional relationships abroad. And to our friends in the UK, please let us know how we can continue to make the product even better!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Posted by jon.white
We have launched a couple of new things in Moz Pro that we are excited to share with you all: Mobile Rankings and a Search Visibility score. If you want, you can jump right in by heading to a campaign and adding a mobile engine, or keep reading for more details!
Mobilegeddon came and went with slightly less fanfare than expected, somewhat due to the vast ‘Mobile Friendly’ updates we all did at super short notice (nice work everyone!). Nevertheless, mobile rankings visibility is now firmly on everyone’s radar, and will only become more important over time.
Now you can track your campaigns’ mobile rankings for all of the same keywords and locations you are tracking on desktop.
Clicking on this will take you into a new Engines tab within your Keyword Rankings page where you can find a more detailed version of this chart as well as a tabular view by keyword for both desktop and mobile. Here you can also filter by label and location.
We’ve given customers an extra engine for each campaign, increasing the number from 3 to 4. Use the extra slot to add the mobile engine and unlock your mobile data!
We will begin to track mobile rankings within 24 hours of adding to a campaign. Once you are set up, you will notice a new chart on your dashboard showing visibility for Desktop vs. Mobile Search Visibility.
Along with this change we have also added a Search Visibility score to your rankings data. Use your visibility score to track and report on your overall campaign ranking performance, compare to your competitors, and look for any large shifts that might indicate penalties or algorithm changes. For a deeper drill-down into your data you can also segment your visibility score by keyword labels or locations. Visit the rankings summary page on any campaign to get started.
The Search Visibility score is the percentage of clicks we estimate you receive based on your rankings positions, across all of your keywords.
We take each ranking position for each keyword, multiply by an estimated click-thru-rate, and then take the average of all of your keywords. You can think of it as the percentage of your SERPs that you own. The score is expressed as a percentage, though scores of 100% would be almost impossible unless you are tracking keywords using the “site:” modifier. It is probably more useful to measure yourself vs. your competitors rather than focus on the actual score, but, as a rule of thumb, mid-40s is probably the realistic maximum for non-branded keywords.
Jeremy, our Moz Analytics TPM, came up with this metaphor:
Think of the SERPs for your keywords as villages. Each position on the SERP is a plot of land in SERP-village. The Search Visibility score is the average amount of plots you own in each SERP-village. Prime real estate plots (i.e., better ranking positions, like #1) are worth more. A complete monopoly of real estate in SERP-village would equate to a score of 100%. The Search Visibility score equates to how much total land you own in all SERP-villages.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Posted by randfish
When we’re doing keyword research and targeting, we have a choice to make: Are we targeting broader keywords with multiple potential searcher intents, or are we targeting very narrow keywords where it’s pretty clear what the searchers were looking for? Those different approaches, it turns out, apply to content creation and site architecture, as well. In today’s Whiteboard Friday, Rand illustrates that connection.
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about pinpoint versus floodlight tactics for content targeting, content strategy, and keyword research, keyword targeting strategy. This is also called the shotgun versus sniper approach, but I’m not a big gun fan. So I’m going to stick with my floodlight versus pinpoint, plus, you know, for the opening shot we don’t have a whole lot of weaponry here at Moz, but we do have lighting.
So let’s talk through this at first. You’re going through and doing some keyword research. You’re trying to figure out which terms and phrases to target. You might look down a list like this.
Well, maybe, I’m using an example here around antique science equipment. So you see these various terms and phrases. You’ve got your volume numbers. You probably have lots of other columns. Hopefully, you’ve watched the Whiteboard Friday on how to do keyword research like it’s 2015 and not 2010.
So you know you have all these other columns to choose from, but I’m simplifying here for the purpose of this experiment. So you might choose some of these different terms. Now, they’re going to have different kinds of tactics and a different strategic approach, depending on the breadth and depth of the topic that you’re targeting. That’s going to determine what types of content you want to create and where you place it in your information architecture. So I’ll show you what I mean.
For antique science equipment, this is a relatively broad phrase. I’m going to do my floodlight analysis on this, and floodlight analysis is basically saying like, “Okay, are there multiple potential searcher intents?” Yeah, absolutely. That’s a fairly broad phase. People could be looking to transact around it. They might be looking for research information, historical information, different types of scientific equipment that they’re looking for.
<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b15fc96679b8.73854740.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"
Are there four or more approximately unique keyword terms and phrases to target? Well, absolutely, in fact, there’s probably more than that. So antique science equipment, antique scientific equipment, 18th century scientific equipment, all these different terms and phrases that you might explore there.
Is this a broad content topic with many potential subtopics? Again, yes is the answer to this. Are we talking about generally larger search volume? Again, yes, this is going to have a much larger search volume than some of the narrower terms and phrases. That’s not always the case, but it is here.
For pinpoint analysis, we kind of go the opposite direction. So we might look at a term like antique test tubes, which is a very specific kind of search, and that has a clear single searcher intent or maybe two. Someone might be looking for actually purchasing one of those, or they might be looking to research them and see what kinds there are. Not a ton of additional intents behind that. One to three unique keywords, yeah, probably. It’s pretty specific. Antique test tubes, maybe 19th century test tubes, maybe old science test tubes, but you’re talking about a limited set of keywords that you’re targeting. It’s a narrow content topic, typically smaller search volume.
<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b160069eb6b1.12473448.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"
Now, these are going to feed into your IA, your information architecture, and your site structure in this way. So floodlight content generally sits higher up. It’s the category or the subcategory, those broad topic terms and phrases. Those are going to turn into those broad topic category pages. Then you might have multiple, narrower subtopics. So we could go into lab equipment versus astronomical equipment versus chemistry equipment, and then we’d get into those individual pinpoints from the pinpoint analysis.
Why are we doing this? Well, generally speaking, if you can take your terms and phrases and categorize them like this and then target them differently, you’re going to provide a better, more logical user experience. Someone who searches for antique scientific equipment, they’re going to really expect to see that category and then to be able to drill down into things. So you’re providing them the experience they predict, the one that they want, the one that they expect.
It’s better for topic modeling analysis and for all of the algorithms around things like Hummingbird, where Google looks at: Are you using the types of terms and phrases, do you have the type of architecture that we expect to find for this keyword?
It’s better for search intent targeting, because the searcher intent is going to be fulfilled if you provide the multiple paths versus the narrow focus. It’s easier keyword targeting for you. You’re going to be able to know, “Hey, I need to target a lot of different terms and phrases and variations in floodlight and one very specific one in pinpoint.”
There’s usually higher searcher satisfaction, which means you get lower bounce rate. You get more engagement. You usually get a higher conversion rate. So it’s good for all those things.
I’ll actually create pages for each of antique scientific equipment and antique test tubes to illustrate this. So I’ve got two different types of pages here. One is my antique scientific equipment page.
<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b161fa871e32.54731215.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"
This is that floodlight, shotgun approach, and what we’re doing here is going to be very different from a pinpoint approach. It’s looking at like, okay, you’ve landed on antique scientific equipment. Now, where do you want to go? What do you want to specifically explore? So we’re going to have a little bit of content specifically about this topic, and how robust that is depends on the type of topic and the type of site you are.
If this is an e-commerce site or a site that’s showing information about various antiques, well maybe we don’t need very much content here. You can see the filtration that we’ve got is going to be pretty broad. So I can go into different centuries. I can go into chemistry, astronomy, physics. Maybe I have a safe for kids type of stuff if you want to buy your kids antique lab equipment, which you might be. Who knows? Maybe you’re awesome and your kids are too. Then different types of stuff at a very broad level. So I can go to microscopes or test tubes, lab searches.
This is great because it’s got broad intent foci, serving many different kinds of searchers with the same page because we don’t know exactly what they want. It’s got multiple keyword targets so that we can go after broad phrases like antique or old or historical or 13th, 14th, whatever century, science and scientific equipment ,materials, labs, etc., etc., etc. This is a broad page that could reach any and all of those. Then there’s lots of navigational and refinement options once you get there.
Total opposite of pinpoint content.
<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b1622740f0b5.73477500.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"
Pinpoint content, like this antique test tubes page, we’re still going to have some filtration options, but one of the important things to note is note how these are links that take you deeper. Depending on how deep the search volume goes in terms of the types of queries that people are performing, you might want to make a specific page for 17th century antique test tubes. You might not, and if you don’t want to do that, you can have these be filters that are simply clickable and change the content of the page here, narrowing the options rather than creating completely separate pages.
So if there’s no search volume for these different things and you don’t think you need to separately target them, go ahead and just make them filters on the data that already appears on this page or the results that are already in here as opposed to links that are going to take you deeper into specific content and create a new page, a new experience.
You can also see I’ve got my individual content here. I probably would go ahead and add some content specifically to this page that is just unique here and that describes antique test tubes and the things that your searchers need. They might want to know things about price. They might want to know things about make and model. They might want to know things about what they were used for. Great. You can have that information broadly, and then individual pieces of content that someone might dig into.
This is narrower intent foci obviously, serving maybe one or two searcher intents. This is really talking about targeting maybe one to two separate keywords. So antique test tubes, maybe lab tubes or test tube sets, but not much beyond that.
Ten we’re going to have fewer navigational paths, fewer distractions. We want to keep the searcher. Because we know their intent, we want to guide them along the path that we know they probably want to take and that we want them to take.
So when you’re considering your content, choose wisely between shotgun/floodlight approach or sniper/pinpoint approach. Your searchers will be better served. You’ll probably rank better. You’ll be more likely to earn links and amplification. You’re going to be more successful.
Looking forward to the comments, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.
Posted by ryanwashere
This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of Moz, Inc.
I was furious when
keyword data disappeared from Google Analytics (GA).
I mean, how could I possibly optimize a website
without keyword data?!?!
It didn’t take me long to realize I was overreacting. In fact, I quickly realized how trivial keyword data was.
Search engines are pretty damn good at what they do. If you properly optimize your content, people will find it with the keywords you intended. (You should set up an
SEO dashboard in GA to verify your results.)
The truly valuable keywords are the ones visitors use
within your site.
When mined correctly, internal terms uncover
why users engage with content. These insights provide clear direction to improve content, SEO, and the user journey (resulting in increased conversions, leads, and sales).
In this post, I’ll cover three things:
Before I get into the details, make sure you have the following set up in your GA account:
Supplemental reading: How to set up Google Analytics on your website
Standard GA implementation doesn’t have internal search reporting configured. In order to get the data, we need to input some information into GA manually.
Follow these steps to get it up and running:
In order to complete the tracking, you’ll need to locate your site’s query parameter.
Search query: seo
Landing URL: http://webris.org/?s=seo
What to enter in GA: s
will not post-date searches. In other words, searches that took place before you set up reporting won’t populate. You will only get data from searches occur going forward.
For this reason, you’ll need to wait about 30 days after setting up site search tracking in GA before analyzing the site search data. Otherwise, you won’t have sufficient data to conduct meaningful analysis.
To access your site search data, navigate to
Behavior > Behavior Flow > Site Search in GA.
There are five reports under Site Search:
How to get there: Behavior > Behavior Flow > Site Search > Overview
What the report tells us:
Lists the high-level metrics related to your site’s internal search
How to get there: Behavior > Behavior Flow > Site Search > Usage
What the report tells us:
User journeys that used site search vs. those who didn’t
How to get there: Behavior > Behavior Flow > Site Search > Search Terms
What the report tells us:
Lists the most used search terms with corresponding engagement metrics
How to get there: Behavior > Behavior Flow > Site Search > Pages
What the report tells us: The pages users made their queries on
How to get there: Behavior > Behavior Flow > Site Search > Any/All Reports
What the report tells us: Segments add additional depth and value. I often use the following segments to drive more insights:
The internal site search reports described above are high-level. Sometimes it takes seeing them in action to understand how to truly apply them.
As such, I’ve included two case studies that show exactly how I’ve used internal search data to drive meaningful action.
Site: Pop culture publisher (online only)
Marketing channels: SEO, social, and content
After launch of the strategy, the site saw amazing results:
Up and to the right!
Site: Online travel site
Marketing channels: SEO, PPC, email, social, content, display, TV, radio, and print
When mined properly, internal search data will give you the information you need to greatly improve your web content, design, and search engine optimization efforts.
Posted by Casey_Meraz
Competition in local search is fierce. While it’s typical to do some surface level research on your competitors before entering a market, you can go much further down the SEO rabbit hole. In this article we will look at how you can find more competitors, pull their data, and use it to beat them in the search game.
Since there are plenty of resources out there on best practices, this guide will assume that you have already followed the best practices for your own listing and are looking for the little things that might make a big difference in putting you over your competition. So if you haven’t already read how to perform the Ultimate Local SEO Audit or how to Find and Build Citations then you should probably start there.
Disclaimer: While it’s important to mention that correlation does not mean causation, we can learn a lot by seeing what the competition has done.
Some of the benefits of conducting competitive research are:
Once you isolate trends that seem to make a positive difference, you can create a hypothesis and test. This allows you to constantly be testing, finding out what works, and growing those positive elements while eliminating the things that don’t produce results. Instead of making final decisions off of emotion, make your decisions off of the conversion data.
A good competition analysis will give you a strong insight into the market and allow you to test, succeed, or fail fast. The idea behind this process is to really get a strong snapshot of your competition at a glance to isolate factors you may be missing in your company’s online presence.
Disclaimer 2: It’s good to use competitors’ ideas if they work, but don’t make that your only strategy.
Below I will cover a process I commonly use for competition analysis. I have also created this Google Docs spreadsheet for you to follow along with and use for yourself. To make your own copy simply go to File > Make A Copy. (Don’t ask me to add you as an owner please 🙂
Whether you work internally or were hired as an outside resource to help with your client’s SEO campaign, you probably have some idea of who the competition is in your space. Some companies may have good offline marketing but poor online marketing. If you’re looking to be the best, it’s a good idea to do your own research and see who you’re up against.
In my experience it’s always good to find and verify 5-10 online competitors in your space from a variety of sources. You can use tools for this or take the manual approach. Keep in mind that you have to screen the data tools give you with your own eye for accuracy.
We’re going to look at some tools you can use to find competitors here in a second, but keep in mind you want to record everything you find.
Make sure to capture the basic information for each competitor including their company name, location, and website. These tools will be useful at a later time. Record these in the “competitor research” tab of the spreadsheet.
This is pointing out the obvious, but if you have a set of keywords you want to rank for, you can look for trends and see who is already ranking where you want to be. Don’t limit this to just one or two keywords, instead get a broader list of the competitors out there.
To do this, simply come up with a list of several keywords you want to rank for and search for them in your geographic area. Make sure your Geographic preference is set correctly so you get accurate data.
To start we’re just going to collect the data and enter it into the spreadsheet. We will revisit this data shortly.
Outside of the basics, I always find it’s good to see who else is out there. Since organic and local rankings are more closely tied together than ever, it’s a good idea to use 3rd party tools to get some insight as to what else your website could be considered related to.
This can help provide hidden opportunities outside of the normal competition you likely look at most frequently.
SEMRush is a pretty neat competitive analysis tool. While it is a paid program, they do in fact have a few free visits a day you can check out. It’s limited but it will show you 10 competitors based on keyword ranking data. It’s also useful for recording paid competition as well.
To use the tool, visit www.SEMRush.com and enter your website in the provided search box and hit search. Once the page loads, you simply have to scroll down to the area that says “main competitors”. If you click the “view full report” option you’ll be taken to a page with 10 competition URLs.
Put these URLs into the spreadsheet so we can track them later.
This is a cool tool that will show your top 5 competitors in paid and organic search. Just like SEMRush, it’s a paid tool that’s easy to use. On the home page, you will see a box that loads where you can enter your URL. Once you hit search, a list of 5 websites will populate for free.
Enter these competitors into your spreadsheet for tracking.
This website is a goldmine of data if you’re trying to learn about a startup. In addition to the basic information we’re looking for, you can also find out things like how much money they’ve raised, staff members, past employee history, and so much more.
Crunchbase also works pretty similarly to the prior tools in the sense that you you just enter your website URL and hit the search button. Once the page loads, you can scroll down the page to the competitors section for some data.
While Crunchbase is cool, it’s not too useful for smaller companies as it doesn’t seem to have too much data outside of the startup world.
This tool seems to have limited data for smaller websites but it’s worth a shot. It can also be a little bit more high-level than I prefer, but you should still check it out.
To use the tool visit www.compete.com and enter the URL you want to examine in the box provided then hit search.
Click the “Find more sites like” box to get list of three related sites. Enter these in the provided spreadsheet.
SimilarWeb provides a cool tool with a bunch of data to check out websites. After entering your information, you can scroll down to the similar sites section which will show websites it believes to be related.
The good news about SimilarWeb is that it seems to have data no matter how big or small your site is.
Now that we have a list of competitors, we can really do a deep dive to see who is ranking and what factors might be contributing to their success. To start, make sure to pick your top competitors from the spreadsheet and then look for and record the information below about each business on the Competitor Analysis tab.
You will want to to pull this information from their Google My Business page.
If you know the company’s name, it’s pretty easy to find them just by searching the brand. You can add the geographic location if it’s a multi-location business.
For example if I was searching for a Wendy’s in Parker, Colorado, I could simply search this: “Wendy’s Parker, CO” and it will pull up the location(s).
Make sure to take and record the following information from their local listings. Get the data from their Google My Business (Google + Page) and record it in the spreadsheet!
** Record this information on the spreadsheet. A sample is below.
Since you’ve already optimized your own listing for best practices, we want to see if there is any particular trends that seem to be working better in a certain area. We can then create a hypothesis and test it to see if any gains are losses are made. While we can’t isolate factors, we can get some insight as to what’s working the more you change it.
In my experience, examining trends is much easier when the data is side by side. You can easily pick out data that stands out from the rest.
You already know the ins and outs of your landing page. Now let’s look at each competitor’s landing page individually. Let’s look at the factors that carry the most weight and see if anything sticks out.
Record the following information into the spreadsheet and compare side by side with your company vs. the successful ones.
|Page title of landing page|
|City present? – Is the city present in the landing page meta title?|
|State present? – Is the state present in the landing page meta title?|
|Major KW in title? Is there a major keyword in the landing page meta title?|
|Content length on landing page – Possibly minor but worth examining. Copy/paste into MS Word|
|H1 present? – Is the H1 tag present?|
|City in H1? – Does the H1 contain the city name?|
|State in H1? – Does the H1 have the state or abbreviation in the heading?|
|Keyword in H1? – Do they use a keyword in the H1?|
|Local business schema present? – Are they using schema? Find out using the Google structured data testing tool here.|
|Embedded map present? – Are they embedding a Google map?|
|GPS coordinates present? – Are they using GPS coordinates via schema or text?|
Recently, I was having a conversation with a client who was super-excited about the efforts his staff was making. He proudly proclaimed that his office was building 10 new citations a day and added over 500 within the past couple of months!
His excitement freaked me out. As I suspected, when I asked to see his list, I saw a bunch of low quality directory sites that were passing little or no value. One way I could tell they were not really helping (besides the fact that some were NSFW websites), was that the citations or listings were not even indexed in Google.
I think it’s a reasonable assumption that you should test to see what Google knows about your business. Whatever Google delivers about your brand, it’s serving because it has the most relevance or authority in its eyes.
It’s actually pretty simple. Just do a Google Search. One of the ways that I try to evaluate and see whether or not a citation website is authoritative enough is to take the competition’s NAP and Google it. While you’ve probably done this many times before for citation earning, you can prioritize your efforts based off of what’s recurring between top ranked competitor websites.
As you can see in the example below where I did a quick search for a competitor’s dental office (by pasting his NAP in the search bar), I see that Google is associating this particular brand with websites like:
Pro Tip: Amazon local is relatively new, but you can see that it’s going to carry a citation benefit in local search. If your clients are willing, you should sign up for this.
Don’t want to copy and paste the NAP in a variety of formats? Use Andrew Shotland’s NAP Hunter to get your competitor’s variants. This tool will easily open multiple window tabs in your browser and search for combinations of your competitor’s NAP listings. It makes it easy and it’s kind of fun.
With citations, I’m generally in the ballpark of quality over quantity. That being said, if you’re just getting the same citations that everyone else has, that doesn’t really set you apart does it? I like to tell clients that the top citation sources are a must, but it’s good to seek out opportunities and monitor what your competition does so you can keep up and stay ahead of the game.
You need to check the top citations and see where you’re listed vs. your competition. Tools like Whitespark’s local citation finder make this much easier to get an easy snapshot.
If you’re looking to see which citations you should find and check, use these two resources below:
Just like in the example in the section above, you can find powerful hidden gems and also new website opportunities that arise from time to time.
A common mistake I see is businesses thinking it’s ok to just turn things off when they get to the top.That’s a bad idea. If you’re serious about online marketing, you know that someone is always out to get you. So in addition to tracking your brand mentions through the Fresh Web Explorer, you also need to be tracking your competition at least once a month! The good news is that you can do this easily with Fresh Web Explorer from Moz.
Plus track anything else you can think of related to your brand. This will help the on-going efforts get a bit easier.
Did you know some citation sources have dofollow links which mean they pass link juice to your website? Now while these by themselves likely won’t pass a lot of juice, it adds an incentive for you to be proactive with recording and promoting these listings.
When reviewing my competition’s citations and links I use a simple Chrome plugin called NoFollow which simply highlights nofollow links on pages. It makes it super easy to see what’s a follow vs. a nofollow link.
But what’s the benefit of this? Let’s say that I have a link on a city website that’s a follow link and a citation. If it’s an authority page that talks highly about my business, it would make sense for me to link to it from time to time. If you’re getting links from websites other than your own and linking to these high quality citations you will pass link juice to your page. It’s a pretty simple way of increasing the authority of your local landing pages.
Since the Pigeon update almost a year ago, links started to make a bigger impact in local search. You have to be earning links and you have to earn high quality links to your website and especially your Google My Business Landing page.
If the factors show you’re on the same playing field as your competition except in domain authority or page authority, you know your primary focus needs to be links.
Now here is where the research gets interesting. Remember the data sources we pulled earlier like compete, spyfu.com, etc? We are now going to get a bigger picture on the link profile because we did this extra work. Not only are we just going to look at the links that our competition in the pack has, we’ve started to branch out of that for more ideas which will potentially pay off big in the long run.
Now we want to take every domain we looked at when we started and run Open Site Explorer on each and every domain. Once we have these lists of links, we can then sort them out and go after the high quality ones that you don’t already have.
Typically, when I’m doing this research I will export everything into Excel or Google Docs, combine them into one spreadsheet and then sort from highest authority to least authority. This way you can prioritize your road map and focus on the bigger fish.
Keep in mind that citations usually have links and some links have citations. If they have a lot of authority you should make sure you add both.
If you feel like you’ve gone above and beyond your competition and yet you’re not seeing the gains you want, there is more you have to look at. Sometimes as an SEO it’s easy to get in a paradigm of just the technical or link side of things. But what about user behavior?
It’s no secret and even some recent tests are showing promising data. If your users visit your site and then click back to the search results it indicates that they didn’t find what they were looking for. Through our own experiments we have seen listings in the SERPs jump a few positions in hours just based off of user behavior.
You need to make sure your pages are answering the users queries as they land on your page, preferably above the fold. For example, if I’m looking for a haircut place and I land on your page, I might be wanting to know the hours, pricing, or directions to your store. Making information prevalent is essential.
Make sure that if you’re going to make these changes you test them. Come up with a hypothesis, test the results, and come to conclusion or another test based off of the data. If you want to know more about your users, I say that you need to find as much about them as human possible. Some services you can use for that are:
1. Inspectlet – Record user sessions and watch how they navigate your website. This awesome tool literally allows you to watch recorded user sessions. Check out their site.
2. LinkedIn Tracking Script – Although I admit it’s a bit creepy, did you know that you can see the actual visitors to your website if they’re logged into LinkedIn while browsing your website? You sure can. To do this complete the following steps:
1. Sign up for a LinkedIn Premium Account
2. Enter this code into the body of your website pages:
<img src="https://www.linkedin.com/profile/view?authToken=zRgB&authType=name&id=XXXXX" />
3. Replace the XXXXX with your account number of your profile. You can get this by logging into your profile page and getting the number present after viewid?=
4. Wait for the visitors to start showing up under “who’s viewed your profile”
3. Google Analytics – Watch user behavior and gain insights as so what they were doing on your website.
Speaking of user behavior, is your listing the only one without reviews? Does it have fewer or less favorable reviews? All of these are negative signals for user experience. Do you competitors have more positive reviews? If so you need to work getting more.
While this post was mainly geared towards local SEO as in Google My Business rankings, you have to consider that there are a lot of localized search queries that do not generate pack results. In these cases they’re just standard organic listings.
If you’ve been deterred to add these by Google picking its own meta descriptions or by their lack of ranking benefit, you need to check yourself before you wreck yourself. Seriously. Customers will make a decision on which listing to click on based on this information. If you’re not thinking about optimizing these for user intent on the corresponding page then you’re just being lazy. Spend the time, increase CTR, and increase your rankings if you’re serving great content.
The key to success here is realizing that this is a marathon and not a sprint. If you examine the competition in the top areas mentioned above and create a plan to overcome, you will win long term. This of course also assumes you’re not doing anything shady and staying above board.
While there were many more things I could add to this article, I believe that if you put your focus on what’s mentioned here you’ll have the greatest success. Since I didn’t talk too much about geo-tagged media in this article, I also included some other items to check in the spreadsheet under the competitor analysis tab.
Remember to actively monitor what those around you are doing and develop a pro-active plan to be successful for your clients.
What’s the most creative thing you have seen a competitor do successfully local search? I would love to hear about it in the comments below.
Posted by randfish
A recent patent from Google suggests a new kind of influence in the rankings that has immense implications for marketers. In today’s Whiteboard Friday, Rand discusses what it says, what that means, and adds a twist of his own to get us thinking about where Google might be heading.
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week let’s chat about some things that Google is learning about web searchers and web surfers that may be impacting the rankings.
I was pretty psyched to see a patent a few weeks ago that had been granted actually to Google, so filed a while before that. That patent came from Navneet Panda who, as many in the SEO space may remember, is also the engineer for whom Panda, the Panda Update from Google, is named after. Bill Slawski did a great analysis of the patent on his website, and you can check that out, along with some of the other patent diagrams themselves. Patents can be a little confusing and weird, especially the language, but this one had some surprising clarity to it and some potentially obvious applications for web marketers too.
So, in this case, Googlebot here — I’ve anthropomorphized him, my Googlebot there, nicely — is thinking about the queries that are being performed in Google search engine and basically saying, “Huh, if I see lots of people searching for things like ‘find email address,’ ’email address tool,’ ’email finder,’ and then I also see a lot of search queries similar to those but with an additional branded element, like ‘VoilaNorbert email tool’ or ‘Norbert email finder’ or ‘how to find email Norbert,’ or even things like ’email site:voilanorbert.com,'” Googlebot might actually say, “Hmm, lots of searchers who look for these kinds of queries seem to be also looking for this particular brand.”
You can imagine this in tons and tons of ways. Lots of people searching for restaurants also search for Yelp. Lots of people searching for hotels also add in queries like “Trip Advisor.” Lots of people searching for homes to buy also add in Zillow. These brands that essentially get known and combined and perform very well in these non-branded searches, one of the ways that Google might be thinking about that is because they see a lot of branded search that includes the unbranded words around that site.
In Panda’s site quality patent — and Navneet Panda wasn’t the only author on this patent, but one of the ones we recognize — what’s described is essentially that this algorithm, well not algorithm, very simplistic equation. I’m sure much more than simplistic than what Google’s actually using if they are actually using this. Remember, when it comes to patents, they usually way oversimplify that type of stuff because they don’t want to get exactly what they’re doing out there in the public. But they have this equation that looks like this: Number of unique searchers for the brand or keyword X — so essentially, this is kind of a searches, searchers. They’re trying to identify only unique quantities of people doing it, looking at things like IP address and device and location and all of that to try and identify just the unique people who are performing this — divided by the number of unique searches for the non-branded version.
So branded divided by non-branded equals some sort of site quality score for keyword X. If a lot more people are performing a search for “Trip Advisor + California vacations” than are performing searches for just “California vacations,” then the site quality score for Trip Advisor when it comes to the keyword “California vacations” might be quite high.
You can imagine that if we take another brand — let’s say a brand that folks are less familiar with, WhereToGoInTheWorld.com — and there’s very, very few searches for that brand plus “California vacations,” and there’s lots of searches for the unbranded version, the site quality score for WhereToGoInTheWorld.com is going to be much lower. I don’t even think that’s a real website, but regardless.
Now, I want to add one more wrinkle on to this. I think one of the things that struck me as being almost obvious but not literally mentioned in this specific patent was my theory that this also applies to clickstream data. You can see this happening obviously already in personalization, personalized search, but I think it might be happening in non-personalized search as well, and that is essentially through Android and through Chrome, which I’ve drawn these lovely logos just for you. Google knows basically where everyone goes on the web and what everyone does on the web. They see this performance.
So they can look and see the clickstream for a lot of people’s process is a searcher goes and searches for “find email address tool,” and then they find this resource from Distilled and Distilled mentions Rob Ousbey’s account — I think it was from Rob Ousbey that that original resource came out — and they follow him and then they follow me and they see that I tweeted about VoilaNorbert. Voila, they make it to VoilaNorbert.com’s website, where their search ends. They’re no longer looking for this information. They’ve now found a source that sort of answers their desire, their intent. Google might go, “Huh, you know, why not just rank this? Why rank this one when we could just put this there? Because this seems to be the thing that is answering the searcher’s problem. It’s taking care of their issue.”
This is tough for marketers. I think both of these, the query formatting and the potential clickstream uses, suggest a world in which building up your brand association and building up the stream of traffic to your website that’s solving a problem not just for searchers, but for potential searchers and people with that issue, whether they search or not, is part of SEO. I think that’s going to mean that things like branding and things like attracting traffic from other sources, from social, from email, from content, from direct, from offline, and word-of-mouth, that all of those things are going to become part of the SEO equation. If we don’t do those things well, in the long term, we might do great SEO, kind of classic, old-school keywords and links and crawl and rankings SEO and miss out on this important piece that’s on the rise.
I’m looking forward to some great comments and your theories as well. We’ll see you again next week for another edition of Whiteboard Friday. Take care.