I Want To Rank Beyond My Location: A Guide to How This Works

Posted by MiriamEllis

Staff at your agency get asked this question just about every day, and it’s a local SEO forum FAQ, too:

“I’m located in ‘x’, but how do I rank beyond that?”

In fact, this query is so popular, it deserves a good and thorough answer. I’ve written this article in the simplest terms possible so that you can instantly share it with even your least-technical clients.

We’ll break rankings down into five easy-to-grasp groups, and make sense out of how Google appears to bucket rankings for different types of users and queries. Your clients will come away with an understanding of what’s appropriate, what’s possible, and what’s typically impossible. It’s my hope that shooting this link over to all relevant clients will save your team a ton of time, and ensure that the brands you’re serving are standing on steady ground with some good education.

There’s nothing quite like education as a sturdy baseline for creating achievable goals, is there?

One hypothetical client’s story

We’ll illustrate our story by focusing in on a single fictitious business. La Tortilleria is a tortilla bakery located at 197 Fifth Avenue in San Rafael, Marin County, California, USA. San Rafael is a small city with a population of about 60,000. La Tortilleria vends directly to B2C customers, as well as distributing their handmade tortillas to a variety of B2B clients, like restaurants and grocery stores throughout Marin County.

La Tortilleria’s organic white corn tortillas are so delicious, the bakery recently got featured on a Food Network TV show. Then, they started getting calls from San Francisco, Sacramento, and even Los Angeles asking about their product. This business, which started out as a mom-and-pop shop, is now hoping to expand distribution beyond county borders.

When it comes to Google visibility, what is La Tortilleria eligible for, and is there some strategy they can employ to show up in many places for many kinds of searches? Let’s begin:

Group I: Hyperlocal rankings

Scenario

Your supreme chance of ranking in Google’s local pack results is typically in the neighborhood surrounding your business. For example, with the right strategy, La Tortilleria could expect to rank very well in the above downtown area of San Rafael surrounding their bakery. When searchers are physically located in this area or using search language like “tortilleria near me,” Google can hyper-localize the radius of the search to just a few city blocks when there are enough nearby options to make up a local pack.

Ask the client to consider:

  • What is my locale like? Am I in a big city, a small town, a rural area?
  • What is the competitive level of my market? Am I one of many businesses offering the same goods/services in my neighborhood, or am I one of the only businesses in my industry here?

Google’s local pack radius will vary greatly based on the answers to those two questions. For example, if there are 100 tortilla bakeries in San Rafael, Google doesn’t have to go very far to make up a local pack for a searcher standing on Fifth Avenue with their mobile phone. But, if La Tortilleria is one of only three such businesses in town, Google will have to reach further across the map to make up the pack. Meanwhile, in a truly rural area with few such businesses, Google’s smallest radius could span several towns, or if there simply aren’t enough options, not show a local pack in the results at all.

Strategy

To do well in the hyperlocal packs, tell your client their business should:

  • Create and claim a Google My Business listing, filling out as many fields as possible. Earn some reviews and respond to them
  • Build out local business listings on top local business information platforms, either manually or via a service like Moz Local.
  • Mention neighborhood names or other hyperlocal terms on the company website, including on whichever page of the site the Google listing points to.
  • If competition is strong in the neighborhood, invest in more advanced tactics like earning local linktations, developing more targeted hyperlocal content, using Google Posts to highlight neighborhood-oriented content, and managing Google Q&A to outdistance more sluggish competitors.

*Note that if you are marketing a multi-location enterprise, you’ll need to undertake this work for each location to get it ranking well at a hyperlocal level.

Group II: Local rankings

Scenario

These rankings are quite similar to the above but encompass an entire city. In fact, when we talk about local rankings, we are most often thinking about how a business ranks within its city of location. For example, how does La Tortilleria rank for searches like “tortilleria,” “tortilla shop,” or “tortillas san rafael” when a searcher is anywhere in that city, or traveling to that city from another locale?

If Google believes the intent of such searches is local (meaning that the searcher wants to find some tortillas to buy near them rather than just seeking general information about baked goods), they will make up a local pack of results. As we’ve covered, Google will customize these packs based on the searcher’s physical location in many instances, but a business that becomes authoritative enough can often rank across an entire city for multiple search phrases and searcher locales.

For instance, La Tortilleria might always rank #1 for “tortilla shop” when searchers on Fifth Avenue perform that search, but they could also rank #1 for “organic tortillas San Rafael” when locals in any part of that city or even out-of-towners do this lookup, if the business has built up enough authority surrounding this topic.

With the right strategy, every business has a very good chance of ranking locally in its city of physical location for some portion of its most desired search phrases.

Ask the client to consider:

  • Does my location + Google’s results behavior create small or large hurdles in my quest for city-wide rankings? When I look at the local packs I want to rank for, does Google appear to be clustering them too tightly in some part of the city to include my location in a different part of town? If so, can I overcome this?
  • What can I specialize in to set me apart? Is there some product, service, or desirable attribute my business can become particularly known for in my city over all other competitors? If I can’t compete for the biggest terms I’d like to rank for, are there smaller terms I could become dominant for city-wide?
  • How can I build my authority surrounding this special offering? What will be the most effective methodologies for becoming a household name in my community when people need the services I offer?

Your agency will face challenges surrounding this area of work. I was recently speaking with a business owner in Los Angeles who was disappointed that he wasn’t appearing for the large, lucrative search term “car service to LAX.” When we looked at the results together from various locations, we saw that Google’s radius for that term was tightly clustered around the airport. This company’s location was in a different neighborhood many miles away. In fact, it was only when we zoomed out on Google Maps to enlarge the search radius, or zoomed in on this company’s neighborhood, that we were able to see their listing appear in the local results.

This was a classic example of a big city with tons of brands offering nearly-identical services — it results in very stiff competition and tight local pack radius.

My advice in a tough scenario like this would revolve around one of these three things:

  • Becoming such a famous brand that the business could overcome Google’s famous bias
  • Specializing in some attribute that would enable them to seek rankings for less competitive keywords
  • Moving to an office near that “centroid” of business instead of in a distant neighborhood of the large city.

Your specific scenario may be easier, equal to, or even harder than this. Needless to say, a tortilla shop in a modestly-sized town does not face the same challenges as a car service in a metropolis. Your strategy will be based on your study of your market.

Strategy

Depending on the level of competition in the client’s market, tell them they will need to invest in some or all of the following:

  • Identify the keyword phrases you’re hoping to rank for using tools like Moz Keyword Explorer, Answer the Public, and Google Trends combined with organized collection and analysis of the real-world FAQs customers ask your staff.
  • Observe Google’s local pack behavior surrounding these phrases to discover how they are clustering results. Perform searches from devices in your own neighborhood and from other places around your city, as described in my recent post How to Find Your True Local Competitors. You can also experiment with tools like BrightLocal’s Local Search Results Checker.
  • Identify the top competitors in your city for your targeted phrases and then do a competitive audit of them.
  • Stack these discovered competitors up side-by-side with your business to see how their local search ranking factors may be stronger than yours. Improve your metrics so that they surpass those of the competitors, whether this surrounds Google My Business signals, Domain Authority, reputation, citation factors, website quality, or other elements.
  • If Google’s radius is tight for the most lucrative terms and your efforts to build authority so far aren’t enabling you to overcome it due to your location falling outside their reach, consider specialization in other smaller, but still valuable, search phrases. For instance, La Tortilleria could be the only bakery in San Rafael offering organic tortillas. A local business might significantly narrow the competition by being pet-friendly, open later, cheaper, faster, more staffed, women-led, serving specific dietary restrictions or other special needs, selling rarities, or bundling goods with expert advice. There are many ways to set yourself apart.
  • Finally, publicize your unique selling proposition. Highlight it on your website with great content. If it’s a big deal, make connections with local journalists and bloggers to try to make news. Use Google My Business attributes to feature it on your listing. Cross-sell with related local businesses and promote one another online. Talk it up on social media. Structure review requests to nudge customers towards mentioning your special offering in their reviews. Do everything you can to help your community and Google associate your brand name with your specialty.

Group III: Regional rankings

Scenario

This is where we typically hit our first really big hurdle, and where the real questions begin. La Tortilleria is located in San Rafael and has very good chances of ranking in relation to that city. But what if they want to expand to selling their product throughout Marin County, or even throughout several surrounding counties? Unless competition is very low, they are unlikely to rank in the local packs for searchers in neighboring cities like Novato, Mill Valley, or Corte Madera. What paths are open to them to increase their visibility beyond their city of location?

It’s at this juncture that agencies start hearing clients ask, “What can I do if I want to rank outside my city?” And it’s here that it’s most appropriate to respond with some questions clients need to be asking themselves.

Ask the client to consider:

  • Does my business model legitimately lend itself to transactions in multiple cities or counties? For example, am I just hoping that if my business in City A could rank in City B, people from that second location would travel to me? For instance, the fact that a dentist has some patients who come to their practice from other towns isn’t really something to build a strategy on. Consumers and Google won’t be excited by this. So, ask yourself: “Do I genuinely have a model that delivers goods/services to City B or has some other strong relationship to neighbors in those locales?”
  • Is there something I can do to build a physical footprint in cities where I lack a physical location? Short of opening additional branches, is there anything my business can do to build relationships with neighboring communities?

Strategy

  • First, know that it’s sometimes possible for a business in a less-competitive market to rank in nearby neighboring cities. If La Tortilleria is one of just 10 such businesses in Marin County, Google may well surface them in a local pack or the expanded local finder view for searchers in multiple neighboring towns because there is a paucity of options. However, as competition becomes denser, purely local rankings beyond city borders become increasingly rare. Google does not need to go outside of the city of San Francisco, for example, to make up complete local results sets for pizza, clothing, automotive services, attorneys, banks, dentists, etc. Assess the density of competition in your desired regional market.
  • If you determine that your business is something of a rarity in your county or similar geographical region, follow the strategy described above in the “Local Rankings” section and give it everything you’ve got so that you can become a dominant result in packs across nearby multiple cities. If competition is too high for this, keep reading.
  • If you determine that what you offer isn’t rare in your region, local pack rankings beyond your city borders may not be feasible. In this case, don’t waste money or time on unachievable goals. Rather, move the goalposts so that your marketing efforts outside of your city are targeting organic, social, paid, and offline visibility.
  • Determine whether your brand lends itself to growing face-to-face relationships with neighboring cities. La Tortilleria can send delivery persons to restaurants and grocery stores throughout its county. They can send their bakers to workshops, culinary schools, public schools, food festivals, expos, fairs, farmers markets, and a variety of events in multiple cities throughout their targeted region. They can sponsor regional events, teams, and organizations. They can cross-sell with a local salsa company, a chocolatier, a caterer. Determine what your brand’s resources are for expanding a real-world footprint within a specific region.
  • Once you’ve begun investing in building this footprint, publicize it. Write content, guest blog, make the news, share socially, advertise online, advertise in local print, radio, and TV media. Earn links, citations and social mentions online for what you are doing offline and grow your regional authority in Google’s eyes while you’re doing it.
  • If your brand is a traditional service area business, like a residential painting company with a single location that serves multiple cities, develops a website landing page for each city you serve. Make each page a showcase of your work in that city, with project features, customer reviews, localized tips, staff interviews, videos, photos, FAQs and more. As with brick-and-mortar models, your level of rarity will determine whether your single physical office can show up in the local packs for more than one city. If your geo-market is densely competitive, the main goal of your service city landing pages will be organic rankings, not local ones.

Group IV: State-wide rankings

Scenario

This is where our desired consumer base can no longer be considered truly local, though local packs may still occasionally come into play. In our continuing story, revenue significantly increased after La Tortilleria appeared on a popular TV show. Now they’ve scaled up their small kitchen to industrial strength in hopes of increasing trade across the state of California. Other examples might be an architectural firm that sends staff state-wide to design buildings or a photographer who accepts event engagements across the state.

What we’re not talking about here is a multi-location business. Any time you have a physical location, you can simply refer back to Groups I–III for strategy because you are truly in the local running any place you have a branch. But for the single location client with a state-wide offering, the quest for broad visibility begs some questions.

Ask the client to consider:

  • Are state-wide local pack results at all in evidence for my query or is this not the reality at all for my industry? For example, when I do a non-modified search just for “sports arena” in California, it’s interesting to see that Google is willing to make up a local pack of three famous venues spanning Sonora to San Diego (about 500 miles apart). Does Google return state-wide packs for my search terms, and is what I offer so rare that I might be included in them?
  • Does my business model genuinely lend itself to non-local queries and clients willing to travel far to transact with me or hire me from anywhere in the state? For example, it would be a matter of pure vanity for me to want my vacuum cleaner repair shop to rank state-wide, as people can easily access services like mine in their own towns. But, what if I’m marketing a true rara avis, like a famous performing arts company, a landmark museum, a world-class interior design consultancy, or a vintage electronics restoration business?
  • Whether Google returns state-wide local packs or only organic results for my targeted search terms, what can I do to be visible? What are my resources for setting myself apart?

Strategy

  • First, let’s take it for granted that you’ve got your basic local search strategy in place. You’re already doing everything we’ve covered above to build a strong hyperlocal, local, and regional digital and offline footprint.
  • If Google does return state-wide local packs for your search phrases, simply continue to amp up the known local pack signals we’ve already discussed, in hopes of becoming authoritative enough to be included.
  • If your phrases don’t return state-wide local packs, you will be competing against a big field for organic results visibility. In this case, you are likely to be best served by three things. Firstly, take publication on your website seriously. The more you can write about your offerings, the more of an authoritative resource you will become. Delve deeply into your company’s internal talent for developing magazine-quality content and bring in outside experts where necessary. Secondly, invest in link research tools like Moz Link Explorer to analyze which links are helping competitors to rank highly in the organic results for your desired terms and to discover where you need to get links to grow your visibility. Thirdly, seek out your state’s most trusted media sources and create a strategy for seeking publicity from them. Whether this comes down to radio, newspapers, TV shows, blogs, social platforms, or organizational publications, build your state-wide fame via inclusion.
  • If all else fails and you need to increase multi-regional visibility throughout your state, you will need to consider your resources for opening additional staffed offices in new locales.

Group V: National rankings & beyond

Scenario

Here, we encounter two common themes, neither of which fall within our concept of local search.

In the first instance, La Tortilleria is ready to go multi-state or nation-wide with its product, distributing goods outside of California as a national brand. The second is the commonly-encountered digital brand that is vending to a multi-state or national audience and is often frustrated by the fact that they are being outranked both in the local and organic results by physical, local companies in a variety of locations. In either case, the goals of both models can sometimes extend beyond country borders when businesses go multinational.

Ask the client to consider:

  • What is my business model? Am I selling B2B, B2C, or both?
  • Which marketing strategies will generate the brand recognition I need? Is my most critical asset my brand’s website, or other forms of off-and-online advertising? Am I like Wayfair, where my e-commerce sales are almost everything, bolstered by TV advertising? Or, am I like Pace Foods with a website offering little more than branding because distribution to other businesses is where my consumers find me?
  • Does my offering need to be regionalized to succeed? Perhaps La Tortilleria will need to start producing super-sized white flour tortillas to become a hit in Texas. McDonald’s offers SPAM in Hawaii and green chile cheeseburgers in New Mexico. Regional language variants, seasonality, and customs may require fine-tuning of campaigns.

Strategy

  • If your national brand hinges on B2C online sales, let me put the e-commerce SEO column of the Moz blog at your fingertips. Also highly recommended, E-commerce SEO: The Definitive Guide.
  • If your national brand revolves around getting your product on shelves, delve into Neilsen’s manufacturer/distributor resources and I’ve also found some good reading at MrCheckout.
  • If you are expanding beyond your country, read Moz’s basic definition of International SEO, then move on to An In-Depth Look at International SEO and The Ultimate Guide to International SEO.
  • This article can’t begin to cover all of the steps involved in growing a brand from local to an international scale, but in all scenarios, a unifying question will revolve around how to cope with the reality that Google will frequently rank local brands above or alongside your business for queries that matter to you. If your business has a single physical headquarters, then content, links, social, and paid advertising will be the tools at your disposal to compete as best you can. Rarity may be your greatest strength, as seen in the case of America’s sole organic tulip bulb grower, or authority, as in the case of this men’s grooming site ranking for all kinds of queries related to beards.
  • You’ll be wanting to rank for every user nationwide, but you’ll also need to be aware of who your competitors are at a local and regional level. This is why even national/international brands need some awareness of how local search works so that they can identify and audit strong local brands in target markets in order to compete with them in the organic SERPs, sometimes fine-tuning their offerings to appeal to regional needs and customs.
  • I often hear from digital-only brands that want to rank in every city in the nation for a virtual service. While this may be possible for a business with overwhelming authority and brand recognition (think Amazon), a company just starting out can set a more reasonable goal of analyzing a handful of major cities instead of thousands of them to see what it would take to get in the running with entrenched local and digital brands.
  • Finally, I want to mention one interesting and common national business model with its own challenges. In this category are tutoring businesses, nanny services, dog walking services, and other brands that have a national headquarters but whose employees or contractors are the ones providing face-to-face services. Owners ask if it’s possible to create multiple Google listings based on the home addresses of their workers so that they can achieve local pack rankings for what is, in fact, a locally-rendered service. The answer is that Google doesn’t approve of this tactic. So, where a local pack presence is essential, the brand must find a way to staff an office in each target region. Avoid virtual offices, which are explicitly forbidden, but there could be some leeway in exploring inexpensive co-working spaces staffed during stated business hours and where no other business in the same Google category is operating. A business that determines this model could work for them can then pop back up to Groups I-IV to see how far local search can take them.

Summing up

There may be no more important task in client-onboarding than setting correct expectations. Basing a strategy on what’s possible for each client’s business model will be the best guardian of your time and your client’s budget. To recap:

  1. Identify the client’s model.
  2. Investigate Google’s search behavior for the client’s important search phrases.
  3. Gauge the density of competition/rarity of the client’s offerings in the targeted area.
  4. Audit competitors to discover their strengths and weaknesses.
  5. Create a strategy for local, organic, social, paid, and offline marketing based on the above four factors.

For each client who asks you how to rank beyond their physical location, there will be a unique answer. The work your agency puts into finding that answer will make you an expert in their markets and a powerful ally in achieving their achievable goals.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 1 month ago from tracking.feedpress.it

SearchCap: SEO periodic table, Google updates & rank tracking

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: SEO periodic table, Google updates & rank tracking appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Reblogged 2 years ago from feeds.searchengineland.com

When You Rank High Organically But Not Locally (Case Study)

You’ve done everything right in terms of local SEO — you’re even ranking high in organic results — but you just can’t seem to get a place in the map pack. What’s wrong? Columnist Joy Hawkins explores.

The post When You Rank High Organically But Not Locally (Case Study) appeared first on Search…

Please visit Search Engine Land for the full article.

Reblogged 3 years ago from feeds.searchengineland.com

Why Effective, Modern SEO Requires Technical, Creative, and Strategic Thinking – Whiteboard Friday

Posted by randfish

There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.

There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.

Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”

Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.

First off, some of the good points he did bring up.

One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”

Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.

And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.

You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.

But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.

First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.

So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.

Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.

Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.

So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.

And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.

Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.

The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”

Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”

I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.

There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.

Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;

  • Good on-site experience
  • Writing good content
  • Getting others to acknowledge you as an authority
  • Rising in social popularity
  • Earning local relevance
  • Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)

The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.

I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.

But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.

I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.

We have to be able to understand things like;

  • Content rendering and indexability
  • Crawl structure, internal links, JavaScript, Ajax. If something’s post-loading after the page and Google’s not able to index it, or there are links that are accessible via JavaScript or Ajax, maybe Google can’t necessarily see those or isn’t crawling them as effectively, or is crawling them, but isn’t assigning them as much link weight as they might be assigning other stuff, and you’ve made it tough to link to them externally, and so they can’t crawl it.
  • Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?
  • Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.
  • Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.

Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”

Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.

  • International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?
  • Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?
  • Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.

Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.

I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.

  • Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.
  • Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.
  • Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.
  • Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.
  • Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.
  • Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.
  • Diagnosing mobile friendliness issues
  • Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.

Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.

So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.

All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Distance from Perfect

Posted by wrttnwrd

In spite of all the advice, the strategic discussions and the conference talks, we Internet marketers are still algorithmic thinkers. That’s obvious when you think of SEO.

Even when we talk about content, we’re algorithmic thinkers. Ask yourself: How many times has a client asked you, “How much content do we need?” How often do you still hear “How unique does this page need to be?”

That’s 100% algorithmic thinking: Produce a certain amount of content, move up a certain number of spaces.

But you and I know it’s complete bullshit.

I’m not suggesting you ignore the algorithm. You should definitely chase it. Understanding a little bit about what goes on in Google’s pointy little head helps. But it’s not enough.

A tale of SEO woe that makes you go “whoa”

I have this friend.

He ranked #10 for “flibbergibbet.” He wanted to rank #1.

He compared his site to the #1 site and realized the #1 site had five hundred blog posts.

“That site has five hundred blog posts,” he said, “I must have more.”

So he hired a few writers and cranked out five thousand blogs posts that melted Microsoft Word’s grammar check. He didn’t move up in the rankings. I’m shocked.

“That guy’s spamming,” he decided, “I’ll just report him to Google and hope for the best.”

What happened? Why didn’t adding five thousand blog posts work?

It’s pretty obvious: My, uh, friend added nothing but crap content to a site that was already outranked. Bulk is no longer a ranking tactic. Google’s very aware of that tactic. Lots of smart engineers have put time into updates like Panda to compensate.

He started like this:

And ended up like this:
more posts, no rankings

Alright, yeah, I was Mr. Flood The Site With Content, way back in 2003. Don’t judge me, whippersnappers.

Reality’s never that obvious. You’re scratching and clawing to move up two spots, you’ve got an overtasked IT team pushing back on changes, and you’ve got a boss who needs to know the implications of every recommendation.

Why fix duplication if rel=canonical can address it? Fixing duplication will take more time and cost more money. It’s easier to paste in one line of code. You and I know it’s better to fix the duplication. But it’s a hard sell.

Why deal with 302 versus 404 response codes and home page redirection? The basic user experience remains the same. Again, we just know that a server should return one home page without any redirects and that it should send a ‘not found’ 404 response if a page is missing. If it’s going to take 3 developer hours to reconfigure the server, though, how do we justify it? There’s no flashing sign reading “Your site has a problem!”

Why change this thing and not that thing?

At the same time, our boss/client sees that the site above theirs has five hundred blog posts and thousands of links from sites selling correspondence MBAs. So they want five thousand blog posts and cheap links as quickly as possible.

Cue crazy music.

SEO lacks clarity

SEO is, in some ways, for the insane. It’s an absurd collection of technical tweaks, content thinking, link building and other little tactics that may or may not work. A novice gets exposed to one piece of crappy information after another, with an occasional bit of useful stuff mixed in. They create sites that repel search engines and piss off users. They get more awful advice. The cycle repeats. Every time it does, best practices get more muddled.

SEO lacks clarity. We can’t easily weigh the value of one change or tactic over another. But we can look at our changes and tactics in context. When we examine the potential of several changes or tactics before we flip the switch, we get a closer balance between algorithm-thinking and actual strategy.

Distance from perfect brings clarity to tactics and strategy

At some point you have to turn that knowledge into practice. You have to take action based on recommendations, your knowledge of SEO, and business considerations.

That’s hard when we can’t even agree on subdomains vs. subfolders.

I know subfolders work better. Sorry, couldn’t resist. Let the flaming comments commence.

To get clarity, take a deep breath and ask yourself:

“All other things being equal, will this change, tactic, or strategy move my site closer to perfect than my competitors?”

Breaking it down:

“Change, tactic, or strategy”

A change takes an existing component or policy and makes it something else. Replatforming is a massive change. Adding a new page is a smaller one. Adding ALT attributes to your images is another example. Changing the way your shopping cart works is yet another.

A tactic is a specific, executable practice. In SEO, that might be fixing broken links, optimizing ALT attributes, optimizing title tags or producing a specific piece of content.

A strategy is a broader decision that’ll cause change or drive tactics. A long-term content policy is the easiest example. Shifting away from asynchronous content and moving to server-generated content is another example.

“Perfect”

No one knows exactly what Google considers “perfect,” and “perfect” can’t really exist, but you can bet a perfect web page/site would have all of the following:

  1. Completely visible content that’s perfectly relevant to the audience and query
  2. A flawless user experience
  3. Instant load time
  4. Zero duplicate content
  5. Every page easily indexed and classified
  6. No mistakes, broken links, redirects or anything else generally yucky
  7. Zero reported problems or suggestions in each search engines’ webmaster tools, sorry, “Search Consoles”
  8. Complete authority through immaculate, organically-generated links

These 8 categories (and any of the other bazillion that probably exist) give you a way to break down “perfect” and help you focus on what’s really going to move you forward. These different areas may involve different facets of your organization.

Your IT team can work on load time and creating an error-free front- and back-end. Link building requires the time and effort of content and outreach teams.

Tactics for relevant, visible content and current best practices in UX are going to be more involved, requiring research and real study of your audience.

What you need and what resources you have are going to impact which tactics are most realistic for you.

But there’s a basic rule: If a website would make Googlebot swoon and present zero obstacles to users, it’s close to perfect.

“All other things being equal”

Assume every competing website is optimized exactly as well as yours.

Now ask: Will this [tactic, change or strategy] move you closer to perfect?

That’s the “all other things being equal” rule. And it’s an incredibly powerful rubric for evaluating potential changes before you act. Pretend you’re in a tie with your competitors. Will this one thing be the tiebreaker? Will it put you ahead? Or will it cause you to fall behind?

“Closer to perfect than my competitors”

Perfect is great, but unattainable. What you really need is to be just a little perfect-er.

Chasing perfect can be dangerous. Perfect is the enemy of the good (I love that quote. Hated Voltaire. But I love that quote). If you wait for the opportunity/resources to reach perfection, you’ll never do anything. And the only way to reduce distance from perfect is to execute.

Instead of aiming for pure perfection, aim for more perfect than your competitors. Beat them feature-by-feature, tactic-by-tactic. Implement strategy that supports long-term superiority.

Don’t slack off. But set priorities and measure your effort. If fixing server response codes will take one hour and fixing duplication will take ten, fix the response codes first. Both move you closer to perfect. Fixing response codes may not move the needle as much, but it’s a lot easier to do. Then move on to fixing duplicates.

Do the 60% that gets you a 90% improvement. Then move on to the next thing and do it again. When you’re done, get to work on that last 40%. Repeat as necessary.

Take advantage of quick wins. That gives you more time to focus on your bigger solutions.

Sites that are “fine” are pretty far from perfect

Google has lots of tweaks, tools and workarounds to help us mitigate sub-optimal sites:

  • Rel=canonical lets us guide Google past duplicate content rather than fix it
  • HTML snapshots let us reveal content that’s delivered using asynchronous content and JavaScript frameworks
  • We can use rel=next and prev to guide search bots through outrageously long pagination tunnels
  • And we can use rel=nofollow to hide spammy links and banners

Easy, right? All of these solutions may reduce distance from perfect (the search engines don’t guarantee it). But they don’t reduce it as much as fixing the problems.
Just fine does not equal fixed

The next time you set up rel=canonical, ask yourself:

“All other things being equal, will using rel=canonical to make up for duplication move my site closer to perfect than my competitors?”

Answer: Not if they’re using rel=canonical, too. You’re both using imperfect solutions that force search engines to crawl every page of your site, duplicates included. If you want to pass them on your way to perfect, you need to fix the duplicate content.

When you use Angular.js to deliver regular content pages, ask yourself:

“All other things being equal, will using HTML snapshots instead of actual, visible content move my site closer to perfect than my competitors?”

Answer: No. Just no. Not in your wildest, code-addled dreams. If I’m Google, which site will I prefer? The one that renders for me the same way it renders for users? Or the one that has to deliver two separate versions of every page?

When you spill banner ads all over your site, ask yourself…

You get the idea. Nofollow is better than follow, but banner pollution is still pretty dang far from perfect.

Mitigating SEO issues with search engine-specific tools is “fine.” But it’s far, far from perfect. If search engines are forced to choose, they’ll favor the site that just works.

Not just SEO

By the way, distance from perfect absolutely applies to other channels.

I’m focusing on SEO, but think of other Internet marketing disciplines. I hear stuff like “How fast should my site be?” (Faster than it is right now.) Or “I’ve heard you shouldn’t have any content below the fold.” (Maybe in 2001.) Or “I need background video on my home page!” (Why? Do you have a reason?) Or, my favorite: “What’s a good bounce rate?” (Zero is pretty awesome.)

And Internet marketing venues are working to measure distance from perfect. Pay-per-click marketing has the quality score: A codified financial reward applied for seeking distance from perfect in as many elements as possible of your advertising program.

Social media venues are aggressively building their own forms of graphing, scoring and ranking systems designed to separate the good from the bad.

Really, all marketing includes some measure of distance from perfect. But no channel is more influenced by it than SEO. Instead of arguing one rule at a time, ask yourself and your boss or client: Will this move us closer to perfect?

Hell, you might even please a customer or two.

One last note for all of the SEOs in the crowd. Before you start pointing out edge cases, consider this: We spend our days combing Google for embarrassing rankings issues. Every now and then, we find one, point, and start yelling “SEE! SEE!!!! THE GOOGLES MADE MISTAKES!!!!” Google’s got lots of issues. Screwing up the rankings isn’t one of them.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Moz Local Officially Launches in the UK

Posted by David-Mihm

To all Moz Local fans in the UK, I’m excited to announce that your wait is over. As the sun rises “across the pond” this morning, Moz Local is officially live in the United Kingdom!

A bit of background

As many of you know, we released the US version of Moz Local in March 2014. After 12 months of terrific growth in the US, and a boatload of technical improvements and feature releases–especially for Enterprise customers–we released the Check Listing feature for a limited set of partner search engines and directories in the UK in April of this year.

Over 20,000 of you have checked your listings (or your clients’ listings) in the last 3-1/2 months. Those lookups have helped us refine and improve the background technology immensely (more on that below). We’ve been just as eager to release the fully-featured product as you’ve been to use it, and the technical pieces have finally fallen into place for us to do so.

How does it work?

The concept is the same as the US version of Moz Local: show you how accurately and completely your business is listed on the most important local search platforms and directories, and optimize and perfect as many of those business listings as we can on your behalf.

For customers specifically looking for you, accurate business listings are obviously important. For customers who might not know about you yet, they’re also among the most important factors for ranking in local searches on Google. Basically, the more times Google sees your name, address, phone, and website listed the same way on quality local websites, the more trust they have in your business, and the higher you’re likely to rank.

Moz Local is designed to help on both these fronts.

To use the product, you simply need to type a name and postcode at moz.com/local. We’ll then show you a list of the closest matching listings we found. We prioritize verified listing information that we find on Google or Facebook, and selecting one of those verified listings means we’ll be able to distribute it on your behalf.

Clicking on a result brings you to a full details report for that listing. We’ll show you how accurate and complete your listings are now, and where they could be after using our product.

Clicking the tabs beneath the Listing Score graphic will show you some of the incompletions and inconsistencies that publishing your listing with Moz Local will address.

For customers with hundreds or thousands of locations, bulk upload is also available using a modified version of your data from Google My Business–feel free to e-mail enterpriselocal@moz.com for more details.

Where do we distribute your data?

We’ve prioritized the most important commercial sites in the UK local search ecosystem, and made them the centerpieces of Moz Local. We’ll update your data directly on globally-important players Factual and Foursquare, and the UK-specific players CentralIndex, Thomson Local, and the Scoot network–which includes key directories like TouchLocal, The Independent, The Sun, The Mirror, The Daily Scotsman, and Wales Online.

We’ll be adding two more major destinations shortly, and for those of you who sign up before that time, your listings will be automatically distributed to the additional destinations when the integrations are complete.

How much does it cost?

The cost per listing is £84/year, which includes distribution to the sites mentioned above with unlimited updates throughout the year, monitoring of your progress over time, geographically- focused reporting, and the ability to find and close duplicate listings right from your Moz Local dashboard–all the great upgrades that my colleague Noam Chitayat blogged about here.

What’s next?

Well, as I mentioned just a couple paragraphs ago, we’ve got two additional destinations to which we’ll be sending your data in very short order. Once those integrations are complete, we’ll be just a few weeks away from releasing our biggest set of features since we launched. I look forward to sharing more about these features at BrightonSEO at the end of the summer!

For those of you around the world in Canada, Australia, and other countries, we know there’s plenty of demand for Moz Local overseas, and we’re working as quickly as we can to build additional relationships abroad. And to our friends in the UK, please let us know how we can continue to make the product even better!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Pinpoint vs. Floodlight Content and Keyword Research Strategies – Whiteboard Friday

Posted by randfish

When we’re doing keyword research and targeting, we have a choice to make: Are we targeting broader keywords with multiple potential searcher intents, or are we targeting very narrow keywords where it’s pretty clear what the searchers were looking for? Those different approaches, it turns out, apply to content creation and site architecture, as well. In today’s Whiteboard Friday, Rand illustrates that connection.

Pinpoint vs Floodlight Content and Keyword Research Strategy Whiteboard

For reference, here are stills of this week’s whiteboards. Click on it to open a high resolution image in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about pinpoint versus floodlight tactics for content targeting, content strategy, and keyword research, keyword targeting strategy. This is also called the shotgun versus sniper approach, but I’m not a big gun fan. So I’m going to stick with my floodlight versus pinpoint, plus, you know, for the opening shot we don’t have a whole lot of weaponry here at Moz, but we do have lighting.

So let’s talk through this at first. You’re going through and doing some keyword research. You’re trying to figure out which terms and phrases to target. You might look down a list like this.

Well, maybe, I’m using an example here around antique science equipment. So you see these various terms and phrases. You’ve got your volume numbers. You probably have lots of other columns. Hopefully, you’ve watched the Whiteboard Friday on how to do keyword research like it’s 2015 and not 2010.

So you know you have all these other columns to choose from, but I’m simplifying here for the purpose of this experiment. So you might choose some of these different terms. Now, they’re going to have different kinds of tactics and a different strategic approach, depending on the breadth and depth of the topic that you’re targeting. That’s going to determine what types of content you want to create and where you place it in your information architecture. So I’ll show you what I mean.

The floodlight approach

For antique science equipment, this is a relatively broad phrase. I’m going to do my floodlight analysis on this, and floodlight analysis is basically saying like, “Okay, are there multiple potential searcher intents?” Yeah, absolutely. That’s a fairly broad phase. People could be looking to transact around it. They might be looking for research information, historical information, different types of scientific equipment that they’re looking for.

<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b15fc96679b8.73854740.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"

Are there four or more approximately unique keyword terms and phrases to target? Well, absolutely, in fact, there’s probably more than that. So antique science equipment, antique scientific equipment, 18th century scientific equipment, all these different terms and phrases that you might explore there.

Is this a broad content topic with many potential subtopics? Again, yes is the answer to this. Are we talking about generally larger search volume? Again, yes, this is going to have a much larger search volume than some of the narrower terms and phrases. That’s not always the case, but it is here.

The pinpoint approach

For pinpoint analysis, we kind of go the opposite direction. So we might look at a term like antique test tubes, which is a very specific kind of search, and that has a clear single searcher intent or maybe two. Someone might be looking for actually purchasing one of those, or they might be looking to research them and see what kinds there are. Not a ton of additional intents behind that. One to three unique keywords, yeah, probably. It’s pretty specific. Antique test tubes, maybe 19th century test tubes, maybe old science test tubes, but you’re talking about a limited set of keywords that you’re targeting. It’s a narrow content topic, typically smaller search volume.

<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b160069eb6b1.12473448.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"

Now, these are going to feed into your IA, your information architecture, and your site structure in this way. So floodlight content generally sits higher up. It’s the category or the subcategory, those broad topic terms and phrases. Those are going to turn into those broad topic category pages. Then you might have multiple, narrower subtopics. So we could go into lab equipment versus astronomical equipment versus chemistry equipment, and then we’d get into those individual pinpoints from the pinpoint analysis.

How do I decide which approach is best for my keywords?

Why are we doing this? Well, generally speaking, if you can take your terms and phrases and categorize them like this and then target them differently, you’re going to provide a better, more logical user experience. Someone who searches for antique scientific equipment, they’re going to really expect to see that category and then to be able to drill down into things. So you’re providing them the experience they predict, the one that they want, the one that they expect.

It’s better for topic modeling analysis and for all of the algorithms around things like Hummingbird, where Google looks at: Are you using the types of terms and phrases, do you have the type of architecture that we expect to find for this keyword?

It’s better for search intent targeting, because the searcher intent is going to be fulfilled if you provide the multiple paths versus the narrow focus. It’s easier keyword targeting for you. You’re going to be able to know, “Hey, I need to target a lot of different terms and phrases and variations in floodlight and one very specific one in pinpoint.”

There’s usually higher searcher satisfaction, which means you get lower bounce rate. You get more engagement. You usually get a higher conversion rate. So it’s good for all those things.

For example…

I’ll actually create pages for each of antique scientific equipment and antique test tubes to illustrate this. So I’ve got two different types of pages here. One is my antique scientific equipment page.

<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b161fa871e32.54731215.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"

This is that floodlight, shotgun approach, and what we’re doing here is going to be very different from a pinpoint approach. It’s looking at like, okay, you’ve landed on antique scientific equipment. Now, where do you want to go? What do you want to specifically explore? So we’re going to have a little bit of content specifically about this topic, and how robust that is depends on the type of topic and the type of site you are.

If this is an e-commerce site or a site that’s showing information about various antiques, well maybe we don’t need very much content here. You can see the filtration that we’ve got is going to be pretty broad. So I can go into different centuries. I can go into chemistry, astronomy, physics. Maybe I have a safe for kids type of stuff if you want to buy your kids antique lab equipment, which you might be. Who knows? Maybe you’re awesome and your kids are too. Then different types of stuff at a very broad level. So I can go to microscopes or test tubes, lab searches.

This is great because it’s got broad intent foci, serving many different kinds of searchers with the same page because we don’t know exactly what they want. It’s got multiple keyword targets so that we can go after broad phrases like antique or old or historical or 13th, 14th, whatever century, science and scientific equipment ,materials, labs, etc., etc., etc. This is a broad page that could reach any and all of those. Then there’s lots of navigational and refinement options once you get there.

Total opposite of pinpoint content.

<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b1622740f0b5.73477500.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"

Pinpoint content, like this antique test tubes page, we’re still going to have some filtration options, but one of the important things to note is note how these are links that take you deeper. Depending on how deep the search volume goes in terms of the types of queries that people are performing, you might want to make a specific page for 17th century antique test tubes. You might not, and if you don’t want to do that, you can have these be filters that are simply clickable and change the content of the page here, narrowing the options rather than creating completely separate pages.

So if there’s no search volume for these different things and you don’t think you need to separately target them, go ahead and just make them filters on the data that already appears on this page or the results that are already in here as opposed to links that are going to take you deeper into specific content and create a new page, a new experience.

You can also see I’ve got my individual content here. I probably would go ahead and add some content specifically to this page that is just unique here and that describes antique test tubes and the things that your searchers need. They might want to know things about price. They might want to know things about make and model. They might want to know things about what they were used for. Great. You can have that information broadly, and then individual pieces of content that someone might dig into.

This is narrower intent foci obviously, serving maybe one or two searcher intents. This is really talking about targeting maybe one to two separate keywords. So antique test tubes, maybe lab tubes or test tube sets, but not much beyond that.

Ten we’re going to have fewer navigational paths, fewer distractions. We want to keep the searcher. Because we know their intent, we want to guide them along the path that we know they probably want to take and that we want them to take.

So when you’re considering your content, choose wisely between shotgun/floodlight approach or sniper/pinpoint approach. Your searchers will be better served. You’ll probably rank better. You’ll be more likely to earn links and amplification. You’re going to be more successful.

Looking forward to the comments, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it