Distance from Perfect

Posted by wrttnwrd

In spite of all the advice, the strategic discussions and the conference talks, we Internet marketers are still algorithmic thinkers. That’s obvious when you think of SEO.

Even when we talk about content, we’re algorithmic thinkers. Ask yourself: How many times has a client asked you, “How much content do we need?” How often do you still hear “How unique does this page need to be?”

That’s 100% algorithmic thinking: Produce a certain amount of content, move up a certain number of spaces.

But you and I know it’s complete bullshit.

I’m not suggesting you ignore the algorithm. You should definitely chase it. Understanding a little bit about what goes on in Google’s pointy little head helps. But it’s not enough.

A tale of SEO woe that makes you go “whoa”

I have this friend.

He ranked #10 for “flibbergibbet.” He wanted to rank #1.

He compared his site to the #1 site and realized the #1 site had five hundred blog posts.

“That site has five hundred blog posts,” he said, “I must have more.”

So he hired a few writers and cranked out five thousand blogs posts that melted Microsoft Word’s grammar check. He didn’t move up in the rankings. I’m shocked.

“That guy’s spamming,” he decided, “I’ll just report him to Google and hope for the best.”

What happened? Why didn’t adding five thousand blog posts work?

It’s pretty obvious: My, uh, friend added nothing but crap content to a site that was already outranked. Bulk is no longer a ranking tactic. Google’s very aware of that tactic. Lots of smart engineers have put time into updates like Panda to compensate.

He started like this:

And ended up like this:
more posts, no rankings

Alright, yeah, I was Mr. Flood The Site With Content, way back in 2003. Don’t judge me, whippersnappers.

Reality’s never that obvious. You’re scratching and clawing to move up two spots, you’ve got an overtasked IT team pushing back on changes, and you’ve got a boss who needs to know the implications of every recommendation.

Why fix duplication if rel=canonical can address it? Fixing duplication will take more time and cost more money. It’s easier to paste in one line of code. You and I know it’s better to fix the duplication. But it’s a hard sell.

Why deal with 302 versus 404 response codes and home page redirection? The basic user experience remains the same. Again, we just know that a server should return one home page without any redirects and that it should send a ‘not found’ 404 response if a page is missing. If it’s going to take 3 developer hours to reconfigure the server, though, how do we justify it? There’s no flashing sign reading “Your site has a problem!”

Why change this thing and not that thing?

At the same time, our boss/client sees that the site above theirs has five hundred blog posts and thousands of links from sites selling correspondence MBAs. So they want five thousand blog posts and cheap links as quickly as possible.

Cue crazy music.

SEO lacks clarity

SEO is, in some ways, for the insane. It’s an absurd collection of technical tweaks, content thinking, link building and other little tactics that may or may not work. A novice gets exposed to one piece of crappy information after another, with an occasional bit of useful stuff mixed in. They create sites that repel search engines and piss off users. They get more awful advice. The cycle repeats. Every time it does, best practices get more muddled.

SEO lacks clarity. We can’t easily weigh the value of one change or tactic over another. But we can look at our changes and tactics in context. When we examine the potential of several changes or tactics before we flip the switch, we get a closer balance between algorithm-thinking and actual strategy.

Distance from perfect brings clarity to tactics and strategy

At some point you have to turn that knowledge into practice. You have to take action based on recommendations, your knowledge of SEO, and business considerations.

That’s hard when we can’t even agree on subdomains vs. subfolders.

I know subfolders work better. Sorry, couldn’t resist. Let the flaming comments commence.

To get clarity, take a deep breath and ask yourself:

“All other things being equal, will this change, tactic, or strategy move my site closer to perfect than my competitors?”

Breaking it down:

“Change, tactic, or strategy”

A change takes an existing component or policy and makes it something else. Replatforming is a massive change. Adding a new page is a smaller one. Adding ALT attributes to your images is another example. Changing the way your shopping cart works is yet another.

A tactic is a specific, executable practice. In SEO, that might be fixing broken links, optimizing ALT attributes, optimizing title tags or producing a specific piece of content.

A strategy is a broader decision that’ll cause change or drive tactics. A long-term content policy is the easiest example. Shifting away from asynchronous content and moving to server-generated content is another example.

“Perfect”

No one knows exactly what Google considers “perfect,” and “perfect” can’t really exist, but you can bet a perfect web page/site would have all of the following:

  1. Completely visible content that’s perfectly relevant to the audience and query
  2. A flawless user experience
  3. Instant load time
  4. Zero duplicate content
  5. Every page easily indexed and classified
  6. No mistakes, broken links, redirects or anything else generally yucky
  7. Zero reported problems or suggestions in each search engines’ webmaster tools, sorry, “Search Consoles”
  8. Complete authority through immaculate, organically-generated links

These 8 categories (and any of the other bazillion that probably exist) give you a way to break down “perfect” and help you focus on what’s really going to move you forward. These different areas may involve different facets of your organization.

Your IT team can work on load time and creating an error-free front- and back-end. Link building requires the time and effort of content and outreach teams.

Tactics for relevant, visible content and current best practices in UX are going to be more involved, requiring research and real study of your audience.

What you need and what resources you have are going to impact which tactics are most realistic for you.

But there’s a basic rule: If a website would make Googlebot swoon and present zero obstacles to users, it’s close to perfect.

“All other things being equal”

Assume every competing website is optimized exactly as well as yours.

Now ask: Will this [tactic, change or strategy] move you closer to perfect?

That’s the “all other things being equal” rule. And it’s an incredibly powerful rubric for evaluating potential changes before you act. Pretend you’re in a tie with your competitors. Will this one thing be the tiebreaker? Will it put you ahead? Or will it cause you to fall behind?

“Closer to perfect than my competitors”

Perfect is great, but unattainable. What you really need is to be just a little perfect-er.

Chasing perfect can be dangerous. Perfect is the enemy of the good (I love that quote. Hated Voltaire. But I love that quote). If you wait for the opportunity/resources to reach perfection, you’ll never do anything. And the only way to reduce distance from perfect is to execute.

Instead of aiming for pure perfection, aim for more perfect than your competitors. Beat them feature-by-feature, tactic-by-tactic. Implement strategy that supports long-term superiority.

Don’t slack off. But set priorities and measure your effort. If fixing server response codes will take one hour and fixing duplication will take ten, fix the response codes first. Both move you closer to perfect. Fixing response codes may not move the needle as much, but it’s a lot easier to do. Then move on to fixing duplicates.

Do the 60% that gets you a 90% improvement. Then move on to the next thing and do it again. When you’re done, get to work on that last 40%. Repeat as necessary.

Take advantage of quick wins. That gives you more time to focus on your bigger solutions.

Sites that are “fine” are pretty far from perfect

Google has lots of tweaks, tools and workarounds to help us mitigate sub-optimal sites:

  • Rel=canonical lets us guide Google past duplicate content rather than fix it
  • HTML snapshots let us reveal content that’s delivered using asynchronous content and JavaScript frameworks
  • We can use rel=next and prev to guide search bots through outrageously long pagination tunnels
  • And we can use rel=nofollow to hide spammy links and banners

Easy, right? All of these solutions may reduce distance from perfect (the search engines don’t guarantee it). But they don’t reduce it as much as fixing the problems.
Just fine does not equal fixed

The next time you set up rel=canonical, ask yourself:

“All other things being equal, will using rel=canonical to make up for duplication move my site closer to perfect than my competitors?”

Answer: Not if they’re using rel=canonical, too. You’re both using imperfect solutions that force search engines to crawl every page of your site, duplicates included. If you want to pass them on your way to perfect, you need to fix the duplicate content.

When you use Angular.js to deliver regular content pages, ask yourself:

“All other things being equal, will using HTML snapshots instead of actual, visible content move my site closer to perfect than my competitors?”

Answer: No. Just no. Not in your wildest, code-addled dreams. If I’m Google, which site will I prefer? The one that renders for me the same way it renders for users? Or the one that has to deliver two separate versions of every page?

When you spill banner ads all over your site, ask yourself…

You get the idea. Nofollow is better than follow, but banner pollution is still pretty dang far from perfect.

Mitigating SEO issues with search engine-specific tools is “fine.” But it’s far, far from perfect. If search engines are forced to choose, they’ll favor the site that just works.

Not just SEO

By the way, distance from perfect absolutely applies to other channels.

I’m focusing on SEO, but think of other Internet marketing disciplines. I hear stuff like “How fast should my site be?” (Faster than it is right now.) Or “I’ve heard you shouldn’t have any content below the fold.” (Maybe in 2001.) Or “I need background video on my home page!” (Why? Do you have a reason?) Or, my favorite: “What’s a good bounce rate?” (Zero is pretty awesome.)

And Internet marketing venues are working to measure distance from perfect. Pay-per-click marketing has the quality score: A codified financial reward applied for seeking distance from perfect in as many elements as possible of your advertising program.

Social media venues are aggressively building their own forms of graphing, scoring and ranking systems designed to separate the good from the bad.

Really, all marketing includes some measure of distance from perfect. But no channel is more influenced by it than SEO. Instead of arguing one rule at a time, ask yourself and your boss or client: Will this move us closer to perfect?

Hell, you might even please a customer or two.

One last note for all of the SEOs in the crowd. Before you start pointing out edge cases, consider this: We spend our days combing Google for embarrassing rankings issues. Every now and then, we find one, point, and start yelling “SEE! SEE!!!! THE GOOGLES MADE MISTAKES!!!!” Google’s got lots of issues. Screwing up the rankings isn’t one of them.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Why We Can’t Do Keyword Research Like It’s 2010 – Whiteboard Friday

Posted by randfish

Keyword Research is a very different field than it was just five years ago, and if we don’t keep up with the times we might end up doing more harm than good. From the research itself to the selection and targeting process, in today’s Whiteboard Friday Rand explains what has changed and what we all need to do to conduct effective keyword research today.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

What do we need to change to keep up with the changing world of keyword research?

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat a little bit about keyword research, why it’s changed from the last five, six years and what we need to do differently now that things have changed. So I want to talk about changing up not just the research but also the selection and targeting process.

There are three big areas that I’ll cover here. There’s lots more in-depth stuff, but I think we should start with these three.

1) The Adwords keyword tool hides data!

This is where almost all of us in the SEO world start and oftentimes end with our keyword research. We go to AdWords Keyword Tool, what used to be the external keyword tool and now is inside AdWords Ad Planner. We go inside that tool, and we look at the volume that’s reported and we sort of record that as, well, it’s not good, but it’s the best we’re going to do.

However, I think there are a few things to consider here. First off, that tool is hiding data. What I mean by that is not that they’re not telling the truth, but they’re not telling the whole truth. They’re not telling nothing but the truth, because those rounded off numbers that you always see, you know that those are inaccurate. Anytime you’ve bought keywords, you’ve seen that the impression count never matches the count that you see in the AdWords tool. It’s not usually massively off, but it’s often off by a good degree, and the only thing it’s great for is telling relative volume from one from another.

But because AdWords hides data essentially by saying like, “Hey, you’re going to type in . . .” Let’s say I’m going to type in “college tuition,” and Google knows that a lot of people search for how to reduce college tuition, but that doesn’t come up in the suggestions because it’s not a commercial term, or they don’t think that an advertiser who bids on that is going to do particularly well and so they don’t show it in there. I’m giving an example. They might indeed show that one.

But because that data is hidden, we need to go deeper. We need to go beyond and look at things like Google Suggest and related searches, which are down at the bottom. We need to start conducting customer interviews and staff interviews, which hopefully has always been part of your brainstorming process but really needs to be now. Then you can apply that to AdWords. You can apply that to suggest and related.

The beautiful thing is once you get these tools from places like visiting forums or communities, discussion boards and seeing what terms and phrases people are using, you can collect all this stuff up, plug it back into AdWords, and now they will tell you how much volume they’ve got. So you take that how to lower college tuition term, you plug it into AdWords, they will show you a number, a non-zero number. They were just hiding it in the suggestions because they thought, “Hey, you probably don’t want to bid on that. That won’t bring you a good ROI.” So you’ve got to be careful with that, especially when it comes to SEO kinds of keyword research.

2) Building separate pages for each term or phrase doesn’t make sense

It used to be the case that we built separate pages for every single term and phrase that was in there, because we wanted to have the maximum keyword targeting that we could. So it didn’t matter to us that college scholarship and university scholarships were essentially people looking for exactly the same thing, just using different terminology. We would make one page for one and one page for the other. That’s not the case anymore.

Today, we need to group by the same searcher intent. If two searchers are searching for two different terms or phrases but both of them have exactly the same intent, they want the same information, they’re looking for the same answers, their query is going to be resolved by the same content, we want one page to serve those, and that’s changed up a little bit of how we’ve done keyword research and how we do selection and targeting as well.

3) Build your keyword consideration and prioritization spreadsheet with the right metrics

Everybody’s got an Excel version of this, because I think there’s just no awesome tool out there that everyone loves yet that kind of solves this problem for us, and Excel is very, very flexible. So we go into Excel, we put in our keyword, the volume, and then a lot of times we almost stop there. We did keyword volume and then like value to the business and then we prioritize.

What are all these new columns you’re showing me, Rand? Well, here I think is how sophisticated, modern SEOs that I’m seeing in the more advanced agencies, the more advanced in-house practitioners, this is what I’m seeing them add to the keyword process.

Difficulty

A lot of folks have done this, but difficulty helps us say, “Hey, this has a lot of volume, but it’s going to be tremendously hard to rank.”

The difficulty score that Moz uses and attempts to calculate is a weighted average of the top 10 domain authorities. It also uses page authority, so it’s kind of a weighted stack out of the two. If you’re seeing very, very challenging pages, very challenging domains to get in there, it’s going to be super hard to rank against them. The difficulty is high. For all of these ones it’s going to be high because college and university terms are just incredibly lucrative.

That difficulty can help bias you against chasing after terms and phrases for which you are very unlikely to rank for at least early on. If you feel like, “Hey, I already have a powerful domain. I can rank for everything I want. I am the thousand pound gorilla in my space,” great. Go after the difficulty of your choice, but this helps prioritize.

Opportunity

This is actually very rarely used, but I think sophisticated marketers are using it extremely intelligently. Essentially what they’re saying is, “Hey, if you look at a set of search results, sometimes there are two or three ads at the top instead of just the ones on the sidebar, and that’s biasing some of the click-through rate curve.” Sometimes there’s an instant answer or a Knowledge Graph or a news box or images or video, or all these kinds of things that search results can be marked up with, that are not just the classic 10 web results. Unfortunately, if you’re building a spreadsheet like this and treating every single search result like it’s just 10 blue links, well you’re going to lose out. You’re missing the potential opportunity and the opportunity cost that comes with ads at the top or all of these kinds of features that will bias the click-through rate curve.

So what I’ve seen some really smart marketers do is essentially build some kind of a framework to say, “Hey, you know what? When we see that there’s a top ad and an instant answer, we’re saying the opportunity if I was ranking number 1 is not 10 out of 10. I don’t expect to get whatever the average traffic for the number 1 position is. I expect to get something considerably less than that. Maybe something around 60% of that, because of this instant answer and these top ads.” So I’m going to mark this opportunity as a 6 out of 10.

There are 2 top ads here, so I’m giving this a 7 out of 10. This has two top ads and then it has a news block below the first position. So again, I’m going to reduce that click-through rate. I think that’s going down to a 6 out of 10.

You can get more and less scientific and specific with this. Click-through rate curves are imperfect by nature because we truly can’t measure exactly how those things change. However, I think smart marketers can make some good assumptions from general click-through rate data, which there are several resources out there on that to build a model like this and then include it in their keyword research.

This does mean that you have to run a query for every keyword you’re thinking about, but you should be doing that anyway. You want to get a good look at who’s ranking in those search results and what kind of content they’re building . If you’re running a keyword difficulty tool, you are already getting something like that.

Business value

This is a classic one. Business value is essentially saying, “What’s it worth to us if visitors come through with this search term?” You can get that from bidding through AdWords. That’s the most sort of scientific, mathematically sound way to get it. Then, of course, you can also get it through your own intuition. It’s better to start with your intuition than nothing if you don’t already have AdWords data or you haven’t started bidding, and then you can refine your sort of estimate over time as you see search visitors visit the pages that are ranking, as you potentially buy those ads, and those kinds of things.

You can get more sophisticated around this. I think a 10 point scale is just fine. You could also use a one, two, or three there, that’s also fine.

Requirements or Options

Then I don’t exactly know what to call this column. I can’t remember the person who’ve showed me theirs that had it in there. I think they called it Optional Data or Additional SERPs Data, but I’m going to call it Requirements or Options. Requirements because this is essentially saying, “Hey, if I want to rank in these search results, am I seeing that the top two or three are all video? Oh, they’re all video. They’re all coming from YouTube. If I want to be in there, I’ve got to be video.”

Or something like, “Hey, I’m seeing that most of the top results have been produced or updated in the last six months. Google appears to be biasing to very fresh information here.” So, for example, if I were searching for “university scholarships Cambridge 2015,” well, guess what? Google probably wants to bias to show results that have been either from the official page on Cambridge’s website or articles from this year about getting into that university and the scholarships that are available or offered. I saw those in two of these search results, both the college and university scholarships had a significant number of the SERPs where a fresh bump appeared to be required. You can see that a lot because the date will be shown ahead of the description, and the date will be very fresh, sometime in the last six months or a year.

Prioritization

Then finally I can build my prioritization. So based on all the data I had here, I essentially said, “Hey, you know what? These are not 1 and 2. This is actually 1A and 1B, because these are the same concepts. I’m going to build a single page to target both of those keyword phrases.” I think that makes good sense. Someone who is looking for college scholarships, university scholarships, same intent.

I am giving it a slight prioritization, 1A versus 1B, and the reason I do this is because I always have one keyword phrase that I’m leaning on a little more heavily. Because Google isn’t perfect around this, the search results will be a little different. I want to bias to one versus the other. In this case, my title tag, since I more targeting university over college, I might say something like college and university scholarships so that university and scholarships are nicely together, near the front of the title, that kind of thing. Then 1B, 2, 3.

This is kind of the way that modern SEOs are building a more sophisticated process with better data, more inclusive data that helps them select the right kinds of keywords and prioritize to the right ones. I’m sure you guys have built some awesome stuff. The Moz community is filled with very advanced marketers, probably plenty of you who’ve done even more than this.

I look forward to hearing from you in the comments. I would love to chat more about this topic, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

How to Combat 5 of the SEO World’s Most Infuriating Problems – Whiteboard Friday

Posted by randfish

These days, most of us have learned that spammy techniques aren’t the way to go, and we have a solid sense for the things we should be doing to rank higher, and ahead of our often spammier competitors. Sometimes, maddeningly, it just doesn’t work. In today’s Whiteboard Friday, Rand talks about five things that can infuriate SEOs with the best of intentions, why those problems exist, and what we can do about them.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

What SEO problems make you angry?

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about some of the most infuriating things in the SEO world, specifically five problems that I think plague a lot of folks and some of the ways that we can combat and address those.

I’m going to start with one of the things that really infuriates a lot of new folks to the field, especially folks who are building new and emerging sites and are doing SEO on them. You have all of these best practices list. You might look at a web developer’s cheat sheet or sort of a guide to on-page and on-site SEO. You go, “Hey, I’m doing it. I’ve got my clean URLs, my good, unique content, my solid keyword targeting, schema markup, useful internal links, my XML sitemap, and my fast load speed. I’m mobile friendly, and I don’t have manipulative links.”

Great. “Where are my results? What benefit am I getting from doing all these things, because I don’t see one?” I took a site that was not particularly SEO friendly, maybe it’s a new site, one I just launched or an emerging site, one that’s sort of slowly growing but not yet a power player. I do all this right stuff, and I don’t get SEO results.

This makes a lot of people stop investing in SEO, stop believing in SEO, and stop wanting to do it. I can understand where you’re coming from. The challenge is not one of you’ve done something wrong. It’s that this stuff, all of these things that you do right, especially things that you do right on your own site or from a best practices perspective, they don’t increase rankings. They don’t. That’s not what they’re designed to do.

1) Following best practices often does nothing for new and emerging sites

This stuff, all of these best practices are designed to protect you from potential problems. They’re designed to make sure that your site is properly optimized so that you can perform to the highest degree that you are able. But this is not actually rank boosting stuff unfortunately. That is very frustrating for many folks. So following a best practices list, the idea is not, “Hey, I’m going to grow my rankings by doing this.”

On the flip side, many folks do these things on larger, more well-established sites, sites that have a lot of ranking signals already in place. They’re bigger brands, they have lots of links to them, and they have lots of users and usage engagement signals. You fix this stuff. You fix stuff that’s already broken, and boom, rankings pop up. Things are going well, and more of your pages are indexed. You’re getting more search traffic, and it feels great. This is a challenge, on our part, of understanding what this stuff does, not a challenge on the search engine’s part of not ranking us properly for having done all of these right things.

2) My competition seems to be ranking on the back of spammy or manipulative links

What’s going on? I thought Google had introduced all these algorithms to kind of shut this stuff down. This seems very frustrating. How are they pulling this off? I look at their link profile, and I see a bunch of the directories, Web 2.0 sites — I love that the spam world decided that that’s Web 2.0 sites — article sites, private blog networks, and do follow blogs.

You look at this stuff and you go, “What is this junk? It’s terrible. Why isn’t Google penalizing them for this?” The answer, the right way to think about this and to come at this is: Are these really the reason that they rank? I think we need to ask ourselves that question.

One thing that we don’t know, that we can never know, is: Have these links been disavowed by our competitor here?

I’ve got my HulksIncredibleStore.com and their evil competitor Hulk-tastrophe.com. Hulk-tastrophe has got all of these terrible links, but maybe they disavowed those links and you would have no idea. Maybe they didn’t build those links. Perhaps those links came in from some other place. They are not responsible. Google is not treating them as responsible for it. They’re not actually what’s helping them.

If they are helping, and it’s possible they are, there are still instances where we’ve seen spam propping up sites. No doubt about it.

I think the next logical question is: Are you willing to loose your site or brand? What we don’t see anymore is we almost never see sites like this, who are ranking on the back of these things and have generally less legitimate and good links, ranking for two or three or four years. You can see it for a few months, maybe even a year, but this stuff is getting hit hard and getting hit frequently. So unless you’re willing to loose your site, pursuing their links is probably not a strategy.

Then what other signals, that you might not be considering potentially links, but also non-linking signals, could be helping them rank? I think a lot of us get blinded in the SEO world by link signals, and we forget to look at things like: Do they have a phenomenal user experience? Are they growing their brand? Are they doing offline kinds of things that are influencing online? Are they gaining engagement from other channels that’s then influencing their SEO? Do they have things coming in that I can’t see? If you don’t ask those questions, you can’t really learn from your competitors, and you just feel the frustration.

3) I have no visibility or understanding of why my rankings go up vs down

On my HulksIncredibleStore.com, I’ve got my infinite stretch shorts, which I don’t know why he never wears — he should really buy those — my soothing herbal tea, and my anger management books. I look at my rankings and they kind of jump up all the time, jump all over the place all the time. Actually, this is pretty normal. I think we’ve done some analyses here, and the average page one search results shift is 1.5 or 2 position changes daily. That’s sort of the MozCast dataset, if I’m recalling correctly. That means that, over the course of a week, it’s not uncommon or unnatural for you to be bouncing around four, five, or six positions up, down, and those kind of things.

I think we should understand what can be behind these things. That’s a very simple list. You made changes, Google made changes, your competitors made changes, or searcher behavior has changed in terms of volume, in terms of what they were engaging with, what they’re clicking on, what their intent behind searches are. Maybe there was just a new movie that came out and in one of the scenes Hulk talks about soothing herbal tea. So now people are searching for very different things than they were before. They want to see the scene. They’re looking for the YouTube video clip and those kind of things. Suddenly Hulk’s soothing herbal tea is no longer directing as well to your site.

So changes like these things can happen. We can’t understand all of them. I think what’s up to us to determine is the degree of analysis and action that’s actually going to provide a return on investment. Looking at these day over day or week over week and throwing up our hands and getting frustrated probably provides very little return on investment. Looking over the long term and saying, “Hey, over the last 6 months, we can observe 26 weeks of ranking change data, and we can see that in aggregate we are now ranking higher and for more keywords than we were previously, and so we’re going to continue pursuing this strategy. This is the set of keywords that we’ve fallen most on, and here are the factors that we’ve identified that are consistent across that group.” I think looking at rankings in aggregate can give us some real positive ROI. Looking at one or two, one week or the next week probably very little ROI.

4) I cannot influence or affect change in my organization because I cannot accurately quantify, predict, or control SEO

That’s true, especially with things like keyword not provided and certainly with the inaccuracy of data that’s provided to us through Google’s Keyword Planner inside of AdWords, for example, and the fact that no one can really control SEO, not fully anyway.

You get up in front of your team, your board, your manager, your client and you say, “Hey, if we don’t do these things, traffic will suffer,” and they go, “Well, you can’t be sure about that, and you can’t perfectly predict it. Last time you told us something, something else happened. So because the data is imperfect, we’d rather spend money on channels that we can perfectly predict, that we can very effectively quantify, and that we can very effectively control.” That is understandable. I think that businesses have a lot of risk aversion naturally, and so wanting to spend time and energy and effort in areas that you can control feels a lot safer.

Some ways to get around this are, first off, know your audience. If you know who you’re talking to in the room, you can often determine the things that will move the needle for them. For example, I find that many managers, many boards, many executives are much more influenced by competitive pressures than they are by, “We won’t do as well as we did before, or we’re loosing out on this potential opportunity.” Saying that is less powerful than saying, “This competitor, who I know we care about and we track ourselves against, is capturing this traffic and here’s how they’re doing it.”

Show multiple scenarios. Many of the SEO presentations that I see and have seen and still see from consultants and from in-house folks come with kind of a single, “Hey, here’s what we predict will happen if we do this or what we predict will happen if we don’t do this.” You’ve got to show multiple scenarios, especially when you know you have error bars because you can’t accurately quantify and predict. You need to show ranges.

So instead of this, I want to see: What happens if we do it a little bit? What happens if we really overinvest? What happens if Google makes a much bigger change on this particular factor than we expect or our competitors do a much bigger investment than we expect? How might those change the numbers?

Then I really do like bringing case studies, especially if you’re a consultant, but even in-house there are so many case studies in SEO on the Web today, you can almost always find someone who’s analogous or nearly analogous and show some of their data, some of the results that they’ve seen. Places like SEMrush, a tool that offers competitive intelligence around rankings, can be great for that. You can show, hey, this media site in our sector made these changes. Look at the delta of keywords they were ranking for versus R over the next six months. Correlation is not causation, but that can be a powerful influencer showing those kind of things.

Then last, but not least, any time you’re going to get up like this and present to a group around these topics, if you very possibly can, try to talk one-on-one with the participants before the meeting actually happens. I have found it almost universally the case that when you get into a group setting, if you haven’t had the discussions beforehand about like, “What are your concerns? What do you think is not valid about this data? Hey, I want to run this by you and get your thoughts before we go to the meeting.” If you don’t do that ahead of time, people can gang up and pile on. One person says, “Hey, I don’t think this is right,” and everybody in the room kind of looks around and goes, “Yeah, I also don’t think that’s right.” Then it just turns into warfare and conflict that you don’t want or need. If you address those things beforehand, then you can include the data, the presentations, and the “I don’t know the answer to this and I know this is important to so and so” in that presentation or in that discussion. It can be hugely helpful. Big difference between winning and losing with that.

5) Google is biasing to big brands. It feels hopeless to compete against them

A lot of people are feeling this hopelessness, hopelessness in SEO about competing against them. I get that pain. In fact, I’ve felt that very strongly for a long time in the SEO world, and I think the trend has only increased. This comes from all sorts of stuff. Brands now have the little dropdown next to their search result listing. There are these brand and entity connections. As Google is using answers and knowledge graph more and more, it’s feeling like those entities are having a bigger influence on where things rank and where they’re visible and where they’re pulling from.

User and usage behavior signals on the rise means that big brands, who have more of those signals, tend to perform better. Brands in the knowledge graph, brands growing links without any effort, they’re just growing links because they’re brands and people point to them naturally. Well, that is all really tough and can be very frustrating.

I think you have a few choices on the table. First off, you can choose to compete with brands where they can’t or won’t. So this is areas like we’re going after these keywords that we know these big brands are not chasing. We’re going after social channels or people on social media that we know big brands aren’t. We’re going after user generated content because they have all these corporate requirements and they won’t invest in that stuff. We’re going after content that they refuse to pursue for one reason or another. That can be very effective.

You better be building, growing, and leveraging your competitive advantage. Whenever you build an organization, you’ve got to say, “Hey, here’s who is out there. This is why we are uniquely better or a uniquely better choice for this set of customers than these other ones.” If you can leverage that, you can generally find opportunities to compete and even to win against big brands. But those things have to become obvious, they have to become well-known, and you need to essentially build some of your brand around those advantages, or they’re not going to give you help in search. That includes media, that includes content, that includes any sort of press and PR you’re doing. That includes how you do your own messaging, all of these things.

(C) You can choose to serve a market or a customer that they don’t or won’t. That can be a powerful way to go about search, because usually search is bifurcated by the customer type. There will be slightly different forms of search queries that are entered by different kinds of customers, and you can pursue one of those that isn’t pursued by the competition.

Last, but not least, I think for everyone in SEO we all realize we’re going to have to become brands ourselves. That means building the signals that are typically associated with brands — authority, recognition from an industry, recognition from a customer set, awareness of our brand even before a search has happened. I talked about this in a previous Whiteboard Friday, but I think because of these things, SEO is becoming a channel that you benefit from as you grow your brand rather than the channel you use to initially build your brand.

All right, everyone. Hope these have been helpful in combating some of these infuriating, frustrating problems and that we’ll see some great comments from you guys. I hope to participate in those as well, and we’ll catch you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

How to Have a Successful Local SEO Campaign in 2015

Posted by Casey_Meraz

Another year in search has passed. It’s now 2015 and we have seen some major changes in local ranking factors since 2014, which I also expect to change greatly throughout 2015. For some a new year means a fresh starting point and yet for others it’s a time of reflection to analyze how successful your campaign has been. Whatever boat you’re in, make sure to sit down and read this guide. 

In this guide we will cover how you can have a successful local SEO campaign in 2015 starting with the basics and getting down to five action items you should focus on now. This is not limited to Google My Business and also includes localized organic results. 

Now the question is where do you start?

Since Pigeon has now rolled out to the US, UK, Australia, and Canada it’s important to make sure your strategies are in line with this no matter what part of the world you’re in. A successful local SEO Campaign in 2015 will be much more successful if you put more work into it. Don’t be fooled though. More work by itself isn’t going to get you where you need to be. You need to work smarter towards the goals which are going to fuel your conversions.

For some industries that might mean more localized content, for others it may mean more social interaction in your local area. Whatever it ends up being, the root of it should be the same for most. You need to get more conversions for your website or your client’s website. So with this in mind let’s make sure we’re on the same page as far as our goals are concerned.

Things you need to know first

Focus on the right goals

Recently I had a conversation with a client who wanted to really nail in the point that
he was not interested in traffic. He was interested in the conversions he could track. He was also interested to see how all of these content resource pieces I recommended would help. He was tired of the silly graphs from other agencies that showed great rankings on a variety of keywords when he was more interested to see which efforts brought him the most value. Instead, he wanted to see how his campaign was bringing him conversions or meaningful traffic. I really appreciated this statement and I felt like he really got it.

Still, however, far too often I have to talk to potential clients and explain to them why their sexy looking traffic reports aren’t actually helping them. You can have all of the traffic in the world but if it doesn’t meet one of your goals of conversions or education then it’s probably not helping. Even if you make the client happy with your snazzy reports for a few months, eventually they’re going to want to know their return on investment (ROI).

It’s 2015. If your clients aren’t tracking conversions properly, give them the help they need. Record their contacts in a CRM and track the source of each of these contacts. Track them all the way through the sales funnel. 

That’s a simple and basic marketing example but as SEOs
your role has transformed. If you can show this type of actual value and develop a plan accordingly, you will be unstoppable.

Second, don’t get tunnel vision

You may wonder why I started a little more basic than normal in this post. The fact is that in this industry there is not a full formal training program that covers all aspects of what we do. 

We all come from different walks of life and experience which makes it easy for us to get tunnel vision. You probably opened this article with the idea of “How Can I Dominate My Google Local Rankings?” While we cover some actionable tips you should be using, you need to think outside of the box as well. Your website is not the only online property you need to be concerned about.

Mike Ramsey from Nifty Marketing put out a great study on 
measuring the click-through rates from the new local stack. In this study he measured click-through rates of users conducting several different searches like “Salt Lake City Hotel” in the example below. With so many different options look where the users are clicking:

They’re really clicking all over the place! While it’s cool to be number one, it’s much better if you get clicks from your paid ad, organic result, local result, and barnacle SEO efforts (which we’ll talk about a little later). 

If you combine your conversion marketing data with your biggest priorities, you can put together a plan to tackle the most important areas for your industry. Don’t assume it’s a one-size-fits-all approach. 

Third, some spam still works. Don’t do it and rise above it.

There’s no doubt that some spammy tactics are still working. Google gets better everyday but you still see crap
like this example below show up in the SERPs.

While it sucks to see that kind of stuff, remember that in time it disappears (just as it did before this article was published). If you take shortcuts, you’re going to get caught and it’s not worth it for the client or the heartache on your site. Maintain the course and do things the right way. 

Now let’s get tactical and prepare for 2015

Now it’s time for some practical and tactical takeaways you can use to dominate your local search campaign in 2015.

Practical tip 1: start with an audit

Over the years, one of the best lessons I have learned is it’s OK to say “I don’t know” when you don’t have the answer. Consulting with industry experts or people with more experience than you is not a bad thing and will likely only lead to you to enhance your knowledge and get a different perspective. It can be humbling but the experience is amazing. It can open your mind.

Last year, I had the opportunity to work with over ten of the industry’s best minds and retained them for site audits on different matters. 

The perspective this gives is absolutely incredible and I believe it’s a great way to learn. Everyone in this industry has come from a different background and seen different things over the years. Combining that knowledge is invaluable to the success of your clients’ projects. Don’t be afraid to do it and learn from it. This is also a good idea if you feel like your project has reached a stalemate. Getting more advice, identifying potential problems, and having a fresh perspective will do wonders for your success.

As many of the experts have confirmed, ever since the Pigeon update, organic and local ranking factors have been more tied together than ever. Since they started going this direction in a big way, I would not expect it to stop. 

This means that you really do need to worry about things like site speed, content, penalties, mobile compatibility, site structure, and more. On a side note, guess what will happen to your organic results if you keep this as a top priority? They will flourish and you will thank me.

If you don’t have the budget or resources to get a third party opinion, you can also conduct an independent audit. 

Do it yourself local SEO audit resources:

Do it yourself organic SEO audit resources:

Alternatively if you’re more in the organic boat you should also check out this guide by Steve Webb on
How To Perform The World’s Greatest SEO Audit

Whatever your situation is, it’s worth the time to have this perspective yearly or even a couple times a year if possible.

Practical tip 2: consider behavioral signals and optimize accordingly

I remember having a conversation with Darren Shaw, the founder of 
Whitespark, at MozCon 2013 about his thoughts on user behavior affecting local results. At the time I didn’t do too much testing around it. However just this year, Darren had a mind-blowing presentation at the Dallas State of Search where he threw in the behavioral signals curve ball. Phil Rozek also spoke about behavioral signals and provided a great slide deck with actionable items (included below). 

We have always speculated on behavioral signals but between his tests and some of Rand’s IMEC Lab tests, I became more of a believer last year. Now, before we go too deep on this remember that your local campaign is NOT only focused on just your local pack results. If user behavior can have an impact on search results, we should definitely be optimizing for our users.


You can view Phil Rozek’s presentation below: 

Don’t just optimize for the engines, optimize for the humans. One day when Skynet is around this may not be an issue, but for now you need to do it.

So how can you optimize for behavioral signals?

There is a dark side and a light side path to this question. If you ask me I will always say follow the light side as it will be effective and you don’t have to worry about being penalized. That’s a serious issue and it’s unethical for you to put your clients in that position.

Local SEO: how to optimize for behavioral signals

Do you remember the click-through study we looked at a bit earlier from Nifty Marketing? Do you remember where the users clicked? If you look again or just analyze user and shopper behavior, you might notice that many of the results with the most reviews got clicks. We know that reviews are hard to get so here are two quick ways that I use and recommend to my clients:


1. Solicit your Gmail clients for reviews

If you have a list of happy Gmail clients you can simply send them an email with a direct link to your Google My Business Page. Just get the URL of your local page by pulling up your URL and copying and pasting it. A URL will look like the one below:

Once you have this URL, simply remove the /posts and replace it with: 

 /?hl=en&review=1


It will look like this:

If your clients click on this link via their logged-in Gmail, they will automatically be taken to the review page which will open up the box to leave a review which looks like the example below. It doesn’t get much more simple than that. 

2. Check out a service like Mike Blumenthal’s Get Five Stars for reviews

I recently used this with a client and got a lot of great feedback and several reviews.

Remember that these reviews will also help on third-party sites and can help your Google My Business ranking positions as well as click-through rates. You can
check out Get Five Stars Here.

Another way outside of getting reviews is to optimize the appearance of your Google My Business Page. 


3. Optimize your local photos

Your Google My Business page includes photos. Don’t use generic photos. Use high quality photos so when the users hover over your listing they get an accurate representation of what they’re looking for. Doing this will increase your click-through rate. 

Organic SEO: Optimize for Behavioral Signals

The optimization for click-through rates on organic results typically focus on three areas. While you’re likely very familiar with the first two, you should not ignore them.


1. Title tags: optimize them for the user and engine

Optimize your meta title tags to increase click-through rates. Each page should have a unique title tag and should help the viewer with their query. The example below (although fabricated) is a good example of what NOT to do. 


2. Meta descriptions: optimize them for the user

Optimize your meta description to get the user to click on the search result. If you’re not doing this just because Google may or may not pull it, you’re missing opportunities and clicks. 


3. Review Schema markup: add this to appropriate pages

Reviewing
Schema markup is still a very overlooked opportunity. Like we talked about above in the local section, if you don’t have reviews coded in Schema, you could be missing out on getting the orange stars in organic results. 

Practical tip 3: don’t ignore barnacle SEO

I firmly believe that most people are not taking advantage of barnacle SEO still to this day and I’m a big fan. When I first heard Will Scott introduce this term at Pubcon I thought it was spot on. According to Will Scott’s website Search Influence, barnacle SEO is “attaching oneself to a large fixed object and waiting for the customers to float by in the current.” In a nutshell, we know that if you’re trying to rank on page one of Google you will find others that you may be able to attach to. If Yelp results come up for a lot of your search terms you might identify that as an opportunity. But there are three main ways you can take advantage of this.


1. You can try to have the most visible profile on that third party page

If Yelp is ranking for LA Personal Injury Attorneys, it would suit you to figure out how the top users are showing up there. Maybe your customers are headed there and then doing some shopping and making a selection. Or maybe they’re using it for a research platform and then will visit your website. If your profile looks great and shows up high on the list, you just gave yourself a better chance at getting a conversion.


2. You can try to get your page to rank

Hey, just because you don’t own Yelp.com or whatever similar site you’ve found, doesn’t mean you shouldn’t put in the effort to have it rank. If Google is already showing you that they trust a third party site by ranking it, you can use similar organic ranking techniques that you would use on your own site to make your profile page stronger. Over time you might add this to your bio on interviews or other websites to earn links. If you increase the visibility of your profile on search engines and they see your website on the same page you might increase conversions.


3. You can help your Google My Business

If the site you’re using passes link juice and you earn links to the third party profile page, you will start to see some strong results. Links are a big factor in local since Pigeon this year and it’s an opportunity that should not be missed.


So how can you use this advice?

Start by finding a list of potential barnacle SEO partners for your industry. As an example, I did a search for “Personal Injury Attorneys” in Los Angeles. In addition to the law firms that showed up in the results on the first page, I also identified four additional places I may be able to show up on.

  1. Yelp
  2.  Thumbtack
  3. Avvo
  4. Wikipedia

If you were attorney, it would be worth your while to explore these and see if any make sense for you to contribute to.

Practical tip 4: earn some good links

Most people get too carried away with link building. I know because I used to do it. The key with link building is to change your approach to understand that
it’s always better to get fewer high quality links than hundreds or thousands of low quality links

For example, a link like this one that one of our clients earned is what I’m talking about. 

If you want to increase your local rankings you can do so by earning these links to your associated Google My Business landing page.

Do you know the URL you entered in your Google My Business page when you set it up? That’s the one I’m talking about. In most cases this will be linked to either a local landing page for that location or the home page. It’s essential to your success that you earn solid links to this page.


Simple resources for link building

Practical tip 5: have consistent citations and remove duplicates

Identifying and correcting incorrect or duplicate citations has been getting easier and easier over the years. Even if you don’t want to pay someone to do it, you can sign up for some great do-it-yourself tools. Your goal with any citation cleanup program is this:

  1. Ensure there are no duplicate citations
  2. Ensure there are no incorrect citations with wrong phone numbers, old addresses, etc. 

You can ignore small differences and inconsistencies like St vs. Street. I believe the importance of citations has been greatly reduced over the past year. At the same time, you still want to be the least imperfect and provide your customers with accurate information if they’re looking on third party websites.  

Let’s do good things in 2015

2014 was a tough year in search altogether. We had ups like Penguin refreshes and we had downs like the removal of authorship. I’m guessing 2015 will be no different. Staying on the roller coaster and keeping with the idea of having the “least imperfect” site is the best way to ring out the new year and march on moving forward. If you had a tough year in local search, keep your head up high, fix any existing issues, and sprint through this year by making positive changes to your site. 

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from moz.com

Back to Fundamentals: 6 Untapped Keyword Sources that Will Boost Organic Traffic

Posted by neilpatel

I used to perform keyword research in the typical, perfunctory way—go to the Keyword Tool, type in some words, and punch out a list of terms.

Easy. Quick. Simple.

Today, things are different. The much-loved
keyword tool has been replaced, long-tail keywords have the ascendancy, and it’s harder to figure out what users are actually searching for.

The rules have changed, and so have the ways of playing the game. I still use the
Keyword Planner, but I’ve also discovered a medley of not-so-obvious ways to get keywords that improve my organic traffic.

1. Wikipedia

Do you think of Wikipedia as just a massive encyclopedia? Think again.
I use Wikipedia for keyword research.

Image from Search Engine Journal.

My process is pretty simple.

Step 1: Google inurl:Wikipedia and my topic. Or just Google the topic or head term. Wikipedia is often the first organic result.

Step 2: Look at the SERP to identify the most relevant terms and possible keywords within a Wikipedia entry.

Step 3: Open the entry in Wikipedia and identify the most relevant terms from the first few paragraphs, morphing them into longail iterations.

Step 4: Identify other relevant terms from Wikipedia’s table of contents on the topic.

Step 5: Link to other associated Wikipedia to see related subjects, and identify even more keywords.

Wikipedia is the world’s
sixth most popular website, and ranks it at number #4 on Google’s list. It boasts 310,000,000 unique visitors (20% of its traffic), and has 7,900,000,000 pageviews. All of this with absolutely no advertising.

In other words, Wikipedia has one of the best organic SEO strategies on the planet. Obviously, these are keywords that matter. Wikipedia’s popularity shows us that people want information. It’s like the greatest content marketing strategy ever, combining user-generated content with prolific publishing on a grand scale.

Do what Wikipedia does. Use the terms that people search for. You won’t outrank Wikipedia, but you will start to rank organically for the longtail varieties that you discern from Wikipedia.

2. Google autocomplete

When you type stuff into Google’s search bar, Google predicts your query and types it out for you. The feature has been around
for a long time. The more time that goes by, the more intelligent the autocomplete algorithm becomes.

These autocomplete suggestions are all based on real user queries. They vary based on geographic location and language. However, in spite of the variation, autocomplete provides a fairly accurate representation of what people are looking for.

Here is why autocomplete is a killer source of keywords:

Step 1: It indicates some of the most popular keywords.

Step 2: It provides longtail suggestions.

Step 3: The keywords are ranked according to the “freshness layer” algorithm. That means that currently popular search terms will rank higher in the autocomplete list.

How do you use autocomplete for keyword research? Well, you can go about this the good old-fashioned spade and shovel way, like this:

Google 2014-08-11 13-50-24

Step 4: Open Google. To prevent Google from autocompleting previously-searched for terms, log out of Google or open an “incognito” window (Chrome: Shift + Cmnd + N).

Step 5: Type in your main keyword or longtail keyword E.g. “lawnmower.”

Step 6: Write down the suggestions that appear in autocomplete.

Step 7: After you type in your main keyword or head term, type in “A” and write down the autocomplete suggestions.

Step 8: Repeat Step 7 for rest of the alphabet.

Or, you can do it the easy way, with Übersuggest. It’s called”suggest on steroids.” It will do all the work for you. The only downside is that it doesn’t suggest keyword extensions based on search popularity.

Keyword suggestion tool — Google suggest scraper — Übersuggest 2014-08-11 13-53-48

If you can get past the eye-popping UI, Übersuggest is a pretty awesome tool.

Keep in mind that Google is not going to provide suggestions for everything.
As quoted in Search Engine Land, here is what the algorithm will filter out:

  • Hate- or violence-related suggestions
  • Personally identifiable information in suggestions
  • Porn & adult content-related suggestions
  • Legally mandated removals
  • Piracy-related suggestions

3. Google Related Searches

Since Google is the biggest search engine, we’ve got to take our cues from its mighty algorithm, imperfect and agonizing though it may be.

Google’s related searches is a really easy way to snag some instant keyword research.


Step 1:
Search for your keyword in Google.


Step 2:
Scroll to the bottom, and ignore everything in between.

There, at the bottom is a harvest of keywords, ripe for the selection:

lawn mower - Google Search 2014-08-11 14-05-22

The idea is similar to Google suggest. However, instead of providing autocomplete suggestions, Google takes the keyword and mixes it up with other words. These other words may be at the end, at the beginning, or sprinkled throughout. These related searches might not even include the actual keyword, but are simply connected in a tangential way.

Whatever the case, you will undoubtedly find some keyword ideas from this list.

4. MetaGlossary.com

Not a whole lot of people know about MetaGlossary.com. You won’t find a lot of information about the company itself, but you will find a ton of keyword ideas.

Here are the instructions. Not too hard.

MetaGlossary.com 2014-08-11 14-53-43

The whole point of the glossary is to provide definitions. But along with the many definitions, you’ll get “related terms.” That’s what we’re looking for.

When I type in “Search Engine Optimization,” my head term, here’s what I get:

Metaglossary.com - Definitions for "search engine optimization" 2014-08-11 14-56-26

All of those are potential keywords.

I can take this a step further by looking through the definitions. These can provide even more keyword fodder:

Metaglossary.com - Definitions for "search engine optimization" 2014-08-11 14-57-28

For this particular term, I found 117 definitions. That’s enough to keep me busy for a while.

5. Competitor keywords

Another great way to get keyword ideas is to snag them from the competition.

Not only are you going to identify some great keywords, but you’ll be able to gain these keywords ideas from the top-ranking organic sites in the SERPs.

Here’s how to do it.

Step 1: Google your top keyword.

Step 2: Click the first organic result.

Step 3: View the page source (Chrome: Cmnd + Alt + u)

Step 4: Search for “<Title>”. Identify any non-branded terms as possible keywords.

Step 5: Search for “<h1>”. Identify any potential keywords in the H1 text.

Step 6: Search for “<keywords>”. Identify any potential keywords that they have identified as such. Some websites have this, such as specific WordPress themed sites, or WP sites using an SEO plugin. Most websites don’t.

Step 7: Look at all the content and locate any additional longtail keywords or keyword variations.

The competitors that are first in the SERP for a given head term or longtail query are ranking high for a variety of reasons. One of those reasons is their keyword selection. Sure, they may have good link profiles, but you can’t rank for a keyword unless you actually have that keyword (or some variation thereof) on your page.

6. Amazon.com

Amazon.com is king of the ecommerce jungle, no questions asked.

Part of their power is that they have total domination of the organic search results for just about any purchase-related keyword. When your audience circles closer to a transactional search query, Amazon is ranking somewhere.

Why? They’ve got keywords—lots of them. And they have reviews—lots of them. This means one thing for you: Lots of keywords ideas.

Let me make a quick clarification. Not everyone is going to find keyword ideas on Amazon. This works best if you have a physical products, and obviously only if Amazon sells it.

Here’s how to skim the cream off of Amazon’s great keywords.

Step 1: Google your keyword.

Step 2: Locate the Amazon entry in the SERP.

Step 3: Click on the result to see the product/landing page on Google.

Step 4: Locate keywords in the following places.

-“Show results for” menu

-Main header

-Text underneath main header

-“## Results for” text.

-Breadcrumb

-Items listed

Here’s a quick survey of where you can find these keywords. Notice the highlighted text.

Amazon.com: Bags & Cases: Electronics: Sleeves & Slipcases, Messenger Bags, Shoulder Bags, Backpacks & More 2014-08-11 14-28-16

You’ll find even more keywords once you dive into individual products.

Pay special attention to these areas on product pages:

-“Customers Who Bought This Item Also Bought”

-“Product Description”

-“Product Ads from External Websites”

-“Customer Questions & Answers.” You’ll find some nice query-like longtail keywords here.

-“Customer Reviews.” Again, this is a great source of longtails.

Let Amazon be your guide. They’re the biggest e-retailer around, and they have some great keyword clout going for them.

Conclusion

Keyword research is a basic skill for any SEO. The actual process of finding those keywords, however, does not require expensive tools, formula-driven methods, or an extremely limited pool of options.

I’ve used each of these methods for myself and my clients with incredible success.


What is your favorite source for finding great keywords? 

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 5 years ago from feedproxy.google.com