The post Can I Take Someone Else’s SEO Strategy from a Website to Drive Traffic? appeared first on OutreachMama.
The post Can I Take Someone Else’s SEO Strategy from a Website to Drive Traffic? appeared first on OutreachMama.
Posted by EricEnge
How do the SERPs for commercial queries vary from the treatment of informational queries? Moz is about to publish its new Search Engine Ranking Factors, and was kind enough to provide me with access to their raw ranking data. Today I am going to share some of what I found.
In addition, I am going to compare it against raw ranking data pulled by my company, Stone Temple Consulting (STC). What makes this so interesting is that the Moz data is based on commercial queries across 165,177 pages and the STC data is based on informational queries over 182,340 pages (for a total of 347,517 result pages). Let’s dive in!
Google rolled out their Mobile-Friendly Update on April 21 to much fanfare. We published our study results on how big that impact was here, and in that test, we tracked a set of 15,235 SERPs both before and after the SERPs.
The following chart shows the percentage of the top 10 results in the SERPs that are mobile friendly for the Moz (commercial) queries, and the STC informational queries before and after the mobile update:
Clearly, the commercial queries are returning a much larger percentage of mobile friendly results than the informational queries. Much of this may be due to it being more important to people running E-commerce sites to have a mobile-friendly site.
What this suggests to us is that publishers of E-commerce sites have been faster to adopt mobile friendliness than publishers of informational sites. That makes sense. Of course, our friends at Google know this is more important for commercial queries, too.
Regardless of query type, you can see that more than 60% of the results meet Google’s current definition for mobile friendliness. For commercial queries, it’s nearly 3/4 of them. Obviously, if you are not currently mobile friendly, then solve that, but that’s not the whole story.
Over time, I believe that what is considered mobile friendly is going to change. The mobile world will become much more than just viewing your current desktop site with a smaller screen and a crappier keyboard. What are some more things you can expect in the long term?
My third point is an item that is already in progress, and the first two are really not for most people at this time. However, I put them out there to stimulate some thinking that much more is going to happen in this space than meets the eye. In the short term, what can you do?
My suggestion is that you start looking at the mobile version of your site as more than a different rendering of your desktop site. What are the different use cases between mobile and desktop? Consider running two surveys of your users, one of desktop users and one of smartphone users, and ask them what they are looking for, and what they would like to see. My bet is that you will quickly see that the use cases are different in material ways.
In the near term, you can leverage this information to make your mobile site optimization work better for users, probably without re-architecting it entirely. In the longer term, collecting this type of data will prepare you for considering more radical design differences between your desktop and mobile sites.
Another one of the newer ranking factors is whether or not a site uses HTTPS. Just this past July 22, Google’s Gary Illyes clarified again that this is a minor ranking signal that acts like a tiebreaker in cases where the ranking for two competing pages are “more or less equal.”
How has that played out in the SERPs? Let’s take a look:
As with the mobile-friendliness, we once again see the commercial queries placing significantly more emphasis on this factor than the informational queries. Yet, the penetration levels are clearly far lower than they are for mobile friendliness. So should I care about this then?
Yes, it matters. Here are three reasons why:
Yes, I know there is much debate about whether or not you need to have HTTPS if all you are doing is running a content site. But a lot of big players out there are taking a simple stance: that it’s time for the plain text web to come to an end.
The big thing that HTTPS helps prevent is Man in the Middle Attacks. Do read the linked article if you don’t know what that is. Basically though, when you communicate with a non-secure web site, it’s pretty trivial for someone to intercept the communication and monitor or alter the information flow between you and the sending web site.
The most trivial form of this can occur any time you connect to a third party Wifi connection. People can inject ads you don’t want, or simply monitor everything you do and build a profile about you. Is that what you want?
Let me offer a simple example: Have you ever connected to Wifi in a hotel? What’s the first thing that happens? You try to go to a website, but instead you get a login screen asking for your room number and last name to sign in – and most times they charge you some fee.
That’s the concept – you tried to go to a web site, and instead got served different content (the Wifi login screen). The hotel can do this at any time. Even after you login and pay their fee, they can intercept your communication with other web sites and modify the content. A simple application for this it to inject ads. They can also monitor and keep a record of every site you visit. They can do this because they are in the middle.
In an HTTPS world, they will still be able to intercept the initial connection, but once you are connected, they will no longer be able to see inside the content going back and forth between you and the https websites you choose to access.
Eventually, the plain text web will come to an end. As this movement grows, more and more publishers will make the switch to HTTPS, and Google will dial up the strength of this signal as a ranking factor. If you have not made the switch, then get it into your longer term plans.
Both mobile-friendliness and HTTPS support appear to matter more to commercial sites today. I tend to think that this is more a result of more e-commerce site publishers and informational site publishers have made the conversions, rather than it being the impact of the related Google algorithms. Regardless of that, the importance of both of these factors will grow, and it would be wise to aggressively prepare for the future.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Posted by Isla_McKetta
Quick note: This article is meant to apply to teams of all sizes, from the sole proprietor who spends all night writing their copy (because they’re doing business during the day) to the copy team who occupies an entire floor and produces thousands of pieces of content per week. So if you run into a section that you feel requires more resources than you can devote just now, that’s okay. Bookmark it and revisit when you can, or scale the step down to a more appropriate size for your team. We believe all the information here is important, but that does not mean you have to do everything right now.
If you thought ideation was fun, get ready for content creation. Sure, we’ve all written some things before, but the creation phase of content marketing is where you get to watch that beloved idea start to take shape.
Before you start creating, though, you want to get (at least a little) organized, and an editorial calendar is the perfect first step.
Creativity and organization are not mutually exclusive. In fact, they can feed each other. A solid schedule gives you and your writers the time and space to be wild and creative. If you’re just starting out, this document may be sparse, but it’s no less important. Starting early with your editorial calendar also saves you from creating content willy-nilly and then finding out months later that no one ever finished that pesky (but crucial) “About” page.
There’s no wrong way to set up your editorial calendar, as long as it’s meeting your needs. Remember that an editorial calendar is a living document, and it will need to change as a hot topic comes up or an author drops out.
There are a lot of different types of documents that pass for editorial calendars. You get to pick the one that’s right for your team. The simplest version is a straight-up calendar with post titles written out on each day. You could even use a wall calendar and a Sharpie.
||The Five Colors of Oscar Fashion||12 Fabrics We’re Watching for Fall||Is Charmeuse the New Corduroy?||Hot Right Now: Matching Your Handbag to Your Hatpin||Tea-length and Other Fab Vocab You Need to Know|
Teams who are balancing content for different brands at agencies or other more complex content environments will want to add categories, author information, content type, social promo, and more to their calendars.
Truly complex editorial calendars are more like hybrid content creation/editorial calendars, where each of the steps to create and publish the content are indicated and someone has planned for how long all of that takes. These can be very helpful if the content you’re responsible for crosses a lot of teams and can take a long time to complete. It doesn’t matter if you’re using Excel or a Google Doc, as long as the people who need the calendar can easily access it. Gantt charts can be excellent for this. Here’s a favorite template for creating a Gantt chart in Google Docs (and they only get more sophisticated).
Complex calendars can encompass everything from ideation through writing, legal review, and publishing. You might even add content localization if your empire spans more than one continent to make sure you have the currency, date formatting, and even slang right.
Governance outlines who is taking responsibility for your content. Who evaluates your content performance? What about freshness? Who decides to update (or kill) an older post? Who designs and optimizes workflows for your team or chooses and manages your CMS?
All these individual concerns fall into two overarching components to governance: daily maintenance and overall strategy. In the long run it helps if one person has oversight of the whole process, but the smaller steps can easily be split among many team members. Read this to take your governance to the next level.
The scale of your writing enterprise doesn’t have to be limited to the number of authors you have on your team. It’s also important to consider the possibility of working with freelancers and guest authors. Here’s a look at the pros and cons of outsourced versus in-house talent.
Guest authors and freelancers
You (as part of their salary)
You (on a per-piece basis)
Subject matter expertise
Broad but shallow
Deep but narrow
Capacity for extra work
As you wish
Show me the Benjamins
On a dime
From that table, it might look like in-house authors have a lot more advantages. That’s somewhat true, but do not underestimate the value of occasionally working with a true industry expert who has name recognition and a huge following. Whichever route you take (and there are plenty of hybrid options), it’s always okay to ask that the writers you are working with be professional about communication, payment, and deadlines. In some industries, guest writers will write for links. Consider yourself lucky if that’s true. Remember, though, that the final paycheck can be great leverage for getting a writer to do exactly what you need them to (such as making their deadlines).
So those are some things you need to have in place before you create content. Now’s the fun part: getting started. One of the beautiful things about the Internet is that new and exciting tools crop up every day to help make our jobs easier and more efficient. Here are a few of our favorites.
You can always use Excel or a Google Doc to set up your editorial calendar, but we really like Trello for the ability to gather a lot of information in one card and then drag and drop it into place. Once there are actual dates attached to your content, you might be happier with something like a Google Calendar.
If you need a quick fix for ideation, turn your keywords into wacky ideas with Portent’s Title Maker. You probably won’t want to write to the exact title you’re given (although “True Facts about Justin Bieber’s Love of Pickles” does sound pretty fascinating…), but it’s a good way to get loose and look at your topic from a new angle.
Once you’ve got that idea solidified, find out what your audience thinks about it by gathering information with Survey Monkey or your favorite survey tool. Or, use Storify to listen to what people are saying about your topic across a wide variety of platforms. You can also use Storify to save those references and turn them into a piece of content or an illustration for one. Don’t forget that a simple social ask can also do wonders.
Content doesn’t have to be all about the words. Screencasts, Google+ Hangouts, and presentations are all interesting ways to approach content. Remember that not everyone’s a reader. Some of your audience will be more interested in visual or interactive content. Make something for everyone.
Don’t forget to make your content pretty. It’s not that hard to find free stock images online (just make sure you aren’t violating someone’s copyright). We like Morgue File, Free Images, and Flickr’s Creative Commons. If you aren’t into stock images and don’t have access to in-house graphic design, it’s still relatively easy to add images to your content. Pull a screenshot with Skitch or dress up an existing image with Pixlr. You can also use something like Canva to create custom graphics.
Don’t stop with static graphics, though. There are so many tools out there to help you create gifs, quizzes and polls, maps, and even interactive timelines. Dream it, then search for it. Chances are whatever you’re thinking of is doable.
Less is more. That’s not an excuse to pare your blog down to one post per month (check out our publishing cadence experiment), but it is an important reminder that if you’re writing “How to Properly Install a Toilet Seat” two days after publishing “Toilet Seat Installation for Dummies,” you might want to rethink your strategy.
The thing is, and I’m going to use another cliché here to drive home the point, you never get a second chance to make a first impression. Potential customers are roving the Internet right now looking for exactly what you’re selling. And if what they find is an only somewhat informative article stuffed with keywords and awful spelling and grammar mistakes… well, you don’t want that. Oh, and search engines think it’s spammy too…
We’re not copyright lawyers, so we can’t give you the ins and outs on all the technicalities. What we can tell you (and you already know this) is that it’s not okay to steal someone else’s work. You wouldn’t want them to do it to you. This includes images. So whenever you can, make your own images or find images that you can either purchase the rights to (stock imagery) or license under Creative Commons.
It’s usually okay to quote short portions of text, as long as you attribute the original source (and a link is nice). In general, titles and ideas can’t be copyrighted (though they might be trademarked or patented). When in doubt, asking for permission is smart.
That said, part of the fun of the Internet is the remixing culture which includes using things like memes and gifs. Just know that if you go that route, there is a certain amount of risk involved.
Your content needs to go through at least one editing cycle by someone other than the original author. There are two types of editing, developmental (which looks at the underlying structure of a piece that happens earlier in the writing cycle) and copy editing (which makes sure all the words are there and spelled right in the final draft).
If you have a very small team or are in a rush (and are working with writers that have some skill), you can often skip the developmental editing phase. But know that an investment in that close read of an early draft is often beneficial to the piece and to the writer’s overall growth.
Many content teams peer-edit work, which can be great. Other organizations prefer to run their work by a dedicated editor. There’s no wrong answer, as long as the work gets edited.
The good news is that search engines are doing their best to get closer and closer to understanding and processing natural language. So good writing (including the natural use of synonyms rather than repeating those keywords over and over and…) will take you a long way towards SEO mastery.
For that reason (and because it’s easy to get trapped in keyword thinking and veer into keyword stuffing), it’s often nice to think of your SEO check as a further edit of the post rather than something you should think about as you’re writing.
But there are still a few things you can do to help cover those SEO bets. Once you have that draft, do a pass for SEO to make sure you’ve covered the following:
Writing (assuming you’re the one doing the writing) can require a lot of energy—especially if you want to do it well. The best way to find time to write is to break each project down into little tasks. For example, writing a blog post actually breaks down into these steps (though not always in this order):
So if you only have random chunks of time, set aside 15-30 minutes one day (when your research is complete) to write a really great outline. Then find an hour the next to fill that outline in. After an additional hour the following day, (unless you’re dealing with a research-heavy post) you should have a solid draft by the end of day three.
The magic of working this way is that you engage your brain and then give it time to work in the background while you accomplish other tasks. Hemingway used to stop mid-sentence at the end of his writing days for the same reason.
Once you have that draft nailed, the rest of the steps are relatively easy (even the headline, which often takes longer to write than any other sentence, is easier after you’ve immersed yourself in the post over a few days).
Every designer and developer is a little different, so we can’t give you any blanket cure-alls for inter-departmental workarounds (aka “smashing silos”). But here are some suggestions to help you convey your vision while capitalizing on the expertise of your coworkers to make your content truly excellent.
From the initial brainstorm to general questions about how to work together, asking your team members what they think and prefer can go a long way. Communicate all the details you have (especially the unspoken expectations) and then listen.
If your designer tells you up front that your color scheme is years out of date, you’re saving time. And if your developer tells you that the interactive version of that timeline will require four times the resources, you have the info you need to fight for more budget (or reassess the project).
Things change in the design and development process. If you have interim check-ins already set up with everyone who’s working on the project, you’ll avoid the potential for nasty surprises at the end. Like finding out that no one has experience working with that hot new coding language you just read about and they’re trying to do a workaround that isn’t working.
Your job isn’t done when you hand over the copy to your designer or developer. Not only might they need help rewriting some of your text so that it fits in certain areas, they will also need you to proofread the final version. Accidents happen in the copy-and-paste process and there’s nothing sadder than a really beautiful (and expensive) piece of content that wraps up with a typo:
Conflict isn’t fun, but sometimes it’s necessary. The more people involved in your content, the more watered down the original idea can get and the more roadblocks and conflicting ideas you’ll run into. Some of that is very useful. But sometimes you’ll get pulled off track. Always remember who owns the final product (this may not be you) and be ready to stand up for the idea if it’s starting to get off track.
We’re confident this list will set you on the right path to creating some really awesome content, but is there more you’d like to know? Ask us your questions in the comments.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Posted by Isla_McKetta
“How can I learn SEO?” is a deceptively simple question. The standard approach is to attempt to appeal to anyone who’s interested in SEO without any idea of your previous experience or the actual reasons you want to learn SEO. That’s fun. Especially the part about weeding through tons of information that might not even apply to what you want to learn.
So let’s fix that. This guide is written to help you choose your own SEO adventure. If you know very little about SEO and just want to learn enough to impress your CMO, start at the beginning and stop when you feel like you understand enough concepts. Or if you’ve been doing SEO for years but need a brush up on the latest tips and tricks before impressing a potential client or employer, there’s a path for you too. Be sure to follow the links. They refer you to resources that are much more in-depth than we could reproduce in one post.
You may know what a title tag is, but you aren’t quite sure how to use it or why. The SEO Newbie could be a web developing hobbyist on the verge of a new obsession or someone looking for the next growing career path. Regardless, you have the most to learn (and the most to gain) from this adventure.
Start at the very beginning with What is SEO? and explore as many paths as you can. You might be surprised at the bits of information you pick up along the way. For a guided tour, follow the teal boxes. Don’t forget to bookmark this page so you can come back and learn more once you’ve absorbed each batch of info.
You were doing SEO back in the days of AltaVista, so you know all the things to know. Except maybe you took a break for a few years or decided to swap that black hat for a gray (or even white) one and need to know what’s the what with the major changes in the past few years.
Make a quick stop at the Algorithm Change History to catch up on the latest updates and penalties. After that, we’ll guide you through some of the topics that are more likely to have changed since you last checked. Just look for the purple boxes.
You’ve heard of SEO. You might even have worked with a few SEOs. Now you’re ready to dig in and understand what everyone’s talking about and how you can use all that new info to improve your marketing (and maybe level up your career at the same time).
Start with What is SEO? and look for shortcuts in orange boxes along the path to gather highlights. You can always dig deeper into any topic you find especially interesting.
Whichever path you choose, don’t worry, we’ll keep weaving you in and out of the sections that are relevant to your learning needs; just look for the color that’s relevant to your chosen character.
For you table of contents types who like to read straight through rather than have someone set the path for you, here’s a quick look at what we’ll be covering:
First things first. It’s hard to learn the ins and outs of SEO (search engine optimization) before you even know what it is. In the following short video, Rand Fishkin (a.k.a. the Wizard of Moz) defines SEO as “The practice of increasing the quantity and quality of the traffic that you earn through the organic results in search engines like Google, Yahoo, and Bing.”
Watch it to understand the difference between paid search and organic search and a few basic things about improving click-throughs from search pages.
A lot of different factors, from site speed to content quality, are important in SEO. These are, as far as anyone can tell, the factors that search engines use in determining whether or not to show your page to searchers. For a great intro to those elements and how they interact to affect your site’s overall ranking, check out Search Engine Land’s Periodic Table of SEO Success Factors.
That’s all nice, but if SEO is starting to seem like a lot of work, you probably want to understand whether SEO is even worth it. The short answer is that yes, SEO is worth it, but only if you want potential customers to be able to find your site when they’re searching on Google (or any other search engine).
Yes, search engines are crawling your site, but those crawlers aren’t as sophisticated as you might like. SEO gives you more control over how your site is represented in those search engine results pages. Good SEO can also improve how users experience your site. Learn more with Why Search Engine Marketing is Necessary.
Who are these search engines anyway and why do we spend so much time worrying about how they see our sites? To get the best answer, let’s look at that question from two points of view: search engines and searchers.
First, it’s important to understand how search engines crawl sites, build their indexes, and ultimately determine what’s relevant to a user’s query. Some of the specifics are trade secrets, but this section of the Beginner’s Guide to SEO offers a solid overview. And for an introduction to how Google ranks pages, watch this video:
As you’re learning about SEO, remember that not everything you read on the Internet should be treated as gospel. Here are some common myths and misconceptions about search engines.
Understanding how people use search engines is as crucial to SEO as understanding their needs is to marketing. Learn about classic search query patterns and how people scan search results here.
So far we’ve dropped a lot of phrases like “search results” and “search pages,” but what does any of that really mean? Search Engine Land does a great job of decoding the standard search engine results page (SERP). It’s a strong foundation for understanding why everyone is shooting to be in the top ten search results. But one thing you’ll find the more you get into SEO is that SERPs are rapidly evolving. Ads move, knowledge graphs appear (and disappear) and sometimes local search results invade. Dr. Pete takes you on a tour of how SERPs have changed and why ten blue links are probably a thing of the past in this article.
And then there’s the darker side of SEO, because once there’s a system, there’s someone trying to game that system. Spend more than a few minutes talking to anyone about SEO and you’ll hear something or other about black hat tactics like keyword stuffing and unnatural linking.
If you decide to use these tactics, you might soon become acquainted with search engine penalties. These algorithm updates, like Hummingbird and Penguin, are implemented by search engines at various intervals. The official word is that these updates improve user experience, but they can also be effective ways to penalize SEOs using spammy tactics. Learn more about Google’s algorithm updates. That page includes not only a full history of prior penalties, but it’s consistently refreshed when a new algorithm update is confirmed.
SEO veterans, you get to skip ahead of the class now to learn about the current state of page speed, mobile web development, and competitive research along with info on the best tools available today.
As you can see, a lot of work can go into SEO, but the results can be pretty incredible, too. To track your progress in topping the SERPs, make sure you’re using an analytics platform like Google Analytics or Omniture. You can get by with something like Rank Tracker to track rankings on keywords as a start, but eventually you’re going to want some of the data those more sophisticated tools offer.
Brain full? You’ve just learned everything a beginner needs to know about what SEO is. Go take a walk or get some coffee and let all that info soak in.
Before you go, save this bookmark.
First of all, don’t freak out, you don’t have to build a totally new site to get something out of this section. But if you’re an SEO Newbie intent on making a career of this, you might want to set up a practice site to really get your hands dirty and learn everything you can.
Before you start worrying about site content and structure (aka the fun stuff), you have a real chance to set your site up for success by using a strong domain name and developing a URL structure that’s SEO and user friendly. This stuff can be hard to change later when you have hundreds (or thousands) of pages in place, so you’ll be glad you started out on the right foot.
While you’re decades too late to score “buy.com,” it’s never too late to find the right domain name for you. This resource will help you sort through the SEO dos and SEO don’ts of selecting a root domain and TLD (don’t worry, all is explained) that are memorable without being spammy. There’s even info on what to consider if you have to change your domain name.
Don’t skip the section on subdomains—it could save you from making some rookie duplicate content errors.
Oh the SEO havoc that can ensue when your URLs aren’t set up quite right. Learn what not to do.
Things to think about at this point are that your content is indexable (that the crawlers can actually find it) and that you don’t have any orphaned pages. Learn more about those issues here.
And then you’re going to need a sitemap. Sitemaps help search engines index your content and understand the relationships between pages. So where better to get advice on how to build and implement a sitemap than straight from Google.
Another vital way to show search engines what pages are most important/related (and to help humans navigate your content) is through internal links. You want enough links to show users what’s what, but not so many that it’s impossible to tell what’s really important/related. Read more about optimal link structure and passing ranking power.
How long it takes a page on your site to load (page speed) mattered when we were all using desktops, but it’s crucial now that so much Internet traffic comes from mobile devices, plus it’s one factor in how pages get ranked. So whether you’re new to SEO or looking for new tricks, page speed might be a good place to start.
Use Google’s PageSpeed Insights to get specific recommendations on how to speed up your site and then get crackin’.
Speaking of mobile traffic, is your site mobile friendly? Learn about the difference between responsive designs and device-specific solutions on our mobile optimization page. You’ll also see a list of don’ts for mobile design (ever tried to close a pop-up on your iPhone?). This only gets more important the more mobile traffic you get (and want).
Phew! That was a lot of information, but once you’ve absorbed it all, you’ll have an excellent handle on site structure (which will save you a lot of trouble down the line). Bookmark this spot, then take a well-deserved break. We’ll start back here together when you’re ready.
Now that you have that site framework all set up, it’s time to get to the good stuff—populating it with content!
Before you write or post too much of your own content, you might want to see what’s working (and what isn’t) for your competitors. This analysis helps you identify those competitors and then understand what their links, rankings, and keywords look like. It’s important to update this research occasionally because your competition might change over time.
Veteran SEOs, you can skip straight ahead to Schema structured data unless you want a refresh on any other topics related to content.
Marketers, this is your chance to learn all the basics for SEO-friendly content, so stick with us for a spell. You won’t need the same depth of understanding as someone who plans to do SEO for a living, so let your curiosity guide you as deep into any of these topics as you want to go.
You may feel like you just did keyword research in the last step, but it’s crucial enough that we’re going to dive a little deeper here. Understand the value of a particular keyword and see what kind of shot you have at ranking for it by reading Chapter 5 of the Beginner’s Guide to SEO.
We promised you’d get to actually create content and that time is finally here! Now that you have an understanding of the competitive landscape and the keywords you want to (and can) rank for, write away. Remember that while you’re really writing content for users, a few simple tips can help your content stand out to search engines too. Isn’t it nice when something does double duty?
Go the extra mile by incorporating Schema structured data into your content. This additional info gives search engines the data they need to include rich snippets (like review boxes) below your search results.
Veteran SEOs, it’s a good idea to skip ahead to on-site related topics now.
Duplicate content is the bane of a website. Even if you think you’ve done everything right with your content, there’s a chance that a dynamic URL or something else is surfacing that same content to crawlers more than once. Not only does Google fail to see the logic in “twice as much is twice as nice” but they might also penalize you for it. Navigate around the most common pitfalls.
Content doesn’t just mean words, but unfortunately, the crawlers aren’t (yet) sophisticated enough to parse things like images and video. If your alt attributes are in good shape, you’re covered for images, but there are some SEO tactics you need to incorporate if you’re using video on your site. The good news is that once your video SEO is in good shape, video content often gets better rankings than text.
So you’ve got all that content on your site, but how do you know if it’s actually helping your SEO? At the beginning is a good time to set yourself up to measure your success so you can establish a baseline. Learn more about what metrics you should be tracking and how.
Time for yet another well-earned break. Grab a nap if you can and then spend a day or so observing how these issues are handled by other sites on the web. For maximum learning, try practicing some of your newfound knowledge on a site you have access to.
Set your bookmarks before you go.
When you’re ready to continue learning SEO, Newbies should make a stop at on-site related topics to get familiar with Robots.txt and HTTPS.
Any veterans still hanging about might want to take a quick read through on-site related topics to see what might have changed with Robots.txt and to take in the latest wisdom on HTTPS.
Marketers, you get to sit that one out and head straight on over to link-related topics.
For the true SEO aficionado, there are some technical details that you must get right. We’ve all heard stories of people accidentally blocking their site from being crawled and then wondering where all the traffic is. To keep from being one of these, learn about Robots.txt: how it helps you get found and when blocking robots is not actually effective.
The other technical on-site topic you’ll want to master is the switching of your site from HTTP to HTTPS without slowing down your site or losing traffic. This is especially important since Google announced that HTTPS is a ranking factor.
See how far you’ve leveled up already by getting current on just those two topics? Bet you aren’t even tired yet.
Newbies, it’s time to dive straight into link-related topics.
Veterans, go check out guest blogging for a look at how that practice has changed.
You now know a lot about how to make your site SEO friendly. Now it’s time to look at how to bend the rest of the Internet to your SEO will. Don’t worry, this’ll be TAGFEE.
External links are a fantastic way to show search engines that your site is credible and useful. They’re also a great way for users to find you by navigating from sites they already use. In short, they build your authority with humans and bots.
There are two effective ways to get more links from external sources: you can either earn them or build them. Chances are that you’ll get the best results by focusing on some combination of those two tactics.
Notice how we didn’t say “buy them”? Don’t buy links.
One tried and true way to build external links is through guest blogging, although this tactic has evolved a lot in the past few years. What used to be an “I give you content, you give me a link” sort of exchange has given way to guest blogging with a purpose.
Veterans, go ahead and pop on over to conversion rate optimization unless you want a refresh on link-related topics like link nofollow and canonicalization.
When you’re out there on the Internet trying to build links, be sure you’re looking for good quality links. Those are links that come from sites that are trustworthy, popular, and relevant to your content. For more information on factors search engines use to determine link value, read this page.
Anchor text is simply the text that’s used in a link whether it’s a link to a site or within that site. The implications of anchor text, though, reach farther because while keywords in anchor text can help your site rank for those words, it’s easy for keyword-stuffed anchor text to look spammy. Learn more about best practices for anchor text.
“Nofollow” is a designation you can apply to a link to keep it from passing any link equity (that’s kind of like the SEO equivalent of an up-vote). What might surprise you is that links don’t need to be “followed” to pass human authority. Even nofollowed links can help you build awareness and get more links. So when you’re linking to a site (or to other content on your site) think about whether that link leads to something you’re proud to be associated with.
Every Internet user eventually encounters a 404 error page, but that’s just one of the many HTTP status codes found on the web. Learn the difference between a 500 and a 503 along with some best practices for 404 pages here.
One of the most useful HTTP status codes for SEOs is the 301 redirect which is used to tell search engines a page has permanently moved elsewhere (and passes a good share of link equity). Gather all the in-depth info you ever needed about 301s and other redirects.
Perhaps because it’s one of the hardest SEO words to pronounce, canonicalization has a reputation for being complex. But the basic concept is simple: you have two (or more) pages that have similar content and canonicalization allows you to either combine those pages (using redirects) or indicate which version of the page you want search engines to treat as paramount. Read up on the details of using canonicalization to handle duplicate content.
You’ve now mastered so much SEO knowledge that you could teach the stuff (at least on a 101 level). If you’ve read and digested all the links along the way, you now know so much more about SEO than when you started.
But you’re so self-motivated that you want to know even more, don’t you?
Newbies, read closely through other optimization to refine your knowledge and apply those newly-minted optimization skills to even more aspects of the sites you’re working on.
Marketers, you’ve done a fabulous job powering through all these topics and there’s no doubt you can hold your own in the next SEO team meeting. To take your understanding of optimization even further, skim other optimization.
Or scoot on ahead and test your skills with the SEO Expert Quiz.
There are many ways (beyond the basic SEO knowledge you’ve been accruing here) to give your site an optimization boost. Find (and fix) what’s keeping potential customers from converting with conversion rate optimization, get your storefronts found on the web with local SEO, and find out how to prep your site to show up in international SERPs with international SEO.
If shoppers are abandoning their carts so fast you’re looking around for the tornado, your marketing funnel is acting more like a sieve and it’s time to plug some holes. Stop the bleeding with Paddy Moogan’s five-step framework for CRO. And keep on learning by keeping up with the latest CRO posts from the Moz Blog.
Even if you do most of your business in person at a local shop, customers are still trying to use the Internet to find you (and your hours, phone number, menu, etc.). Make sure they’re getting the right info (and finding you before they find your competitor across the street) by investing some time learning about local SEO. On that page you can also sign up for the Local 7-Pack, a monthly newsletter highlighting the top local SEO news you need to know. Or, watch for the latest local SEO developments on the Moz Blog.
A global customer base is a good thing to have, but you want to use international SEO to make sure potential customers in the UK are finding your British shipping policies instead of your American ones. Master hreflang to direct Chinese customers to content using simplified Chinese characters while you send Taiwanese customers to content that uses the traditional characters they’re used to. And find out how your site structure and whether you’re using a country code top-level domain (ccTLD) (like “.uk”) affects your SEO and potential ranking in international SERPs.
SEO newbies, we really can’t call you newbies anymore. Congratulations! No one has read deeper into this blog post or learned more along the way than you have.
SEO veterans, you knew a lot of this already, but now you’re up to date on the latest tips, tricks, and techniques.
And SEO-curious marketers, if you’re still hanging around, bravo! You can safely add “speaks SEO” as a feather in your cap.
You’re all ready to test your skills against the experts and prove just how much you’ve learned, take the SEO Expert Quiz and brag about your score.
Feel like you’ve mastered SEO already? Take the New SEO Expert Quiz to see how you stack up.
Congratulations! You’re well on your way to SEO mastery. Bask in that glow for a moment or two before moving on to your next project.
The fun thing about a developing field like SEO is that the learning and adventure never end. Whether you’re looking for more advanced knowledge or just to learn in a different format, try Distilled U‘s interactive modules or Market Motive’s web-based classes. If you’re looking for a job in SEO, Carl Hendy might just have your roadmap.
Thanks for following along with this choose your own adventure version of how to learn SEO. Share your favorite resources and ask us about any topics we might have missed in the comments.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Posted by randfish
When we’re doing keyword research and targeting, we have a choice to make: Are we targeting broader keywords with multiple potential searcher intents, or are we targeting very narrow keywords where it’s pretty clear what the searchers were looking for? Those different approaches, it turns out, apply to content creation and site architecture, as well. In today’s Whiteboard Friday, Rand illustrates that connection.
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about pinpoint versus floodlight tactics for content targeting, content strategy, and keyword research, keyword targeting strategy. This is also called the shotgun versus sniper approach, but I’m not a big gun fan. So I’m going to stick with my floodlight versus pinpoint, plus, you know, for the opening shot we don’t have a whole lot of weaponry here at Moz, but we do have lighting.
So let’s talk through this at first. You’re going through and doing some keyword research. You’re trying to figure out which terms and phrases to target. You might look down a list like this.
Well, maybe, I’m using an example here around antique science equipment. So you see these various terms and phrases. You’ve got your volume numbers. You probably have lots of other columns. Hopefully, you’ve watched the Whiteboard Friday on how to do keyword research like it’s 2015 and not 2010.
So you know you have all these other columns to choose from, but I’m simplifying here for the purpose of this experiment. So you might choose some of these different terms. Now, they’re going to have different kinds of tactics and a different strategic approach, depending on the breadth and depth of the topic that you’re targeting. That’s going to determine what types of content you want to create and where you place it in your information architecture. So I’ll show you what I mean.
For antique science equipment, this is a relatively broad phrase. I’m going to do my floodlight analysis on this, and floodlight analysis is basically saying like, “Okay, are there multiple potential searcher intents?” Yeah, absolutely. That’s a fairly broad phase. People could be looking to transact around it. They might be looking for research information, historical information, different types of scientific equipment that they’re looking for.
<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b15fc96679b8.73854740.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"
Are there four or more approximately unique keyword terms and phrases to target? Well, absolutely, in fact, there’s probably more than that. So antique science equipment, antique scientific equipment, 18th century scientific equipment, all these different terms and phrases that you might explore there.
Is this a broad content topic with many potential subtopics? Again, yes is the answer to this. Are we talking about generally larger search volume? Again, yes, this is going to have a much larger search volume than some of the narrower terms and phrases. That’s not always the case, but it is here.
For pinpoint analysis, we kind of go the opposite direction. So we might look at a term like antique test tubes, which is a very specific kind of search, and that has a clear single searcher intent or maybe two. Someone might be looking for actually purchasing one of those, or they might be looking to research them and see what kinds there are. Not a ton of additional intents behind that. One to three unique keywords, yeah, probably. It’s pretty specific. Antique test tubes, maybe 19th century test tubes, maybe old science test tubes, but you’re talking about a limited set of keywords that you’re targeting. It’s a narrow content topic, typically smaller search volume.
<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b160069eb6b1.12473448.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"
Now, these are going to feed into your IA, your information architecture, and your site structure in this way. So floodlight content generally sits higher up. It’s the category or the subcategory, those broad topic terms and phrases. Those are going to turn into those broad topic category pages. Then you might have multiple, narrower subtopics. So we could go into lab equipment versus astronomical equipment versus chemistry equipment, and then we’d get into those individual pinpoints from the pinpoint analysis.
Why are we doing this? Well, generally speaking, if you can take your terms and phrases and categorize them like this and then target them differently, you’re going to provide a better, more logical user experience. Someone who searches for antique scientific equipment, they’re going to really expect to see that category and then to be able to drill down into things. So you’re providing them the experience they predict, the one that they want, the one that they expect.
It’s better for topic modeling analysis and for all of the algorithms around things like Hummingbird, where Google looks at: Are you using the types of terms and phrases, do you have the type of architecture that we expect to find for this keyword?
It’s better for search intent targeting, because the searcher intent is going to be fulfilled if you provide the multiple paths versus the narrow focus. It’s easier keyword targeting for you. You’re going to be able to know, “Hey, I need to target a lot of different terms and phrases and variations in floodlight and one very specific one in pinpoint.”
There’s usually higher searcher satisfaction, which means you get lower bounce rate. You get more engagement. You usually get a higher conversion rate. So it’s good for all those things.
I’ll actually create pages for each of antique scientific equipment and antique test tubes to illustrate this. So I’ve got two different types of pages here. One is my antique scientific equipment page.
<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b161fa871e32.54731215.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"
This is that floodlight, shotgun approach, and what we’re doing here is going to be very different from a pinpoint approach. It’s looking at like, okay, you’ve landed on antique scientific equipment. Now, where do you want to go? What do you want to specifically explore? So we’re going to have a little bit of content specifically about this topic, and how robust that is depends on the type of topic and the type of site you are.
If this is an e-commerce site or a site that’s showing information about various antiques, well maybe we don’t need very much content here. You can see the filtration that we’ve got is going to be pretty broad. So I can go into different centuries. I can go into chemistry, astronomy, physics. Maybe I have a safe for kids type of stuff if you want to buy your kids antique lab equipment, which you might be. Who knows? Maybe you’re awesome and your kids are too. Then different types of stuff at a very broad level. So I can go to microscopes or test tubes, lab searches.
This is great because it’s got broad intent foci, serving many different kinds of searchers with the same page because we don’t know exactly what they want. It’s got multiple keyword targets so that we can go after broad phrases like antique or old or historical or 13th, 14th, whatever century, science and scientific equipment ,materials, labs, etc., etc., etc. This is a broad page that could reach any and all of those. Then there’s lots of navigational and refinement options once you get there.
Total opposite of pinpoint content.
<img src="http://d1avok0lzls2w.cloudfront.net/uploads/blog/55b1622740f0b5.73477500.jpg" rel="box-shadow: 0 0 10px 0 #999; border-radius: 20px;"
Pinpoint content, like this antique test tubes page, we’re still going to have some filtration options, but one of the important things to note is note how these are links that take you deeper. Depending on how deep the search volume goes in terms of the types of queries that people are performing, you might want to make a specific page for 17th century antique test tubes. You might not, and if you don’t want to do that, you can have these be filters that are simply clickable and change the content of the page here, narrowing the options rather than creating completely separate pages.
So if there’s no search volume for these different things and you don’t think you need to separately target them, go ahead and just make them filters on the data that already appears on this page or the results that are already in here as opposed to links that are going to take you deeper into specific content and create a new page, a new experience.
You can also see I’ve got my individual content here. I probably would go ahead and add some content specifically to this page that is just unique here and that describes antique test tubes and the things that your searchers need. They might want to know things about price. They might want to know things about make and model. They might want to know things about what they were used for. Great. You can have that information broadly, and then individual pieces of content that someone might dig into.
This is narrower intent foci obviously, serving maybe one or two searcher intents. This is really talking about targeting maybe one to two separate keywords. So antique test tubes, maybe lab tubes or test tube sets, but not much beyond that.
Ten we’re going to have fewer navigational paths, fewer distractions. We want to keep the searcher. Because we know their intent, we want to guide them along the path that we know they probably want to take and that we want them to take.
So when you’re considering your content, choose wisely between shotgun/floodlight approach or sniper/pinpoint approach. Your searchers will be better served. You’ll probably rank better. You’ll be more likely to earn links and amplification. You’re going to be more successful.
Looking forward to the comments, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.
Posted by randfish
When should you disallow search engines in your robots.txt file, and when should you use meta robots tags in a page header? What about nofollowing links? In today’s Whiteboard Friday, Rand covers these tools and their appropriate use in four situations that SEOs commonly find themselves facing.
Howdy Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to talk about controlling search engine crawlers, blocking bots, sending bots where we want, restricting them from where we don’t want them to go. We’re going to talk a little bit about crawl budget and what you should and shouldn’t have indexed.
As a start, what I want to do is discuss the ways in which we can control robots. Those include the three primary ones: robots.txt, meta robots, and—well, the nofollow tag is a little bit less about controlling bots.
There are a few others that we’re going to discuss as well, including Webmaster Tools (Search Console) and URL status codes. But let’s dive into those first few first.
Robots.txt lives at yoursite.com/robots.txt, it tells crawlers what they should and shouldn’t access, it doesn’t always get respected by Google and Bing. So a lot of folks when you say, “hey, disallow this,” and then you suddenly see those URLs popping up and you’re wondering what’s going on, look—Google and Bing oftentimes think that they just know better. They think that maybe you’ve made a mistake, they think “hey, there’s a lot of links pointing to this content, there’s a lot of people who are visiting and caring about this content, maybe you didn’t intend for us to block it.” The more specific you get about an individual URL, the better they usually are about respecting it. The less specific, meaning the more you use wildcards or say “everything behind this entire big directory,” the worse they are about necessarily believing you.
Meta robots—a little different—that lives in the headers of individual pages, so you can only control a single page with a meta robots tag. That tells the engines whether or not they should keep a page in the index, and whether they should follow the links on that page, and it’s usually a lot more respected, because it’s at an individual-page level; Google and Bing tend to believe you about the meta robots tag.
And then the nofollow tag, that lives on an individual link on a page. It doesn’t tell engines where to crawl or not to crawl. All it’s saying is whether you editorially vouch for a page that is being linked to, and whether you want to pass the PageRank and link equity metrics to that page.
Interesting point about meta robots and robots.txt working together (or not working together so well)—many, many folks in the SEO world do this and then get frustrated.
What if, for example, we take a page like “blogtest.html” on our domain and we say “all user agents, you are not allowed to crawl blogtest.html. Okay—that’s a good way to keep that page away from being crawled, but just because something is not crawled doesn’t necessarily mean it won’t be in the search results.
So then we have our SEO folks go, “you know what, let’s make doubly sure that doesn’t show up in search results; we’ll put in the meta robots tag:”
<meta name="robots" content="noindex, follow">
So, “noindex, follow” tells the search engine crawler they can follow the links on the page, but they shouldn’t index this particular one.
Then, you go and run a search for “blog test” in this case, and everybody on the team’s like “What the heck!? WTF? Why am I seeing this page show up in search results?”
The answer is, you told the engines that they couldn’t crawl the page, so they didn’t. But they are still putting it in the results. They’re actually probably not going to include a meta description; they might have something like “we can’t include a meta description because of this site’s robots.txt file.” The reason it’s showing up is because they can’t see the noindex; all they see is the disallow.
So, if you want something truly removed, unable to be seen in search results, you can’t just disallow a crawler. You have to say meta “noindex” and you have to let them crawl it.
So this creates some complications. Robots.txt can be great if we’re trying to save crawl bandwidth, but it isn’t necessarily ideal for preventing a page from being shown in the search results. I would not recommend, by the way, that you do what we think Twitter recently tried to do, where they tried to canonicalize www and non-www by saying “Google, don’t crawl the www version of twitter.com.” What you should be doing is rel canonical-ing or using a 301.
Meta robots—that can allow crawling and link-following while disallowing indexation, which is great, but it requires crawl budget and you can still conserve indexing.
The nofollow tag, generally speaking, is not particularly useful for controlling bots or conserving indexation.
Webmaster Tools (now Google Search Console) has some special things that allow you to restrict access or remove a result from the search results. For example, if you have 404’d something or if you’ve told them not to crawl something but it’s still showing up in there, you can manually say “don’t do that.” There are a few other crawl protocol things that you can do.
And then URL status codes—these are a valid way to do things, but they’re going to obviously change what’s going on on your pages, too.
If you’re not having a lot of luck using a 404 to remove something, you can use a 410 to permanently remove something from the index. Just be aware that once you use a 410, it can take a long time if you want to get that page re-crawled or re-indexed, and you want to tell the search engines “it’s back!” 410 is permanent removal.
301—permanent redirect, we’ve talked about those here—and 302, temporary redirect.
Now let’s jump into a few specific use cases of “what kinds of content should and shouldn’t I allow engines to crawl and index” in this next version…
[Rand moves at superhuman speed to erase the board and draw part two of this Whiteboard Friday. Seriously, we showed Roger how fast it was, and even he was impressed.]
So we’ve got these four big problems that I want to talk about as they relate to crawling and indexing.
The first one here is around, “If I have content of quality I’m still trying to improve—it’s not yet ready for primetime, it’s not ready for Google, maybe I have a bunch of products and I only have the descriptions from the manufacturer and I need people to be able to access them, so I’m rewriting the content and creating unique value on those pages… they’re just not ready yet—what should I do with those?”
My options around crawling and indexing? If I have a large quantity of those—maybe thousands, tens of thousands, hundreds of thousands—I would probably go the robots.txt route. I’d disallow those pages from being crawled, and then eventually as I get (folder by folder) those sets of URLs ready, I can then allow crawling and maybe even submit them to Google via an XML sitemap.
If I’m talking about a small quantity—a few dozen, a few hundred pages—well, I’d probably just use the meta robots noindex, and then I’d pull that noindex off of those pages as they are made ready for Google’s consumption. And then again, I would probably use the XML sitemap and start submitting those once they’re ready.
What about, “Should I noindex, nofollow, or potentially disallow crawling on largely duplicate URLs or thin content?” I’ve got an example. Let’s say I’m an ecommerce shop, I’m selling this nice Star Wars t-shirt which I think is kind of hilarious, so I’ve got starwarsshirt.html, and it links out to a larger version of an image, and that’s an individual HTML page. It links out to different colors, which change the URL of the page, so I have a gray, blue, and black version. Well, these four pages are really all part of this same one, so I wouldn’t recommend disallowing crawling on these, and I wouldn’t recommend noindexing them. What I would do there is a rel canonical.
Remember, rel canonical is one of those things that can be precluded by disallowing. So, if I were to disallow these from being crawled, Google couldn’t see the rel canonical back, so if someone linked to the blue version instead of the default version, now I potentially don’t get link credit for that. So what I really want to do is use the rel canonical, allow the indexing, and allow it to be crawled. If you really feel like it, you could also put a meta “noindex, follow” on these pages, but I don’t really think that’s necessary, and again that might interfere with the rel canonical.
Number three: “If I want to pass link equity (or at least crawling) through a set of pages without those pages actually appearing in search results—so maybe I have navigational stuff, ways that humans are going to navigate through my pages, but I don’t need those appearing in search results—what should I use then?”
What I would say here is, you can use the meta robots to say “don’t index the page, but do follow the links that are on that page.” That’s a pretty nice, handy use case for that.
Do NOT, however, disallow those in robots.txt—many, many folks make this mistake. What happens if you disallow crawling on those, Google can’t see the noindex. They don’t know that they can follow it. Granted, as we talked about before, sometimes Google doesn’t obey the robots.txt, but you can’t rely on that behavior. Trust that the disallow in robots.txt will prevent them from crawling. So I would say, the meta robots “noindex, follow” is the way to do this.
Finally, fourth, “What should I do with search results-type pages?” Google has said many times that they don’t like your search results from your own internal engine appearing in their search results, and so this can be a tricky use case.
Sometimes a search result page—a page that lists many types of results that might come from a database of types of content that you’ve got on your site—could actually be a very good result for a searcher who is looking for a wide variety of content, or who wants to see what you have on offer. Yelp does this: When you say, “I’m looking for restaurants in Seattle, WA,” they’ll give you what is essentially a list of search results, and Google does want those to appear because that page provides a great result. But you should be doing what Yelp does there, and make the most common or popular individual sets of those search results into category-style pages. A page that provides real, unique value, that’s not just a list of search results, that is more of a landing page than a search results page.
However, that being said, if you’ve got a long tail of these, or if you’d say “hey, our internal search engine, that’s really for internal visitors only—it’s not useful to have those pages show up in search results, and we don’t think we need to make the effort to make those into category landing pages.” Then you can use the disallow in robots.txt to prevent those.
Just be cautious here, because I have sometimes seen an over-swinging of the pendulum toward blocking all types of search results, and sometimes that can actually hurt your SEO and your traffic. Sometimes those pages can be really useful to people. So check your analytics, and make sure those aren’t valuable pages that should be served up and turned into landing pages. If you’re sure, then go ahead and disallow all your search results-style pages. You’ll see a lot of sites doing this in their robots.txt file.
That being said, I hope you have some great questions about crawling and indexing, controlling robots, blocking robots, allowing robots, and I’ll try and tackle those in the comments below.
We’ll look forward to seeing you again next week for another edition of Whiteboard Friday. Take care!
Posted by EricEnge
Today’s post focuses on a vision for your online presence. This vision outlines what it takes to be the best, both from an overall reputation and visibility standpoint, as well as an SEO point of view. The reason these are tied together is simple: Your overall online reputation and visibility is a huge factor in your SEO. Period. Let’s start by talking about why.
For purposes of this post, let’s define three cornerstone ranking signals that most everyone agrees on:
Links remain a huge factor in overall ranking. Both Cyrus Shepard and Marcus Tober re-confirmed this on the Periodic Table of SEO Ranking Factors session at the SMX Advanced conference in Seattle this past June.
On-page content remains a huge factor too, but with some subtleties now thrown in. I wrote about some of this in earlier posts I did on Moz about Term Frequency and Inverse Document Frequency. Suffice it to say that on-page content is about a lot more than pure words on the page, but also includes the supporting pages that you link to.
This is not one of the traditional SEO signals from the early days of SEO, but most advanced SEO pros that I know consider it a real factor these days. One of the most popular concepts people talk about is called pogo-sticking, which is illustrated here:
You can learn more about the pogosticking concept by visiting this Whiteboard Friday video by a rookie SEO with a last name of Fishkin.
OK, so these are the more obvious signals, but now let’s look more broadly at the overall web ecosystem and talk about other types of ranking signals. Be warned that some of these signals may be indirect, but that just doesn’t matter. In fact, my first example below is an indirect factor which I will use to demonstrate why whether a signal is direct or indirect is not an issue at all.
Let me illustrate with an example. Say you spend $1 billion dollars building a huge brand around a product that is massively useful to people. Included in this is a sizable $100 million dollar campaign to support a highly popular charitable foundation, and your employees regularly donate time to help out in schools across your country. In short, the great majority of people love your brand.
Do you think this will impact the way people link to your site? Of course it does. Do you think it will impact how likely people are to be satisified with quality of the pages of your site? Consider this A/B test scenario of 2 pages from different “brands” (for the one on the left, imagine the image of Coca Cola or Pepsi Cola, whichever one you prefer):
Do you think that the huge brand will get a benefit of a doubt on their page that the no-name brand does not even though the pages are identical? Of course they will. Now let’s look at some simpler scenarios that don’t involve a $1 billion investment.
Imagine that a user arrives on your auto parts site after searching on the phrase “oil filter” at Google or Bing. Chances are pretty good that they want an oil filter, but here are some other items they may also want:
This is just the basics, right? But, you would be surprised with how many sites don’t include links or information on directly related products on their money pages. Providing this type of smart site and page design can have a major impact on user engagement with the money pages of your site.
In the prior item we covered the user’s most directly related needs, but they may have secondary needs as well. Someone who is changing a car’s oil is either a mechanic or a do-it-yourself-er. What else might they need? How about other parts, such as windshield wipers or air filters?
These are other fairly easy maintenance steps for someone who is working on their car to complete. Presence of these supporting products could be one way to improve user engagement with your pages.
Publishing world-class content on your site is a great way to produce links to your site. Of course, if you do this on a blog on your site, it may not provide links directly to your money pages, but it will nonetheless lift overall site authority.
In addition, if someone has consumed one or more pieces of great content on your site, the chance of their engaging in a more positive manner with your site overall go way up. Why? Because you’ve earned their trust and admiration.
Are there major media sites that cover your market space? Do they consider you to be an expert? Will they quote you in articles they write? Can you provide them with guest posts or let you be a guest columnist? Will they collaborate on larger content projects with you?
All of these activities put you in front of their audiences, and if those audiences overlap with yours, this provides a great way to build your overall reputation and visibility. This content that you publish, or collaborate on, that shows up on 3rd-party sites will get you mentions and links. In addition, once again, it will provide you with a boost to your branding. People are now more likely to consume your other content more readily, including on your money pages.
The concept here shares much in common with the prior point. Social media provides opportunities to get in front of relevant audiences. Every person that’s an avid follower of yours on a social media site is more likely to show very different behavior characteristics interacting with your site than someone that does not know you well at all.
Note that links from social media sites are nofollowed, but active social media behavior can lead to people implementing “real world” links to your site that are followed, from their blogs and media web sites.
Think your offline activity doesn’t matter online? Think again. Relationships are still most easily built face-to-face. People you meet and spend time with can well become your most loyal fans online. This is particularly important when it comes to building relationships with influential people.
One great way to do that is to go to public events related to your industry, such as conferences. Better still, obtain speaking engagements at those conferences. This can even impact people who weren’t there to hear you speak, as they become aware that you have been asked to do that. This concept can also work for a small local business. Get out in your community and engage with people at local events.
The payoff here is similar to the payoff for other items: more engaged, highly loyal fans who engage with you across the web, sending more and more positive signals, both to other people and to search engines, that you are the real deal.
Whatever your business may be, you need to take care of your customers as best you can. No one can make everyone happy, that’s unrealistic, but striving for much better than average is a really sound idea. Having satisfied customers saying nice things about you online is a big impact item in the grand scheme of things.
While this post is not about the value of influencer relationships, I include this in the list for illustration purposes, for two reasons:
The web provides a level of integrated, real-time connectivity of a kind that the world has never seen before. This is only going to increase. Do something bad to a customer in Hong Kong? Consumers in Boston will know within 5 minutes. That’s where it’s all headed.
Google and Bing (and any future search engine that may emerge) want to measure these types of signals because they tell them how to improve the quality of the experience on their platforms. There are may ways they can perform these measurements.
One simple concept is covered by Rand in this recent Whiteboard Friday video. The discussion is about a recent patent granted to Google that shows how the company can use search queries to detect who is an authority on a topic.
The example he provides is about people who search on “email finding tool”. If Google also finds that a number of people search on “voila norbert email tool”, Google may use that as an authority signal.
Think about that for a moment. How are you going to get people to search on your brand more while putting it together with a non-branded querly like that? (OK, please leave Mechanical Turk and other services like that out of the discussion).
Now you can start to see the bigger picture. Measurements like pogosticking and this recent search behavior related patent are just the tip of the iceberg. Undoubtedly, there are many other ways that search engines can measure what people like and engage with the most.
This is all part of SEO now. UX, product breadth, problem solving, UX, engaging in social media, getting face to face, creating great content that you publish in front of other people’s audiences, and more.
For the small local business, you can still win at this game, as your focus just needs to be on doing it better than your competitors. The big brands will never be hyper-local like you are, so don’t think you can’t play the game, because you can.
Whoever you are, get ready, because this new integrated ecosystem is already upon us, and you need to be a part of it.
Posted by EricEnge
Editor’s note: Today we’re featuring back-to-back episodes of Whiteboard Friday from our friends at Stone Temple Consulting. Make sure to also check out the first episode, “Becoming Better SEO Scientists” from Mark Traphagen.
User experience and the quality of your content have an incredibly broad impact on your SEO efforts. In this episode of Whiteboard Friday, Stone Temple’s Eric Enge shows you how paying attention to your users can benefit your position in the SERPs.
Hi, Mozzers. I’m Eric Enge, CEO of Stone Temple Consulting. Today I want to talk to you about one of the most underappreciated aspects of SEO, and that is the interaction between user experience, content quality, and your SEO rankings and traffic.
I’m going to take you through a little history first. You know, we all know about the Panda algorithm update that came out in February 23, 2011, and of course more recently we have the search quality update that came out in May 19, 2015. Our Panda friend had 27 different updates that we know of along the way. So a lot of stuff has gone on, but we need to realize that that is not where it all started.
The link algorithm from the very beginning was about search quality. Links allowed Google to have an algorithm that gave better results than the other search engines of their day, which were dependent on keywords. These things however, that I’ve just talked about, are still just the tip of the iceberg. Google goes a lot deeper than that, and I want to walk you through the different things that it does.
So consider for a moment, you have someone search on the phrase “men’s shoes” and they come to your website.
What is that they want when they come to your website? Do they want sneakers, sandals, dress shoes? Well, those are sort of the obvious things that they might want. But you need to think a little bit more about what the user really wants to be able to know before they buy from you.
So as we think about this, what is it that we can do to do a better job with our websites? Well, first of all, lose the focus on keywords. Don’t get me wrong, keywords haven’t gone entirely away. But the pages where we overemphasize one particular keyword over another or related phrases are long gone, and you need to have a broader focus on how you approach things.
User experience is now a big deal. You really need to think about how users are interacting with your page and how that shows your overall page quality. Think about the percent satisfaction. If I send a hundred users to your page from my search engine, how many of those users are going to be happy with the content or the products or everything that they see with your page? You need to think through the big picture. So at the end of the day, this impacts the content on your page to be sure, but a lot more than that it impacts the design, related items that you have on the page.
So let me just give you an example of that. I looked at one page recently that was for a flower site. It was a page about annuals on that site, and that page had no link to their perennials page. Well, okay, a fairly good percentage of people who arrive on a page about annuals are also going to want to have perennials as something they might consider buying. So that page was probably coming across as a poor user experience. So these related items concepts are incredibly important.
Then the links to your page is actually a way to get to some of those related items, and so those are really important as well. What are the related products that you link to?
Finally, really it impacts everything you do with your page design. You need to move past the old-fashioned way of thinking about SEO and into the era of: How am I doing with satisfying all the people who come to the pages of your site?
Thank you, Mozzers. Have a great day.