Below is what happened in search today, as reported on Search Engine Land and from other places across the web.
Please visit Search Engine Land for the full article.
[ccw-atrib-link]
Posted by MiriamEllis
When America’s first star TV chef, Julia Child, demonstrated the use of a wire whisk on her 1960’s cooking show, the city of Pittsburgh sold out of them. Pennsylvanians may well have owned a few of these implements prior to the show’s air date, but probably didn’t spend a lot of time thinking about them. After the show, however, wire whisks were on everyone’s mind and they simply had to have one. Call it a retro micro-moment, and imagine consumers jamming the lines of rotary phones or hoofing it around town in quest of this gleaming gadget … then zoom up to the present and see us all on our mobile devices.
I like this anecdote from the pages of culinary history because it encapsulates all four of Google’s stated core micro-moments:
I want to know – Consumers were watching a local broadcast of this show in Pittsburgh because they wanted to know how to make an omelet.
I want to go – Consumers then scoured the city in search of the proper whisk.
I want to buy – Consumers then purchased the implement at a chosen retailer.
I want to do – And finally, consumers either referred to the notes they had taken during the show (no DVRs back then) or might have turned to Julia Child’s cookbook to actually beat up their first-ever omelet.
Not only does the wire whisk story foreshadow the modern micro-moment, it also provides a roadmap for tying each of the 4 stages to local SEO via current technology. I’ve seen other bloggers pointing to the ‘I want to go’ phase as inherently local, but in this post, I want to demonstrate how your local business can decisively claim all four of these micro-moments as your own, and claim the desirable transactions resulting thereby!
Google whisked up some excitement of their own with the publication of Micro-Moments: Your Guide to Winning the Shift to Mobile. Some of the statistics in the piece are stunning:
Google defines micro-moments as “critical touch points within today’s consumer journey, and when added together, they ultimately determine how that journey ends,” and goes on to identify mobile as the great facilitator of all this activity. It’s simple to think of micro-moments as a series of points in time that culminate in a consumer arriving at a transactional decision. For local business owners and their marketers, the goal is to ‘be there’ for the consumer at each of these critical points with the resources you have developed on the web.
Let’s reverse-engineer the famous tale of the wire whisk and put it into a modern technological context, demonstrating how a hypothetical cooking supply store in Pittsburgh, PA could become a major micro-moments winner in 2017.
I want to be sure to preface this with one very important proviso about the order in which micro-moments happen: it varies.
For example, a consumer might decide she wants to patch cracks in her ceiling so she watches a video on YouTube demoing this >>> looks up the name of the putty the YouTube personality was using >>> looks up where to buy that putty locally >>> buys it. Or, the consumer could already be inside a home improvement store, see putty, realize she’d like to patch cracks, then look up reviews of various putty brands, look at a video to see how difficult the task is, and finally, purchase.
There is no set order in which micro-moments occur, and though there may be patterns specific to auto body shops or insurance firms, the idea is to be present at every possible moment in time so that the consumer is assisted, regardless of the order in which they discover and act. What I’m presenting here is just one possible path.
Our consumer is a 30-year-old man named Walter who loves the fluffy omelets served at a fancy bistro in Pittsburgh. One morning while at the restaurant, Walter asks himself,
“I wonder why I can’t make omelets as fluffy as these at home. I’m not a bad cook. There must be some secret to it. Hey — I challenge myself to find out what that secret is!”
While walking back to his car, Walter pulls out his smartphone and begins his micro-moment journey with his I-want-to-know query: how to make a fluffier omelet.
Across town, Patricia, the owner of a franchise location of Soup’s On Cooking Supply has anticipated Walter’s defining moment because she has been studying her website analytics, studying question research tools like Answer The Public, watching Google Trends, and looking at Q&A sites like this one where people are already searching for answers to the secret of fluffy omelets. She also has her staff actively cataloging common in-store questions. The data gathered has convinced her to make these efforts:
Walking down the street, Walter discovers and watches the video on YouTube. He notices the Soup’s On Cooking Supply branding on the video, even though there was no hard-sell in its content — just really good tips for omelet fluffiness.
“Soup’s On near me,” Walter asks his mobile phone, not 100% sure this chain has an outlet in Pittsburgh. He’s having his I-Want-To-Go moment.
Again, Patricia has anticipated this need and prevented customer loss by:
Walter keys the ignition.
Walter arrives safely at the retail location. You’d think he might put his phone away, but being like 87% of millennials, he keeps it at his side day and night and, like 91% of his compadres, he turns it on mid-task. The store clerk has shown him where the wire whisks and pans are stocked, but Walter is not convinced that he can trust what the video claimed about their quality. He’d like to see a comparison.
Fortunately, Patricia is a Moz Whiteboard Friday fan and took Rand’s advice about comprehensive content and 10x content to heart. Her website’s product comparison charts go to great lengths, weighing USA-made kitchen products against German ones, Lodgeware vs. Le Creuset, in terms of price, performance for specific cooking tasks, and quality. They’re ranking very well.
Walter is feeling more informed now, while being kept inside of the company’s own website, but the I-Want-To-Buy micro-moment is cemented when he sees:
The next day, Walter is ready to make his first fluffier omelet. Because he’s already been exposed to Patricia’s article on the Soup’s On Cooking Supply website, he can easily return to it now to re-watch the video and follow the recipe provided. Even in the I-want-to-do phase, Walter is being assisted by the brand, and this multi-part experience he’s now had with the company should go far towards cementing it in his memory as a go-to resource for all of his future culinary needs.
It would be excellent if the website’s page on fluffy omelets also challenged Walter to use his new whisk for creating other dishes — perhaps soufflés (for which he’ll need a ceramic ramekin) or chantilly cream (a nice glass bowl set over ice water helps). Walter may find himself wanting to do all kinds of new things, and he now knows exactly where he can find helpful tutorials and purchase the necessary equipment.
As we’ve seen, it’s completely possible for a local business to own all four of Google’s attested micro-moments. What I can’t cover with a single scenario is all of the variables that might apply to a given geography or industry, but I do want to at least make mention of these three points that should be applicable to most local businesses:
The origins of both I-want-to-do and I-want-to-know moments are incredibly varied. A consumer need can arise from something really practical, as in, it’s winter again and I need to buy snow tires. Or, there can be public/cultural happenings (like Julia Child’s cooking program) to which consumers’ ultimate transactions can be directly traced. To discover the sparks that ignite your specific customers’ micro-moments fires, I recommend delving further into the topic of barnacle local SEO — the process of latching onto existing influences in your community in order to speak to existing wishes and needs.
Google states that 29% of smartphone users will immediately navigate away from any website or app that doesn’t satisfy them. 70% of these cite slow loading and 67% cite too many steps to reach information or purchase as reasons for dissatisfaction. On November 4, 2016, Google announced its major shift toward mobile-first indexing, signaling to all website publishers that Google sees mobile, rather than desktop, as the primary platform now.
Google’s statistics and policies make it irrefutable that every competitive local business which hasn’t yet done so must now devote appropriate funds to creating the best possible mobile user experience. Failure to do so risks reputation, rankings, and revenue.
Though my story of Walter touches briefly on the resources Patricia had built for his in-store experience, I didn’t delve into the skyrocketing technology constantly being pioneered around this micro-moment phase. This would include beacons, though they have so far failed to live up to earlier hype in some ways. It could involve the development of in-store apps. And, at the highest echelons of commerce, it could include kiosks, augmented, and virtual reality.
KFC may strive to master I-want-to-buy moments with chicken-serving robots, Amazon Go may see micro-moments in checkout-free shopping, and Google Home’s giant, listening ear may be turning whole lives into a series of documented micro-moments, but what makes sense for your local business?
The answer to this is going to be dictated by the competitiveness of your industry and the needs of your consumer base. Does a rural, independently owned hardware store really need a 6-foot-high in-store touch screen enabling customers to virtually paint their houses? Probably not, but a well-written comparison of non-toxic paint brands the shop carries and why they’re desirable for health reasons could transform a small town’s decorating habits. Meanwhile, in more competitive markets, each local brand would be wise to invest in new technology only where it really makes proven sense, and not just because it’s the next big thing.
Our industry loves new technology to a degree that can verge on the overwhelming for striving local business owners, and while it can genuinely be a bit daunting to sink your teeth into all of the variables of winning the micro-moment journey, take heart. Julia Child sold Pittsburgh out of wire whisks with a shoestring, black-and-white PBS program on which she frequently dropped implements on the floor and sent egg beaters flying across rooms.
With our modern capabilities of surveying and mining consumers needs and presenting useful solutions via the instant medium of the web, what can’t you do? The steps in the micro-moments funnel are as old as commerce itself. Simply seize the current available technology … and get cooking!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
[ccw-atrib-link]
The technology company has said that it will begin to punish sites that display interstitials or pop-ups that obscure indexed content. The change isn’t due to come into play until January 2017, but we wanted to take the opportunity to explore the extent of the new rules and their possible impact.
We know from Google’s past algorithm updates that the focus has turned to the ever-increasing number of mobile users. One change Google said it was “experimenting” with in relation to the ranking signal was mobile-friendly design. The company added a ‘Mobile-friendly’ label, which appeared in the search results when a site conformed to its criteria – such as using text that’s readable without zooming, sizing content to the screen or avoiding software like Flash.
It’s clear, then, that there are multiple factors in the way that Google rates websites into the mobile experience – so how much weighting will it be applying to those using pop-ups or interstitials? We won’t know until it happens, but we can speculate.
How do people browse the web on mobile?
Let’s think about people’s usage and habits when it comes to browsing the web on a mobile device.
Dissimilar from perhaps on a laptop or a desktop, those searching the web on their mobile will tend to be picking up the device to look up something specific. These searches will often be long-tail keywords which will draw up deeper links from a site, and this is where brands need to be careful. Popovers featured on these detail pages, and which distract from the main content, can be a barrier to conversion and lead to bounces.
Rather, marketers need to be selective about how they use pop-ups and take a more considered approach when it comes to the user experience.
What constitutes a bad UX?
No-one wants to create a bad user experience, because it can be detrimental to credibility, performance and conversions. However, if companies achieve good response rates to newsletter sign-up popovers, you could argue that they aren’t providing a negative web experience and, in fact, it would be wrong to penalize.
With the right tool, brands can also be cleverer about when, where and how popovers appear. If a company is trying to collect a steady stream of email addresses from new website visitors, it might make sense to host the popover somewhere on the homepage. After all, the homepage is your company’s shop window and its purpose is to lure people in.
It would also be wise to consider when it pops up. In order not to disrupt the journey and experience, you would want to prevent the popover from appearing immediately. And, of course, you would also want to prevent the pop-up from appearing on the next visit if the user had either signed up or dismissed it.
Will it or will it not?
Let’s remember that the new signal in Google’s algorithm is just one of hundreds of signals that are used to determine rankings – so popovers could make up a small percentage of the overall score. What we take from it all is: if a page’s content is relevant, gets lots of clicks and has a decent dwell time, it may still rank highly (in fact, read the official Google blog post). If a popover is enhancing the experience by giving users another way to consume similar content, and there is positive uptake, we don’t see the harm.
[ccw-atrib-link]
Wondering what’s up with local search rankings lately? Columnist Joy Hawkins has the scoop on a recent local algorithm update that local SEO experts are calling ‘Possum.’
The post Everything you need to know about Google’s ‘Possum’ algorithm update appeared first on Search Engine…
Please visit Search Engine Land for the full article.
[ccw-atrib-link]
Posted by Cyrus-Shepard
Recently, Moz announced the results of our biennial Ranking Factors study. Today, we’d like to explore one of the most vital elements of the study: the Ranking Factors survey.
2015 Ranking Factors Expert Survey
Every two years, Moz surveys the brightest minds in SEO and search marketing with a comprehensive set of questions meant to gauge the current workings of Google’s search algorithm. This year’s panel of experts possesses a truly unique set of knowledge and perspectives. We’re thankful on behalf of the entire community for their contribution.
In addition to asking the participants about what does and doesn’t work in Google’s ranking algorithm today, one of the most illuminating group of questions asks the panel to predict the future of search – how the features of Google’s algorithm are expected to change over the next 12 months.
Amazingly, almost all of the factors that are expected to increase in influence revolved around user experience, including:
The experts predicted that more traditional ranking signals, such as those around links and URL structures, would largely remain the same, while the more manipulative aspects of SEO, like paid links and anchor text (which is subject to manipulation), would largely decrease in influence.
The survey also asks respondents to weight the importance of various factors within Google’s current ranking algorithm (on a scale of 1-10). Understanding these areas of importance helps to inform webmasters and marketers where to invest time and energy in working to improve the search presence of their websites.
These features describe use of the keyword term/phrase in particular parts of the HTML code on the page (title element, H1s, alt attributes, etc).
Highest influence: Keyword present in title element, 8.34
Lowest influence: Keyword present in specific HTML elements (bold/italic/li/a/etc), 4.16
![]() |
Titles are still very powerful. Overall, it’s about focus and matching query syntax. If your post is about airplane propellers but you go on a three paragraph rant about gorillas, you’re going to have a problem ranking for airplane propellers. |
![]() |
Keyword usage is vital to making the cut, but we don’t always see it correlate with ranking, because we’re only looking at what already made the cut. The page has to be relevant to appear for a query, IMO, but when it comes to how high the page ranks once it’s relevant, I think keywords have less impact than they once did. So, it’s a necessary but not sufficient condition to ranking. |
![]() |
In my experience, most of problems with organic visibility are related to on-page factors. When I look for an opportunity, I try to check for 2 strong things: presence of keyword in the title and in the main content. Having both can speed up your visibility, especially on long-tail queries. |
These features cover how keywords are used in the root or subdomain name, and how much impact this might have on search engine rankings.
Highest influence: Keyword is the exact match root domain name, 5.83
Lowest influence: Keyword is the domain extension, 2.55
![]() |
The only domain/keyword factor I’ve seen really influence rankings is an exact match. Subdomains, partial match, and others appear to have little or no effect. |
![]() |
There’s no direct influence, but an exact match root domain name can definitely lead to a higher CTR within the SERPs and therefore a better ranking in the long term. |
![]() |
It’s very easy to link keyword-rich domains with their success in Google’s results for the given keyword. I’m always mindful about other signals that align with domain name which may have contributed to its success. These includes inbound links, mentions, and local citations. |
These features describe link metrics for the individual ranking page (such as number of links, PageRank, etc).
Highest influence: Raw quantity of links from high-authority sites, 7.78
Lowest influence: Sentiment of the external links pointing to the page, 3.85
![]() |
High-quality links still rule rankings. The way a brand can earn links has become more important over the years, whereas link schemes can hurt a site more than ever before. There is a lot of FUD slinging in this respect! |
![]() |
Similar to my thoughts on content, I suspect link-based metrics are going to be used increasingly with a focus on verisimilitude (whether content is actually true or not) and relationships between nodes in Knowledge Graph. Google’s recent issues with things, such as the snippet results for “evolution,” highlight the importance of them only pulling things that are factually correct for featured parts of a SERP. Thus, just counting traditional link metrics won’t cut it anymore. |
![]() |
While anchor text is still a powerful ranking factor, using targeted anchor text carries a significant amount of risk and can easily wipe out your previous success. |
These features describe elements that indicate qualities of branding and brand metrics.
Highest influence: Search volume for the brand/domain, 6.54
Lowest influence: Popularity of business’s official social media profiles, 3.99
![]() |
This is clearly on deck to change very soon with the reintegration of Twitter into Google’s Real-Time Results. It will be interesting to see how this affects the “Breaking News” box and trending topics. Social influencers, quality and quantity of followers, RTs, and favorites will all be a factor. And what’s this?! Hashtags will be important again?! Have mercy! |
![]() |
Google has to give the people what they want, and if most of the time they are searching for a brand, Google is going to give them that brand. Google doesn’t have a brand bias, we do. |
![]() |
It’s already noticeable; brands are more prominently displayed in search results for both informational and commercial queries. I’m expecting Google will be paying more attention to brand-related metrics from now on (and certainly more initiatives to encourage site owners to optimize for better entity detection). |
These features relate to third-party metrics from social media sources (Facebook, Twitter, Google+, etc) for the ranking page.
Highest influence: Engagement with content/URL on social networks, 3.87
Lowest influence: Upvotes for the page on social sites, 2.7
![]() |
Social ranking factors are important in a revamped Query Deserves Freshness algorithm. Essentially, if your content gets a lot of natural tweets, shares, and likes, it will rank prominently for a short period of time, until larger and more authoritative sites catch up. |
![]() |
Social popularity has several factors to consider: (1) Years ago, Google and Bing said they take into account the authority of a social profile sharing a link and the popularity of the link being shared (retweets/reshares), and there was more complexity to social signals that was never revealed even back then. (2) My experience has been that social links and shares have more power for newsy/fresh-type content. For example, a lot of social shares for a dentist’s office website wouldn’t be nearly as powerful (or relevant to consider) as a lot of social shares for an article on a site with a constant flow of fresh content. |
![]() |
Honestly, I do not think that the so-called “social signals” have any direct influence on the Google Algorithm (that does not mean that a correlation doesn’t exist, though). My only doubt is related to Twitter, because of the renewed contract between Google and Twitter itself. That said, as of now I do not consider Twitter to offer any ranking signals, except for very specific niches related to news and “news-able” content, where QDF plays a fundamental role. |
These elements describe non-keyword-usage, non-link-metrics features of individual pages (such as length of the page, load speed, etc).
Highest influence: Uniqueness of the content on the page, 7.85
Lowest influence: Page contains Open Graph data and/or Twitter cards, 3.64
![]() |
By branching mobile search off of Google’s core ranking algorithm, having a “mobile-friendly” website is probably now less important for desktop search rankings. Our clients are seeing an ever-increasing percentage of organic search traffic coming from mobile devices, though (particularly in retail), so this is certainly not an excuse to ignore responsive design – the opposite, in fact. Click-through rate from the SERPs has been an important ranking signal for a long time and continues to be, flagging irrelevant or poor-quality search listings. |
![]() |
I believe many of these will be measured within the ecosystem, rather than absolutely. For example, the effect of bounce rate (or rather, bounce speed) on a site will be relative to the bounce speeds on other pages in similar positions for similar terms. |
![]() |
I want to answer these a certain way because, while I have been told by Google what matters to them, what I see in the SERPs does not back up what Google claims they want. There are a lot of sites out there with horrible UX that rank in the top three. While I believe it’s really important for conversion and to bring customers back, I don’t feel as though Google is all that concerned, based on the sites that rank highly. Additionally, Google practically screams “unique content,” yet sites that more or less steal and republish content from other sites are still ranking highly. What I think should matter to Google doesn’t seem to matter to them, based on the results they give me. |
These features describe link metrics about the domain hosting the page.
Highest influence: Quantity of unique linking domains to the domain, 7.45
Lowest influence: Sentiment of the external links pointing to the site, 3.91
![]() |
Quantity and quality of unique linking domains at the domain level is still among the most significant factors in determining how a domain will perform as a whole in the organic search results, and is among the best SEO “spot checks” for determining if a site will be successful relative to other competitor sites with similar content and selling points. |
![]() |
Throughout this survey, when I say “no direct influence,” this is interchangeable with “no direct positive influence.” For example, I’ve marked exact match domain as low numbers, while their actual influence may be higher – though negatively. |
![]() |
Topical relevancy has, in my opinion, gained much ground as a relevant ranking factor. Although I find it most at play when at page level, I am seeing significant shifts at overall domain relevancy, by long-tail growth or by topically-relevant domains linking to sites. One way I judge such movements is the growth of the long-tail relevant to the subject or ranking, when neither anchor text (exact match or synonyms) nor exact phrase is used in a site’s content, yet it still ranks very highly for long-tail and mid-tail synonyms. |
These features relate to the entire root domain, but don’t directly describe link- or keyword-based elements. Instead, they relate to things like the length of the domain name in characters.
Highest influence: Uniqueness of content across the whole site, 7.52
Lowest influence: Length of time until domain name expires, 2.45
![]() |
Character length of domain name is another correlative yet not causative factor, in my opinion. They don’t need to rule these out – it just so happens that longer domain names get clicked on, so they get ruled out quickly. |
![]() |
A few points: Google’s document inception date patents describe how Google might handle freshness and maturity of content for a query. The “trust signal” pages sound like a site quality metric that Google might use to score a page on the basis of site quality. Some white papers from Microsoft on web spam signals identified multiple hyphens in subdomains as evidence of web spam. The length of time until the domain expires was cited as a potential signal in Google’s patent on information retrieval through historic data, and was refuted by Matt Cutts after domain sellers started trying to use that information to sell domain extensions to “help the SEO” of a site. |
![]() |
I think that page speed only becomes a factor when it is significantly slow. I think that having error pages on the site doesn’t matter, unless there are so many that it greatly impacts Google’s ability to crawl. |
To bring it back to the beginning, we asked the experts if they had any comments or alternative signals they think will become more or less important over the next 12 months.
![]() |
While I expect that static factors, such as incoming links and anchor text, will remain influential, I think the power of these will be mediated by the presence or absence of engagement factors. |
![]() |
The app world and webpage world are getting lumped together. If you have the more popular app relative to your competitors, expect Google to notice. |
![]() |
Mobile will continue to increase, with directly-related factors increasing as well. Structured data will increase, along with more data partners and user segmentation/personalization of SERPs to match query intent, localization, and device-specific need states. |
![]() |
User location may have more influence in mobile SERPs as (a) more connected devices like cars and watches allow voice search, and (b) sites evolve accordingly to make such signals more accurate. |
![]() |
I really think that over the next 12-18 months we are going to see a larger impact of structured data in the SERPs. In fact, we are already seeing this. Google has teams that focus on artificial intelligence and machine learning. They are studying “relationships of interest” and, at the heart of what they are doing, are still looking to provide the most relevant result in the quickest fashion. Things like schema that help “educate” the search engines as to a given topic or entity are only going to become more important as a result. |
For more data, check out the complete Ranking Factors Survey results.
2015 Ranking Factors Expert Survey
Finally, we leave you with this infographic created by Kevin Engle which shows the relative weighting of broad areas of Google’s algorithm, according to the experts.
What’s your opinion on the future of search and SEO? Let us know in the comments below.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
[ccw-atrib-link]
Posted by Cyrus-Shepard
We’re excited to announce the results of Moz’s biannual Search Engine Ranking Correlation Study and Expert Survey, a.k.a. Ranking Factors.
Moz’s Ranking Factors study helps identify which attributes of pages and sites have the strongest association with ranking highly in Google. The study consists of two parts: a survey of professional SEOs and a large correlation study.
This year, with the help of Moz’s data scientist Dr. Matt Peters, new data partners, and over 150 search marketing professionals, we were able to study more data points than in any year past. All together, we measured over 170 correlations and collected over 15,000 data points from our panel of SEO experts.
Ready to dig in?
We want to especially thank our data partners. SimilarWeb, Ahrefs, and DomainTools each gave us unparallelled access and their data was essential to helping make this study a success. It’s amazing and wonderful when different companies—even competitors—can come together for the advancement of knowledge.
You can see all of our findings within the study now. In the coming days and weeks we’ll dive into deeper analysis as to what we can learn from these correlations.
Moz’s Ranking Correlation Study measures which attributes of pages and websites are associated with higher rankings in Google’s search results. This means we look at characteristics such as:
To be clear, the study doesn’t tell us if Google actually uses these attributes in its core ranking algorithm. Instead, it shows which features of pages and sites are most associated with higher rankings. It’s a fine, but important, distinction.
While correlation studies can’t prove or disprove which attributes Google considers in its algorithm, it does provide valuable hints. In fact, many would argue that correlation studies are even more important than causation when working with today’s increasingly complex algorithms.
For the study, Dr. Peters examined the top 50 Google results of 16,521 search queries, resulting in over 700,000 unique URLs. You can read about the full methodology here.
Here’s a sample of our findings:
The features in the chart below describe link metrics to the individual ranking page (such as number of links, PageRank, etc.) and their correlation to higher rankings in Google.
Despite rumors to the contrary, links continue to show one of the strongest associations with higher rankings out of all the features we studied. While this doesn’t prove how Google uses links in its algorithm, this information combined with statements from Google and the observations of many professional marketers leads us to strongly believe that links remain hugely important for SEO.
Link-based features were only one of the features categories we examined. The complete correlation study includes 12 different categories of data.
While correlation data can provide valuable insight into the workings of Google’s algorithm, we often learn much more by gathering the collective wisdom of search marketing experts working at the top of their game.
For this reason, every two years we conduct the Ranking Factors Expert Survey.
The survey itself is famously grueling–over 100 questions covering every aspect of Google’s ranking algorithm. This year, we sent the invitation-only survey to 150 industry professionals.
Stay tuned for a deeper dive into the Expert Survey later this week. We’re honored to have the participation of so many knowledgeable professionals.
In the meantime, you can freely view all the findings and results right now:
Ranking Factors wouldn’t be possible without the contribution of dozens of very talented people, but we’d especially like to thank Dr. Matt Peters, Kevin Engle, Rand Fishkin, Casey Coates, Trevor Klein, and Kelly Cooper for their efforts, along with our data partners and all the survey participants.
What ranking factors or correlations stand out to you? Leave your thoughts in the comments below.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
[ccw-atrib-link]
Posted by EricEnge
How do the SERPs for commercial queries vary from the treatment of informational queries? Moz is about to publish its new Search Engine Ranking Factors, and was kind enough to provide me with access to their raw ranking data. Today I am going to share some of what I found.
In addition, I am going to compare it against raw ranking data pulled by my company, Stone Temple Consulting (STC). What makes this so interesting is that the Moz data is based on commercial queries across 165,177 pages and the STC data is based on informational queries over 182,340 pages (for a total of 347,517 result pages). Let’s dive in!
Google rolled out their Mobile-Friendly Update on April 21 to much fanfare. We published our study results on how big that impact was here, and in that test, we tracked a set of 15,235 SERPs both before and after the SERPs.
The following chart shows the percentage of the top 10 results in the SERPs that are mobile friendly for the Moz (commercial) queries, and the STC informational queries before and after the mobile update:
Clearly, the commercial queries are returning a much larger percentage of mobile friendly results than the informational queries. Much of this may be due to it being more important to people running E-commerce sites to have a mobile-friendly site.
What this suggests to us is that publishers of E-commerce sites have been faster to adopt mobile friendliness than publishers of informational sites. That makes sense. Of course, our friends at Google know this is more important for commercial queries, too.
Regardless of query type, you can see that more than 60% of the results meet Google’s current definition for mobile friendliness. For commercial queries, it’s nearly 3/4 of them. Obviously, if you are not currently mobile friendly, then solve that, but that’s not the whole story.
Over time, I believe that what is considered mobile friendly is going to change. The mobile world will become much more than just viewing your current desktop site with a smaller screen and a crappier keyboard. What are some more things you can expect in the long term?
My third point is an item that is already in progress, and the first two are really not for most people at this time. However, I put them out there to stimulate some thinking that much more is going to happen in this space than meets the eye. In the short term, what can you do?
My suggestion is that you start looking at the mobile version of your site as more than a different rendering of your desktop site. What are the different use cases between mobile and desktop? Consider running two surveys of your users, one of desktop users and one of smartphone users, and ask them what they are looking for, and what they would like to see. My bet is that you will quickly see that the use cases are different in material ways.
In the near term, you can leverage this information to make your mobile site optimization work better for users, probably without re-architecting it entirely. In the longer term, collecting this type of data will prepare you for considering more radical design differences between your desktop and mobile sites.
Another one of the newer ranking factors is whether or not a site uses HTTPS. Just this past July 22, Google’s Gary Illyes clarified again that this is a minor ranking signal that acts like a tiebreaker in cases where the ranking for two competing pages are “more or less equal.”
How has that played out in the SERPs? Let’s take a look:
As with the mobile-friendliness, we once again see the commercial queries placing significantly more emphasis on this factor than the informational queries. Yet, the penetration levels are clearly far lower than they are for mobile friendliness. So should I care about this then?
Yes, it matters. Here are three reasons why:
Yes, I know there is much debate about whether or not you need to have HTTPS if all you are doing is running a content site. But a lot of big players out there are taking a simple stance: that it’s time for the plain text web to come to an end.
The big thing that HTTPS helps prevent is Man in the Middle Attacks. Do read the linked article if you don’t know what that is. Basically though, when you communicate with a non-secure web site, it’s pretty trivial for someone to intercept the communication and monitor or alter the information flow between you and the sending web site.
The most trivial form of this can occur any time you connect to a third party Wifi connection. People can inject ads you don’t want, or simply monitor everything you do and build a profile about you. Is that what you want?
Let me offer a simple example: Have you ever connected to Wifi in a hotel? What’s the first thing that happens? You try to go to a website, but instead you get a login screen asking for your room number and last name to sign in – and most times they charge you some fee.
That’s the concept – you tried to go to a web site, and instead got served different content (the Wifi login screen). The hotel can do this at any time. Even after you login and pay their fee, they can intercept your communication with other web sites and modify the content. A simple application for this it to inject ads. They can also monitor and keep a record of every site you visit. They can do this because they are in the middle.
In an HTTPS world, they will still be able to intercept the initial connection, but once you are connected, they will no longer be able to see inside the content going back and forth between you and the https websites you choose to access.
Eventually, the plain text web will come to an end. As this movement grows, more and more publishers will make the switch to HTTPS, and Google will dial up the strength of this signal as a ranking factor. If you have not made the switch, then get it into your longer term plans.
Both mobile-friendliness and HTTPS support appear to matter more to commercial sites today. I tend to think that this is more a result of more e-commerce site publishers and informational site publishers have made the conversions, rather than it being the impact of the related Google algorithms. Regardless of that, the importance of both of these factors will grow, and it would be wise to aggressively prepare for the future.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
[ccw-atrib-link]
Posted by CarloSeo
This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of Moz, Inc.
The spam in Google Analytics (GA) is becoming a serious issue. Due to a deluge of referral spam from social buttons, adult sites, and many, many other sources, people are starting to become overwhelmed by all the filters they are setting up to manage the useless data they are receiving.
The good news is, there is no need to panic. In this post, I’m going to focus on the most common mistakes people make when fighting spam in GA, and explain an efficient way to prevent it.
But first, let’s make sure we understand how spam works. A couple of months ago, Jared Gardner wrote an excellent article explaining what referral spam is, including its intended purpose. He also pointed out some great examples of referral spam.
The spam in Google Analytics can be categorized by two types: ghosts and crawlers.
The vast majority of spam is this type. They are called ghosts because they never access your site. It is important to keep this in mind, as it’s key to creating a more efficient solution for managing spam.
As unusual as it sounds, this type of spam doesn’t have any interaction with your site at all. You may wonder how that is possible since one of the main purposes of GA is to track visits to our sites.
They do it by using the Measurement Protocol, which allows people to send data directly to Google Analytics’ servers. Using this method, and probably randomly generated tracking codes (UA-XXXXX-1) as well, the spammers leave a “visit” with fake data, without even knowing who they are hitting.
This type of spam, the opposite to ghost spam, does access your site. As the name implies, these spam bots crawl your pages, ignoring rules like those found in robots.txt that are supposed to stop them from reading your site. When they exit your site, they leave a record on your reports that appears similar to a legitimate visit.
Crawlers are harder to identify because they know their targets and use real data. But it is also true that new ones seldom appear. So if you detect a referral in your analytics that looks suspicious, researching it on Google or checking it against this list might help you answer the question of whether or not it is spammy.
I’ve been following this issue closely for the last few months. According to the comments people have made on my articles and conversations I’ve found in discussion forums, there are primarily three mistakes people make when dealing with spam in Google Analytics.
One of the biggest mistakes people make is trying to block Ghost Spam from the .htaccess file.
For those who are not familiar with this file, one of its main functions is to allow/block access to your site. Now we know that ghosts never reach your site, so adding them here won’t have any effect and will only add useless lines to your .htaccess file.
Ghost spam usually shows up for a few days and then disappears. As a result, sometimes people think that they successfully blocked it from here when really it’s just a coincidence of timing.
Then when the spammers later return, they get worried because the solution is not working anymore, and they think the spammer somehow bypassed the barriers they set up.
The truth is, the .htaccess file can only effectively block crawlers such as buttons-for-website.com and a few others since these access your site. Most of the spam can’t be blocked using this method, so there is no other option than using filters to exclude them.
Another error is trying to use the referral exclusion list to stop the spam. The name may confuse you, but this list is not intended to exclude referrals in the way we want to for the spam. It has other purposes.
For example, when a customer buys something, sometimes they get redirected to a third-party page for payment. After making a payment, they’re redirected back to you website, and GA records that as a new referral. It is appropriate to use referral exclusion list to prevent this from happening.
If you try to use the referral exclusion list to manage spam, however, the referral part will be stripped since there is no preexisting record. As a result, a direct visit will be recorded, and you will have a bigger problem than the one you started with since. You will still have spam, and direct visits are harder to track.
When people see that the bounce rate changes drastically because of the spam, they start worrying about the impact that it will have on their rankings in the SERPs.
This is another mistake commonly made. With or without spam, Google doesn’t take into consideration Google Analytics metrics as a ranking factor. Here is an explanation about this from Matt Cutts, the former head of Google’s web spam team.
And if you think about it, Cutts’ explanation makes sense; because although many people have GA, not everyone uses it.
Another common concern when people see strange landing pages coming from spam on their reports is that they have been hacked.
The page that the spam shows on the reports doesn’t exist, and if you try to open it, you will get a 404 page. Your site hasn’t been compromised.
But you have to make sure the page doesn’t exist. Because there are cases (not spam) where some sites have a security breach and get injected with pages full of bad keywords to defame the website.
Now that we’ve discarded security issues and their effects on rankings, the only thing left to worry about is your data. The fake trail that the spam leaves behind pollutes your reports.
It might have greater or lesser impact depending on your site traffic, but everyone is susceptible to the spam.
Small and midsize sites are the most easily impacted – not only because a big part of their traffic can be spam, but also because usually these sites are self-managed and sometimes don’t have the support of an analyst or a webmaster.
Big sites with a lot of traffic can also be impacted by spam, and although the impact can be insignificant, invalid traffic means inaccurate reports no matter the size of the website. As an analyst, you should be able to explain what’s going on in even in the most granular reports.
Usually it is recommended to add the referral to an exclusion filter after it is spotted. Although this is useful for a quick action against the spam, it has three big disadvantages.
Luckily, there is a good way to prevent all these problems. Most of the spam (ghost) works by hitting GA’s random tracking-IDs, meaning the offender doesn’t really know who is the target, and for that reason either the hostname is not set or it uses a fake one. (See report below)
You can see that they use some weird names or don’t even bother to set one. Although there are some known names in the list, these can be easily added by the spammer.
On the other hand, valid traffic will always use a real hostname. In most of the cases, this will be the domain. But it also can also result from paid services, translation services, or any other place where you’ve inserted GA tracking code.
Based on this, we can make a filter that will include only hits that use real hostnames. This will automatically exclude all hits from ghost spam, whether it shows up as a referral, keyword, or pageview; or even as a direct visit.
To create this filter, you will need to find the report of hostnames. Here’s how:
You will see a list of all hostnames, including the ones that the spam uses. Make a list of all the valid hostnames you find, as follows:
For small to medium sites, this list of hostnames will likely consist of the main domain and a couple of subdomains. After you are sure you got all of them, create a regular expression similar to this one:
yourmaindomain\.com|anotheruseddomain\.com|payingservice\.com|translatetool\.com
You don’t need to put all of your subdomains in the regular expression. The main domain will match all of them. If you don’t have a view set up without filters, create one now.
Then create a Custom Filter.
Make sure you select INCLUDE, then select “Hostname” on the filter field, and copy your expression into the Filter Pattern box.
You might want to verify the filter before saving to check that everything is okay. Once you’re ready, set it to save, and apply the filter to all the views you want (except the view without filters).
This single filter will get rid of future occurrences of ghost spam that use invalid hostnames, and it doesn’t require much maintenance. But it’s important that every time you add your tracking code to any service, you add it to the end of the filter.
Now you should only need to take care of the crawler spam. Since crawlers access your site, you can block them by adding these lines to the .htaccess file:
## STOP REFERRER SPAM RewriteCond %{HTTP_REFERER} semalt\.com [NC,OR] RewriteCond %{HTTP_REFERER} buttons-for-website\.com [NC] RewriteRule .* - [F]
It is important to note that this file is very sensitive, and misplacing a single character it it can bring down your entire site. Therefore, make sure you create a backup copy of your .htaccess file prior to editing it.
If you don’t feel comfortable messing around with your .htaccess file, you can alternatively make an expression with all the crawlers, then and add it to an exclude filter by Campaign Source.
Implement these combined solutions, and you will worry much less about spam contaminating your analytics data. This will have the added benefit of freeing up more time for you to spend actually analyze your valid data.
After stopping spam, you can also get clean reports from the historical data by using the same expressions in an Advance Segment to exclude all the spam.
If you still need more information to help you understand and deal with the spam on your GA reports, you can read my main article on the subject here: http://www.ohow.co/what-is-referrer-spam-how-stop-it-guide/.
Additional information on how to stop spam can be found at these URLs:
In closing, I am eager to hear your ideas on this serious issue. Please share them in the comments below.
(Editor’s Note: All images featured in this post were created by the author.)
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
[ccw-atrib-link]
Posted by Isla_McKetta
“How can I learn SEO?” is a deceptively simple question. The standard approach is to attempt to appeal to anyone who’s interested in SEO without any idea of your previous experience or the actual reasons you want to learn SEO. That’s fun. Especially the part about weeding through tons of information that might not even apply to what you want to learn.
So let’s fix that. This guide is written to help you choose your own SEO adventure. If you know very little about SEO and just want to learn enough to impress your CMO, start at the beginning and stop when you feel like you understand enough concepts. Or if you’ve been doing SEO for years but need a brush up on the latest tips and tricks before impressing a potential client or employer, there’s a path for you too. Be sure to follow the links. They refer you to resources that are much more in-depth than we could reproduce in one post.
You may know what a title tag is, but you aren’t quite sure how to use it or why. The SEO Newbie could be a web developing hobbyist on the verge of a new obsession or someone looking for the next growing career path. Regardless, you have the most to learn (and the most to gain) from this adventure.
Start at the very beginning with What is SEO? and explore as many paths as you can. You might be surprised at the bits of information you pick up along the way. For a guided tour, follow the teal boxes. Don’t forget to bookmark this page so you can come back and learn more once you’ve absorbed each batch of info.
You were doing SEO back in the days of AltaVista, so you know all the things to know. Except maybe you took a break for a few years or decided to swap that black hat for a gray (or even white) one and need to know what’s the what with the major changes in the past few years.
Make a quick stop at the Algorithm Change History to catch up on the latest updates and penalties. After that, we’ll guide you through some of the topics that are more likely to have changed since you last checked. Just look for the purple boxes.
You’ve heard of SEO. You might even have worked with a few SEOs. Now you’re ready to dig in and understand what everyone’s talking about and how you can use all that new info to improve your marketing (and maybe level up your career at the same time).
Start with What is SEO? and look for shortcuts in orange boxes along the path to gather highlights. You can always dig deeper into any topic you find especially interesting.
Whichever path you choose, don’t worry, we’ll keep weaving you in and out of the sections that are relevant to your learning needs; just look for the color that’s relevant to your chosen character.
For you table of contents types who like to read straight through rather than have someone set the path for you, here’s a quick look at what we’ll be covering:
First things first. It’s hard to learn the ins and outs of SEO (search engine optimization) before you even know what it is. In the following short video, Rand Fishkin (a.k.a. the Wizard of Moz) defines SEO as “The practice of increasing the quantity and quality of the traffic that you earn through the organic results in search engines like Google, Yahoo, and Bing.”
Watch it to understand the difference between paid search and organic search and a few basic things about improving click-throughs from search pages.
A lot of different factors, from site speed to content quality, are important in SEO. These are, as far as anyone can tell, the factors that search engines use in determining whether or not to show your page to searchers. For a great intro to those elements and how they interact to affect your site’s overall ranking, check out Search Engine Land’s Periodic Table of SEO Success Factors.
That’s all nice, but if SEO is starting to seem like a lot of work, you probably want to understand whether SEO is even worth it. The short answer is that yes, SEO is worth it, but only if you want potential customers to be able to find your site when they’re searching on Google (or any other search engine).
Yes, search engines are crawling your site, but those crawlers aren’t as sophisticated as you might like. SEO gives you more control over how your site is represented in those search engine results pages. Good SEO can also improve how users experience your site. Learn more with Why Search Engine Marketing is Necessary.
Who are these search engines anyway and why do we spend so much time worrying about how they see our sites? To get the best answer, let’s look at that question from two points of view: search engines and searchers.
First, it’s important to understand how search engines crawl sites, build their indexes, and ultimately determine what’s relevant to a user’s query. Some of the specifics are trade secrets, but this section of the Beginner’s Guide to SEO offers a solid overview. And for an introduction to how Google ranks pages, watch this video:
As you’re learning about SEO, remember that not everything you read on the Internet should be treated as gospel. Here are some common myths and misconceptions about search engines.
Understanding how people use search engines is as crucial to SEO as understanding their needs is to marketing. Learn about classic search query patterns and how people scan search results here.
So far we’ve dropped a lot of phrases like “search results” and “search pages,” but what does any of that really mean? Search Engine Land does a great job of decoding the standard search engine results page (SERP). It’s a strong foundation for understanding why everyone is shooting to be in the top ten search results. But one thing you’ll find the more you get into SEO is that SERPs are rapidly evolving. Ads move, knowledge graphs appear (and disappear) and sometimes local search results invade. Dr. Pete takes you on a tour of how SERPs have changed and why ten blue links are probably a thing of the past in this article.
And then there’s the darker side of SEO, because once there’s a system, there’s someone trying to game that system. Spend more than a few minutes talking to anyone about SEO and you’ll hear something or other about black hat tactics like keyword stuffing and unnatural linking.
If you decide to use these tactics, you might soon become acquainted with search engine penalties. These algorithm updates, like Hummingbird and Penguin, are implemented by search engines at various intervals. The official word is that these updates improve user experience, but they can also be effective ways to penalize SEOs using spammy tactics. Learn more about Google’s algorithm updates. That page includes not only a full history of prior penalties, but it’s consistently refreshed when a new algorithm update is confirmed.
SEO veterans, you get to skip ahead of the class now to learn about the current state of page speed, mobile web development, and competitive research along with info on the best tools available today.
As you can see, a lot of work can go into SEO, but the results can be pretty incredible, too. To track your progress in topping the SERPs, make sure you’re using an analytics platform like Google Analytics or Omniture. You can get by with something like Rank Tracker to track rankings on keywords as a start, but eventually you’re going to want some of the data those more sophisticated tools offer.
Brain full? You’ve just learned everything a beginner needs to know about what SEO is. Go take a walk or get some coffee and let all that info soak in.
Before you go, save this bookmark.
SEO newbies, when you come back, you’ll be in exactly the right place to start putting some of your new knowledge into action by practicing how to build an SEO-friendly site.
SEO-curious marketers, you might not want to go to the trouble of actually building a site, but you’ll learn a lot by reading through the next section and the related materials.
First of all, don’t freak out, you don’t have to build a totally new site to get something out of this section. But if you’re an SEO Newbie intent on making a career of this, you might want to set up a practice site to really get your hands dirty and learn everything you can.
Before you start worrying about site content and structure (aka the fun stuff), you have a real chance to set your site up for success by using a strong domain name and developing a URL structure that’s SEO and user friendly. This stuff can be hard to change later when you have hundreds (or thousands) of pages in place, so you’ll be glad you started out on the right foot.
While you’re decades too late to score “buy.com,” it’s never too late to find the right domain name for you. This resource will help you sort through the SEO dos and SEO don’ts of selecting a root domain and TLD (don’t worry, all is explained) that are memorable without being spammy. There’s even info on what to consider if you have to change your domain name.
Don’t skip the section on subdomains—it could save you from making some rookie duplicate content errors.
Oh the SEO havoc that can ensue when your URLs aren’t set up quite right. Learn what not to do.
Woo-hoo! Now that you have a site, it’s time to think about how best to structure your site. Remember that you want to be thinking about both search engines and users as you set up that site. For example, that amazing Javascript menu you had designed might not be bot-friendly.
Things to think about at this point are that your content is indexable (that the crawlers can actually find it) and that you don’t have any orphaned pages. Learn more about those issues here.
And then you’re going to need a sitemap. Sitemaps help search engines index your content and understand the relationships between pages. So where better to get advice on how to build and implement a sitemap than straight from Google.
Another vital way to show search engines what pages are most important/related (and to help humans navigate your content) is through internal links. You want enough links to show users what’s what, but not so many that it’s impossible to tell what’s really important/related. Read more about optimal link structure and passing ranking power.
How long it takes a page on your site to load (page speed) mattered when we were all using desktops, but it’s crucial now that so much Internet traffic comes from mobile devices, plus it’s one factor in how pages get ranked. So whether you’re new to SEO or looking for new tricks, page speed might be a good place to start.
Use Google’s PageSpeed Insights to get specific recommendations on how to speed up your site and then get crackin’.
Speaking of mobile traffic, is your site mobile friendly? Learn about the difference between responsive designs and device-specific solutions on our mobile optimization page. You’ll also see a list of don’ts for mobile design (ever tried to close a pop-up on your iPhone?). This only gets more important the more mobile traffic you get (and want).
Phew! That was a lot of information, but once you’ve absorbed it all, you’ll have an excellent handle on site structure (which will save you a lot of trouble down the line). Bookmark this spot, then take a well-deserved break. We’ll start back here together when you’re ready.
Now that you have that site framework all set up, it’s time to get to the good stuff—populating it with content!
Before you write or post too much of your own content, you might want to see what’s working (and what isn’t) for your competitors. This analysis helps you identify those competitors and then understand what their links, rankings, and keywords look like. It’s important to update this research occasionally because your competition might change over time.
Veteran SEOs, you can skip straight ahead to Schema structured data unless you want a refresh on any other topics related to content.
SEO newbies, you’ll want a deep understanding of keyword research, SEO copywriting, and the other content-related topics in this section. Get yourself a coffee and then settle back in to learn a ton.
Marketers, this is your chance to learn all the basics for SEO-friendly content, so stick with us for a spell. You won’t need the same depth of understanding as someone who plans to do SEO for a living, so let your curiosity guide you as deep into any of these topics as you want to go.
You may feel like you just did keyword research in the last step, but it’s crucial enough that we’re going to dive a little deeper here. Understand the value of a particular keyword and see what kind of shot you have at ranking for it by reading Chapter 5 of the Beginner’s Guide to SEO.
We promised you’d get to actually create content and that time is finally here! Now that you have an understanding of the competitive landscape and the keywords you want to (and can) rank for, write away. Remember that while you’re really writing content for users, a few simple tips can help your content stand out to search engines too. Isn’t it nice when something does double duty?
For really search engine-friendly content, you’re going to want to make sure your meta data is all in order. That includes title tags, meta descriptions, and alt attributes.
Go the extra mile by incorporating Schema structured data into your content. This additional info gives search engines the data they need to include rich snippets (like review boxes) below your search results.
Veteran SEOs, it’s a good idea to skip ahead to on-site related topics now.
Newbies, your SEO education is not complete without a solid understanding of duplicate content, SEO for video, and how to measure success, so stick with this section until the end.
Marketers, duplicate content is something you’re going to hear a lot about and it doesn’t hurt to understand video SEO and how to measure success, so continue to read onward in this section.
Duplicate content is the bane of a website. Even if you think you’ve done everything right with your content, there’s a chance that a dynamic URL or something else is surfacing that same content to crawlers more than once. Not only does Google fail to see the logic in “twice as much is twice as nice” but they might also penalize you for it. Navigate around the most common pitfalls.
Content doesn’t just mean words, but unfortunately, the crawlers aren’t (yet) sophisticated enough to parse things like images and video. If your alt attributes are in good shape, you’re covered for images, but there are some SEO tactics you need to incorporate if you’re using video on your site. The good news is that once your video SEO is in good shape, video content often gets better rankings than text.
So you’ve got all that content on your site, but how do you know if it’s actually helping your SEO? At the beginning is a good time to set yourself up to measure your success so you can establish a baseline. Learn more about what metrics you should be tracking and how.
Time for yet another well-earned break. Grab a nap if you can and then spend a day or so observing how these issues are handled by other sites on the web. For maximum learning, try practicing some of your newfound knowledge on a site you have access to.
Set your bookmarks before you go.
When you’re ready to continue learning SEO, Newbies should make a stop at on-site related topics to get familiar with Robots.txt and HTTPS.
Any veterans still hanging about might want to take a quick read through on-site related topics to see what might have changed with Robots.txt and to take in the latest wisdom on HTTPS.
Marketers, you get to sit that one out and head straight on over to link-related topics.
For the true SEO aficionado, there are some technical details that you must get right. We’ve all heard stories of people accidentally blocking their site from being crawled and then wondering where all the traffic is. To keep from being one of these, learn about Robots.txt: how it helps you get found and when blocking robots is not actually effective.
The other technical on-site topic you’ll want to master is the switching of your site from HTTP to HTTPS without slowing down your site or losing traffic. This is especially important since Google announced that HTTPS is a ranking factor.
See how far you’ve leveled up already by getting current on just those two topics? Bet you aren’t even tired yet.
Newbies, it’s time to dive straight into link-related topics.
Veterans, go check out guest blogging for a look at how that practice has changed.
You now know a lot about how to make your site SEO friendly. Now it’s time to look at how to bend the rest of the Internet to your SEO will. Don’t worry, this’ll be TAGFEE.
External links are a fantastic way to show search engines that your site is credible and useful. They’re also a great way for users to find you by navigating from sites they already use. In short, they build your authority with humans and bots.
There are two effective ways to get more links from external sources: you can either earn them or build them. Chances are that you’ll get the best results by focusing on some combination of those two tactics.
Notice how we didn’t say “buy them”? Don’t buy links.
One tried and true way to build external links is through guest blogging, although this tactic has evolved a lot in the past few years. What used to be an “I give you content, you give me a link” sort of exchange has given way to guest blogging with a purpose.
Veterans, go ahead and pop on over to conversion rate optimization unless you want a refresh on link-related topics like link nofollow and canonicalization.
When you’re out there on the Internet trying to build links, be sure you’re looking for good quality links. Those are links that come from sites that are trustworthy, popular, and relevant to your content. For more information on factors search engines use to determine link value, read this page.
Anchor text is simply the text that’s used in a link whether it’s a link to a site or within that site. The implications of anchor text, though, reach farther because while keywords in anchor text can help your site rank for those words, it’s easy for keyword-stuffed anchor text to look spammy. Learn more about best practices for anchor text.
“Nofollow” is a designation you can apply to a link to keep it from passing any link equity (that’s kind of like the SEO equivalent of an up-vote). What might surprise you is that links don’t need to be “followed” to pass human authority. Even nofollowed links can help you build awareness and get more links. So when you’re linking to a site (or to other content on your site) think about whether that link leads to something you’re proud to be associated with.
Every Internet user eventually encounters a 404 error page, but that’s just one of the many HTTP status codes found on the web. Learn the difference between a 500 and a 503 along with some best practices for 404 pages here.
One of the most useful HTTP status codes for SEOs is the 301 redirect which is used to tell search engines a page has permanently moved elsewhere (and passes a good share of link equity). Gather all the in-depth info you ever needed about 301s and other redirects.
Perhaps because it’s one of the hardest SEO words to pronounce, canonicalization has a reputation for being complex. But the basic concept is simple: you have two (or more) pages that have similar content and canonicalization allows you to either combine those pages (using redirects) or indicate which version of the page you want search engines to treat as paramount. Read up on the details of using canonicalization to handle duplicate content.
You’ve now mastered so much SEO knowledge that you could teach the stuff (at least on a 101 level). If you’ve read and digested all the links along the way, you now know so much more about SEO than when you started.
But you’re so self-motivated that you want to know even more, don’t you?
Newbies, read closely through other optimization to refine your knowledge and apply those newly-minted optimization skills to even more aspects of the sites you’re working on.
Marketers, you’ve done a fabulous job powering through all these topics and there’s no doubt you can hold your own in the next SEO team meeting. To take your understanding of optimization even further, skim other optimization.
Or scoot on ahead and test your skills with the SEO Expert Quiz.
There are many ways (beyond the basic SEO knowledge you’ve been accruing here) to give your site an optimization boost. Find (and fix) what’s keeping potential customers from converting with conversion rate optimization, get your storefronts found on the web with local SEO, and find out how to prep your site to show up in international SERPs with international SEO.
If shoppers are abandoning their carts so fast you’re looking around for the tornado, your marketing funnel is acting more like a sieve and it’s time to plug some holes. Stop the bleeding with Paddy Moogan’s five-step framework for CRO. And keep on learning by keeping up with the latest CRO posts from the Moz Blog.
Even if you do most of your business in person at a local shop, customers are still trying to use the Internet to find you (and your hours, phone number, menu, etc.). Make sure they’re getting the right info (and finding you before they find your competitor across the street) by investing some time learning about local SEO. On that page you can also sign up for the Local 7-Pack, a monthly newsletter highlighting the top local SEO news you need to know. Or, watch for the latest local SEO developments on the Moz Blog.
A global customer base is a good thing to have, but you want to use international SEO to make sure potential customers in the UK are finding your British shipping policies instead of your American ones. Master hreflang to direct Chinese customers to content using simplified Chinese characters while you send Taiwanese customers to content that uses the traditional characters they’re used to. And find out how your site structure and whether you’re using a country code top-level domain (ccTLD) (like “.uk”) affects your SEO and potential ranking in international SERPs.
SEO newbies, we really can’t call you newbies anymore. Congratulations! No one has read deeper into this blog post or learned more along the way than you have.
SEO veterans, you knew a lot of this already, but now you’re up to date on the latest tips, tricks, and techniques.
And SEO-curious marketers, if you’re still hanging around, bravo! You can safely add “speaks SEO” as a feather in your cap.
You’re all ready to test your skills against the experts and prove just how much you’ve learned, take the SEO Expert Quiz and brag about your score.
Feel like you’ve mastered SEO already? Take the New SEO Expert Quiz to see how you stack up.
Congratulations! You’re well on your way to SEO mastery. Bask in that glow for a moment or two before moving on to your next project.
The fun thing about a developing field like SEO is that the learning and adventure never end. Whether you’re looking for more advanced knowledge or just to learn in a different format, try Distilled U‘s interactive modules or Market Motive’s web-based classes. If you’re looking for a job in SEO, Carl Hendy might just have your roadmap.
Thanks for following along with this choose your own adventure version of how to learn SEO. Share your favorite resources and ask us about any topics we might have missed in the comments.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
[ccw-atrib-link]