Why Effective, Modern SEO Requires Technical, Creative, and Strategic Thinking – Whiteboard Friday

Posted by randfish

There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.

There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.

Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”

Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.

First off, some of the good points he did bring up.

One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”

Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.

And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.

You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.

But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.

First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.

So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.

Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.

Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.

So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.

And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.

Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.

The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”

Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”

I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.

There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.

Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;

  • Good on-site experience
  • Writing good content
  • Getting others to acknowledge you as an authority
  • Rising in social popularity
  • Earning local relevance
  • Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)

The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.

I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.

But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.

I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.

We have to be able to understand things like;

  • Content rendering and indexability
  • Crawl structure, internal links, JavaScript, Ajax. If something’s post-loading after the page and Google’s not able to index it, or there are links that are accessible via JavaScript or Ajax, maybe Google can’t necessarily see those or isn’t crawling them as effectively, or is crawling them, but isn’t assigning them as much link weight as they might be assigning other stuff, and you’ve made it tough to link to them externally, and so they can’t crawl it.
  • Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?
  • Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.
  • Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.

Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”

Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.

  • International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?
  • Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?
  • Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.

Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.

I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.

  • Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.
  • Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.
  • Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.
  • Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.
  • Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.
  • Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.
  • Diagnosing mobile friendliness issues
  • Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.

Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.

So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.

All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

The 2015 #MozCon Video Bundle Has Arrived!

Posted by EricaMcGillivray

The bird has landed, and by bird, I mean the MozCon 2015 Video Bundle! That’s right, 27 sessions and over 15 hours of knowledge from our top notch speakers right at your fingertips. Watch presentations about SEO, personalization, content strategy, local SEO, Facebook graph search, and more to level up your online marketing expertise.

If these videos were already on your wish list, skip ahead:

If you attended MozCon, the videos are included with your ticket. You should have an email in your inbox (sent to the address you registered for MozCon with) containing your unique URL for a free “purchase.”

MozCon 2015 was fantastic! This year, we opened up the room for a few more attendees and to fit our growing staff, which meant 1,600 people showed up. Each year we work to bring our programming one step further with incredible speakers, diverse topics, and tons of tactics and tips for you.


What did attendees say?

We heard directly from 30% of MozCon attendees. Here’s what they had to say about the content:

Did you find the presentations to be advanced enough? 74% found them to be just perfect.

Wil Reynolds at MozCon 2015


What do I get in the bundle?

Our videos feature the presenter and their presentation side-by-side, so there’s no need to flip to another program to view a slide deck. You’ll have easy access to links and reference tools, and the videos even offer closed captioning for your enjoyment and ease of understanding.

For $299, the 2015 MozCon Video Bundle gives you instant access to:

  • 27 videos (over 15 hours) from MozCon 2015
  • Stream or download the videos to your computer, tablet, phone, phablet, or whatever you’ve got handy
  • Downloadable slide decks for all presentations


Bonus! A free full session from 2015!

Because some sessions are just too good to hide behind a paywall. Sample what the conference is all about with a full session from Cara Harshman about personalization on the web:


Surprised and excited to see these videos so early? Huge thanks is due to the Moz team for working hard to process, build, program, write, design, and do all the necessaries to make these happen. You’re the best!

Still not convinced you want the videos? Watch the preview for the Sherlock Christmas Special. Want to attend the live show? Buy your early bird ticket for MozCon 2016. We’ve sold out the conference for the last five years running, so grab your ticket now!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

How to Rid Your Website of Six Common Google Analytics Headaches

Posted by amandaecking

I’ve been in and out of Google Analytics (GA) for the past five or so years agency-side. I’ve seen three different code libraries, dozens of new different features and reports roll out, IP addresses stop being reported, and keywords not-so-subtly phased out of the free platform.

Analytics has been a focus of mine for the past year or so—mainly, making sure clients get their data right. Right now, our new focus is closed loop tracking, but that’s a topic for another day. If you’re using Google Analytics, and only Google Analytics for the majority of your website stats, or it’s your primary vehicle for analysis, you need to make sure it’s accurate.

Not having data pulling in or reporting properly is like building a house on a shaky foundation: It doesn’t end well. Usually there are tears.

For some reason, a lot of people, including many of my clients, assume everything is tracking properly in Google Analytics… because Google. But it’s not Google who sets up your analytics. People do that. And people are prone to make mistakes.

I’m going to go through six scenarios where issues are commonly encountered with Google Analytics.

I’ll outline the remedy for each issue, and in the process, show you how to move forward with a diagnosis or resolution.

1. Self-referrals

This is probably one of the areas we’re all familiar with. If you’re seeing a lot of traffic from your own domain, there’s likely a problem somewhere—or you need to extend the default session length in Google Analytics. (For example, if you have a lot of long videos or music clips and don’t use event tracking; a website like TEDx or SoundCloud would be a good equivalent.)

Typically one of the first things I’ll do to help diagnose the problem is include an advanced filter to show the full referrer string. You do this by creating a filter, as shown below:

Filter Type: Custom filter > Advanced
Field A: Hostname
Extract A: (.*)
Field B: Request URI
Extract B: (.*)
Output To: Request URI
Constructor: $A1$B1

You’ll then start seeing the subdomains pulling in. Experience has shown me that if you have a separate subdomain hosted in another location (say, if you work with a separate company and they host and run your mobile site or your shopping cart), it gets treated by Google Analytics as a separate domain. Thus, you ‘ll need to implement cross domain tracking. This way, you can narrow down whether or not it’s one particular subdomain that’s creating the self-referrals.

In this example below, we can see all the revenue is being reported to the booking engine (which ended up being cross domain issues) and their own site is the fourth largest traffic source:

I’ll also a good idea to check the browser and device reports to start narrowing down whether the issue is specific to a particular element. If it’s not, keep digging. Look at pages pulling the self-referrals and go through the code with a fine-tooth comb, drilling down as much as you can.

2. Unusually low bounce rate

If you have a crazy-low bounce rate, it could be too good to be true. Unfortunately. An unusually low bounce rate could (and probably does) mean that at least on some pages of your website have the same Google Analytics tracking code installed twice.

Take a look at your source code, or use Google Tag Assistant (though it does have known bugs) to see if you’ve got GA tracking code installed twice.

While I tell clients having Google Analytics installed on the same page can lead to double the pageviews, I’ve not actually encountered that—I usually just say it to scare them into removing the duplicate implementation more quickly. Don’t tell on me.

3. Iframes anywhere

I’ve heard directly from Google engineers and Google Analytics evangelists that Google Analytics does not play well with iframes, and that it will never will play nice with this dinosaur technology.

If you track the iframe, you inflate your pageviews, plus you still aren’t tracking everything with 100% clarity.

If you don’t track across iframes, you lose the source/medium attribution and everything becomes a self-referral.

Damned if you do; damned if you don’t.

My advice: Stop using iframes. They’re Netscape-era technology anyway, with rainbow marquees and Comic Sans on top. Interestingly, and unfortunately, a number of booking engines (for hotels) and third-party carts (for ecommerce) still use iframes.

If you have any clients in those verticals, or if you’re in the vertical yourself, check with your provider to see if they use iframes. Or you can check for yourself, by right-clicking as close as you can to the actual booking element:

iframe-booking.png

There is no neat and tidy way to address iframes with Google Analytics, and usually iframes are not the only complicated element of setup you’ll encounter. I spent eight months dealing with a website on a subfolder, which used iframes and had a cross domain booking system, and the best visibility I was able to get was about 80% on a good day.

Typically, I’d approach diagnosing iframes (if, for some reason, I had absolutely no access to viewing a website or talking to the techs) similarly to diagnosing self-referrals, as self-referrals are one of the biggest symptoms of iframe use.

4. Massive traffic jumps

Massive jumps in traffic don’t typically just happen. (Unless, maybe, you’re Geraldine.) There’s always an explanation—a new campaign launched, you just turned on paid ads for the first time, you’re using content amplification platforms, you’re getting a ton of referrals from that recent press in The New York Times. And if you think it just happened, it’s probably a technical glitch.

I’ve seen everything from inflated pageviews result from including tracking on iframes and unnecessary implementation of virtual pageviews, to not realizing the tracking code was installed on other microsites for the same property. Oops.

Usually I’ve seen this happen when the tracking code was somewhere it shouldn’t be, so if you’re investigating a situation of this nature, first confirm the Google Analytics code is only in the places it needs to be.Tools like Google Tag Assistant and Screaming Frog can be your BFFs in helping you figure this out.

Also, I suggest bribing the IT department with sugar (or booze) to see if they’ve changed anything lately.

5. Cross-domain tracking

I wish cross-domain tracking with Google Analytics out of the box didn’t require any additional setup. But it does.

If you don’t have it set up properly, things break down quickly, and can be quite difficult to untangle.

The older the GA library you’re using, the harder it is. The easiest setup, by far, is Google Tag Manager with Universal Analytics. Hard-coded universal analytics is a bit more difficult because you have to implement autoLink manually and decorate forms, if you’re using them (and you probably are). Beyond that, rather than try and deal with it, I say update your Google Analytics code. Then we can talk.

Where I’ve seen the most murkiness with tracking is when parts of cross domain tracking are implemented, but not all. For some reason, if allowLinker isn’t included, or you forget to decorate all the forms, the cookies aren’t passed between domains.

The absolute first place I would start with this would be confirming the cookies are all passing properly at all the right points, forms, links, and smoke signals. I’ll usually use a combination of the Real Time report in Google Analytics, Google Tag Assistant, and GA debug to start testing this. Any debug tool you use will mean you’re playing in the console, so get friendly with it.

6. Internal use of UTM strings

I’ve saved the best for last. Internal use of campaign tagging. We may think, oh, I use Google to tag my campaigns externally, and we’ve got this new promotion on site which we’re using a banner ad for. That’s a campaign. Why don’t I tag it with a UTM string?

Step away from the keyboard now. Please.

When you tag internal links with UTM strings, you override the original source/medium. So that visitor who came in through your paid ad and then who clicks on the campaign banner has now been manually tagged. You lose the ability to track that they came through on the ad the moment they click on the tagged internal link. Their source and medium is now your internal campaign, not that paid ad you’re spending gobs of money on and have to justify to your manager. See the problem?

I’ve seen at least three pretty spectacular instances of this in the past year, and a number of smaller instances of it. Annie Cushing also talks about the evils of internal UTM tags and the odd prevalence of it. (Oh, and if you haven’t explored her blog, and the amazing spreadsheets she shares, please do.)

One clothing company I worked with tagged all of their homepage offers with UTM strings, which resulted in the loss of visibility for one-third of their audience: One million visits over the course of a year, and $2.1 million in lost revenue.

Let me say that again. One million visits, and $2.1 million. That couldn’t be attributed to an external source/campaign/spend.

Another client I audited included campaign tagging on nearly every navigational element on their website. It still gives me nightmares.

If you want to see if you have any internal UTM strings, head straight to the Campaigns report in Acquisition in Google Analytics, and look for anything like “home” or “navigation” or any language you may use internally to refer to your website structure.

And if you want to see how users are moving through your website, go to the Flow reports. Or if you really, really, really want to know how many people click on that sidebar link, use event tracking. But please, for the love of all things holy (and to keep us analytics lovers from throwing our computers across the room), stop using UTM tagging on your internal links.

Now breathe and smile

Odds are, your Google Analytics setup is fine. If you are seeing any of these issues, though, you have somewhere to start in diagnosing and addressing the data.

We’ve looked at six of the most common points of friction I’ve encountered with Google Analytics and how to start investigating them: self-referrals, bounce rate, iframes, traffic jumps, cross domain tracking and internal campaign tagging.

What common data integrity issues have you encountered with Google Analytics? What are your favorite tools to investigate?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

How to Combat 5 of the SEO World’s Most Infuriating Problems – Whiteboard Friday

Posted by randfish

These days, most of us have learned that spammy techniques aren’t the way to go, and we have a solid sense for the things we should be doing to rank higher, and ahead of our often spammier competitors. Sometimes, maddeningly, it just doesn’t work. In today’s Whiteboard Friday, Rand talks about five things that can infuriate SEOs with the best of intentions, why those problems exist, and what we can do about them.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

What SEO problems make you angry?

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about some of the most infuriating things in the SEO world, specifically five problems that I think plague a lot of folks and some of the ways that we can combat and address those.

I’m going to start with one of the things that really infuriates a lot of new folks to the field, especially folks who are building new and emerging sites and are doing SEO on them. You have all of these best practices list. You might look at a web developer’s cheat sheet or sort of a guide to on-page and on-site SEO. You go, “Hey, I’m doing it. I’ve got my clean URLs, my good, unique content, my solid keyword targeting, schema markup, useful internal links, my XML sitemap, and my fast load speed. I’m mobile friendly, and I don’t have manipulative links.”

Great. “Where are my results? What benefit am I getting from doing all these things, because I don’t see one?” I took a site that was not particularly SEO friendly, maybe it’s a new site, one I just launched or an emerging site, one that’s sort of slowly growing but not yet a power player. I do all this right stuff, and I don’t get SEO results.

This makes a lot of people stop investing in SEO, stop believing in SEO, and stop wanting to do it. I can understand where you’re coming from. The challenge is not one of you’ve done something wrong. It’s that this stuff, all of these things that you do right, especially things that you do right on your own site or from a best practices perspective, they don’t increase rankings. They don’t. That’s not what they’re designed to do.

1) Following best practices often does nothing for new and emerging sites

This stuff, all of these best practices are designed to protect you from potential problems. They’re designed to make sure that your site is properly optimized so that you can perform to the highest degree that you are able. But this is not actually rank boosting stuff unfortunately. That is very frustrating for many folks. So following a best practices list, the idea is not, “Hey, I’m going to grow my rankings by doing this.”

On the flip side, many folks do these things on larger, more well-established sites, sites that have a lot of ranking signals already in place. They’re bigger brands, they have lots of links to them, and they have lots of users and usage engagement signals. You fix this stuff. You fix stuff that’s already broken, and boom, rankings pop up. Things are going well, and more of your pages are indexed. You’re getting more search traffic, and it feels great. This is a challenge, on our part, of understanding what this stuff does, not a challenge on the search engine’s part of not ranking us properly for having done all of these right things.

2) My competition seems to be ranking on the back of spammy or manipulative links

What’s going on? I thought Google had introduced all these algorithms to kind of shut this stuff down. This seems very frustrating. How are they pulling this off? I look at their link profile, and I see a bunch of the directories, Web 2.0 sites — I love that the spam world decided that that’s Web 2.0 sites — article sites, private blog networks, and do follow blogs.

You look at this stuff and you go, “What is this junk? It’s terrible. Why isn’t Google penalizing them for this?” The answer, the right way to think about this and to come at this is: Are these really the reason that they rank? I think we need to ask ourselves that question.

One thing that we don’t know, that we can never know, is: Have these links been disavowed by our competitor here?

I’ve got my HulksIncredibleStore.com and their evil competitor Hulk-tastrophe.com. Hulk-tastrophe has got all of these terrible links, but maybe they disavowed those links and you would have no idea. Maybe they didn’t build those links. Perhaps those links came in from some other place. They are not responsible. Google is not treating them as responsible for it. They’re not actually what’s helping them.

If they are helping, and it’s possible they are, there are still instances where we’ve seen spam propping up sites. No doubt about it.

I think the next logical question is: Are you willing to loose your site or brand? What we don’t see anymore is we almost never see sites like this, who are ranking on the back of these things and have generally less legitimate and good links, ranking for two or three or four years. You can see it for a few months, maybe even a year, but this stuff is getting hit hard and getting hit frequently. So unless you’re willing to loose your site, pursuing their links is probably not a strategy.

Then what other signals, that you might not be considering potentially links, but also non-linking signals, could be helping them rank? I think a lot of us get blinded in the SEO world by link signals, and we forget to look at things like: Do they have a phenomenal user experience? Are they growing their brand? Are they doing offline kinds of things that are influencing online? Are they gaining engagement from other channels that’s then influencing their SEO? Do they have things coming in that I can’t see? If you don’t ask those questions, you can’t really learn from your competitors, and you just feel the frustration.

3) I have no visibility or understanding of why my rankings go up vs down

On my HulksIncredibleStore.com, I’ve got my infinite stretch shorts, which I don’t know why he never wears — he should really buy those — my soothing herbal tea, and my anger management books. I look at my rankings and they kind of jump up all the time, jump all over the place all the time. Actually, this is pretty normal. I think we’ve done some analyses here, and the average page one search results shift is 1.5 or 2 position changes daily. That’s sort of the MozCast dataset, if I’m recalling correctly. That means that, over the course of a week, it’s not uncommon or unnatural for you to be bouncing around four, five, or six positions up, down, and those kind of things.

I think we should understand what can be behind these things. That’s a very simple list. You made changes, Google made changes, your competitors made changes, or searcher behavior has changed in terms of volume, in terms of what they were engaging with, what they’re clicking on, what their intent behind searches are. Maybe there was just a new movie that came out and in one of the scenes Hulk talks about soothing herbal tea. So now people are searching for very different things than they were before. They want to see the scene. They’re looking for the YouTube video clip and those kind of things. Suddenly Hulk’s soothing herbal tea is no longer directing as well to your site.

So changes like these things can happen. We can’t understand all of them. I think what’s up to us to determine is the degree of analysis and action that’s actually going to provide a return on investment. Looking at these day over day or week over week and throwing up our hands and getting frustrated probably provides very little return on investment. Looking over the long term and saying, “Hey, over the last 6 months, we can observe 26 weeks of ranking change data, and we can see that in aggregate we are now ranking higher and for more keywords than we were previously, and so we’re going to continue pursuing this strategy. This is the set of keywords that we’ve fallen most on, and here are the factors that we’ve identified that are consistent across that group.” I think looking at rankings in aggregate can give us some real positive ROI. Looking at one or two, one week or the next week probably very little ROI.

4) I cannot influence or affect change in my organization because I cannot accurately quantify, predict, or control SEO

That’s true, especially with things like keyword not provided and certainly with the inaccuracy of data that’s provided to us through Google’s Keyword Planner inside of AdWords, for example, and the fact that no one can really control SEO, not fully anyway.

You get up in front of your team, your board, your manager, your client and you say, “Hey, if we don’t do these things, traffic will suffer,” and they go, “Well, you can’t be sure about that, and you can’t perfectly predict it. Last time you told us something, something else happened. So because the data is imperfect, we’d rather spend money on channels that we can perfectly predict, that we can very effectively quantify, and that we can very effectively control.” That is understandable. I think that businesses have a lot of risk aversion naturally, and so wanting to spend time and energy and effort in areas that you can control feels a lot safer.

Some ways to get around this are, first off, know your audience. If you know who you’re talking to in the room, you can often determine the things that will move the needle for them. For example, I find that many managers, many boards, many executives are much more influenced by competitive pressures than they are by, “We won’t do as well as we did before, or we’re loosing out on this potential opportunity.” Saying that is less powerful than saying, “This competitor, who I know we care about and we track ourselves against, is capturing this traffic and here’s how they’re doing it.”

Show multiple scenarios. Many of the SEO presentations that I see and have seen and still see from consultants and from in-house folks come with kind of a single, “Hey, here’s what we predict will happen if we do this or what we predict will happen if we don’t do this.” You’ve got to show multiple scenarios, especially when you know you have error bars because you can’t accurately quantify and predict. You need to show ranges.

So instead of this, I want to see: What happens if we do it a little bit? What happens if we really overinvest? What happens if Google makes a much bigger change on this particular factor than we expect or our competitors do a much bigger investment than we expect? How might those change the numbers?

Then I really do like bringing case studies, especially if you’re a consultant, but even in-house there are so many case studies in SEO on the Web today, you can almost always find someone who’s analogous or nearly analogous and show some of their data, some of the results that they’ve seen. Places like SEMrush, a tool that offers competitive intelligence around rankings, can be great for that. You can show, hey, this media site in our sector made these changes. Look at the delta of keywords they were ranking for versus R over the next six months. Correlation is not causation, but that can be a powerful influencer showing those kind of things.

Then last, but not least, any time you’re going to get up like this and present to a group around these topics, if you very possibly can, try to talk one-on-one with the participants before the meeting actually happens. I have found it almost universally the case that when you get into a group setting, if you haven’t had the discussions beforehand about like, “What are your concerns? What do you think is not valid about this data? Hey, I want to run this by you and get your thoughts before we go to the meeting.” If you don’t do that ahead of time, people can gang up and pile on. One person says, “Hey, I don’t think this is right,” and everybody in the room kind of looks around and goes, “Yeah, I also don’t think that’s right.” Then it just turns into warfare and conflict that you don’t want or need. If you address those things beforehand, then you can include the data, the presentations, and the “I don’t know the answer to this and I know this is important to so and so” in that presentation or in that discussion. It can be hugely helpful. Big difference between winning and losing with that.

5) Google is biasing to big brands. It feels hopeless to compete against them

A lot of people are feeling this hopelessness, hopelessness in SEO about competing against them. I get that pain. In fact, I’ve felt that very strongly for a long time in the SEO world, and I think the trend has only increased. This comes from all sorts of stuff. Brands now have the little dropdown next to their search result listing. There are these brand and entity connections. As Google is using answers and knowledge graph more and more, it’s feeling like those entities are having a bigger influence on where things rank and where they’re visible and where they’re pulling from.

User and usage behavior signals on the rise means that big brands, who have more of those signals, tend to perform better. Brands in the knowledge graph, brands growing links without any effort, they’re just growing links because they’re brands and people point to them naturally. Well, that is all really tough and can be very frustrating.

I think you have a few choices on the table. First off, you can choose to compete with brands where they can’t or won’t. So this is areas like we’re going after these keywords that we know these big brands are not chasing. We’re going after social channels or people on social media that we know big brands aren’t. We’re going after user generated content because they have all these corporate requirements and they won’t invest in that stuff. We’re going after content that they refuse to pursue for one reason or another. That can be very effective.

You better be building, growing, and leveraging your competitive advantage. Whenever you build an organization, you’ve got to say, “Hey, here’s who is out there. This is why we are uniquely better or a uniquely better choice for this set of customers than these other ones.” If you can leverage that, you can generally find opportunities to compete and even to win against big brands. But those things have to become obvious, they have to become well-known, and you need to essentially build some of your brand around those advantages, or they’re not going to give you help in search. That includes media, that includes content, that includes any sort of press and PR you’re doing. That includes how you do your own messaging, all of these things.

(C) You can choose to serve a market or a customer that they don’t or won’t. That can be a powerful way to go about search, because usually search is bifurcated by the customer type. There will be slightly different forms of search queries that are entered by different kinds of customers, and you can pursue one of those that isn’t pursued by the competition.

Last, but not least, I think for everyone in SEO we all realize we’re going to have to become brands ourselves. That means building the signals that are typically associated with brands — authority, recognition from an industry, recognition from a customer set, awareness of our brand even before a search has happened. I talked about this in a previous Whiteboard Friday, but I think because of these things, SEO is becoming a channel that you benefit from as you grow your brand rather than the channel you use to initially build your brand.

All right, everyone. Hope these have been helpful in combating some of these infuriating, frustrating problems and that we’ll see some great comments from you guys. I hope to participate in those as well, and we’ll catch you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Should I Rebrand and Redirect My Site? Should I Consolidate Multiple Sites/Brands? – Whiteboard Friday

Posted by randfish

Making changes to your brand is a huge step, and while it’s sometimes the best path forward, it isn’t one to be taken lightly. In today’s Whiteboard Friday, Rand offers some guidance to marketers who are wondering whether a rebrand/redirect is right for them, and also those who are considering consolidating multiple sites under a single brand.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

To rebrand, or not to rebrand, that is the question

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. Today we’re going to chat a little bit about whether you should rebrand and consider redirecting your existing website or websites and whether you should potentially consolidate multiple websites and brands that you may be running.

So we’ve talked before about redirection moves best practices. We’ve also talked about the splitting of link equity and domain authority and those kinds of things. But one of the questions that people have is, “Gosh, you know I have a website today and given the moves that Google has been making, that the social media world has been making, that content marketing has been making, I’m wondering whether I should potentially rebrand my site.” Lots of people bought domains back in the day that were exact match domains or partial match domains or that they thought reflected a move of the web toward or away from less brand-centric stuff and toward more keyword matching, topic matching, intent matching kinds of things.

Maybe you’re reconsidering those moves and you want to know, “Hey, should I be thinking about making a change now?” That’s what I’m here to answer. So this question to rebrand or not to re, it is tough because you know that when you do that rebrand, you will almost certainly take a traffic hit, and SEO is one of the biggest places where people typically take that traffic hit.

Moz previously was at SEOmoz.org and moved to moz.com. We saw a dip in our traffic over about 3 to 4 months before it fully recovered, and I would say that dip was between 15% and 25% of our search traffic, depending on week to week. I’ll link to a list of metrics that I put on my personal blog, Moz.com/rand, so that you can check those out if you’d like to see them. But it was a short recovery time for us.

One of the questions that people always have is, “Well wait, did you lose rankings for SEO since SEO used to be in your domain name?” The answer is no. In fact, six months after the move, we were ranking higher for SEO related terms and phrases.

Scenario A: Rebranding or redirecting scifitoysandgames.com

So let’s imagine that today you are running SciFiToysAndGames.com, which is right on the borderline. In my opinion, that’s right on the borderline of barely tolerable. Like it could be brandable, but it’s not great. I don’t love the “sci-fi” in here, partially because of how the Syfy channel, the entity that broadcasts stuff on television has chosen to delineate their spelling, sci-fi can be misinterpreted as to how it’s spelled. I don’t love having to have “and” in a domain name. This is long. All sorts of stuff.

Let’s say you also own StarToys.com, but you haven’t used it. Previously StarToys.com has been redirecting to SciFiToysAndGames.com, and you’re thinking, “Well, man, is it the right time to make this move? Should I make this change now? Should I wait for the future?”

How memorable or amplifiable is your current brand?

Well, these are the questions that I would urge you to consider. How memorable and amplifiable is your current brand? That’s something that if you are recognizing like, “Hey I think our brand name, in fact, is holding us back in search results and social media amplification, press, in blog mentions, in journalist links and these kinds of things,” well, that’s something serious to think about. Word of mouth too.

Will you maintain your current brand name long term?

So if you know that sometime in the next two, three, four, or five years you do want to move to StarToys, I would actually strongly urge you to do that right now, because the longer you wait, the longer it will take to build up the signals around the new domain and the more pain you’ll potentially incur by having to keep branding this and working on this old brand name. So I would strongly urge you, if you know you’re going to make the move eventually, make it today. Take the pain now, rather than more pain later.

Can or have you tested brand preference with your target audience?

I would urge you to find two different groups, one who are loyal customers today, people who know SciFiToysAndGames.com and have used it, and two, people who are potential customers, but aren’t yet familiar with it.

You don’t need to do big sample-sizes. If you can get 5, 10, or 15 people either in a room or talk to them in person, you can try some web surveys, you can try using some social media ads like things on Facebook. I’ve seen some companies do some testing around this. Even buying potential PPC ads and seeing how click-through rates perform and sentiment and those kinds of things, that is a great way to help validate your ideas, especially if you’re forced to bring data to a table by executives or other stakeholders.

How much traffic would you need in one year to justify a URL move?

The last thing I think about is imagine, and I want you to either imagine or even model this out, mathematically model it out. If your traffic growth rate — so let’s say you’re growing at 10% year-over-year right now — if that improved 1%, 5%, or 10% annually with a new brand name, would you make the move? So knowing that you might take a short-term hit, but then that your growth rate would be incrementally higher in years to come, how big would that growth rate need to be?

I would say that, in general, if I were thinking about these two domains, granted this is a hard case because you don’t know exactly how much more brandable or word-of-mouth-able or amplifiable your new one might be compared to your existing one. Well, gosh, my general thing here is if you think that’s going to be a substantive percentage, say 5% plus, almost always it’s worth it, because compound growth rate over a number of years will mean that you’re winning big time. Remember that that growth rate is different that raw growth. If you can incrementally increase your growth rate, you get tremendously more traffic when you look back two, three, four, or five years later.

Where does your current and future URL live on the domain/brand name spectrum?

I also made this domain name, brand name spectrum, because I wanted to try and visualize crappiness of domain name, brand name to really good domain name, brand name. I wanted to give some examples and then extract out some elements so that maybe you can start to build on these things thematically as you’re considering your own domains.

So from awful, we go to tolerable, good, and great. So Science-Fi-Toys.net is obviously terrible. I’ve taken a contraction of the name and the actual one. It’s got a .net. It’s using hyphens. It’s infinitely unmemorable up to what I think is tolerable — SciFiToysAndGames.com. It’s long. There are some questions about how type-in-able it is, how easy it is to type in. SciFiToys.com, which that’s pretty good. SciFiToys, relatively short, concise. It still has the “sci-fi” in there, but it’s a .com. We’re getting better. All the way up to, I really love the name, StarToys. I think it’s very brandable, very memorable. It’s concise. It’s easy to remember and type in. It has positive associations probably with most science fiction toy buyers who are familiar with at least “Star Wars” or “Star Trek.” It’s cool. It has some astronomy connotations too. Just a lot of good stuff going on with that domain name.

Then, another one, Region-Data-API.com. That sucks. NeighborhoodInfo.com. Okay, at least I know what it is. Neighborhood is a really hard name to type because it is very hard for many people to spell and remember. It’s long. I don’t totally love it. I don’t love the “info” connotation, which is generic-y.

DistrictData.com has a nice, alliterative ring to it. But maybe we could do even better and actually there is a company, WalkScore.com, which I think is wonderfully brandable and memorable and really describes what it is without being too in your face about the generic brand of we have regional data about places.

What if you’re doing mobile apps? BestAndroidApps.com. You might say, “Why is that in awful?” The answer is two things. One, it’s the length of the domain name and then the fact that you’re actually using someone else’s trademark in your name, which can be really risky. Especially if you start blowing up, getting big, Google might go and say, “Oh, do you have Android in your domain name? We’ll take that please. Thank you very much.”

BestApps.io, in the tech world, it’s very popular to use domains like .io or .ly. Unfortunately, I think once you venture outside of the high tech world, it’s really tough to get people to remember that that is a domain name. If you put up a billboard that says “BestApps.com,” a majority of people will go, “Oh, that’s a website.” But if you use .io, .ly, or one of the new domain names, .ninja, a lot of people won’t even know to connect that up with, “Oh, they mean an Internet website that I can type into my browser or look for.”

So we have to remember that we sometimes live in a bubble. Outside of that bubble are a lot of people who, if it’s not .com, questionable as to whether they’re even going to know what it is. Remember outside of the U.S., country code domain names work equally well — .co.uk, .ca, .co.za, wherever you are.

InstallThis.com. Now we’re getting better. Memorable, clear. Then all the way up to, I really like the name AppCritic.com. I have positive associations with like, “Oh year, restaurant critics, food critics, and movie critics, and this is an app critic. Great, that’s very cool.”

What are the things that are in here? Well, stuff at this end of the spectrum tends to be generic, forgettable, hard to type in. It’s long, brand-infringing, danger, danger, and sketchy sounding. It’s hard to quantify what sketchy sounding is, but you know it when you see it. When you’re reviewing domain names, you’re looking for links, you’re looking at things in the SERPs, you’re like, “Hmm, I don’t know about this one.” Having that sixth sense is something that we all develop over time, so sketchy sounding not quite as scientific as I might want for a description, but powerful.

On this end of the spectrum though, domain names and brand names tend to be unique, memorable, short. They use .com. Unfortunately, still the gold standard. Easy to type in, pronounceable. That’s a powerful thing too, especially because of word of mouth. We suffered with that for a long time with SEOmoz because many people saw it and thought, “Oh, ShowMoz, COMoz, SeeMoz.” It sucked. Have positive associations, like StarToys or WalkScore or AppCritic. They have these positive, pre-built-in associations psychologically that suggest something brandable.

Scenario B: Consolidating two sites

Scenario B, and then we’ll get to the end, but scenario B is the question like, “Should I consolidate?” Let’s say I’m running both of these today. Or more realistic and many times I see people like this, you’re running AppCritic.com and StarToys.com, and you think, “Boy, these are pretty separate.” But then you keep finding overlap between them. Your content tends to overlap, the audience tends to overlap. I find this with many, many folks who run multiple domains.

How much audience and content overlap is there?

So we’ve got to consider a few things. First off, that audience and content overlap. If you’ve got StarToys and AppCritic and the overlap is very thin, just that little, tiny piece in the middle there. The content doesn’t overlap much, the audience doesn’t overlap much. It probably doesn’t make that much sense.

But what if you’re finding like, “Gosh, man, we’re writing more and more about apps and tech and mobile and web stuff on StarToys, and we’re writing more and more about other kinds of geeky, fun things on AppCritic. Slowly it feels like these audiences are merging.” Well, now you might want to consider that consolidation.

Is there potential for separate sales or exits?

Second point of consideration, the potential for separate exits or sales. So if you know that you’re going to sell AppCritic.com to someone in the future and you want to make sure that’s separate from StarToys, you should keep them separate. If you think to yourself, “Gosh, I’d never sell one without the other. They’re really part of the same company, brand, effort,” well, I’d really consider that consolidation.

Will you dilute marketing or branding efforts?

Last point of positive consideration is dilution of marketing and branding efforts. Remember that you’re going to be working on marketing. You’re going to be working on branding. You’re going to be working on growing traffic to these. When you split your efforts, unless you have two relatively large, separate teams, this is very, very hard to do at the same rate that it could be done if you combined those efforts. So another big point of consideration. That compound growth rate that we talked about, that’s another big consideration with this.

Is the topical focus out of context?

What I don’t recommend you consider and what has been unfortunately considered, by a lot of folks in the SEO-centric world in the past, is topical focus of the content. I actually am crossing this out. Not a big consideration. You might say to yourself, “But Rand, we talked about previously on Whiteboard Friday how I can have topical authority around toys and games that are related to science fiction stuff, and I can have topical authority related to mobile apps.”

My answer is if the content overlap is strong and the audience overlap is strong, you can do both on one domain. You can see many, many examples of this across the web, Moz being a great example where we talk about startups and technology and sometimes venture capital and team building and broad marketing and paid search marketing and organic search marketing and just a ton of topics, but all serving the same audience and content. Because that overlap is strong, we can be an authority in all of these realms. Same goes for any time you’re considering these things.

All right everyone, hope you’ve enjoyed this edition of Whiteboard Friday. I look forward to some great comments, and we’ll see you again next week. take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

I Can’t Drive 155: Meta Descriptions in 2015

Posted by Dr-Pete

For years now, we (and many others) have been recommending keeping your Meta Descriptions shorter than
about 155-160 characters. For months, people have been sending me examples of search snippets that clearly broke that rule, like this one (on a search for “hummingbird food”):

For the record, this one clocks in at 317 characters (counting spaces). So, I set out to discover if these long descriptions were exceptions to the rule, or if we need to change the rules. I collected the search snippets across the MozCast 10K, which resulted in 92,669 snippets. All of the data in this post was collected on April 13, 2015.

The Basic Data

The minimum snippet length was zero characters. There were 69 zero-length snippets, but most of these were the new generation of answer box, that appears organic but doesn’t have a snippet. To put it another way, these were misidentified as organic by my code. The other 0-length snippets were local one-boxes that appeared as organic but had no snippet, such as this one for “chichen itza”:

These zero-length snippets were removed from further analysis, but considering that they only accounted for 0.07% of the total data, they didn’t really impact the conclusions either way. The shortest legitimate, non-zero snippet was 7 characters long, on a search for “geek and sundry”, and appears to have come directly from the site’s meta description:

The maximum snippet length that day (this is a highly dynamic situation) was 372 characters. The winner appeared on a search for “benefits of apple cider vinegar”:

The average length of all of the snippets in our data set (not counting zero-length snippets) was 143.5 characters, and the median length was 152 characters. Of course, this can be misleading, since some snippets are shorter than the limit and others are being artificially truncated by Google. So, let’s dig a bit deeper.

The Bigger Picture

To get a better idea of the big picture, let’s take a look at the display length of all 92,600 snippets (with non-zero length), split into 20-character buckets (0-20, 21-40, etc.):

Most of the snippets (62.1%) cut off as expected, right in the 141-160 character bucket. Of course, some snippets were shorter than that, and didn’t need to be cut off, and some broke the rules. About 1% (1,010) of the snippets in our data set measured 200 or more characters. That’s not a huge number, but it’s enough to take seriously.

That 141-160 character bucket is dwarfing everything else, so let’s zoom in a bit on the cut-off range, and just look at snippets in the 120-200 character range (in this case, by 5-character bins):

Zooming in, the bulk of the snippets are displaying at lengths between about 146-165 characters. There are plenty of exceptions to the 155-160 character guideline, but for the most part, they do seem to be exceptions.

Finally, let’s zoom in on the rule-breakers. This is the distribution of snippets displaying 191+ characters, bucketed in 10-character bins (191-200, 201-210, etc.):

Please note that the Y-axis scale is much smaller than in the previous 2 graphs, but there is a pretty solid spread, with a decent chunk of snippets displaying more than 300 characters.

Without looking at every original meta description tag, it’s very difficult to tell exactly how many snippets have been truncated by Google, but we do have a proxy. Snippets that have been truncated end in an ellipsis (…), which rarely appears at the end of a natural description. In this data set, more than half of all snippets (52.8%) ended in an ellipsis, so we’re still seeing a lot of meta descriptions being cut off.

I should add that, unlike titles/headlines, it isn’t clear whether Google is cutting off snippets by pixel width or character count, since that cut-off is done on the server-side. In most cases, Google will cut before the end of the second line, but sometimes they cut well before this, which could suggest a character-based limit. They also cut off at whole words, which can make the numbers a bit tougher to interpret.

The Cutting Room Floor

There’s another difficulty with telling exactly how many meta descriptions Google has modified – some edits are minor, and some are major. One minor edit is when Google adds some additional information to a snippet, such as a date at the beginning. Here’s an example (from a search for “chicken pox”):

With the date (and minus the ellipsis), this snippet is 164 characters long, which suggests Google isn’t counting the added text against the length limit. What’s interesting is that the rest comes directly from the meta description on the site, except that the site’s description starts with “Chickenpox.” and Google has removed that keyword. As a human, I’d say this matches the meta description, but a bot has a very hard time telling a minor edit from a complete rewrite.

Another minor rewrite occurs in snippets that start with search result counts:

Here, we’re at 172 characters (with spaces and minus the ellipsis), and Google has even let this snippet roll over to a third line. So, again, it seems like the added information at the beginning isn’t counting against the length limit.

All told, 11.6% of the snippets in our data set had some kind of Google-generated data, so this type of minor rewrite is pretty common. Even if Google honors most of your meta description, you may see small edits.

Let’s look at our big winner, the 372-character description. Here’s what we saw in the snippet:

Jan 26, 2015 – Health• Diabetes Prevention: Multiple studies have shown a correlation between apple cider vinegar and lower blood sugar levels. … • Weight Loss: Consuming apple cider vinegar can help you feel more full, which can help you eat less. … • Lower Cholesterol: … • Detox: … • Digestive Aid: … • Itchy or Sunburned Skin: … • Energy Boost:1 more items

So, what about the meta description? Here’s what we actually see in the tag:

Were you aware of all the uses of apple cider vinegar? From cleansing to healing, to preventing diabetes, ACV is a pantry staple you need in your home.

That’s a bit more than just a couple of edits. So, what’s happening here? Well, there’s a clue on that same page, where we see yet another rule-breaking snippet:

You might be wondering why this snippet is any more interesting than the other one. If you could see the top of the SERP, you’d know why, because it looks something like this:

Google is automatically extracting list-style data from these pages to fuel the expansion of the Knowledge Graph. In one case, that data is replacing a snippet
and going directly into an answer box, but they’re performing the same translation even for some other snippets on the page.

So, does every 2nd-generation answer box yield long snippets? After 3 hours of inadvisable mySQL queries, I can tell you that the answer is a resounding “probably not”. You can have 2nd-gen answer boxes without long snippets and you can have long snippets without 2nd-gen answer boxes,
but there does appear to be a connection between long snippets and Knowledge Graph in some cases.

One interesting connection is that Google has begun bolding keywords that seem like answers to the query (and not just synonyms for the query). Below is an example from a search for “mono symptoms”. There’s an answer box for this query, but the snippet below is not from the site in the answer box:

Notice the bolded words – “fatigue”, “sore throat”, “fever”, “headache”, “rash”. These aren’t synonyms for the search phrase; these are actual symptoms of mono. This data isn’t coming from the meta description, but from a bulleted list on the target page. Again, it appears that Google is trying to use the snippet to answer a question, and has gone well beyond just matching keywords.

Just for fun, let’s look at one more, where there’s no clear connection to the Knowledge Graph. Here’s a snippet from a search for “sons of anarchy season 4”:

This page has no answer box, and the information extracted is odd at best. The snippet bears little or no resemblance to the site’s meta description. The number string at the beginning comes out of a rating widget, and some of the text isn’t even clearly available on the page. This seems to be an example of Google acknowledging IMDb as a high-authority site and desperately trying to match any text they can to the query, resulting in a Frankenstein’s snippet.

The Final Verdict

If all of this seems confusing, that’s probably because it is. Google is taking a lot more liberties with snippets these days, both to better match queries, to add details they feel are important, or to help build and support the Knowledge Graph.

So, let’s get back to the original question – is it time to revise the 155(ish) character guideline? My gut feeling is: not yet. To begin with, the vast majority of snippets are still falling in that 145-165 character range. In addition, the exceptions to the rule are not only atypical situations, but in most cases those long snippets don’t seem to represent the original meta description. In other words, even if Google does grant you extra characters, they probably won’t be the extra characters you asked for in the first place.

Many people have asked: “How do I make sure that Google shows my meta description as is?” I’m afraid the answer is: “You don’t.” If this is very important to you, I would recommend keeping your description below the 155-character limit, and making sure that it’s a good match to your target keyword concepts. I suspect Google is going to take more liberties with snippets over time, and we’re going to have to let go of our obsession with having total control over the SERPs.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Give It Up for Our MozCon 2015 Community Speakers

Posted by EricaMcGillivray

Super thrilled that we’re able to announce this year’s community speakers for MozCon, July 13-15th in Seattle!

Wow. Each year I feel that I say the pool keeps getting more and more talented, but it’s the truth! We had more quality pitches this year than in the past, and quantity-wise, there were 241, around 100 more entries than years previously. Let me tell you, many of the review committee members filled our email thread with amazement at this.

And even though we had an unprecedented six slots, the choices seemed even tougher!

241 pitches
Let that number sink in for a little while.

Because we get numerous questions about what makes a great pitch, I wanted to share both information about the speakers and their great pitches—with some details removed for spoilers. (We’re still working with each speaker to polish and finalize their topic.) I’ve also included my or Matt Roney‘s own notes on each one from when we read them without knowing who the authors were.

Please congratulate our MozCon 2015 community speakers!

Adrian Vender

Adrian is the Director of Analytics at IMI and a general enthusiast of coding and digital marketing. He’s also a life-long drummer and lover of music. Follow him at @adrianvender.

Adrian’s pitch:

Content Tracking with Google Tag Manager

While marketers have matured in the use of web analytics tools, our ability to measure how users interact with our sites’ content needs improvement. Users are interacting with dynamic content that just aren’t captured in a pageview. While there are JavaScript tricks to help track these details, working with IT to place new code is usually the major hurdle that stops us.

Finally, Google Tag Manager is that bridge to advanced content analysis. GTM may appear technical, but it can easily be used by any digital marketer to track almost any action on a site. My goal is to make ALL attendees users of GTM.

My talk will cover the following GTM concepts:

[Adrian lists 8 highly-actionable tactics he’ll cover.]

I’ll share a client example of tracking content interaction in GA. I’ll also share a link to a GTM container file that can help people pre-load the above tag templates into their own GTM.

Matt’s notes: Could be good. I know a lot of people have questions about Tag Manager, and the ubiquity of GA should help it be pretty well-received.


Chris DayleyChris Dayley

Chris is a digital marketing expert and owner of Dayley Conversion. His company provides full-service A/B testing for businesses, including design, development, and test execution. Follow him at @chrisdayley.

Chris’ pitch:

I would like to present a super actionable 15 minute presentation focused on the first two major steps businesses should take to start A/B testing:

1. Radical Redesign Testing

2. Iterative Testing (Test EVERYTHING)

I am one of the few CROs out there that recommends businesses to start with a radical redesign test. My reasoning for doing so is that most businesses have done absolutely no testing on their current website, so the current landing page/website really isn’t a “best practice” design yet.

I will show several case studies where clients saw more than a 50% lift in conversion rates just from this first step of radical redesign testing, and will offer several tips for how to create a radical redesign test. Some of the tips include:

[Chris lists three direct and interesting tips he’ll share.]

Next I suggest moving into the iterative phase.

I will show several case studies of how to move through iterative testing so you eventually test every element on your page.

Erica’s notes: Direct, interesting, and with promise of multiple case studies.


Duane BrownDuane Brown

Duane is a digital marketer with 10 years’ experience having lived and worked in five cities across three continents. He’s currently at Unbounce. When not working, you can find Duane traveling to some far-flung location around the world to eat food and soak up the culture. Follow him at @DuaneBrown.

Duane’s pitch:

What Is Delightful Remarketing & How You Can Do It Too

A lot of people find remarketing creepy and weird. They don’t get why they are seeing those ads around the internet…. let alone how to make them stop showing.

This talk will focus on the different between remarketing & creating delightful remarketing that can help grow the revenue & profit at a company and not piss customers off. 50% of US marketers don’t use remarketing according to eMarketer (2013).

– [Duane’s direct how-to for e-commerce customers.] Over 60% of customers abandon a shopping cart each year: http://baymard.com/lists/cart-abandonment-rate (3 minute)

– Cover a SaaS company using retargeting to [Duane’s actionable item]. This remarketing helps show your products sticky features while showing off your benefits (3 minute)

– The Dos: [Duane’s actionable tip], a variety of creative & a dedicated landing page creates delightful remarketing that grows revenue (3 minute)

– Wrap up and review main points. (2 minutes)

Matt’s notes: Well-detailed, an area in which there’s a lot of room for improvement.


Gianluca FiorelliGianluca Fiorelli

Moz Associate, official blogger for StateofDigital.com and known international SEO and inbound strategist, Gianluca works in the digital marketing industry, but he still believes that he just know that he knows nothing. Follow him at @gfiorelli1.

Gianluca’s pitch:

Unusual Sources for Keyword and Topical Research

A big percentage of SEOs equal Keyword and Topical Research to using Keyword Planner and Google Suggest.

However, using only them, we cannot achieve a real deep knowledge of the interests, psychology and language of our target.

In this talk, I will present unusual sources and unnoticed features of very well-known tools, and offer a final example based on a true story.

Arguments touched in the speech (not necessarily in this order):

[Gianluca lists seven how-tos and one unique case study.]

Erica’s notes: Theme of Google not giving good keyword info. Lots of unique actionable points and resources. Will work in 15 minute time limit.


Ruth Burr ReedyRuth Burr Reedy

Ruth is the head of on-site SEO for BigWing Interactive, a full-service digital marketing agency in Oklahoma City, OK. At BigWing, she manages a team doing on-site, technical, and local SEO. Ruth has been working in SEO since 2006. Follow her at @ruthburr.

Ruth’s pitch:

Get Hired to Do SEO

This talk will go way beyond “just build your own website” and talk about specific ways SEOs can build evidence of their skills across the web, including:

[Ruth lists 7 how-tos with actionable examples.]

All in a funny, actionable, beautiful, easy-to-understand get-hired masterpiece.

Erica’s notes: Great takeaways. Wanted to do a session about building your resume as a marketer for a while.


Stephanie WallaceStephanie Wallace

Stephanie is director of SEO at Nebo, a digital agency in Atlanta. She helps clients navigate the ever-changing world of SEO by understanding their audience and helping them create a digital experience that both the user and Google appreciates. Follow her at @SWallaceSEO.

Stephanie’s pitch:

Everyone knows PPC and SEO complement one another – increased visibility in search results help increase perceived authority and drive more clickthroughs to your site overall. But are you actively leveraging the wealth of PPC data available to build on your existing SEO strategy? The key to effectively using this information lies in understanding how to test SEO tactics and how to apply the results to your on-page strategies. This session will delve into actionable strategies for using PPC campaign insights to influence on-page SEO and content strategies. Key takeaways include:

[Stephanie lists four how-tos.]

Erica’s notes: Nice and actionable. Like this a lot.


As mentioned, we had 241 entries, and many of them were stage quality. Notable runners up included AJ Wilcox, Ed Reese, and Daylan Pearce, and a big pat on the back to all those who tossed their hat in.

Also, a huge thank you to my fellow selection committee members for 2015: Charlene Inoncillo, Cyrus Shepard, Danie Launders, Jen Lopez, Matt Roney, Rand Fishkin, Renea Nielsen, and Trevor Klein.

Buy your ticket now

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it