Labour Wins Shock Election and Forms Coalition with Lib Dems*

If this is front-page news on Friday morning, remember you read it here first.

Theresa May conveniently called the snap election right after we released our Hitting the Mark benchmarking report. We thought we would apply the same methodology to the election with some obvious modifications. First, unlike the ecommerce brands in Hitting the Mark, political parties don’t really sell anything (insert your own joke here), so we could not analyse the post purchase journey. Secondly, because the election was such a surprise, we only have about a month’s worth of emails.

We signed up for emails from all of the parties that were represented in Parliament after the 2015 General Election:

General Themes

Across all of these parties, some general themes emerged. First, not all of the parties sent us emails even after we had subscribed. In many cases, this makes sense. We used two London postcodes as part of our sign-up. Many of these parties are regional and therefore have no candidates running in our constituencies; there is no real need to send us emails. If that were the case however, a simple email saying that would have been the polite thing to do. Additionally, if they did not intend to use the data, why did they collect it? It would have been better to let us know at the point of data capture that they would not be emailing us and would not be storing our data.

The second theme is that interest in finding out more about your organisation does not necessarily mean I want to ‘join’ your party, but that was the base assumption across all of the parties. Some came right out and said that upfront, while others were happy to capture you email address and only talk about ‘membership’, ‘accounts’, and ‘public profiles’ as part of the post sign-up or welcome messaging. We found that very off-putting. We wanted to know what each party stands for and why we should support them. Then, and only then, would we consider joining or donating money and time to get that party’s candidates elected. Perhaps this is because the election was so near but there was no sense of lead nurture or customer journey:

  • Would you like more information?
  • Are you interested?
  • Are you going to vote?
  • Are you going to vote for us?
  • Would you like to donate time, give money or join the party?

In the end, we only received emails from:

  • The Conservative Party
  • The Labour Party
  • Liberal Democrats
  • UK Independence Party (UKIP)
  • The Green Party
  • Plaid Cymru

One interesting trend across all of these parties is that they did not fully embrace the use of HTML, which we found surprising for a couple of reasons. Not only is a picture worth a thousand words but some of the parties’ websites, such as the Liberal Democrats, follow a very image-heavy design ethos.

Another trend we saw across all of the parties was that email has clearly become more important to them as we get closer to Election Day. When we first signed up in early May, email capture was prominent but when we looked again earlier this week we found that many of the parties had implemented email capture pop-ups or homepage takeovers.

The Labour Party

As you have probably already guessed, Labour came out on top with a score of 35.5 out of a possible 58. While they won 27% of the total points awarded, they were still almost 40% away from a perfect. Where Labour did well was that they varied their communications strategy based on the constituency from which we registered. One of our addresses was in Hampstead and Kilburn, which currently has three Labour MPs versus the Richmond Park constituency, which is very definitely not Labour. The additional emails sent to the Hampstead and Kilburn constituency were primarily ‘get-out-and-vote’ emails to ensure their supports did not get complacent and to actually turn out to the polls on June 8th. It was also clear that Labour is testing subject lines and content.

Liberal Democrats

Next up was the Liberal Democrats with 31.5 points out of the possible 58. Not only were they testing both subject line content, they were the only party to use emojis in their subject lines. The other things that the Liberal Democrats did well were:

  • Integrating their emails with other channels
  • Surveyed recipients on the issues that matter to them in this election
  • Asked readers if each email was useful
  • Had a preference centre as part of the unsubscribe process
  • Used a tone of voice that really spoke to the voters

For all of the good things that they did, they were clearly not perfect. Their copywriting really let them down by including spelling mistakes, split infinitives and bad sentence structure.

Aside from a single Labour email, the Liberal Democrats were the only party to include any design elements other than a logo by trying to include buttons. Unfortunately, the buttons did not really look like buttons. They looked like coloured rectangles and it was not clear if we were supposed to press them or if they were there for emphasis.

While we are on the topic of design, they could have done so much more. The Liberal Democrats have the base website from a design perspective, but it is actually a little jarring when you click from their plain email to their heavily designed site.

The Conservative Party

The biggest failing in The Conservative’s email program was their failure to send a welcome email. All of the other parties that sent us something started with at least an email confirmation, and some had fully fledged two- or three-step welcome programs. The Conservatives had none. This failing not only left a potential 21 points on the table, but also did nothing to reinforce that they were a party that cares about voters. They also sent the fewest emails, with only three messages going to the Hampstead and Kilburn constituency versus the 20 that Labour sent. The Conservatives did send an extra email to the Richmond Park constituency, which specifically targeted the Liberal Democrats who won the seat in a by-election on December 1st.

The Conservatives, like many of the other parties, were also not consistent with their ‘friendly from’ name. The Conservatives preferred the names of the ‘sender’ as the from name. The first email came from Phillip Hammond twelve days after we registered and then we received two from Theresa May. The extra email to the Richmond Park constituency, however, came from Patrick McLoughlin who it turns out is the Chairman of The Conservative party, but we wondered would most people know that (especially those who are not Conservative die hards)? It would be easy to have skipped over this email if we were not being paid to read every one.

Other Parties

Interestingly, the Green Party, Plaid Cymru and UKIP were the only parties that used confirmed opt-in, clearly indicating that they have a higher regard for voter data and privacy. That said, none of these three parties executed this well.

The Green Party’s confirmation email was clearly generated from their website and did not include their organisational details. Once the confirmation link is clicked, there is a page confirming that their account is active which is followed by an email that has the exact same message. Clearly, this is overkill, which ironically is a waste of electricity.

Plaid Cymru sent a confirmation email and then did not honour the fact that we did not click the confirmation link by sending a further three emails. Perhaps the reason we did not click the confirmation email was that it was written only in Welsh when all of their other emails were bilingual.

UKIP’s confirmation email turned out to be an account activation email. At no point thus far in the journey did they make it clear that we were setting up an account on the UKIP website. This was clear when we got to the confirmation page. Not only would we be setting up an account by completing the page but we would also have a public profile.

Conclusion

There have been so many articles written about how recent communications have been driven by the clever use of data. Based on the reported millions being spent on social media, this may be the case again on June 8th – but I cannot help but think that the UK’s political parties are missing a trick. Email is the most popular channel for consumers to maintain relationships with brands, but the parties are clearly not interested in building relationships or at least they have not been during this election season.

Email is the most effective marketing channel but only when used properly can organisations have human conversations at scale. The parties are getting some things right; personalisation, testing, location-based targeting. On the other hand, they are leaving a number of standard tools and techniques of the email marketer in the toolbox, such as automation, advanced segmentation and dynamic content.

*I cannot or will not make a prediction on the outcome of this election (I have gotten this horribly wrong over the past couple of years), but I can say that regardless of who wins it will not be based on the quality of their email programs.

The post Labour Wins Shock Election and Forms Coalition with Lib Dems* appeared first on The Email Marketing Blog.

Reblogged 3 weeks ago from blog.dotmailer.com

Why Effective, Modern SEO Requires Technical, Creative, and Strategic Thinking – Whiteboard Friday

Posted by randfish

There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.

There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.

Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”

Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.

First off, some of the good points he did bring up.

One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”

Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.

And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.

You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.

But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.

First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.

So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.

Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.

Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.

So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.

And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.

Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.

The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”

Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”

I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.

There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.

Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;

  • Good on-site experience
  • Writing good content
  • Getting others to acknowledge you as an authority
  • Rising in social popularity
  • Earning local relevance
  • Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)

The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.

I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.

But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.

I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.

We have to be able to understand things like;

  • Content rendering and indexability
  • Crawl structure, internal links, JavaScript, Ajax. If something’s post-loading after the page and Google’s not able to index it, or there are links that are accessible via JavaScript or Ajax, maybe Google can’t necessarily see those or isn’t crawling them as effectively, or is crawling them, but isn’t assigning them as much link weight as they might be assigning other stuff, and you’ve made it tough to link to them externally, and so they can’t crawl it.
  • Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?
  • Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.
  • Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.

Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”

Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.

  • International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?
  • Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?
  • Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.

Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.

I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.

  • Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.
  • Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.
  • Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.
  • Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.
  • Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.
  • Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.
  • Diagnosing mobile friendliness issues
  • Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.

Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.

So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.

All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 1 year ago from tracking.feedpress.it

Distance from Perfect

Posted by wrttnwrd

In spite of all the advice, the strategic discussions and the conference talks, we Internet marketers are still algorithmic thinkers. That’s obvious when you think of SEO.

Even when we talk about content, we’re algorithmic thinkers. Ask yourself: How many times has a client asked you, “How much content do we need?” How often do you still hear “How unique does this page need to be?”

That’s 100% algorithmic thinking: Produce a certain amount of content, move up a certain number of spaces.

But you and I know it’s complete bullshit.

I’m not suggesting you ignore the algorithm. You should definitely chase it. Understanding a little bit about what goes on in Google’s pointy little head helps. But it’s not enough.

A tale of SEO woe that makes you go “whoa”

I have this friend.

He ranked #10 for “flibbergibbet.” He wanted to rank #1.

He compared his site to the #1 site and realized the #1 site had five hundred blog posts.

“That site has five hundred blog posts,” he said, “I must have more.”

So he hired a few writers and cranked out five thousand blogs posts that melted Microsoft Word’s grammar check. He didn’t move up in the rankings. I’m shocked.

“That guy’s spamming,” he decided, “I’ll just report him to Google and hope for the best.”

What happened? Why didn’t adding five thousand blog posts work?

It’s pretty obvious: My, uh, friend added nothing but crap content to a site that was already outranked. Bulk is no longer a ranking tactic. Google’s very aware of that tactic. Lots of smart engineers have put time into updates like Panda to compensate.

He started like this:

And ended up like this:
more posts, no rankings

Alright, yeah, I was Mr. Flood The Site With Content, way back in 2003. Don’t judge me, whippersnappers.

Reality’s never that obvious. You’re scratching and clawing to move up two spots, you’ve got an overtasked IT team pushing back on changes, and you’ve got a boss who needs to know the implications of every recommendation.

Why fix duplication if rel=canonical can address it? Fixing duplication will take more time and cost more money. It’s easier to paste in one line of code. You and I know it’s better to fix the duplication. But it’s a hard sell.

Why deal with 302 versus 404 response codes and home page redirection? The basic user experience remains the same. Again, we just know that a server should return one home page without any redirects and that it should send a ‘not found’ 404 response if a page is missing. If it’s going to take 3 developer hours to reconfigure the server, though, how do we justify it? There’s no flashing sign reading “Your site has a problem!”

Why change this thing and not that thing?

At the same time, our boss/client sees that the site above theirs has five hundred blog posts and thousands of links from sites selling correspondence MBAs. So they want five thousand blog posts and cheap links as quickly as possible.

Cue crazy music.

SEO lacks clarity

SEO is, in some ways, for the insane. It’s an absurd collection of technical tweaks, content thinking, link building and other little tactics that may or may not work. A novice gets exposed to one piece of crappy information after another, with an occasional bit of useful stuff mixed in. They create sites that repel search engines and piss off users. They get more awful advice. The cycle repeats. Every time it does, best practices get more muddled.

SEO lacks clarity. We can’t easily weigh the value of one change or tactic over another. But we can look at our changes and tactics in context. When we examine the potential of several changes or tactics before we flip the switch, we get a closer balance between algorithm-thinking and actual strategy.

Distance from perfect brings clarity to tactics and strategy

At some point you have to turn that knowledge into practice. You have to take action based on recommendations, your knowledge of SEO, and business considerations.

That’s hard when we can’t even agree on subdomains vs. subfolders.

I know subfolders work better. Sorry, couldn’t resist. Let the flaming comments commence.

To get clarity, take a deep breath and ask yourself:

“All other things being equal, will this change, tactic, or strategy move my site closer to perfect than my competitors?”

Breaking it down:

“Change, tactic, or strategy”

A change takes an existing component or policy and makes it something else. Replatforming is a massive change. Adding a new page is a smaller one. Adding ALT attributes to your images is another example. Changing the way your shopping cart works is yet another.

A tactic is a specific, executable practice. In SEO, that might be fixing broken links, optimizing ALT attributes, optimizing title tags or producing a specific piece of content.

A strategy is a broader decision that’ll cause change or drive tactics. A long-term content policy is the easiest example. Shifting away from asynchronous content and moving to server-generated content is another example.

“Perfect”

No one knows exactly what Google considers “perfect,” and “perfect” can’t really exist, but you can bet a perfect web page/site would have all of the following:

  1. Completely visible content that’s perfectly relevant to the audience and query
  2. A flawless user experience
  3. Instant load time
  4. Zero duplicate content
  5. Every page easily indexed and classified
  6. No mistakes, broken links, redirects or anything else generally yucky
  7. Zero reported problems or suggestions in each search engines’ webmaster tools, sorry, “Search Consoles”
  8. Complete authority through immaculate, organically-generated links

These 8 categories (and any of the other bazillion that probably exist) give you a way to break down “perfect” and help you focus on what’s really going to move you forward. These different areas may involve different facets of your organization.

Your IT team can work on load time and creating an error-free front- and back-end. Link building requires the time and effort of content and outreach teams.

Tactics for relevant, visible content and current best practices in UX are going to be more involved, requiring research and real study of your audience.

What you need and what resources you have are going to impact which tactics are most realistic for you.

But there’s a basic rule: If a website would make Googlebot swoon and present zero obstacles to users, it’s close to perfect.

“All other things being equal”

Assume every competing website is optimized exactly as well as yours.

Now ask: Will this [tactic, change or strategy] move you closer to perfect?

That’s the “all other things being equal” rule. And it’s an incredibly powerful rubric for evaluating potential changes before you act. Pretend you’re in a tie with your competitors. Will this one thing be the tiebreaker? Will it put you ahead? Or will it cause you to fall behind?

“Closer to perfect than my competitors”

Perfect is great, but unattainable. What you really need is to be just a little perfect-er.

Chasing perfect can be dangerous. Perfect is the enemy of the good (I love that quote. Hated Voltaire. But I love that quote). If you wait for the opportunity/resources to reach perfection, you’ll never do anything. And the only way to reduce distance from perfect is to execute.

Instead of aiming for pure perfection, aim for more perfect than your competitors. Beat them feature-by-feature, tactic-by-tactic. Implement strategy that supports long-term superiority.

Don’t slack off. But set priorities and measure your effort. If fixing server response codes will take one hour and fixing duplication will take ten, fix the response codes first. Both move you closer to perfect. Fixing response codes may not move the needle as much, but it’s a lot easier to do. Then move on to fixing duplicates.

Do the 60% that gets you a 90% improvement. Then move on to the next thing and do it again. When you’re done, get to work on that last 40%. Repeat as necessary.

Take advantage of quick wins. That gives you more time to focus on your bigger solutions.

Sites that are “fine” are pretty far from perfect

Google has lots of tweaks, tools and workarounds to help us mitigate sub-optimal sites:

  • Rel=canonical lets us guide Google past duplicate content rather than fix it
  • HTML snapshots let us reveal content that’s delivered using asynchronous content and JavaScript frameworks
  • We can use rel=next and prev to guide search bots through outrageously long pagination tunnels
  • And we can use rel=nofollow to hide spammy links and banners

Easy, right? All of these solutions may reduce distance from perfect (the search engines don’t guarantee it). But they don’t reduce it as much as fixing the problems.
Just fine does not equal fixed

The next time you set up rel=canonical, ask yourself:

“All other things being equal, will using rel=canonical to make up for duplication move my site closer to perfect than my competitors?”

Answer: Not if they’re using rel=canonical, too. You’re both using imperfect solutions that force search engines to crawl every page of your site, duplicates included. If you want to pass them on your way to perfect, you need to fix the duplicate content.

When you use Angular.js to deliver regular content pages, ask yourself:

“All other things being equal, will using HTML snapshots instead of actual, visible content move my site closer to perfect than my competitors?”

Answer: No. Just no. Not in your wildest, code-addled dreams. If I’m Google, which site will I prefer? The one that renders for me the same way it renders for users? Or the one that has to deliver two separate versions of every page?

When you spill banner ads all over your site, ask yourself…

You get the idea. Nofollow is better than follow, but banner pollution is still pretty dang far from perfect.

Mitigating SEO issues with search engine-specific tools is “fine.” But it’s far, far from perfect. If search engines are forced to choose, they’ll favor the site that just works.

Not just SEO

By the way, distance from perfect absolutely applies to other channels.

I’m focusing on SEO, but think of other Internet marketing disciplines. I hear stuff like “How fast should my site be?” (Faster than it is right now.) Or “I’ve heard you shouldn’t have any content below the fold.” (Maybe in 2001.) Or “I need background video on my home page!” (Why? Do you have a reason?) Or, my favorite: “What’s a good bounce rate?” (Zero is pretty awesome.)

And Internet marketing venues are working to measure distance from perfect. Pay-per-click marketing has the quality score: A codified financial reward applied for seeking distance from perfect in as many elements as possible of your advertising program.

Social media venues are aggressively building their own forms of graphing, scoring and ranking systems designed to separate the good from the bad.

Really, all marketing includes some measure of distance from perfect. But no channel is more influenced by it than SEO. Instead of arguing one rule at a time, ask yourself and your boss or client: Will this move us closer to perfect?

Hell, you might even please a customer or two.

One last note for all of the SEOs in the crowd. Before you start pointing out edge cases, consider this: We spend our days combing Google for embarrassing rankings issues. Every now and then, we find one, point, and start yelling “SEE! SEE!!!! THE GOOGLES MADE MISTAKES!!!!” Google’s got lots of issues. Screwing up the rankings isn’t one of them.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 1 year ago from tracking.feedpress.it

How to Rid Your Website of Six Common Google Analytics Headaches

Posted by amandaecking

I’ve been in and out of Google Analytics (GA) for the past five or so years agency-side. I’ve seen three different code libraries, dozens of new different features and reports roll out, IP addresses stop being reported, and keywords not-so-subtly phased out of the free platform.

Analytics has been a focus of mine for the past year or so—mainly, making sure clients get their data right. Right now, our new focus is closed loop tracking, but that’s a topic for another day. If you’re using Google Analytics, and only Google Analytics for the majority of your website stats, or it’s your primary vehicle for analysis, you need to make sure it’s accurate.

Not having data pulling in or reporting properly is like building a house on a shaky foundation: It doesn’t end well. Usually there are tears.

For some reason, a lot of people, including many of my clients, assume everything is tracking properly in Google Analytics… because Google. But it’s not Google who sets up your analytics. People do that. And people are prone to make mistakes.

I’m going to go through six scenarios where issues are commonly encountered with Google Analytics.

I’ll outline the remedy for each issue, and in the process, show you how to move forward with a diagnosis or resolution.

1. Self-referrals

This is probably one of the areas we’re all familiar with. If you’re seeing a lot of traffic from your own domain, there’s likely a problem somewhere—or you need to extend the default session length in Google Analytics. (For example, if you have a lot of long videos or music clips and don’t use event tracking; a website like TEDx or SoundCloud would be a good equivalent.)

Typically one of the first things I’ll do to help diagnose the problem is include an advanced filter to show the full referrer string. You do this by creating a filter, as shown below:

Filter Type: Custom filter > Advanced
Field A: Hostname
Extract A: (.*)
Field B: Request URI
Extract B: (.*)
Output To: Request URI
Constructor: $A1$B1

You’ll then start seeing the subdomains pulling in. Experience has shown me that if you have a separate subdomain hosted in another location (say, if you work with a separate company and they host and run your mobile site or your shopping cart), it gets treated by Google Analytics as a separate domain. Thus, you ‘ll need to implement cross domain tracking. This way, you can narrow down whether or not it’s one particular subdomain that’s creating the self-referrals.

In this example below, we can see all the revenue is being reported to the booking engine (which ended up being cross domain issues) and their own site is the fourth largest traffic source:

I’ll also a good idea to check the browser and device reports to start narrowing down whether the issue is specific to a particular element. If it’s not, keep digging. Look at pages pulling the self-referrals and go through the code with a fine-tooth comb, drilling down as much as you can.

2. Unusually low bounce rate

If you have a crazy-low bounce rate, it could be too good to be true. Unfortunately. An unusually low bounce rate could (and probably does) mean that at least on some pages of your website have the same Google Analytics tracking code installed twice.

Take a look at your source code, or use Google Tag Assistant (though it does have known bugs) to see if you’ve got GA tracking code installed twice.

While I tell clients having Google Analytics installed on the same page can lead to double the pageviews, I’ve not actually encountered that—I usually just say it to scare them into removing the duplicate implementation more quickly. Don’t tell on me.

3. Iframes anywhere

I’ve heard directly from Google engineers and Google Analytics evangelists that Google Analytics does not play well with iframes, and that it will never will play nice with this dinosaur technology.

If you track the iframe, you inflate your pageviews, plus you still aren’t tracking everything with 100% clarity.

If you don’t track across iframes, you lose the source/medium attribution and everything becomes a self-referral.

Damned if you do; damned if you don’t.

My advice: Stop using iframes. They’re Netscape-era technology anyway, with rainbow marquees and Comic Sans on top. Interestingly, and unfortunately, a number of booking engines (for hotels) and third-party carts (for ecommerce) still use iframes.

If you have any clients in those verticals, or if you’re in the vertical yourself, check with your provider to see if they use iframes. Or you can check for yourself, by right-clicking as close as you can to the actual booking element:

iframe-booking.png

There is no neat and tidy way to address iframes with Google Analytics, and usually iframes are not the only complicated element of setup you’ll encounter. I spent eight months dealing with a website on a subfolder, which used iframes and had a cross domain booking system, and the best visibility I was able to get was about 80% on a good day.

Typically, I’d approach diagnosing iframes (if, for some reason, I had absolutely no access to viewing a website or talking to the techs) similarly to diagnosing self-referrals, as self-referrals are one of the biggest symptoms of iframe use.

4. Massive traffic jumps

Massive jumps in traffic don’t typically just happen. (Unless, maybe, you’re Geraldine.) There’s always an explanation—a new campaign launched, you just turned on paid ads for the first time, you’re using content amplification platforms, you’re getting a ton of referrals from that recent press in The New York Times. And if you think it just happened, it’s probably a technical glitch.

I’ve seen everything from inflated pageviews result from including tracking on iframes and unnecessary implementation of virtual pageviews, to not realizing the tracking code was installed on other microsites for the same property. Oops.

Usually I’ve seen this happen when the tracking code was somewhere it shouldn’t be, so if you’re investigating a situation of this nature, first confirm the Google Analytics code is only in the places it needs to be.Tools like Google Tag Assistant and Screaming Frog can be your BFFs in helping you figure this out.

Also, I suggest bribing the IT department with sugar (or booze) to see if they’ve changed anything lately.

5. Cross-domain tracking

I wish cross-domain tracking with Google Analytics out of the box didn’t require any additional setup. But it does.

If you don’t have it set up properly, things break down quickly, and can be quite difficult to untangle.

The older the GA library you’re using, the harder it is. The easiest setup, by far, is Google Tag Manager with Universal Analytics. Hard-coded universal analytics is a bit more difficult because you have to implement autoLink manually and decorate forms, if you’re using them (and you probably are). Beyond that, rather than try and deal with it, I say update your Google Analytics code. Then we can talk.

Where I’ve seen the most murkiness with tracking is when parts of cross domain tracking are implemented, but not all. For some reason, if allowLinker isn’t included, or you forget to decorate all the forms, the cookies aren’t passed between domains.

The absolute first place I would start with this would be confirming the cookies are all passing properly at all the right points, forms, links, and smoke signals. I’ll usually use a combination of the Real Time report in Google Analytics, Google Tag Assistant, and GA debug to start testing this. Any debug tool you use will mean you’re playing in the console, so get friendly with it.

6. Internal use of UTM strings

I’ve saved the best for last. Internal use of campaign tagging. We may think, oh, I use Google to tag my campaigns externally, and we’ve got this new promotion on site which we’re using a banner ad for. That’s a campaign. Why don’t I tag it with a UTM string?

Step away from the keyboard now. Please.

When you tag internal links with UTM strings, you override the original source/medium. So that visitor who came in through your paid ad and then who clicks on the campaign banner has now been manually tagged. You lose the ability to track that they came through on the ad the moment they click on the tagged internal link. Their source and medium is now your internal campaign, not that paid ad you’re spending gobs of money on and have to justify to your manager. See the problem?

I’ve seen at least three pretty spectacular instances of this in the past year, and a number of smaller instances of it. Annie Cushing also talks about the evils of internal UTM tags and the odd prevalence of it. (Oh, and if you haven’t explored her blog, and the amazing spreadsheets she shares, please do.)

One clothing company I worked with tagged all of their homepage offers with UTM strings, which resulted in the loss of visibility for one-third of their audience: One million visits over the course of a year, and $2.1 million in lost revenue.

Let me say that again. One million visits, and $2.1 million. That couldn’t be attributed to an external source/campaign/spend.

Another client I audited included campaign tagging on nearly every navigational element on their website. It still gives me nightmares.

If you want to see if you have any internal UTM strings, head straight to the Campaigns report in Acquisition in Google Analytics, and look for anything like “home” or “navigation” or any language you may use internally to refer to your website structure.

And if you want to see how users are moving through your website, go to the Flow reports. Or if you really, really, really want to know how many people click on that sidebar link, use event tracking. But please, for the love of all things holy (and to keep us analytics lovers from throwing our computers across the room), stop using UTM tagging on your internal links.

Now breathe and smile

Odds are, your Google Analytics setup is fine. If you are seeing any of these issues, though, you have somewhere to start in diagnosing and addressing the data.

We’ve looked at six of the most common points of friction I’ve encountered with Google Analytics and how to start investigating them: self-referrals, bounce rate, iframes, traffic jumps, cross domain tracking and internal campaign tagging.

What common data integrity issues have you encountered with Google Analytics? What are your favorite tools to investigate?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

How to Combat 5 of the SEO World’s Most Infuriating Problems – Whiteboard Friday

Posted by randfish

These days, most of us have learned that spammy techniques aren’t the way to go, and we have a solid sense for the things we should be doing to rank higher, and ahead of our often spammier competitors. Sometimes, maddeningly, it just doesn’t work. In today’s Whiteboard Friday, Rand talks about five things that can infuriate SEOs with the best of intentions, why those problems exist, and what we can do about them.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

What SEO problems make you angry?

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about some of the most infuriating things in the SEO world, specifically five problems that I think plague a lot of folks and some of the ways that we can combat and address those.

I’m going to start with one of the things that really infuriates a lot of new folks to the field, especially folks who are building new and emerging sites and are doing SEO on them. You have all of these best practices list. You might look at a web developer’s cheat sheet or sort of a guide to on-page and on-site SEO. You go, “Hey, I’m doing it. I’ve got my clean URLs, my good, unique content, my solid keyword targeting, schema markup, useful internal links, my XML sitemap, and my fast load speed. I’m mobile friendly, and I don’t have manipulative links.”

Great. “Where are my results? What benefit am I getting from doing all these things, because I don’t see one?” I took a site that was not particularly SEO friendly, maybe it’s a new site, one I just launched or an emerging site, one that’s sort of slowly growing but not yet a power player. I do all this right stuff, and I don’t get SEO results.

This makes a lot of people stop investing in SEO, stop believing in SEO, and stop wanting to do it. I can understand where you’re coming from. The challenge is not one of you’ve done something wrong. It’s that this stuff, all of these things that you do right, especially things that you do right on your own site or from a best practices perspective, they don’t increase rankings. They don’t. That’s not what they’re designed to do.

1) Following best practices often does nothing for new and emerging sites

This stuff, all of these best practices are designed to protect you from potential problems. They’re designed to make sure that your site is properly optimized so that you can perform to the highest degree that you are able. But this is not actually rank boosting stuff unfortunately. That is very frustrating for many folks. So following a best practices list, the idea is not, “Hey, I’m going to grow my rankings by doing this.”

On the flip side, many folks do these things on larger, more well-established sites, sites that have a lot of ranking signals already in place. They’re bigger brands, they have lots of links to them, and they have lots of users and usage engagement signals. You fix this stuff. You fix stuff that’s already broken, and boom, rankings pop up. Things are going well, and more of your pages are indexed. You’re getting more search traffic, and it feels great. This is a challenge, on our part, of understanding what this stuff does, not a challenge on the search engine’s part of not ranking us properly for having done all of these right things.

2) My competition seems to be ranking on the back of spammy or manipulative links

What’s going on? I thought Google had introduced all these algorithms to kind of shut this stuff down. This seems very frustrating. How are they pulling this off? I look at their link profile, and I see a bunch of the directories, Web 2.0 sites — I love that the spam world decided that that’s Web 2.0 sites — article sites, private blog networks, and do follow blogs.

You look at this stuff and you go, “What is this junk? It’s terrible. Why isn’t Google penalizing them for this?” The answer, the right way to think about this and to come at this is: Are these really the reason that they rank? I think we need to ask ourselves that question.

One thing that we don’t know, that we can never know, is: Have these links been disavowed by our competitor here?

I’ve got my HulksIncredibleStore.com and their evil competitor Hulk-tastrophe.com. Hulk-tastrophe has got all of these terrible links, but maybe they disavowed those links and you would have no idea. Maybe they didn’t build those links. Perhaps those links came in from some other place. They are not responsible. Google is not treating them as responsible for it. They’re not actually what’s helping them.

If they are helping, and it’s possible they are, there are still instances where we’ve seen spam propping up sites. No doubt about it.

I think the next logical question is: Are you willing to loose your site or brand? What we don’t see anymore is we almost never see sites like this, who are ranking on the back of these things and have generally less legitimate and good links, ranking for two or three or four years. You can see it for a few months, maybe even a year, but this stuff is getting hit hard and getting hit frequently. So unless you’re willing to loose your site, pursuing their links is probably not a strategy.

Then what other signals, that you might not be considering potentially links, but also non-linking signals, could be helping them rank? I think a lot of us get blinded in the SEO world by link signals, and we forget to look at things like: Do they have a phenomenal user experience? Are they growing their brand? Are they doing offline kinds of things that are influencing online? Are they gaining engagement from other channels that’s then influencing their SEO? Do they have things coming in that I can’t see? If you don’t ask those questions, you can’t really learn from your competitors, and you just feel the frustration.

3) I have no visibility or understanding of why my rankings go up vs down

On my HulksIncredibleStore.com, I’ve got my infinite stretch shorts, which I don’t know why he never wears — he should really buy those — my soothing herbal tea, and my anger management books. I look at my rankings and they kind of jump up all the time, jump all over the place all the time. Actually, this is pretty normal. I think we’ve done some analyses here, and the average page one search results shift is 1.5 or 2 position changes daily. That’s sort of the MozCast dataset, if I’m recalling correctly. That means that, over the course of a week, it’s not uncommon or unnatural for you to be bouncing around four, five, or six positions up, down, and those kind of things.

I think we should understand what can be behind these things. That’s a very simple list. You made changes, Google made changes, your competitors made changes, or searcher behavior has changed in terms of volume, in terms of what they were engaging with, what they’re clicking on, what their intent behind searches are. Maybe there was just a new movie that came out and in one of the scenes Hulk talks about soothing herbal tea. So now people are searching for very different things than they were before. They want to see the scene. They’re looking for the YouTube video clip and those kind of things. Suddenly Hulk’s soothing herbal tea is no longer directing as well to your site.

So changes like these things can happen. We can’t understand all of them. I think what’s up to us to determine is the degree of analysis and action that’s actually going to provide a return on investment. Looking at these day over day or week over week and throwing up our hands and getting frustrated probably provides very little return on investment. Looking over the long term and saying, “Hey, over the last 6 months, we can observe 26 weeks of ranking change data, and we can see that in aggregate we are now ranking higher and for more keywords than we were previously, and so we’re going to continue pursuing this strategy. This is the set of keywords that we’ve fallen most on, and here are the factors that we’ve identified that are consistent across that group.” I think looking at rankings in aggregate can give us some real positive ROI. Looking at one or two, one week or the next week probably very little ROI.

4) I cannot influence or affect change in my organization because I cannot accurately quantify, predict, or control SEO

That’s true, especially with things like keyword not provided and certainly with the inaccuracy of data that’s provided to us through Google’s Keyword Planner inside of AdWords, for example, and the fact that no one can really control SEO, not fully anyway.

You get up in front of your team, your board, your manager, your client and you say, “Hey, if we don’t do these things, traffic will suffer,” and they go, “Well, you can’t be sure about that, and you can’t perfectly predict it. Last time you told us something, something else happened. So because the data is imperfect, we’d rather spend money on channels that we can perfectly predict, that we can very effectively quantify, and that we can very effectively control.” That is understandable. I think that businesses have a lot of risk aversion naturally, and so wanting to spend time and energy and effort in areas that you can control feels a lot safer.

Some ways to get around this are, first off, know your audience. If you know who you’re talking to in the room, you can often determine the things that will move the needle for them. For example, I find that many managers, many boards, many executives are much more influenced by competitive pressures than they are by, “We won’t do as well as we did before, or we’re loosing out on this potential opportunity.” Saying that is less powerful than saying, “This competitor, who I know we care about and we track ourselves against, is capturing this traffic and here’s how they’re doing it.”

Show multiple scenarios. Many of the SEO presentations that I see and have seen and still see from consultants and from in-house folks come with kind of a single, “Hey, here’s what we predict will happen if we do this or what we predict will happen if we don’t do this.” You’ve got to show multiple scenarios, especially when you know you have error bars because you can’t accurately quantify and predict. You need to show ranges.

So instead of this, I want to see: What happens if we do it a little bit? What happens if we really overinvest? What happens if Google makes a much bigger change on this particular factor than we expect or our competitors do a much bigger investment than we expect? How might those change the numbers?

Then I really do like bringing case studies, especially if you’re a consultant, but even in-house there are so many case studies in SEO on the Web today, you can almost always find someone who’s analogous or nearly analogous and show some of their data, some of the results that they’ve seen. Places like SEMrush, a tool that offers competitive intelligence around rankings, can be great for that. You can show, hey, this media site in our sector made these changes. Look at the delta of keywords they were ranking for versus R over the next six months. Correlation is not causation, but that can be a powerful influencer showing those kind of things.

Then last, but not least, any time you’re going to get up like this and present to a group around these topics, if you very possibly can, try to talk one-on-one with the participants before the meeting actually happens. I have found it almost universally the case that when you get into a group setting, if you haven’t had the discussions beforehand about like, “What are your concerns? What do you think is not valid about this data? Hey, I want to run this by you and get your thoughts before we go to the meeting.” If you don’t do that ahead of time, people can gang up and pile on. One person says, “Hey, I don’t think this is right,” and everybody in the room kind of looks around and goes, “Yeah, I also don’t think that’s right.” Then it just turns into warfare and conflict that you don’t want or need. If you address those things beforehand, then you can include the data, the presentations, and the “I don’t know the answer to this and I know this is important to so and so” in that presentation or in that discussion. It can be hugely helpful. Big difference between winning and losing with that.

5) Google is biasing to big brands. It feels hopeless to compete against them

A lot of people are feeling this hopelessness, hopelessness in SEO about competing against them. I get that pain. In fact, I’ve felt that very strongly for a long time in the SEO world, and I think the trend has only increased. This comes from all sorts of stuff. Brands now have the little dropdown next to their search result listing. There are these brand and entity connections. As Google is using answers and knowledge graph more and more, it’s feeling like those entities are having a bigger influence on where things rank and where they’re visible and where they’re pulling from.

User and usage behavior signals on the rise means that big brands, who have more of those signals, tend to perform better. Brands in the knowledge graph, brands growing links without any effort, they’re just growing links because they’re brands and people point to them naturally. Well, that is all really tough and can be very frustrating.

I think you have a few choices on the table. First off, you can choose to compete with brands where they can’t or won’t. So this is areas like we’re going after these keywords that we know these big brands are not chasing. We’re going after social channels or people on social media that we know big brands aren’t. We’re going after user generated content because they have all these corporate requirements and they won’t invest in that stuff. We’re going after content that they refuse to pursue for one reason or another. That can be very effective.

You better be building, growing, and leveraging your competitive advantage. Whenever you build an organization, you’ve got to say, “Hey, here’s who is out there. This is why we are uniquely better or a uniquely better choice for this set of customers than these other ones.” If you can leverage that, you can generally find opportunities to compete and even to win against big brands. But those things have to become obvious, they have to become well-known, and you need to essentially build some of your brand around those advantages, or they’re not going to give you help in search. That includes media, that includes content, that includes any sort of press and PR you’re doing. That includes how you do your own messaging, all of these things.

(C) You can choose to serve a market or a customer that they don’t or won’t. That can be a powerful way to go about search, because usually search is bifurcated by the customer type. There will be slightly different forms of search queries that are entered by different kinds of customers, and you can pursue one of those that isn’t pursued by the competition.

Last, but not least, I think for everyone in SEO we all realize we’re going to have to become brands ourselves. That means building the signals that are typically associated with brands — authority, recognition from an industry, recognition from a customer set, awareness of our brand even before a search has happened. I talked about this in a previous Whiteboard Friday, but I think because of these things, SEO is becoming a channel that you benefit from as you grow your brand rather than the channel you use to initially build your brand.

All right, everyone. Hope these have been helpful in combating some of these infuriating, frustrating problems and that we’ll see some great comments from you guys. I hope to participate in those as well, and we’ll catch you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 years ago from tracking.feedpress.it

How Much Has Link Building Changed in Recent Years?

Posted by Paddy_Moogan

I get asked this question a lot. It’s mainly asked by people who are considering buying my link building book and want to know whether it’s still up to date. This is understandable given that the first edition was published in February 2013 and our industry has a deserved reputation for always changing.

I find myself giving the same answer, even though I’ve been asked it probably dozens of times in the last two years—”not that much”. I don’t think this is solely due to the book itself standing the test of time, although I’ll happily take a bit of credit for that 🙂 I think it’s more a sign of our industry as a whole not changing as much as we’d like to think.

I started to question myself and if I was right and honestly, it’s one of the reasons it has taken me over two years to release the second edition of the book.

So I posed this question to a group of friends not so long ago, some via email and some via a Facebook group. I was expecting to be called out by many of them because my position was that in reality, it hasn’t actually changed that much. The thing is, many of them agreed and the conversations ended with a pretty long thread with lots of insights. In this post, I’d like to share some of them, share what my position is and talk about what actually has changed.

My personal view

Link building hasn’t changed as much we think it has.

The core principles of link building haven’t changed. The signals around link building have changed, but mainly around new machine learning developments that have indirectly affected what we do. One thing that has definitely changed is the mindset of SEOs (and now clients) towards link building.

I think the last big change to link building came in April 2012 when Penguin rolled out. This genuinely did change our industry and put to bed a few techniques that should never have worked so well in the first place.

Since then, we’ve seen some things change, but the core principles haven’t changed if you want to build a business that will be around for years to come and not run the risk of being hit by a link related Google update. For me, these principles are quite simple:

  • You need to deserve links – either an asset you create or your product
  • You need to put this asset in front of a relevant audience who have the ability to share it
  • You need consistency – one new asset every year is unlikely to cut it
  • Anything that scales is at risk

For me, the move towards user data driving search results + machine learning has been the biggest change we’ve seen in recent years and it’s still going.

Let’s dive a bit deeper into all of this and I’ll talk about how this relates to link building.

The typical mindset for building links has changed

I think that most SEOs are coming round to the idea that you can’t get away with building low quality links any more, not if you want to build a sustainable, long-term business. Spammy link building still works in the short-term and I think it always will, but it’s much harder than it used to be to sustain websites that are built on spam. The approach is more “churn and burn” and spammers are happy to churn through lots of domains and just make a small profit on each one before moving onto another.

For everyone else, it’s all about the long-term and not putting client websites at risk.

This has led to many SEOs embracing different forms of link building and generally starting to use content as an asset when it comes to attracting links. A big part of me feels that it was actually Penguin in 2012 that drove the rise of content marketing amongst SEOs, but that’s a post for another day…! For today though, this goes some way towards explain the trend we see below.

Slowly but surely, I’m seeing clients come to my company already knowing that low quality link building isn’t what they want. It’s taken a few years after Penguin for it to filter down to client / business owner level, but it’s definitely happening. This is a good thing but unfortunately, the main reason for this is that most of them have been burnt in the past by SEO companies who have built low quality links without giving thought to building good quality ones too.

I have no doubt that it’s this change in mindset which has led to trends like this:

The thing is, I don’t think this was by choice.

Let’s be honest. A lot of us used the kind of link building tactics that Google no longer like because they worked. I don’t think many SEOs were under the illusion that it was genuinely high quality stuff, but it worked and it was far less risky to do than it is today. Unless you were super-spammy, the low-quality links just worked.

Fast forward to a post-Penguin world, things are far more risky. For me, it’s because of this that we see the trends like the above. As an industry, we had the easiest link building methods taken away from us and we’re left with fewer options. One of the main options is content marketing which, if you do it right, can lead to good quality links and importantly, the types of links you won’t be removing in the future. Get it wrong and you’ll lose budget and lose the trust if your boss or client in the power of content when it comes to link building.

There are still plenty of other methods to build links and sometimes we can forget this. Just look at this epic list from Jon Cooper. Even with this many tactics still available to us, it’s hard work. Way harder than it used to be.

My summary here is that as an industry, our mindset has shifted but it certainly wasn’t a voluntary shift. If the tactics that Penguin targeted still worked today, we’d still be using them.

A few other opinions…

I definitely think too many people want the next easy win. As someone surfing the edge of what Google is bringing our way, here’s my general take—SEO, in broad strokes, is changing a lot, *but* any given change is more and more niche and impacts fewer people. What we’re seeing isn’t radical, sweeping changes that impact everyone, but a sort of modularization of SEO, where we each have to be aware of what impacts our given industries, verticals, etc.”

Dr. Pete

 

I don’t feel that techniques for acquiring links have changed that much. You can either earn them through content and outreach or you can just buy them. What has changed is the awareness of “link building” outside of the SEO community. This makes link building / content marketing much harder when pitching to journalists and even more difficult when pitching to bloggers.

“Link building has to be more integrated with other channels and struggles to work in its own environment unless supported by brand, PR and social. Having other channels supporting your link development efforts also creates greater search signals and more opportunity to reach a bigger audience which will drive a greater ROI.

Carl Hendy

 

SEO has grown up in terms of more mature staff and SEOs becoming more ingrained into businesses so there is a smarter (less pressure) approach. At the same time, SEO has become more integrated into marketing and has made marketing teams and decision makers more intelligent in strategies and not pushing for the quick win. I’m also seeing that companies who used to rely on SEO and building links have gone through IPOs and the need to build 1000s of links per quarter has rightly reduced.

Danny Denhard

Signals that surround link building have changed

There is no question about this one in my mind. I actually wrote about this last year in my previous blog post where I talked about signals such as anchor text and deep links changing over time.

Many of the people I asked felt the same, here are some quotes from them, split out by the types of signal.

Domain level link metrics

I think domain level links have become increasingly important compared with page level factors, i.e. you can get a whole site ranking well off the back of one insanely strong page, even with sub-optimal PageRank flow from that page to the rest of the site.

Phil Nottingham

I’d agree with Phil here and this is what I was getting at in my previous post on how I feel “deep links” will matter less over time. It’s not just about domain level links here, it’s just as much about the additional signals available for Google to use (more on that later).

Anchor text

I’ve never liked anchor text as a link signal. I mean, who actually uses exact match commercial keywords as anchor text on the web?

SEOs. 🙂

Sure there will be natural links like this, but honestly, I struggle with the idea that it took Google so long to start turning down the dial on commercial anchor text as a ranking signal. They are starting to turn it down though, slowly but surely. Don’t get me wrong, it still matters and it still works. But like pure link spam, the barrier is a lot more lower now in terms what of constitutes too much.

Rand feels that they matter more than we’d expect and I’d mostly agree with this statement:

Exact match anchor text links still have more power than you’d expect—I think Google still hasn’t perfectly sorted what is “brand” or “branded query” from generics (i.e. they want to start ranking a new startup like meldhome.com for “Meld” if the site/brand gets popular, but they can’t quite tell the difference between that and https://moz.com/learn/seo/redirection getting a few manipulative links that say “redirect”)

Rand Fishkin

What I do struggle with though, is that Google still haven’t figured this out and that short-term, commercial anchor text spam is still so effective. Even for a short burst of time.

I don’t think link building as a concept has changed loads—but I think links as a signal have, mainly because of filters and penalties but I don’t see anywhere near the same level of impact from coverage anymore, even against 18 months ago.

Paul Rogers

New signals have been introduced

It isn’t just about established signals changing though, there are new signals too and I personally feel that this is where we’ve seen the most change in Google algorithms in recent years—going all the way back to Panda in 2011.

With Panda, we saw a new level of machine learning where it almost felt like Google had found a way of incorporating human reaction / feelings into their algorithms. They could then run this against a website and answer questions like the ones included in this post. Things such as:

  • “Would you be comfortable giving your credit card information to this site?”
  • “Does this article contain insightful analysis or interesting information that is beyond obvious?”
  • “Are the pages produced with great care and attention to detail vs. less attention to detail?”

It is a touch scary that Google was able to run machine learning against answers to questions like this and write an algorithm to predict the answers for any given page on the web. They have though and this was four years ago now.

Since then, they’ve made various moves to utilize machine learning and AI to build out new products and improve their search results. For me, this was one of the biggest and went pretty unnoticed by our industry. Well, until Hummingbird came along I feel pretty sure that we have Ray Kurzweil to thank for at least some of that.

There seems to be more weight on theme/topic related to sites, though it’s hard to tell if this is mostly link based or more user/usage data based. Google is doing a good job of ranking sites and pages that don’t earn the most links but do provide the most relevant/best answer. I have a feeling they use some combination of signals to say “people who perform searches like this seem to eventually wind up on this website—let’s rank it.” One of my favorite examples is the Audubon Society ranking for all sorts of birding-related searches with very poor keyword targeting, not great links, etc. I think user behavior patterns are stronger in the algo than they’ve ever been.

– Rand Fishkin

Leading on from what Rand has said, it’s becoming more and more common to see search results that just don’t make sense if you look at the link metrics—but are a good result.

For me, the move towards user data driving search results + machine learning advanced has been the biggest change we’ve seen in recent years and it’s still going.

Edit: since drafting this post, Tom Anthony released this excellent blog post on his views on the future of search and the shift to data-driven results. I’d recommend reading that as it approaches this whole area from a different perspective and I feel that an off-shoot of what Tom is talking about is the impact on link building.

You may be asking at this point, what does machine learning have to do with link building?

Everything. Because as strong as links are as a ranking signal, Google want more signals and user signals are far, far harder to manipulate than established link signals. Yes it can be done—I’ve seen it happen. There have even been a few public tests done. But it’s very hard to scale and I’d venture a guess that only the top 1% of spammers are capable of doing it, let alone maintaining it for a long period of time. When I think about the process for manipulation here, I actually think we go a step beyond spammers towards hackers and more cut and dry illegal activity.

For link building, this means that traditional methods of manipulating signals are going to become less and less effective as these user signals become stronger. For us as link builders, it means we can’t keep searching for that silver bullet or the next method of scaling link building just for an easy win. The fact is that scalable link building is always going to be at risk from penalization from Google—I don’t really want to live a life where I’m always worried about my clients being hit by the next update. Even if Google doesn’t catch up with a certain method, machine learning and user data mean that these methods may naturally become less effective and cost efficient over time.

There are of course other things such as social signals that have come into play. I certainly don’t feel like these are a strong ranking factor yet, but with deals like this one between Google and Twitter being signed, I wouldn’t be surprised if that ever-growing dataset is used at some point in organic results. The one advantage that Twitter has over Google is it’s breaking news freshness. Twitter is still way quicker at breaking news than Google is—140 characters in a tweet is far quicker than Google News! Google know this which is why I feel they’ve pulled this partnership back into existence after a couple of years apart.

There is another important point to remember here and it’s nicely summarised by Dr. Pete:

At the same time, as new signals are introduced, these are layers not replacements. People hear social signals or user signals or authorship and want it to be the link-killer, because they already fucked up link-building, but these are just layers on top of on-page and links and all of the other layers. As each layer is added, it can verify the layers that came before it and what you need isn’t the magic signal but a combination of signals that generally matches what Google expects to see from real, strong entities. So, links still matter, but they matter in concert with other things, which basically means it’s getting more complicated and, frankly, a bit harder. Of course, on one wants to hear that.”

– Dr. Pete

The core principles have not changed

This is the crux of everything for me. With all the changes listed above, the key is that the core principles around link building haven’t changed. I could even argue that Penguin didn’t change the core principles because the techniques that Penguin targeted should never have worked in the first place. I won’t argue this too much though because even Google advised website owners to build directory links at one time.

You need an asset

You need to give someone a reason to link to you. Many won’t do it out of the goodness of their heart! One of the most effective ways to do this is to develop a content asset and use this as your reason to make people care. Once you’ve made someone care, they’re more likely to share the content or link to it from somewhere.

You need to promote that asset to the right audience

I really dislike the stance that some marketers take when it comes to content promotion—build great content and links will come.

No. Sorry but for the vast majority of us, that’s simply not true. The exceptions are people that sky dive from space or have huge existing audiences to leverage.

You simply have to spend time promoting your content or your asset for it to get shares and links. It is hard work and sometimes you can spend a long time on it and get little return, but it’s important to keep working at until you’re at a point where you have two things:

  • A big enough audience where you can almost guarantee at least some traffic to your new content along with some shares
  • Enough strong relationships with relevant websites who you can speak to when new content is published and stand a good chance of them linking to it

Getting to this point is hard—but that’s kind of the point. There are various hacks you can use along the way but it will take time to get right.

You need consistency

Leading on from the previous point. It takes time and hard work to get links to your content—the types of links that stand the test of time and you’re not going to be removing in 12 months time anyway! This means that you need to keep pushing content out and getting better each and every time. This isn’t to say you should just churn content out for the sake of it, far from it. I am saying that with each piece of content you create, you will learn to do at least one thing better the next time. Try to give yourself the leverage to do this.

Anything scalable is at risk

Scalable link building is exactly what Google has been trying to crack down on for the last few years. Penguin was the biggest move and hit some of the most scalable tactics we had at our disposal. When you scale something, you often lose some level of quality, which is exactly what Google doesn’t want when it comes to links. If you’re still relying on tactics that could fall into the scalable category, I think you need to be very careful and just look at the trend in the types of links Google has been penalizing to understand why.

The part Google plays in this

To finish up, I want to briefly talk about the part that Google plays in all of this and shaping the future they want for the web.

I’ve always tried to steer clear of arguments involving the idea that Google is actively pushing FUD into the community. I’ve preferred to concentrate more on things I can actually influence and change with my clients rather than what Google is telling us all to do.

However, for the purposes of this post, I want to talk about it.

General paranoia has increased. My bet is there are some companies out there carrying out zero specific linkbuilding activity through worry.

Dan Barker

Dan’s point is a very fair one and just a day or two after reading this in an email, I came across a page related to a client’s target audience that said:

“We are not publishing guest posts on SITE NAME any more. All previous guest posts are now deleted. For more information, see www.mattcutts.com/blog/guest-blogging/“.

I’ve reworded this as to not reveal the name of the site, but you get the point.

This is silly. Honestly, so silly. They are a good site, publish good content, and had good editorial standards. Yet they have ignored all of their own policies, hard work, and objectives to follow a blog post from Matt. I’m 100% confident that it wasn’t sites like this one that Matt was talking about in this blog post.

This is, of course, from the publishers’ angle rather than the link builders’ angle, but it does go to show the effect that statements from Google can have. Google know this so it does make sense for them to push out messages that make their jobs easier and suit their own objectives—why wouldn’t they? In a similar way, what did they do when they were struggling to classify at scale which links are bad vs. good and they didn’t have a big enough web spam team? They got us to do it for them 🙂

I’m mostly joking here, but you see the point.

The most recent infamous mobilegeddon update, discussed here by Dr. Pete is another example of Google pushing out messages that ultimately scared a lot of people into action. Although to be fair, I think that despite the apparent small impact so far, the broad message from Google is a very serious one.

Because of this, I think we need to remember that Google does have their own agenda and many shareholders to keep happy. I’m not in the camp of believing everything that Google puts out is FUD, but I’m much more sensitive and questioning of the messages now than I’ve ever been.

What do you think? I’d love to hear your feedback and thoughts in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 years ago from tracking.feedpress.it

​Inbound Lead Generation: eCommerce Marketing’s Missing Link

Posted by Everett

If eCommerce businesses hope to remain competitive with Amazon, eBay, big box brands, and other online retail juggernauts, they’ll need to learn how to conduct content marketing, lead generation, and contact nurturing as part of a comprehensive inbound marketing strategy.

First, I will discuss some of the ways most online retailers are approaching email from the bottom of the funnel upward, and why this needs to be turned around. Then we can explore how to go about doing this within the framework of “Inbound Marketing” for eCommerce businesses. Lastly, popular marketing automation and email marketing solutions are discussed in the context of inbound marketing for eCommerce.

Key differences between eCommerce and lead generation approaches to email

Different list growth strategies

Email acquisition sources differ greatly between lead gen. sites and online stores. The biggest driver of email acquisition for most eCommerce businesses are their shoppers, especially when the business doesn’t collect an email address for their contact database until the shopper provides it during the check-out process—possibly, not until the very end.

With most B2B/B2C lead gen. websites, the entire purpose of every landing page is to get visitors to submit a contact form or pick up the phone. Often, the price tag for their products or services is much higher than those of an eCommerce site or involves recurring payments. In other words, what they’re selling is more difficult to sell. People take longer to make those purchasing decisions. For this reason, leads—in the form of contact names and email addresses—are typically acquired and nurtured without having first become a customer.

Contacts vs. leads

Whether it is a B2B or B2C website, lead gen. contacts (called leads) are thought of as potential customers (clients, subscribers, patients) who need to be nurtured to the point of becoming “sales qualified,” meaning they’ll eventually get a sales call or email that attempts to convert them into a customer.

On the other hand, eCommerce contacts are often thought of primarily as existing customers to whom the marketing team can blast coupons and other offers by email.

Retail sites typically don’t capture leads at the top or middle of the funnel. Only once a shopper has checked out do they get added to the list. Historically, the buying cycle has been short enough that eCommerce sites could move many first-time visitors directly to customers in a single visit.
But this has changed.

Unless your brand is very strong—possibly a luxury brand or one with an offline retail presence—it is probably getting more difficult (i.e. expensive) to acquire new customers. At the same time, attrition rates are rising. Conversion optimization helps by converting more bottom of the funnel visitors. SEO helps drive more traffic into the site, but mostly for middle-of-funnel (category page) and bottom-of-funnel (product page) visitors who may not also be price/feature comparison shopping, or are unable to convert right away because of device or time limitations.

Even savvy retailers publishing content for shoppers higher up in the funnel, such as buyer guides and reviews, aren’t getting an email address and are missing a lot of opportunities because of it.

attract-convert-grow-funnel-inflow-2.jpg

Here’s a thought. If your eCommerce site has a 10 percent conversion rate, you’re doing pretty good by most standards. But what happened to the other 90 percent of those visitors? Will you have the opportunity to connect with them again? Even if you bump that up a few percentage points with retargeting, a lot of potential revenue has seeped out of your funnel without a trace.

I don’t mean to bash the eCommerce marketing community with generalizations. Most lead gen. sites aren’t doing anything spectacular either, and a lot of opportunity is missed all around.

There are many eCommerce brands doing great things marketing-wise. I’m a big fan of
Crutchfield for their educational resources targeting early-funnel traffic, and Neman Tools, Saddleback Leather and Feltraiger for the stories they tell. Amazon is hard to beat when it comes to scalability, product suggestions and user-generated reviews.

Sadly, most eCommerce sites (including many of the major household brands) still approach marketing in this way…

The ol’ bait n’ switch: promising value and delivering spam

Established eCommerce brands have gigantic mailing lists (compared with lead gen. counterparts), to whom they typically send out at least one email each week with “offers” like free shipping, $ off, buy-one-get-one, or % off their next purchase. The lists are minimally segmented, if at all. For example, there might be lists for repeat customers, best customers, unresponsive contacts, recent purchasers, shoppers with abandoned carts, purchases by category, etc.

The missing points of segmentation include which campaign resulted in the initial contact (sometimes referred to as a cohort) and—most importantly—the persona and buying cycle stage that best applies to each contact.

Online retailers often send frequent “blasts” to their entire list or to a few of the large segments mentioned above. Lack of segmentation means contacts aren’t receiving emails based on their interests, problems, or buying cycle stage, but instead, are receiving what they perceive as “generic” emails.

The result of these missing segments and the lack of overarching strategy looks something like this:

My, What a Big LIST You Have!

iStock_000017047747Medium.jpg

TIME reported in 2012 on stats from Responsys that the average online retailer sent out between five and six emails the week after Thanksgiving. Around the same time, the Wall Street Journal reported that the top 100 online retailers sent an average of 177 emails apiece to each of their contacts in 2011. Averaged out, that’s somewhere between three and four emails each week that the contact is receiving from these retailers.

The better to SPAM you with!

iStock_000016088853Medium.jpg

A 2014 whitepaper from SimpleRelevance titled
Email Fail: An In-Depth Evaluation of Top 20 Internet Retailer’s Email Personalization Capabilities (
PDF) found that, while 70 percent of marketing executives believed personalization was of “utmost importance” to their business…

“Only 17 percent of marketing leaders are going beyond basic transactional data to deliver personalized messages to consumers.”

Speaking of email overload, the same report found that some major online retailers sent ten or more emails per week!

simplerelevance-email-report-frequency.png

The result?

All too often, the eCommerce business will carry around big, dead lists of contacts who don’t even bother reading their emails anymore. They end up scrambling toward other channels to “drive more demand,” but because the real problems were never addressed, this ends up increasing new customer acquisition costs.

The cycle looks something like this:

  1. Spend a fortune driving in unqualified traffic from top-of-the-funnel channels
  2. Ignore the majority of those visitors who aren’t ready to purchase
  3. Capture email addresses only for the few visitors who made a purchase
  4. Spam the hell out of those people until they unsubscribe
  5. Spend a bunch more money trying to fill the top of the funnel with even more traffic

It’s like trying to fill your funnel with a bucket full of holes, some of them patched with band-aids.

The real problems

  1. Lack of a cohesive strategy across marketing channels
  2. Lack of a cohesive content strategy throughout all stages of the buying cycle
  3. Lack of persona, buying cycle stage, and cohort-based list segmentation to nurture contacts
  4. Lack of tracking across customer touchpoints and devices
  5. Lack of gated content that provides enough value to early-funnel visitors to get them to provide their email address

So, what’s the answer?

Inbound marketing allows online retailers to stop competing with Amazon and other “price focused” competitors with leaky funnels, and to instead focus on:

  1. Persona-based content marketing campaigns designed to acquire email addresses from high-quality leads (potential customers) by offering them the right content for each stage in their buyer’s journey
  2. A robust marketing automation system that makes true personalization scalable
  3. Automated contact nurturing emails triggered by certain events, such as viewing specific content, abandoning their shopping cart, adding items to their wish list or performing micro-conversions like downloading a look book
  4. Intelligent SMM campaigns that match visitors and customers with social accounts by email addresses, interests and demographics—as well as social monitoring
  5. Hyper-segmented email contact lists to support the marketing automation described above, as well as to provide highly-customized email and shopping experiences
  6. Cross-channel, closed loop reporting to provide a complete “omnichannel” view of online marketing efforts and how they assist offline conversions, if applicable

Each of these areas will be covered in more detail below. First, let’s take a quick step back and define what it is we’re talking about here.

Inbound marketing: a primer

A lot of people think “inbound marketing” is just a way some SEO agencies are re-cloaking themselves to avoid negative associations with search engine optimization. Others think it’s synonymous with “internet marketing.” I think it goes more like this:

Inbound marketing is to Internet marketing as SEO is to inbound marketing: One piece of a larger whole.

There are many ways to define inbound marketing. A cursory review of definitions from several trusted sources reveals some fundamental similarities :

Rand Fishkin

randfishkin.jpeg

“Inbound Marketing is the practice of earning traffic and attention for your business on the web rather than buying it or interrupting people to get it. Inbound channels include organic search, social media, community-building content, opt-in email, word of mouth, and many others. Inbound marketing is particularly powerful because it appeals to what people are looking for and what they want, rather than trying to get between them and what they’re trying to do with advertising. Inbound’s also powerful due to the flywheel-effect it creates. The more you invest in Inbound and the more success you have, the less effort required to earn additional benefit.”


Mike King

mikeking.jpeg

“Inbound Marketing is a collection of marketing activities that leverage remarkable content to penetrate earned media channels such as Organic Search, Social Media, Email, News and the Blogosphere with the goal of engaging prospects when they are specifically interested in what the brand has to offer.”

This quote is from 2012, and is still just as accurate today. It’s from an
Inbound.org comment thread where you can also see many other takes on it from the likes of Ian Lurie, Jonathon Colman, and Larry Kim.


Inflow

inflow-logo.jpeg

“Inbound Marketing is a multi-channel, buyer-centric approach to online marketing that involves attracting, engaging, nurturing and converting potential customers from wherever they are in the buying cycle.”

From Inflow’s
Inbound Services page.


Wikipedia

wikipedia.jpeg

“Inbound marketing refers to marketing activities that bring visitors in, rather than marketers having to go out to get prospects’ attention. Inbound marketing earns the attention of customers, makes the company easy to be found, and draws customers to the website by producing interesting content.”

From
Inbound Marketing – Wikipedia.


Larry-Kim.jpeg

Larry Kim

“Inbound marketing” refers to marketing activities that bring leads and customers in when they’re ready, rather than you having to go out and wave your arms to try to get people’s attention.”

Via
Marketing Land in 2013. You can also read more of Larry Kim’s interpretation, along with many others, on Inbound.org.


Hubspot

“Instead of the old outbound marketing methods of buying ads, buying email lists, and praying for leads, inbound marketing focuses on creating quality content that pulls people toward your company and product, where they naturally want to be.”

Via
Hubspot, a marketing automation platform for inbound marketing.

When everyone has their own definition of something, it helps to think about what they have in common, as opposed to how they differ. In the case of inbound, this includes concepts such as:

  • Pull (inbound) vs. push (interruption) marketing
  • “Earning” media coverage, search engine rankings, visitors and customers with outstanding content
  • Marketing across channels
  • Meeting potential customers where they are in their buyer’s journey

Running your first eCommerce inbound marketing campaign

Audience personas—priority no. 1

The magic happens when retailers begin to hyper-segment their list based on buyer personas and other relevant information (i.e. what they’ve downloaded, what they’ve purchased, if they abandoned their cart…). This all starts with audience research to develop personas. If you need more information on persona development, try these resources:

Once personas are developed, retailers should choose one on which to focus. A complete campaign strategy should be developed around this persona, with the aim of providing the “right value” to them at the “right time” in their buyer’s journey.

Ready to get started?

We’ve developed a quick-start guide in the form of a checklist for eCommerce marketers who want to get started with inbound marketing, which you can access below.

inbound ecommerce checklist

Hands-on experience running one campaign will teach you more about inbound marketing than a dozen articles. My advice: Just do one. You will make mistakes. Learn from them and get better each time.

Example inbound marketing campaign

Below is an example of how a hypothetical inbound marketing campaign might play out, assuming you have completed all of the steps in the checklist above. Imagine you handle marketing for an online retailer of high-end sporting goods.

AT Hiker Tommy campaign: From awareness to purchase

When segmenting visitors and customers for a “high-end sporting goods / camping retailer” based on the East Coast, you identified a segment of “Trail Hikers.” These are people with disposable income who care about high-quality gear, and will pay top dollar if they know it is tested and reliable. The top trail on their list of destinations is the
Appalachian Trail (AT).

Top of the Funnel: SEO & Strategic Content Marketing

at-tommy.jpg

Tommy’s first action is to do “top of the funnel” research from search engines (one reason why SEO is still so important to a complete inbound marketing strategy).

A search for “Hiking the Appalachian Trail” turns up your article titled “What NOT to Pack When Hiking the Appalachian Trail,” which lists common items that are bulky/heavy, and highlights slimmer, lighter alternatives from your online catalog.

It also highlights the difference between cheap gear and the kind that won’t let you down on your 2,181 mile journey through the wilderness of Appalachia, something you learned was important to Tommy when developing his persona. This allows you to get the company’s value proposition of “tested, high-end, quality gear only” in front of readers very early in their buyer’s journey—important if you want to differentiate your site from all of the retailers racing Amazon to the bottom of their profit margins.

So far you have yet to make “contact” with AT Hiker Tommy. The key to “acquiring” a contact before the potential customer is ready to make a purchase is to provide something of value to that specific type of person (i.e. their persona) at that specific point in time (i.e. their buying cycle stage).

In this case, we need to provide value to AT Hiker Tommy while he is getting started on his research about hiking the Appalachian Trail. He has an idea of what gear not to bring, as well as some lighter, higher-end options sold on your site. At this point, however, he is not ready to buy anything without researching the trail more. This is where retailers lose most of their potential customers. But not you. Not this time…

Middle of the funnel: Content offers, personalization, social & email nurturing

at-hiker-ebook.png

On the “What NOT to Pack When Hiking the Appalachian Trail” article (and probably several others), you have placed a call-to-action (CTA) in the form of a button that offers something like:

Download our Free 122-page Guide to Hiking the Appalachian Trail

This takes Tommy to a landing page showcasing some of the quotes from the book, and highlighting things like:

“We interviewed over 50 ‘thru-hikers’ who completed the AT and have curated and organized the best first-hand tips, along with our own significant research to develop a free eBook that should answer most of your questions about the trail.”

By entering their email address potential customers agree to allow you to send them the free PDF downloadable guide to hiking the AT, and other relevant information about hiking.

An automated email is sent with a link to the downloadable PDF guide, and several other useful content links, such as “The AT Hiker’s Guide to Gear for the Appalachian Trail”—content designed to move Tommy further toward the purchase of hiking gear.

If Tommy still has not made a purchase within the next two weeks, another automated email is sent asking for feedback about the PDF guide (providing the link again), and to again provide the link to the “AT Hiker’s Guide to Gear…” along with a compelling offer just for him, perhaps “Get 20% off your first hiking gear purchase, and a free wall map of the AT!”

Having Tommy’s email address also allows you to hyper-target him on social channels, while also leveraging his initial visit to initiate retargeting efforts.

Bottom of the funnel: Email nurturing & strategic, segmented offers

Eventually Tommy makes a purchase, and he may or may not receive further emails related to this campaign, such as post-purchase emails for reviews, up-sells and cross-sells.

Upon checkout, Tommy checked the box to opt-in to weekly promotional emails. He is now on multiple lists. Your marketing automation system will automatically update Tommy’s status from “Contact” or lead, to “Customer” and potentially remove or deactivate him from the marketing automation system database. This is accomplished either by default integration features, or with the help of integration tools like
Zapier and IFTTT.

You have now nurtured Tommy from his initial research on Google all the way to his first purchase without ever having sent a spammy newsletter email full of irrelevant coupons and other offers. However, now that he is a loyal customer, Tommy finds value in these bottom-of-funnel email offers.

And this is just the start

Every inbound marketing campaign will have its own mix of appropriate channels. This post has focused mostly on email because acquiring the initial permission to contact the person is what fuels most of the other features offered by marketing automation systems, including:

  • Personalization of offers and other content on the site.
  • Knowing exactly which visitors are interacting on social media
  • Knowing where visitors and social followers are in the buying cycle and which persona best represents them, among other things.
  • Smart forms that don’t require visitors to put in the same information twice and allow you to build out more detailed profiles of them over time.
  • Blogging platforms that tie into email and marketing automation systems
  • Analytics data that isn’t blocked by Google and is tied directly to real people.
  • Closed-loop reporting that integrates with call-tracking and Google’s Data Import tool
  • Up-sell, cross-sell, and abandoned cart reclamation features
Three more things…
  1. If you can figure out a way to get Tommy to “log in” when he comes to your site, the personalization possibilities are nearly limitless.
  2. The persona above is based on a real customer segment. I named it after my friend Tommy Bailey, who actually did write the eBook
    Guide to Hiking the Appalachian Trail, featured in the image above.
  3. This Moz post is part of an inbound marketing campaign targeting eCommerce marketers, a segment Inflow identified while building out our own personas. Our hope, and the whole point of inbound marketing, is that it provides value to you.

Current state of the inbound marketing industry

Inbound has, for the the most part, been applied to businesses in which the website objective is to generate leads for a sales team to follow-up with and close the deal. An examination of various marketing automation platforms—a key component of scalable inbound marketing programs—highlights this issue.

Popular marketing automation systems

Most of the major marketing automation systems can be be used very effectively as the backbone of an inbound marketing program for eCommerce businesses. However, only one of them (Silverpop) has made significant efforts to court the eCommerce market with content and out-of-box features. The next closest thing is Hubspot, so let’s start with those two:

Silverpop – an IBMⓇ Company

silver-pop.jpeg

Unlike the other platforms below, right out of the box Silverpop allows marketers to tap into very specific behaviors, including the items purchased or left in the cart.

You can easily segment based on metrics like the Recency, Frequency and Monetary Value (RFM) of purchases:

silverpop triggered campaigns

You can automate personalized shopping cart abandonment recovery emails:

silverpop cart abandonment recovery

You can integrate with many leading brands offering complementary services, including: couponing, CRM, analytics, email deliverability enhancement, social and most major eCommerce platforms.

What you can’t do with Silverpop is blog, find pricing info on their website, get a free trial on their website or have a modern-looking user experience. Sounds like an IBMⓇ company, doesn’t it?

HubSpot

Out of all the marketing automation platforms on this list, HubSpot is the most capable of handling “inbound marketing” campaigns from start to finish. This should come as no surprise, given the phrase is credited to
Brian Halligan, HubSpot’s co-founder and CEO.

While they don’t specifically cater to eCommerce marketing needs with the same gusto they give to lead gen. marketing, HubSpot does have
an eCommerce landing page and a demo landing page for eCommerce leads, which suggests that their own personas include eCommerce marketers. Additionally, there is some good content on their blog written specifically for eCommerce.

HubSpot has allowed some key partners to develop plug-ins that integrate with leading eCommerce platforms. This approach works well with curation, and is not dissimilar to how Google handles Android or Apple handles their approved apps.

magento and hubspot

The
Magento Connector for HubSpot, which costs $80 per month, was developed by EYEMAGiNE, a creative design firm for eCommerce websites. A similar HubSpot-approved third-party integration is on the way for Bigcommerce.

Another eCommerce integration for Hubspot is a Shopify plug-in called
HubShoply, which was developed by Groove Commerce and costs $100 per month.

You can also use HubSpot’s native integration capabilities with
Zapier to sync data between HubSpot and most major eCommerce SaaS vendors, including the ones above, as well as WooCommerce, Shopify, PayPal, Infusionsoft and more. However, the same could be said of some of the other marketing automation platforms, and using these third-party solutions can sometimes feel like fitting a square peg into a round hole.

HubSpot can and does handle inbound marketing for eCommerce websites. All of the features are there, or easy enough to integrate. But let’s put some pressure on them to up their eCommerce game even more. The least they can do is put an eCommerce link in the footer:

hubspot menus

Despite the lack of clear navigation to their eCommerce content, HubSpot seems to be paying more attention to the needs of eCommerce businesses than the rest of the platforms below.

Marketo

Nothing about Marketo’s in-house marketing strategy suggests “Ecommerce Director Bob” might be one of their personas. The description for each of
their marketing automation packages (from Spark to Enterprise) mentions that it is “for B2B” websites.

marketo screenshot

Driving Sales could apply to a retail business so I clicked on the link. Nope. Clearly, this is for lead generation.

marketo marketing automation

Passing “purchase-ready leads” over to your “sales reps” is a good example of the type of language used throughout the site.

Make no mistake, Marketo is a top-notch marketing automation platform. Powerful and clean, it’s a shame they don’t launch a full-scale eCommerce version of their core product. In the meantime, there’s the
Magento Integration for Marketo Plug-in developed by an agency out of Australia called Hoosh Marketing.

magento marketo integration

I’ve never used this integration, but it’s part of Marketo’s
LaunchPoint directory, which I imagine is vetted, and Hoosh seems like a reputable agency.

Their
pricing page is blurred and gated, which is annoying, but perhaps they’ll come on here and tell everyone how much they charge.

marketo pricing page

As with all others except Silverpop, the Marketo navigation provides no easy paths to landing pages that would appeal to “Ecommerce Director Bob.”

Pardot

This option is a
SalesForce product, so—though I’ve never had the opportunity to use it—I can imagine Pardot is heavy on B2B/Sales and very light on B2C marketing for retail sites.

The hero image on their homepage says as much.

pardot tagline

pardot marketing automationAgain, no mention of eCommerce or retail, but clear navigation to lead gen and sales.

Eloqua / OMC

eloqua-logo.jpeg

Eloqua, now part of the Oracle Marketing Cloud (OMC), has a landing page
for the retail industry, on which they proclaim:

“Retail marketers know that the path to lifelong loyalty and increased revenue goes through building and growing deep client relationships.”

Since when did retail marketers start calling customers clients?

eloqua integration

The Integration tab on OMC’s “…Retail.html” page helpfully informs eCommerce marketers that their sales teams can continue using CRM systems like SalesForce and Microsoft Dynamics but doesn’t mention anything about eCommerce platforms and other SaaS solutions for eCommerce businesses.

Others

There are many other players in this arena. Though I haven’t used them yet, three I would love to try out are
SharpSpring, Hatchbuck and Act-On. But none of them appear to be any better suited to handle the concerns of eCommerce websites.

Where there’s a gap, there’s opportunity

The purpose of the section above wasn’t to highlight deficiencies in the tools themselves, but to illustrate a gap in who they are being marketed to and developed for.

So far, most of your eCommerce competitors probably aren’t using tools like these because they are not marketed to by the platforms, and don’t know how to apply the technology to online retail in a way that would justify the expense.

The thing is, a tool is just a tool

The
key concepts behind inbound marketing apply just as much to online retail as they do to lead generation.

In order to “do inbound marketing,” a marketing automation system isn’t even strictly necessary (in theory). They just help make the activities scalable for most businesses.

They also bring a lot of different marketing activities under one roof, which saves time and allows data to be moved and utilized between channels and systems. For example, what a customer is doing on social could influence the emails they receive, or content they see on your site. Here are some potential uses for most of the platforms above:

Automated marketing uses

  • Personalized abandoned cart emails
  • Post-purchase nurturing/reorder marketing
  • Welcome campaigns for the newsletter (other free offer) signups
  • Winback campaigns
  • Lead-nurturing email campaigns for cohorts and persona-based segments

Content marketing uses

  • Optimized, strategic blogging platforms, and frameworks
  • Landing pages for pre-transactional/educational offers or contests
  • Social media reporting, monitoring, and publishing
  • Personalization of content and user experience

Reporting uses

  • Revenue reporting (by segment or marketing action)
  • Attribution reporting (by campaign or content)

Assuming you don’t have the budget for a marketing automation system, but already have a good email marketing platform, you can still get started with inbound marketing. Eventually, however, you may want to graduate to a dedicated marketing automation solution to reap the full benefits.

Email marketing platforms

Most of the marketing automation systems claim to replace your email marketing platform, while many email marketing platforms claim to be marketing automation systems. Neither statement is completely accurate.

Marketing automation systems, especially those created specifically for the type of “inbound” campaigns described above, provide a powerful suite of tools all in one place. On the other hand, dedicated email platforms tend to offer “email marketing” features that are better, and more robust, than those offered by marketing automation systems. Some of them are also considerably cheaper—such as
MailChimp—but those are often light on even the email-specific features for eCommerce.

A different type of campaign

Email “blasts” in the form of B.O.G.O., $10 off or free shipping offers can still be very successful in generating incremental revenue boosts — especially for existing customers and seasonal campaigns.

The conversion rate on a 20% off coupon sent to existing customers, for instance, would likely pulverize the conversion rate of an email going out to middle-of-funnel contacts with a link to content (at least with how CR is currently being calculated by email platforms).

Inbound marketing campaigns can also offer quick wins, but they tend to focus mostly on non-customers after the first segmentation campaign (a campaign for the purpose of segmenting your list, such as an incentivised survey). This means lower initial conversion rates, but long-term success with the growth of new customers.

Here’s a good bet if works with your budget: Rely on a marketing automation system for inbound marketing to drive new customer acquisition from initial visit to first purchase, while using a good email marketing platform to run your “promotional email” campaigns to existing customers.

If you have to choose one or the other, I’d go with a robust marketing automation system.

Some of the most popular email platforms used by eCommerce businesses, with a focus on how they handle various Inbound Marketing activities, include:

Bronto

bronto.jpeg

This platform builds in features like abandoned cart recovery, advanced email list segmentation and automated email workflows that nurture contacts over time.

They also offer a host of eCommerce-related
features that you just don’t get with marketing automation systems like Hubspot and Marketo. This includes easy integration with a variety of eCommerce platforms like ATG, Demandware, Magento, Miva Merchant, Mozu and MarketLive, not to mention apps for coupons, product recommendations, social shopping and more. Integration with enterprise eCommerce platforms is one reason why Bronto is seen over and over again when browsing the Internet Retailer Top 500 reports.

On the other hand, Bronto—like the rest of these email platforms—doesn’t have many of the features that assist with content marketing outside of emails. As an “inbound” marketing automation system, it is incomplete because it focuses almost solely on one channel: email.

Vertical Response

verticalresponse.jpeg

Another juggernaut in eCommerce email marketing platforms, Vertical Response, has even fewer inbound-related features than Bronto, though it is a good email platform with a free version that includes up to 1,000 contacts and 4,000 emails per month (i.e. 4 emails to a full list of 1,000).

Oracle Marketing Cloud (OMC)

Responsys (the email platform), like Eloqua (the marketing automation system) was gobbled up by Oracle and is now part of their “Marketing Cloud.”

It has been my experience that when a big technology firm like IBM or Oracle buys a great product, it isn’t “great” for the users. Time will tell.

Listrak

listrak.jpeg

Out of the established email platforms for eCommerce, Listrak may do the best job at positioning themselves as a full inbound marketing platform.

Listrak’s value proposition is that they’re an “Omnichannel” solution. Everything is all in one “Single, Integrated Digital Marketing Platform for Retailers.” The homepage image promises solutions for Email, Mobile, Social, Web and In-Store channels.

I haven’t had the opportunity to work with Listrak yet, but would love to hear feedback in the comments on whether they could handle the kind of persona-based content marketing and automated email nurturing campaigns described in the example campaign above.

Key takeaways

Congratulations for making this far! Here are a few things I hope you’ll take away from this post:

  • There is a lot of opportunity right now for eCommerce sites to take advantage of marketing automation systems and robust email marketing platforms as the infrastructure to run comprehensive inbound marketing campaigns.
  • There is a lot of opportunity right now for marketing automation systems to develop content and build in eCommerce-specific features to lure eCommerce marketers.
  • Inbound marketing isn’t email marketing, although email is an important piece to inbound because it allows you to begin forming lasting relationships with potential customers much earlier in the buying cycle.
  • To see the full benefits of inbound marketing, you should focus on getting the right content to the right person at the right time in their shopping journey. This necessarily involves several different channels, including search, social and email. One of the many benefits of marketing automation systems is their ability to track your efforts here across marketing channels, devices and touch-points.

Tools, resources, and further reading

There is a lot of great content on the topic of Inbound marketing, some of which has greatly informed my own understanding and approach. Here are a few resources you may find useful as well.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 years ago from tracking.feedpress.it