Do more with your data after busy shopping periods: five minutes with the data gurus

Keep your eyes peeled for part two.

Congratulations! Your business has weathered one of the busiest shopping periods of the year. But whether it’s Black Friday, Christmas, January,scorching summer sales, or even Halloween bonanzas, we all know that the work doesn’t stop post-event. It’s not just distribution and accounts that will have their hands full now – at least it shouldn’t be. As a marketer or a CX professional, this is where you can really get your hands dirty with data.

We speak with Ian Pollard, one of our senior product managers, and Sam Crawley, a product data scientist, to uncover what you should pay attention to once the dust has settled on a busy shopping period, and how to make the most of it in Engagement Cloud.

What are the most common challenges you hear about from retail customers following a surge in sales? Did this inform the product design of Engagement Cloud?

Ian: If you’ve just run a big sale, you’ll have some newly acquired customers. Having spent big on acquisition and likely taken a margin hit with discounting, you don’t want to lose them. Getting that all-important second purchase is the difference between never hearing from them again and building loyalty. This was what led us to build out the ‘single purchase customer’ tile as one of the nine metrics Engagement Cloud users can keep a close eye on and drill down into. 

Sam: There are nine tiles in total and things to learn from all of them, especially after a big sales event, as long as you keep context in mind! The ‘average items per order tile’, for example, might show that during the sale people were either picking up several discounted items or in fact buying lots at once. The latter might indicate a successful use of on-site product recommendations.

As Ian mentioned, the acquisition of new customers is a major part of these events, and many of them may not end up re-purchasing at all. It’s important to keep the average value of these newly acquired customers in mind, especially when comparing to the amount of money and effort that went in to acquiring them. Drilling down into the CLV tile, for instance, might give an idea of the ROI you’d expect compared with a standard period.

Now if you were head of marketing or customer experience, what would you do with this data? How would it help you achieve your goals of improving ROI, lifetime value, or overall customer experience?

Ian: Our segment builder lets you target customers by RFM persona. Drag in the RFM data block and target anyone in the ‘recent customers’ persona.

Marketing to these people is difficult at such an early stage of the relationship; all you really know is that they’re a new customer. They may have nothing else in common with each other. For this reason, I would follow up by using the most reliable data point you have — what they just bought. 

To target effectively against that, I recommend using our ‘also bought’ product recommendation. This looks at the highest value item in the recent checkout and finds other shoppers who have also purchased it. Within that group of shoppers, Engagement Cloud will then find other products they have bought and recommend the most popular. 

Sam: There is no magical method to improve ROI or lifetime value, but different marketing methods can be optimized and refined over time in order to see more success. This is where context becomes important.

We’ve given you the ability to filter the metrics and drill down reports on specific segments or RFM personas. What this means is actually really cool. You can trial different methods on different categories of customers. Then you can compare the effects on CLV and ‘average delay’ over time by selecting different date ranges.

Use these tools to find what works best for you and your customers.

Ian, what was the drive behind developing the recency, frequency, and monetary (RFM) personas (as well as the persona movement reports) in Engagement Cloud? What value do these data-driven metrics bring to a business?

Ian: RFM had been in our plans for a while and we knew it was a popular wish-list feature with customers. The ability to manually create RFM-like segments had always been possible in Engagement Cloud, so the decision to make a formal data model for it wasn’t something we rushed into.

 I’m really pleased with our model: it took a lot of thought, but I think it’s the right balance of power and simplicity. The core model is built around six very-easy-to-understand personas grouped across a lifecycle timeline familiar to any retailer — inactive, lapsing, active. It’s incredibly valuable to anyone wanting to do behavior-based targeting or reporting.

The movement reporting came from insights we uncovered whilst building the RFM model. Some of our customers were really interested in how customers moved through personas over time. That stuck with us and we started modelling these movements and found interesting stories in the data. Finding a way to show this to our customers was a little more of a challenge. We have some big opinions on data visualization in the team, but I think we’re all happy with where we ended up. Even if we did need to define a whole new color palette to make it work!

Which personas should businesses keep an eye on? And how should they be treated after a large sales event?

Sam: New customers, for sure.  After a large sales event you are likely to have a much larger chunk of new customers than normal, and this represents a great opportunity to increase your loyal customer base.  You should focus on marketing to these people, with the aim of converting them into repeat customers.  Make use of the persona movement report to keep track of them and figure out which tactics work best.

Any other advice on doing more with data for businesses using (or thinking of using…) Engagement Cloud?

Ian: We have a great feature called web behavior tracking (WBT). It tracks page views, and, when we can identify the contact, it matches those web sessions with them. If you combine WBT with RFM, you get the ability to identify emerging purchase intent. 

Why does that matter?

Ian: Think about a win-back campaign for your inactive customers. You’ve already paid to acquire them and they’re now giving you every sign that they can realistically be won back. They’re worth spending money on, they’re your best leads.

I would create a multi-stage and multi-channel campaign. If they don’t buy or engage via email, then re-target via Facebook, Instagram, or Google (which can you do via our program builder). If they engage again on your website but still don’t buy, then it may be worth looking at a coupon campaign. 

Any top tips Sam?

Sam: Try combining automation and programs with the persona movement report.  The report isn’t just useful for tracking what happens to your new customers after a sale, but can be used to see what the overall engagement lifecycle of your customers looks like. Filtering based on segments might reveal insights into what can be improved in your automation and programs, or where you are excelling.

Thanks both!

Want to hear more from Ian and Sam? They’ll be speaking at our dotlive event on Wednesday 11th December.

And don’t forget, this is the first of our three-part series in what to do after a surge in sales. Check back soon for part two, or sign up for blog updates and more here.

The post Do more with your data after busy shopping periods: five minutes with the data gurus appeared first on dotdigital blog.

[ccw-atrib-link]

These five email programs will make you stand out in the inbox

So, you’ve made the decision to take on an ecommerce connector using Commerce Flow or Magento, for instance; it’s all hooked up with dotmailer and ready to go. What next?

As with all data-driven marketing, your customer insight has to sit at the very heart of your email. This is particularly the case if you wish to move away from ‘batch and blast’ – or ‘spray and pray’ as I like to call it – to sending the right message, to the right person, at the right time.

An extension to this mantra is also “on the right channel”. With the recent acquisition of COMAPI, dotmailer customers now have more choice over the digital channels on which to convey marketing messages – but that’s a blog for another day.

Automation is a method that enables time-poor marketers to deliver the right message, to the right person, at the right time. BUT, the triggers to enrol a subscriber onto an automation program (or the logic behind it) is all based on… Well you guessed it – data.

Hooking up your ecommerce platform with dotmailer ensures you get valuable transactional data into your email platform, empowering you to tailor the experience around the actions of subscribers.

So, I’ve collated the must-have automations that – regardless of what product you’re selling – should be at the top of your list. And once implemented, you can scope programs as per the industry or sector you’re operating in.

1. Welcome program

The welcome journey would be the first email contact you have with subscribers after they’ve signed up. As with the real world, you’ve only one shot to make a good impression, so make it count!

Your welcome program should set the tone of what subscribers will receive moving forward. A good starting place is a three-part series:

  • In the first email, thank your subscribers for signing up and reinforce the reasons why they did so in the first place – this is the prime time to shout about your USPs and the exclusive benefits of being opted-in.
  • As a follow up to the initial email, you should begin building your relationship with customers by telling them your brand story. Make sure subscribers understand your USPs and make your proposition as compelling as possible, recommending the most-viewed or most-purchased products on your site for instance.
  • Email three is the perfect opportunity to prompt subscribers to act. If they haven’t made their first purchase yet, encourage them to do so. If they have made their first purchase, ask them instead to fill in your preference center so you can further tailor their digital experience.

Charlotte Tilbury’s stylish welcome email provides subscribers with a backdrop to the brand and asks for details – such as hair color and skin tone – to drive relevant communications.

2. Post-purchase program

Now that subscribers have made their first purchase, it’s time to enrol them onto an aftersales program that communicates thanks and rewards them for their business with you over that of a competitor. It doesn’t have to be a discount (although research we’ve conducted with the Direct Marketing Association indicates that discounts and money offs are most effective); you can reinforce their decision was the right one by highlighting relevant customer testimonials and awarding loyalty points. Remember, reassurance (especially of an impulsive buy) goes a long way.

With every purchase that follows, customers can be automatically enrolled onto an aftersales program that collects product reviews and drives value-added content; both are likely to bolster UGC for other email programs and enrich your brand’s credibility.

Tangle Teezer does a great job of maximizing its customer loyalty with product-focused tips and inspiring UCG.

3. Abandoned cart

In my experience, this is the automation of automations. Abandoned cart is without a doubt the highest revenue-generating program I’ve come across. Subscribers – whether intentionally or unintentionally – leave their carts full and unattended; sending them a prompt reminder so that they ‘don’t miss out’ or even a cheeky discount (if it’s been a slow month) is guaranteed to produce great results.

I would experiment between a one- and three-stage abandoned cart program to see what works best for you. But be warned, some of the more savvier consumers will abandon their purchase intentionally to seek out a discount code, as they recognize that many brands will use it as incentive to recover lost carts.

Oliver Bonas tempts subscribers back to their cart using data-triggered notifications.

4. Loyalty program

Loyalty programs can be straightforward and help you generate the advocacy enjoyed by the likes of ASOS, winner of our 2017 benchmark report – Hitting the Mark.

The logic behind a loyalty program could be as simple as having the enrollment criteria set to customers’ average order value (AOV) or a minimum number of orders made in the last 6 months. Conditions to enroll might be an AOV that’s equal to or greater than £100, for instance.

An automation can be triggered when the rules you’ve set up have been met, informing loyal customers that they’ve qualified for membership in a special VIP club, and of their exclusive access to additional benefits or gifts.

Triggering this automation will update the relative ‘data-field’ within dotmailer, marking customers that have enrolled as ‘VIPs’. You can then leverage this insight to enrich the relevancy of your business-as-usual newsletters, using dynamic content to display extra information that’s exclusive to your more loyal customers.

Conversely, another automation could be built and triggered if customers’ AOV or number of orders (made over a certain period) are below the prerequisites of entering the VIP club; they’d be either encouraged to make a purchase or enrolled onto a winback program.

Ellisons drip-feeds discounts to customers as means to strengthen retention and inspire loyalty.

5. Re-engagement program

A re-engagement (or winback) program could be based on a period of subscriber inactivity – i.e. the last time subscribers have opened or clicked in an email. Pro:Direct, for example, prompts unengaged subscribers to remain in the loop or connect on other channels.

But in an ecommerce scenario, I’d base the criterion on when they last made a purchase.

If a number of days or months has elapsed and a subscriber hasn’t made a purchase, sending a ‘we miss you’ campaign is sure to rouse their engagement and compel them to act. Things to potentially include are some inspirational UGC, USPs, freebies or a discretionary discount.

Ready, set, go!

Once you’ve got these five automations in place, you’re well on the road to delivering the right message, to the right person, at the right time. If you’d like more advice on tailoring automations to your business, please feel free to contact your account manager.

And for more automation inspiration, check out our best practice guide on growing your ecommerce business with email.

The post These five email programs will make you stand out in the inbox appeared first on The Marketing Automation Blog.

[ccw-atrib-link]

Five reasons why email is THE core channel for “on-the-go” customers of an e-commerce business

People say that email – a channel with a history spanning five decades – is dead. But I don’t buy it. According to Forrester, approximately 122,500,453,020 emails are sent every hour.

Here are five reasons why email is the core channel for your everyday consumer:

  1. Effectively reflects your brand image
  2. Cheap & drives conversions
  3. Dynamic (i.e. personalization & segmentation)
  4. Consistent, coordinated and deliverable
  5. Very measurable for marketers

Email is very much alive, and has in fact undergone years of evolution into the channel that the consumer wants it to be. Nowadays, for a marketing channel to prove valuable to business strategy, it needs to provide the flexibility and adaptability that the hyper-connected consumer desires. Email is a blank canvas for the marketer – it’s a cost-effective channel to reach prospects & customers, with an on-brand message that drives ROI through dynamic campaigns that keep people engaged. As long as you use email intelligently (i.e. you have the data and tech in place) and employ an on-brand strategy, this channel could prove an essential component in the success of your e-commerce business, whether B2B or B2C. These case studies on workwear provider Alexandra and British homeware brand Cabbages and Roses provide the perfect illustration.

If you’re looking to up your email marketing automation game, these campaigns will give you a jump-start:

Abandon Cart email – “We’re ready when you are.”

Triggering emails on the back of abandoned baskets is a great way to drive revenue and increase conversion. Having insight data in your email marketing platform allows you to store behavioral information on your subscribers in order to drive intelligent and engaging interactions, ultimately leading to conversion. For example, a customer might log onto your website via desktop, browse, and add products to their cart. Then, for some reason they may close their browser. Rather than this information being lost, it can be pushed into your email platform, stored, and then utilized to send the customer an email with their basket details and a CTA for checkout. Email is therefore the perfect tool to recuperate the abandoned customer journey.

Post-purchase email – “We’ve missed you, have you missed us?”

Triggering emails on the back of a customer’s purchase information is something every online retailer should be doing; it’s integral to the e-commerce handbook. Segmenting customers based on recency, frequency and monetary value (RFM model) is a great way to target your audience because it will subsequently drive ROI. For example, rather than sending generic offers on shoes to your entire database, you might want to send a particular segment a 10% discount offer on a high-value pair of stilettos, because this segment has an average lifetime value (ALV) of over £2,000 and they have bought more than 1 pair of stilettos in the past year. They also haven’t purchased for 3 months, hence the offer. This segment is more likely to action over the rest of the database. Sending highly personalized messages through dynamic content will have a greater chance of increasing key metrics, such as click-through-rates (CTRs) and conversions.

At the end of the day, it’s the simple measures that prove essential – segmenting, targeting and personalization drives value right to the top of your customer’s inbox (if your deliverability is top notch, that is).

To see eight other key programs you should be sending to grow your e-commerce business, pick up your free copy of our best practice guide.

[ccw-atrib-link]

Google updates the AMP report in the Google Search Console

Five months after launching the AMP error report in the Google Search Console, Google has updated the report to make it easier to spot errors.

The post Google updates the AMP report in the Google Search Console appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

[ccw-atrib-link]

The Local SEO Playbook To Increased Visibility And Customer Acquisition

Columnist Thomas Stern discusses five essential components of local optimization that increase online exposure and drive offline traffic to brick-and-mortar stores.

The post The Local SEO Playbook To Increased Visibility And Customer Acquisition appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

[ccw-atrib-link]

Why Effective, Modern SEO Requires Technical, Creative, and Strategic Thinking – Whiteboard Friday

Posted by randfish

There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.

There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.

Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”

Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.

First off, some of the good points he did bring up.

One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”

Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.

And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.

You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.

But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.

First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.

So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.

Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.

Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.

So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.

And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.

Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.

The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”

Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”

I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.

There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.

Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;

  • Good on-site experience
  • Writing good content
  • Getting others to acknowledge you as an authority
  • Rising in social popularity
  • Earning local relevance
  • Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)

The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.

I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.

But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.

I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.

We have to be able to understand things like;

  • Content rendering and indexability
  • Crawl structure, internal links, JavaScript, Ajax. If something’s post-loading after the page and Google’s not able to index it, or there are links that are accessible via JavaScript or Ajax, maybe Google can’t necessarily see those or isn’t crawling them as effectively, or is crawling them, but isn’t assigning them as much link weight as they might be assigning other stuff, and you’ve made it tough to link to them externally, and so they can’t crawl it.
  • Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?
  • Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.
  • Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.

Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”

Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.

  • International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?
  • Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?
  • Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.

Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.

I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.

  • Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.
  • Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.
  • Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.
  • Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.
  • Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.
  • Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.
  • Diagnosing mobile friendliness issues
  • Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.

Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.

So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.

All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

The 2015 #MozCon Video Bundle Has Arrived!

Posted by EricaMcGillivray

The bird has landed, and by bird, I mean the MozCon 2015 Video Bundle! That’s right, 27 sessions and over 15 hours of knowledge from our top notch speakers right at your fingertips. Watch presentations about SEO, personalization, content strategy, local SEO, Facebook graph search, and more to level up your online marketing expertise.

If these videos were already on your wish list, skip ahead:

If you attended MozCon, the videos are included with your ticket. You should have an email in your inbox (sent to the address you registered for MozCon with) containing your unique URL for a free “purchase.”

MozCon 2015 was fantastic! This year, we opened up the room for a few more attendees and to fit our growing staff, which meant 1,600 people showed up. Each year we work to bring our programming one step further with incredible speakers, diverse topics, and tons of tactics and tips for you.


What did attendees say?

We heard directly from 30% of MozCon attendees. Here’s what they had to say about the content:

Did you find the presentations to be advanced enough? 74% found them to be just perfect.

Wil Reynolds at MozCon 2015


What do I get in the bundle?

Our videos feature the presenter and their presentation side-by-side, so there’s no need to flip to another program to view a slide deck. You’ll have easy access to links and reference tools, and the videos even offer closed captioning for your enjoyment and ease of understanding.

For $299, the 2015 MozCon Video Bundle gives you instant access to:

  • 27 videos (over 15 hours) from MozCon 2015
  • Stream or download the videos to your computer, tablet, phone, phablet, or whatever you’ve got handy
  • Downloadable slide decks for all presentations


Bonus! A free full session from 2015!

Because some sessions are just too good to hide behind a paywall. Sample what the conference is all about with a full session from Cara Harshman about personalization on the web:


Surprised and excited to see these videos so early? Huge thanks is due to the Moz team for working hard to process, build, program, write, design, and do all the necessaries to make these happen. You’re the best!

Still not convinced you want the videos? Watch the preview for the Sherlock Christmas Special. Want to attend the live show? Buy your early bird ticket for MozCon 2016. We’ve sold out the conference for the last five years running, so grab your ticket now!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Distance from Perfect

Posted by wrttnwrd

In spite of all the advice, the strategic discussions and the conference talks, we Internet marketers are still algorithmic thinkers. That’s obvious when you think of SEO.

Even when we talk about content, we’re algorithmic thinkers. Ask yourself: How many times has a client asked you, “How much content do we need?” How often do you still hear “How unique does this page need to be?”

That’s 100% algorithmic thinking: Produce a certain amount of content, move up a certain number of spaces.

But you and I know it’s complete bullshit.

I’m not suggesting you ignore the algorithm. You should definitely chase it. Understanding a little bit about what goes on in Google’s pointy little head helps. But it’s not enough.

A tale of SEO woe that makes you go “whoa”

I have this friend.

He ranked #10 for “flibbergibbet.” He wanted to rank #1.

He compared his site to the #1 site and realized the #1 site had five hundred blog posts.

“That site has five hundred blog posts,” he said, “I must have more.”

So he hired a few writers and cranked out five thousand blogs posts that melted Microsoft Word’s grammar check. He didn’t move up in the rankings. I’m shocked.

“That guy’s spamming,” he decided, “I’ll just report him to Google and hope for the best.”

What happened? Why didn’t adding five thousand blog posts work?

It’s pretty obvious: My, uh, friend added nothing but crap content to a site that was already outranked. Bulk is no longer a ranking tactic. Google’s very aware of that tactic. Lots of smart engineers have put time into updates like Panda to compensate.

He started like this:

And ended up like this:
more posts, no rankings

Alright, yeah, I was Mr. Flood The Site With Content, way back in 2003. Don’t judge me, whippersnappers.

Reality’s never that obvious. You’re scratching and clawing to move up two spots, you’ve got an overtasked IT team pushing back on changes, and you’ve got a boss who needs to know the implications of every recommendation.

Why fix duplication if rel=canonical can address it? Fixing duplication will take more time and cost more money. It’s easier to paste in one line of code. You and I know it’s better to fix the duplication. But it’s a hard sell.

Why deal with 302 versus 404 response codes and home page redirection? The basic user experience remains the same. Again, we just know that a server should return one home page without any redirects and that it should send a ‘not found’ 404 response if a page is missing. If it’s going to take 3 developer hours to reconfigure the server, though, how do we justify it? There’s no flashing sign reading “Your site has a problem!”

Why change this thing and not that thing?

At the same time, our boss/client sees that the site above theirs has five hundred blog posts and thousands of links from sites selling correspondence MBAs. So they want five thousand blog posts and cheap links as quickly as possible.

Cue crazy music.

SEO lacks clarity

SEO is, in some ways, for the insane. It’s an absurd collection of technical tweaks, content thinking, link building and other little tactics that may or may not work. A novice gets exposed to one piece of crappy information after another, with an occasional bit of useful stuff mixed in. They create sites that repel search engines and piss off users. They get more awful advice. The cycle repeats. Every time it does, best practices get more muddled.

SEO lacks clarity. We can’t easily weigh the value of one change or tactic over another. But we can look at our changes and tactics in context. When we examine the potential of several changes or tactics before we flip the switch, we get a closer balance between algorithm-thinking and actual strategy.

Distance from perfect brings clarity to tactics and strategy

At some point you have to turn that knowledge into practice. You have to take action based on recommendations, your knowledge of SEO, and business considerations.

That’s hard when we can’t even agree on subdomains vs. subfolders.

I know subfolders work better. Sorry, couldn’t resist. Let the flaming comments commence.

To get clarity, take a deep breath and ask yourself:

“All other things being equal, will this change, tactic, or strategy move my site closer to perfect than my competitors?”

Breaking it down:

“Change, tactic, or strategy”

A change takes an existing component or policy and makes it something else. Replatforming is a massive change. Adding a new page is a smaller one. Adding ALT attributes to your images is another example. Changing the way your shopping cart works is yet another.

A tactic is a specific, executable practice. In SEO, that might be fixing broken links, optimizing ALT attributes, optimizing title tags or producing a specific piece of content.

A strategy is a broader decision that’ll cause change or drive tactics. A long-term content policy is the easiest example. Shifting away from asynchronous content and moving to server-generated content is another example.

“Perfect”

No one knows exactly what Google considers “perfect,” and “perfect” can’t really exist, but you can bet a perfect web page/site would have all of the following:

  1. Completely visible content that’s perfectly relevant to the audience and query
  2. A flawless user experience
  3. Instant load time
  4. Zero duplicate content
  5. Every page easily indexed and classified
  6. No mistakes, broken links, redirects or anything else generally yucky
  7. Zero reported problems or suggestions in each search engines’ webmaster tools, sorry, “Search Consoles”
  8. Complete authority through immaculate, organically-generated links

These 8 categories (and any of the other bazillion that probably exist) give you a way to break down “perfect” and help you focus on what’s really going to move you forward. These different areas may involve different facets of your organization.

Your IT team can work on load time and creating an error-free front- and back-end. Link building requires the time and effort of content and outreach teams.

Tactics for relevant, visible content and current best practices in UX are going to be more involved, requiring research and real study of your audience.

What you need and what resources you have are going to impact which tactics are most realistic for you.

But there’s a basic rule: If a website would make Googlebot swoon and present zero obstacles to users, it’s close to perfect.

“All other things being equal”

Assume every competing website is optimized exactly as well as yours.

Now ask: Will this [tactic, change or strategy] move you closer to perfect?

That’s the “all other things being equal” rule. And it’s an incredibly powerful rubric for evaluating potential changes before you act. Pretend you’re in a tie with your competitors. Will this one thing be the tiebreaker? Will it put you ahead? Or will it cause you to fall behind?

“Closer to perfect than my competitors”

Perfect is great, but unattainable. What you really need is to be just a little perfect-er.

Chasing perfect can be dangerous. Perfect is the enemy of the good (I love that quote. Hated Voltaire. But I love that quote). If you wait for the opportunity/resources to reach perfection, you’ll never do anything. And the only way to reduce distance from perfect is to execute.

Instead of aiming for pure perfection, aim for more perfect than your competitors. Beat them feature-by-feature, tactic-by-tactic. Implement strategy that supports long-term superiority.

Don’t slack off. But set priorities and measure your effort. If fixing server response codes will take one hour and fixing duplication will take ten, fix the response codes first. Both move you closer to perfect. Fixing response codes may not move the needle as much, but it’s a lot easier to do. Then move on to fixing duplicates.

Do the 60% that gets you a 90% improvement. Then move on to the next thing and do it again. When you’re done, get to work on that last 40%. Repeat as necessary.

Take advantage of quick wins. That gives you more time to focus on your bigger solutions.

Sites that are “fine” are pretty far from perfect

Google has lots of tweaks, tools and workarounds to help us mitigate sub-optimal sites:

  • Rel=canonical lets us guide Google past duplicate content rather than fix it
  • HTML snapshots let us reveal content that’s delivered using asynchronous content and JavaScript frameworks
  • We can use rel=next and prev to guide search bots through outrageously long pagination tunnels
  • And we can use rel=nofollow to hide spammy links and banners

Easy, right? All of these solutions may reduce distance from perfect (the search engines don’t guarantee it). But they don’t reduce it as much as fixing the problems.
Just fine does not equal fixed

The next time you set up rel=canonical, ask yourself:

“All other things being equal, will using rel=canonical to make up for duplication move my site closer to perfect than my competitors?”

Answer: Not if they’re using rel=canonical, too. You’re both using imperfect solutions that force search engines to crawl every page of your site, duplicates included. If you want to pass them on your way to perfect, you need to fix the duplicate content.

When you use Angular.js to deliver regular content pages, ask yourself:

“All other things being equal, will using HTML snapshots instead of actual, visible content move my site closer to perfect than my competitors?”

Answer: No. Just no. Not in your wildest, code-addled dreams. If I’m Google, which site will I prefer? The one that renders for me the same way it renders for users? Or the one that has to deliver two separate versions of every page?

When you spill banner ads all over your site, ask yourself…

You get the idea. Nofollow is better than follow, but banner pollution is still pretty dang far from perfect.

Mitigating SEO issues with search engine-specific tools is “fine.” But it’s far, far from perfect. If search engines are forced to choose, they’ll favor the site that just works.

Not just SEO

By the way, distance from perfect absolutely applies to other channels.

I’m focusing on SEO, but think of other Internet marketing disciplines. I hear stuff like “How fast should my site be?” (Faster than it is right now.) Or “I’ve heard you shouldn’t have any content below the fold.” (Maybe in 2001.) Or “I need background video on my home page!” (Why? Do you have a reason?) Or, my favorite: “What’s a good bounce rate?” (Zero is pretty awesome.)

And Internet marketing venues are working to measure distance from perfect. Pay-per-click marketing has the quality score: A codified financial reward applied for seeking distance from perfect in as many elements as possible of your advertising program.

Social media venues are aggressively building their own forms of graphing, scoring and ranking systems designed to separate the good from the bad.

Really, all marketing includes some measure of distance from perfect. But no channel is more influenced by it than SEO. Instead of arguing one rule at a time, ask yourself and your boss or client: Will this move us closer to perfect?

Hell, you might even please a customer or two.

One last note for all of the SEOs in the crowd. Before you start pointing out edge cases, consider this: We spend our days combing Google for embarrassing rankings issues. Every now and then, we find one, point, and start yelling “SEE! SEE!!!! THE GOOGLES MADE MISTAKES!!!!” Google’s got lots of issues. Screwing up the rankings isn’t one of them.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]