Why Effective, Modern SEO Requires Technical, Creative, and Strategic Thinking – Whiteboard Friday

Posted by randfish

There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.

There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.

Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”

Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.

First off, some of the good points he did bring up.

One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”

Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.

And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.

You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.

But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.

First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.

So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.

Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.

Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.

So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.

And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.

Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.

The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”

Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”

I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.

There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.

Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;

  • Good on-site experience
  • Writing good content
  • Getting others to acknowledge you as an authority
  • Rising in social popularity
  • Earning local relevance
  • Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)

The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.

I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.

But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.

I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.

We have to be able to understand things like;

  • Content rendering and indexability
  • Crawl structure, internal links, JavaScript, Ajax. If something’s post-loading after the page and Google’s not able to index it, or there are links that are accessible via JavaScript or Ajax, maybe Google can’t necessarily see those or isn’t crawling them as effectively, or is crawling them, but isn’t assigning them as much link weight as they might be assigning other stuff, and you’ve made it tough to link to them externally, and so they can’t crawl it.
  • Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?
  • Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.
  • Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.

Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”

Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.

  • International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?
  • Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?
  • Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.

Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.

I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.

  • Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.
  • Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.
  • Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.
  • Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.
  • Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.
  • Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.
  • Diagnosing mobile friendliness issues
  • Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.

Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.

So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.

All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Understanding and Applying Moz’s Spam Score Metric – Whiteboard Friday

Posted by randfish

This week, Moz released a new feature that we call Spam Score, which helps you analyze your link profile and weed out the spam (check out the blog post for more info). There have been some fantastic conversations about how it works and how it should (and shouldn’t) be used, and we wanted to clarify a few things to help you all make the best use of the tool.

In today’s Whiteboard Friday, Rand offers more detail on how the score is calculated, just what those spam flags are, and how we hope you’ll benefit from using it.

For reference, here’s a still of this week’s whiteboard. 

Click on the image above to open a high resolution version in a new tab!

Video transcription

Howdy Moz fans, and welcome to another edition of Whiteboard Friday. This week, we’re going to chat a little bit about Moz’s Spam Score. Now I don’t typically like to do Whiteboard Fridays specifically about a Moz project, especially when it’s something that’s in our toolset. But I’m making an exception because there have been so many questions and so much discussion around Spam Score and because I hope the methodology, the way we calculate things, the look at correlation and causation, when it comes to web spam, can be useful for everyone in the Moz community and everyone in the SEO community in addition to being helpful for understanding this specific tool and metric.

The 17-flag scoring system

I want to start by describing the 17 flag system. As you might know, Spam Score is shown as a score from 0 to 17. You either fire a flag or you don’t. Those 17 flags you can see a list of them on the blog post, and we’ll show that in there. Essentially, those flags correlate to the percentage of sites that we found with that count of flags, not those specific flags, just any count of those flags that were penalized or banned by Google. I’ll show you a little bit more in the methodology.

Basically, what this means is for sites that had 0 spam flags, none of the 17 flags that we had fired, that actually meant that 99.5% of those sites were not penalized or banned, on average, in our analysis and 0.5% were. At 3 flags, 4.2% of those sites, that’s actually still a huge number. That’s probably in the millions of domains or subdomains that Google has potentially still banned. All the way down here with 11 flags, it’s 87.3% that we did find banned. That seems pretty risky or penalized. It seems pretty risky. But 12.7% of those is still a very big number, again probably in the hundreds of thousands of unique websites that are not banned but still have these flags.

If you’re looking at a specific subdomain and you’re saying, “Hey, gosh, this only has 3 flags or 4 flags on it, but it’s clearly been penalized by Google, Moz’s score must be wrong,” no, that’s pretty comfortable. That should fit right into those kinds of numbers. Same thing down here. If you see a site that is not penalized but has a number of flags, that’s potentially an indication that you’re in that percentage of sites that we found not to be penalized.

So this is an indication of percentile risk, not a “this is absolutely spam” or “this is absolutely not spam.” The only caveat is anything with, I think, more than 13 flags, we found 100% of those to have been penalized or banned. Maybe you’ll find an odd outlier or two. Probably you won’t.

Correlation ≠ causation

Correlation is not causation. This is something we repeat all the time here at Moz and in the SEO community. We do a lot of correlation studies around these things. I think people understand those very well in the fields of social media and in marketing in general. Certainly in psychology and electoral voting and election polling results, people understand those correlations. But for some reason in SEO we sometimes get hung up on this.

I want to be clear. Spam flags and the count of spam flags correlates with sites we saw Google penalize. That doesn’t mean that any of the flags or combinations of flags actually cause the penalty. It could be that the things that are flags are not actually connected to the reasons Google might penalize something at all. Those could be totally disconnected.

We are not trying to say with the 17 flags these are causes for concern or you need to fix these. We are merely saying this feature existed on this website when we crawled it, or it had this feature, maybe it still has this feature. Therefore, we saw this count of these features that correlates to this percentile number, so we’re giving you that number. That’s all that the score intends to say. That’s all it’s trying to show. It’s trying to be very transparent about that. It’s not trying to say you need to fix these.

A lot of flags and features that are measured are perfectly fine things to have on a website, like no social accounts or email links. That’s a totally reasonable thing to have, but it is a flag because we saw it correlate. A number in your domain name, I think it’s fine if you want to have a number in your domain name. There’s plenty of good domains that have a numerical character in them. That’s cool.

TLD extension that happens to be used by lots of spammers, like a .info or a .cc or a number of other ones, that’s also totally reasonable. Just because lots of spammers happen to use those TLD extensions doesn’t mean you are necessarily spam because you use one.

Or low link diversity. Maybe you’re a relatively new site. Maybe your niche is very small, so the number of folks who point to your site tends to be small, and lots of the sites that organically naturally link to you editorially happen to link to you from many of their pages, and there’s not a ton of them. That will lead to low link diversity, which is a flag, but it isn’t always necessarily a bad thing. It might still nudge you to try and get some more links because that will probably help you, but that doesn’t mean you are spammy. It just means you fired a flag that correlated with a spam percentile.

The methodology we use

The methodology that we use, for those who are curious — and I do think this is a methodology that might be interesting to potentially apply in other places — is we brainstormed a large list of potential flags, a huge number. We cut that down to the ones we could actually do, because there were some that were just unfeasible for our technology team, our engineering team to do.

Then, we got a huge list, many hundreds of thousands of sites that were penalized or banned. When we say banned or penalized, what we mean is they didn’t rank on page one for either their own domain name or their own brand name, the thing between the
www and the .com or .net or .info or whatever it was. If you didn’t rank for either your full domain name, www and the .com or Moz, that would mean we said, “Hey, you’re penalized or banned.”

Now you might say, “Hey, Rand, there are probably some sites that don’t rank on page one for their own brand name or their own domain name, but aren’t actually penalized or banned.” I agree. That’s a very small number. Statistically speaking, it probably is not going to be impactful on this data set. Therefore, we didn’t have to control for that. We ended up not controlling for that.

Then we found which of the features that we ideated, brainstormed, actually correlated with the penalties and bans, and we created the 17 flags that you see in the product today. There are lots things that I thought were going to correlate, for example spammy-looking anchor text or poison keywords on the page, like Viagra, Cialis, Texas Hold’em online, pornography. Those things, not all of them anyway turned out to correlate well, and so they didn’t make it into the 17 flags list. I hope over time we’ll add more flags. That’s how things worked out.

How to apply the Spam Score metric

When you’re applying Spam Score, I think there are a few important things to think about. Just like domain authority, or page authority, or a metric from Majestic, or a metric from Google, or any other kind of metric that you might come up with, you should add it to your toolbox and to your metrics where you find it useful. I think playing around with spam, experimenting with it is a great thing. If you don’t find it useful, just ignore it. It doesn’t actually hurt your website. It’s not like this information goes to Google or anything like that. They have way more sophisticated stuff to figure out things on their end.

Do not just disavow everything with seven or more flags, or eight or more flags, or nine or more flags. I think that we use the color coding to indicate 0% to 10% of these flag counts were penalized or banned, 10% to 50% were penalized or banned, or 50% or above were penalized or banned. That’s why you see the green, orange, red. But you should use the count and line that up with the percentile. We do show that inside the tool as well.

Don’t just take everything and disavow it all. That can get you into serious trouble. Remember what happened with Cyrus. Cyrus Shepard, Moz’s head of content and SEO, he disavowed all the backlinks to its site. It took more than a year for him to rank for anything again. Google almost treated it like he was banned, not completely, but they seriously took away all of his link power and didn’t let him back in, even though he changed the disavow file and all that.

Be very careful submitting disavow files. You can hurt yourself tremendously. The reason we offer it in disavow format is because many of the folks in our customer testing said that’s how they wanted it so they could copy and paste, so they could easily review, so they could get it in that format and put it into their already existing disavow file. But you should not do that. You’ll see a bunch of warnings if you try and generate a disavow file. You even have to edit your disavow file before you can submit it to Google, because we want to be that careful that you don’t go and submit.

You should expect the Spam Score accuracy. If you’re doing spam investigation, you’re probably looking at spammier sites. If you’re looking at a random hundred sites, you should expect that the flags would correlate with the percentages. If I look at a random hundred 4 flag Spam Score sites, 7.5% of those I would expect on average to be penalized or banned. If you are therefore seeing sites that don’t fit those, they probably fit into the percentiles that were not penalized, or up here were penalized, down here weren’t penalized, that kind of thing.

Hopefully, you find Spam Score useful and interesting and you add it to your toolbox. We would love to hear from you on iterations and ideas that you’ve got for what we can do in the future, where else you’d like to see it, and where you’re finding it useful/not useful. That would be great.

Hopefully, you’ve enjoyed this edition of Whiteboard Friday and will join us again next week. Thanks so much. Take care.

Video transcription by Speechpad.com

ADDITION FROM RAND: I also urge folks to check out Marie Haynes’ excellent Start-to-Finish Guide to Using Google’s Disavow Tool. We’re going to update the feature to link to that as well.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

My Favorite 5 Analytics Dashboards – Whiteboard Friday

Posted by Sixthman

Finding effective ways of organizing your analytics dashboards is quite a bit easier if you can get a sense for what has worked for others. To that end, in today’s Whiteboard Friday the founder of Sixth Man Marketing, Ed Reese, shares his five favorite approaches.

UPDATE: At the request of several commenters, Ed has generously provided GA templates for these dashboards. Check out the links in his comment below!

For reference, here’s a still of this week’s whiteboard!

Video transcription

Hi, I’m Ed Reese with Sixth Man Marketing and Local U. Welcome to this edition of Whiteboard Friday. Today we’re going to talk about one of my favorite things in terms of Google Analytics — the dashboard.

So think of your dashboard like the dashboard on your car — what’s important to you and what’s important to your client. I have the new Tesla dashboard, you might recognize it. So, for my Tesla dashboard, I want navigation, tunes, calendar, everything and a bag of chips. You notice my hands are not on the wheel because it drives itself now. Awesome.

So, what’s important? I have the top five dashboards that I like to share with my clients and create for them. These are the executive dashboards — one for the CMO on the marketing side, new markets, content, and a tech check. You can actually create dashboards and make sure that everything is working.

These on the side are some of the few that I think people don’t take a look at as often. It’s my opinion that we have a lot of very generic dashboards, so I like to really dive in and see what we can learn so that your client can really start using them for their advantage.

#1 – Executives

Let’s start with the executive dashboard. There is a lot of debate on whether or not to go from left to right or right to left. So in terms of outcome, behavior, and acquisition, Google Analytics gives you those areas. They don’t mark them as these three categories, but I follow Avinash’s language and the language that GA uses.

When you’re talking to executives or CFOs, it’s my personal opinion that executives always want to see the money first. So focus on financials, conversion rates, number of sales, number of leads. They don’t want to go through the marketing first and then get to the numbers. Just give them what they want. On a dashboard, they’re seeing that first.

So let’s start with the result and then go back to behavior. Now, this is where a lot of people have very generic metrics — pages viewed, generic bounce rate, very broad metrics. To really dive in, I like focusing and using the filters to go to specific areas on the site. So if it’s a destination like a hotel, “Oh, are they viewing the pages that helped them get there? Are they looking at the directional information? Are they viewing discounts and sorts of packages?” Think of the behavior on those types of pages you want to measure, and then reverse engineer. That way you can tell they executive, “Hey, this hotel reservation viewed these packages, which came from these sources, campaigns, search, and social.” Remember, you’re building it so that they can view it for themselves and really take advantage and see, “Oh, that’s working, and this campaign from this source had these behaviors that generated a reservation,” in that example.

#2 – CMO

Now, let’s look at it from a marketing perspective. You want to help make them look awesome. So I like to reverse it and start with the marketing side in terms of acquisition, then go to behavior on the website, and then end up with the same financials — money, conversion rate percentages, number of leads, number of hotel rooms booked, etc. I like to get really, really focused.

So when you’re building a dashboard for a CMO or anyone on the marketing side, talk to them about what metrics matter. What do they really want to learn? A lot of times you need to know their exact territory and really fine tune it in to figure out exactly what they want to find out.

Again, I’m a huge fan of filters. What behavior matters? So for example, one of our clients is Beardbrand. They sell beard oil and they support the Urban Beardsman. We know that their main markets are New York, Texas, California, and the Pacific Northwest. So we could have a very broad regional focus for acquisition, but we don’t. We know where their audience lives, we know what type of behavior they like, and ultimately what type of behavior on the website influences purchases.

So really think from a marketing perspective, “How do we want to measure the acquisition to the behavior on the website and ultimately what does that create?”

These are pretty common, so I think most people are using a marketing and executive dashboard. Here are some that have really made a huge difference for clients of ours.

#3 – New markets

Love new market dashboards. Let’s say, for example, you’re a hotel chain and you normally have people visiting your site from Washington, Oregon, Idaho, and Montana. Well, what happened in our case, we had that excluded, and we were looking at states broader — Hawaii, Alaska, Colorado, Texas. Not normally people who would come to this particular hotel.

Well, we discovered in the dashboard — and it was actually the client that discovered it — that we suddenly had a 6000% increase in Hawaii. They called me and said, “Are we marketing to Hawaii?” I said no. They said, “Well, according to the dashboard, we’ve had 193 room nights in the past 2 months.” Like, “Wow, 193 room nights from Hawaii, what happened?” So we started reverse engineering that, and we found out that Allegiant Airlines suddenly had a direct flight from Honolulu to Spokane, and the hotel in this case was two miles from the hotel. They could then do paid search campaigns in Hawaii. They can try to connect with Allegiant to co-op some advertising and some messaging. Boom. Would never have been discovered without that dashboard.

#4 – Top content

Another example, top content. Again, going back to Beardbrand, they have a site called the Urban Beardsman, and they publish a lot of content for help and videos and tutorials. To measure that content, it’s really important, because they’re putting a lot of work into educating their market and new people who are growing beards and using their product. They want to know, “Is it worth it?” They’re hiring photographers, they’re hiring writers, and we’re able to see if people are reading the content they’re providing, and then ultimately, we’re focusing much more on their content on the behavior side and then figuring out what that outcome is.

A lot of people have content or viewing of the blog as part of an overall dashboard, let’s say for your CMO. I’m a big fan of, in addition to having that ,also having a very specific content dashboard so you can see your top blogs. Whatever content you provide, I want you to always know what that’s driving on your website.

#5 – Tech check

One of the things that I’ve never heard anyone talk about before, that we use all the time, is a tech check. So we want to see a setup so we can view mobile, tablet, desktop, browsers. What are your gaps? Where is your site possibly not being used to its fullest potential? Are there any issues with shopping carts? Where do they fall off on your website? Set up any possible tech that you can track. I’m a big fan of looking both on the mobile, tablet, any type of desktop, browsers especially to see where they’re falling off. For a lot of our clients, we’ll have two, three, or four different tech dashboards. Get them to the technical person on the client side so they can immediately see if there’s an issue. If they’ve updated the website, but maybe they forgot to update a certain portion of it, they’ve got a technical issue, and the dashboard can help detect that.

So these are just a few. I’m a huge fan of dashboards. They’re very powerful. But the big key is to make sure that not only you, but your client understands how to use them, and they use them on a regular basis.

I hope that’s been very helpful. Again, I’m Ed Reese, and these are my top five dashboards. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Computer Technology, create free 2500+ back link

This video prepares individuals to apply basic engineering principles and technical skills in support of professionals who use video tutorial computer system…

Reblogged 4 years ago from www.youtube.com

Developing Innovative Content: What You Need to Know

Posted by richardbaxterseo

A few weeks ago, I attended a breakfast meeting with a bunch of entrepreneurs in the technology, space (yes, space travel), software and engineering industry. I felt so blown away by the incredible talent of the speakers. You know, there are people out there building things, like private satellite networks, bio printing facilities, quantum computers and self-driving cars. I was completely transfixed by the incredibly future facing, innovative and exceptionally inventive group in front of me. I also immediately wished I’d worked a little harder in my twenties.

After the presentations, one of the questions that came up during the Q&A session was: “what’s the next big thing?”

Wow. Have you ever thought about “the next big thing”?

Part of the magic of predicting innovation is that it’s really, really hard to get right. Those that can accurately predict the future (in my humble opinion) are those that tend to understand how people will respond to an idea once they’re exposed to it. I think predicting this is a very special skill indeed.

Then again, we’re expected to be able to predict the outcome of our marketing, all the time. While predicting it is one thing, making it happen it is a whole different ball game.

Competition for the attention of our customers is getting tougher

In our industry, when you really boil down what it is we do, we’re fixing things, making things, or we’re communicating things.

Most of the time, we’re building content that communicates: ideas, stories, news and guidance–you get the idea. The problem is, no matter which vertical you work in, we’re all competing for something: the attention of our customers.

As our customers get smarter, that competition is getting tougher and tougher.

The most successful marketers in our industry all have a special trait in common. They are good at finding new ways to communicate ideas. Take a look at classic presentations
like this from Ross Hudgens to see just how powerful it can be to observe, imitate and develop an idea with astounding viral reach.

I particularly enjoy the idea of taking a piece of content and making improvements, be it through design, layout or simply updating what’s there. I like it because it’s actually pretty easy to do, and there’s growing evidence of it happening all over the Internet. Brands are taking a second look at how they’re developing their content to appeal to a wider audience, or to appeal to a viral audience (or both!).

For example; take a look at this beautiful
travel guide to Vietnam (credit: travelindochina.com) or this long form guide to commercial property insurance (credit: Towergate Insurance / Builtvisible.com) for examples of brands in competitive verticals developing their existing content. In verticals where ordinary article content has been done to death, redeveloping the medium itself feels like an important next step.

Innovative isn’t the same thing as technical

I’ve felt for a long time that there’s a conflict between our interpretation of “innovative” and “technical”. As I’ve written before, those that really understand how the web works are at a huge advantage.
Learn how it’s built, and you’ll find yourself able to make great things happen on your own, simply by learning and experimenting.

In my opinion though, you don’t have to be able to learn how to build your own site or be a developer. All you have to do is learn the vocabulary and build a broad understanding of how things work in a browser. I actually think we all need to be doing this, right now. Why?

We need more innovation in content marketing

I think our future depends on our industry’s ability to innovate. Of course, you still need to have your basics in place. We’ll always be
T-Shaped marketers, executing a bit of technical SEO here, a bit of content strategy there. But, we’re all SEOs and we know we need to acquire links, build audiences and generally think big about our ambitions. When your goal is to attract new followers, fans, links, and garner shares in their thousands, you need to do something pretty exciting to attract attention to yourself.

The vocabulary of content development

I’ve designed this post to be a primer on more advanced features found in innovative content development. My original MozCon 2014 presentation was designed to educate on some of the technologies we should be aware of in our content development projects and the process we follow to build things. We’ll save process for another post (shout in the comments if you think that would be useful!) and focus on the “what” for now.

At Builtvisible, we’re working hard on extending our in-house content development capabilities. We learn through sharing amazing examples with each other. Our policy is to always attempt to deconstruct how something might have been developed, that way, we’re learning. Some of the things we see on the web are amazing–they deserve so much respect for the talent and the skills that surface the content.

Here are some examples that I think demonstrate some of the most useful types of approach for content marketers. I hope that these help as much as they’ve helped us, and I hope you can form a perspective of what innovative features look like in more advanced content development. Of course, do feel welcome to share your own examples in the comments, too! The more, the merrier!

The story of EBoy

eBoy: the graphic design firm whose three co-founders and sole members are widely regarded as the “godfathers” of pixel art.

The consistent styling (as well as the beautifully written content) is excellent. Technically speaking, perhaps the most clever and elegant feature is the zoom of the image positioned on the Z axis in a <canvas> container (more on this in a moment).

An event listener (jQuery) helps size the canvas appropriate to the browser window size and the z axis position shifts on scroll to create an elegant zoom effect.


View the example here:

http://www.theverge.com/2014/6/17/5803850/pixel-perfect-the-story-of-eboy.

<canvas> is an HTML element which can be used to draw graphics using scripting (usually JavaScript). This can, for instance, be used to draw graphs, make photo composition or simple animations.

Colorizing the past

Take a look at
Pixart Printing’s Guide to Colourizing the Past (credit: Pixartprinting / Builtvisible.com) for a clever example of <canvas> in use. Here’s one of the images (tip, mouse-over and click the image):

The colorization feature takes advantage of the power of the canvas element. In this case, the color version of the image is applied to the canvas as a background image, with the black and white version on a layer above. Clicking (or touching, on mobile) erases portions of the top image, revealing the color version underneath.

Chrome Experiments: Globe

Globe is “simple” global data visualization of the Earth’s population growth over a set range of dates. The 3d visualization based in
webGL: a JavaScript API for rendering interactive 3D graphics and 2D graphics within any compatible web browser without the use of plug-ins.


View the example here:

http://globe.chromeexperiments.com/.

WebGL is a really exciting, emerging option available to content marketers who might want to experiment with immersive experiences or highly interactive, simulated environments.

Some of my
favourite WebGL examples include Hello Racer and Tweetopia, a 3d Twitter Hastag visualizer.

If you’d like to see more examples of webGL in action, take a look at
Chrome Experiments. Don’t worry, this stuff works in the latest versions of Firefox and IE, too.

Polygon’s PS4 Review

You might have seen me cover this long form concept over at Builtvisible. Polygon’s Playstation 4 review is a fully featured “long form” review of Sony’s much loved gaming machine. The bit that I love is the SVG visualizations:

“What’s SVG?”, I hear you ask!

SVG is super-fast, sharp rendering of vector images inside the browser. Unlike image files (like .jpg, .gif, .png), SVG is XML based, light on file size, loads quickly and adjusts to responsive browser widths perfectly. SVG’s XML based schema lends itself to some interesting manipulation for stunning, easy to implement effects.

View Polygon’s example here: http://www.polygon.com/a/ps4-review

That line tracing animation you see is known as
path animation. Essentially the path attribute in the SVG’s XML can be manipulated in the DOM with a little jQuery. What you’ll get is a pretty snazzy animation to keep your users eyes fixated on your content and yet another nice little effect to keep eyeballs engaged.

My favourite example of SVG execution is Lewis Lehe’s
Gridlocks and Bottlenecks. Gridlocks is a AngularJS, d3.js based visualization of the surprisingly technical and oft-misunderstood “gridlock” and “bottleneck” events in road traffic management.

It’s also very cool:

View the example here:http://setosa.io/blog/2014/09/02/gridlock/.

I have a short vocabulary list that I expect our team to be able to explain (certainly these questions come up in an interview with us!). I think that if you can explain what these things are, as a developing content marketer you’re way ahead of the curve:

  • HTML5
  • Responsive CSS (& libraries)
  • CSS3 (& frameworks)
  • JavaScript (& frameworks: jQuery, MooTools, Jade, Handlebars)
  • JSON (api post and response data)
  • webGL
  • HTML5 audio & video
  • SVG
  • HTML5 History API manipulation with pushState
  • Infinite Scroll

Want to learn more?

I’ve
amassed a series of videos on web development that I think marketers should watch. Not necessarily to learn web development, but definitely to be able to describe what it is you’d like your own content to do. My favourite: I really loved Wes Bos’s JS + HTML5 Video + Canvas tutorial. Amazing.

Innovation in content is such a huge topic but I realize I’ve run out of space (this is already a 1,400 word post) for now.

In my follow up, I’d like to talk about how to plan your content when it’s a little more extensive than just an article, give you some tips on how to work with (or find!) a developer, and how to make the most of every component in your content to get the most from your marketing efforts.

Until then, I’d love to see your own examples of great content and questions in the comments!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from feedproxy.google.com