Information Architecture for SEO – Whiteboard Friday

Posted by randfish

It wasn’t too long ago that there was significant tension between information architects and SEOs; one group wanted to make things easier for humans, the other for search engines. That line is largely disappearing, and there are several best practices in IA that can lead to great benefits in search. In today’s Whiteboard Friday, Rand explains what they are and how we can benefit from them.

For reference, here’s a still of this week’s whiteboard!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat a little bit about information architecture, and specifically how you can organize the content of your website in such a fashion to make information architecture help your SEO and your rankings and how search engines interpret your pages and the links between those.

I want to start by talking broadly about IA and the interaction with SEO. IA is designed to say, “Hey, we want to help web users accomplish their goals on the website quickly and easily.” There are many more broad things around that, but basically that’s the concept.

This actually is not in conflict at all, should almost never be in conflict, even a little bit, with the goals that we have around SEO. In the past, this was not always true, and unfortunately in the past some mythology got created around the things that we have to worry about that could conflict between SEO and information architecture.

Here we’ve got a page that’s optimal for IA, and it’s got this top navigation and left side navigation, some footers, maybe a big image at the front and some text. Great, fine. Then, we have this other version that I’m not going to call it optimal for SEO, because it’s actually not optimal for SEO. It is instead SEO to the max! “At the Tacoma Dome this Sunday, Sunday, Sunday!”

The problem is this is kind of taking SEO much too far. It’s no longer SEO, it’s SE . . . I don’t know, ridiculousness.

The idea would be things like we know that keyword rich anchors are important, and linking internally we want to be descriptive. We know that as people use those terms and links other places on the web, that might help our rankings. So instead of making the navigation obvious for users, we’re going to make it keyword stuffed for SEO. This makes no sense anymore, as I’m sure, hopefully, all of you know.

Text high up on the page, this actually does mean something. It used to mean a little more than it does. So maybe we’re going to take oh, yeah, we want to have that leader image right up at the top because that grabs people’s attention, and the headline flows nicely into that image. But for SEO purposes, we want the text to be even higher. That doesn’t make any sense either.

Even if there is some part of Google’s algorithm, Bing’s algorithm, or Baidu’s algorithm, that says, “Oh, text higher up on the page is a teensy little spattering more meaningful,” this is totally overwhelmed and dwarfed by the fact that SEO today cares a ton about engagement. If people come to this page and are less engaged, are more likely to click the Back button, are less likely to stay here and consume the content and link to it and share it and all these kinds of things, it’s going to lose out even to the slightly less optimized version of the page over here, which really does grab people’s attention.

If your IA folks and your usability folks and your testing is showing you that that leader image up top there is grabbing people’s attention and is working, don’t break it by saying, “Oh, but SEO demands content higher on the page.”

Likewise, if you have something where you say, “Hey, in order to flow or sculpt the link equity around these things, we don’t want to link to this page and this page. We do want to link to these things. We want make sure that we’ve got a very keyword heavy and link heavy footer so that we can point to all the places we need to point to, even though they’re not really for users. It’s mostly for engines. Also, BS. One of the things that modern engines are doing is they’re kind of looking and saying, “Hey, if no one uses these links to navigate internally on a site, we’re not going to take them into consideration from a ranking perspective either.”

They have lots of modeling and machine learning and algorithmic ways to do that, but basic story is make links for users that search engines will also care about, because that’s the only thing that search engines really do want to care about. So IA and SEO, shouldn’t be in conflict.

Important information architecture best practices

Now that we know this, we can move on to some important IA best practices, generally speaking IA best practices that are also SEO best practices and that most of the time, 99.99% of the time work really well together.

1. Broad-to-narrow organization

The first one, in general, it’s the case that you want to do broad to narrow organization of your content. I’ll show you what I mean.

Let’s say that I’ve got a website about adorable animals, a particularly fun one this week, and on my adorable animals page I’ve got some subsections, sub-pages, one on the slow loris, which of course is super adorable, and hedgehogs, also super adorable. Then getting even more detailed from there, I have particular pages on hedgehogs in military uniforms — that page is probably going to bring down the Internet because it will be so popular — and hedgehogs wearing ridiculous hats. These are two sub-pages of my hedgehog page. My hedgehog page, subset of my adorable animals page.

This is generally speaking how I want to do things. I probably would not want to organize, at least from the top level down in my actual architecture for my site, I probably wouldn’t want to say adorable animals and here’s a list of hedgehogs in military uniforms, a list of hedgehogs wearing ridiculous hats, a list of slow loris licking itself. No. I want to have that organization of broad to more narrow to more narrow.

This makes general sense. By the way, for SEO purposes it does help if I link back and forth one level in each case. So for my hedgehog page, I do want to link down to my hedgehogs in military uniforms page, and I also want to link up to my adorable animals page.

You don’t have to do it with exactly these keyword anchor text phrases, that kind of stuff. Just make sure that you are linking. If you want, you can use breadcrumbs. Breadcrumbs are very kind of old-fashioned, been around since the late ’90s, sort of style system for showing off links, and that can work really well for some websites. It doesn’t have to be the only way things can work though.

2. Link to evergreen pages from fresh content

When you’re publishing fresh content is when I think many SEOs get into a lot of trouble. They’re like, “Well, I have a blog that does all this, but then I have the regular parts of my site that have all of my content or my product pages or my detailed descriptions. How do I make these two things work together?”

This has actually become much easier but different in the last five or six years. It used to be the case that we would talk, in the SEO world, about not having keyword cannibalization, meaning if I’ve got an adorable animals page in my main section of my website, I don’t actually want to publish a blog post called “New Adorable Animals to Add to My Collection,” because now I’m competing with myself and I’m diluting my link juice.

Actually, this has gotten way easier. Google, and Bing as well, have become much more intelligent about identifying what’s new content, what’s old, sort of evergreen content, and they’ll promote one. You even sometimes have an opportunity to get both in there. Certainly if you’re posting fresh content that gets into Google news, the blog or the news section can be an opportunity to get in Google news. The old one can be an opportunity to just stay in the search results for a long time period. Get ting links to one doesn’t actually dilute your ranking ability for the other because of how Google is doing much more topic focused associations around entire websites.

So this can be actually a really good thing. However, that being said, you do still want to try and link back to the most relevant, evergreen kind of original page. If I publish a new blog post that has some aggregation of hedgehogs in military uniforms from the Swiss Naval Academy — I don’t know why Switzerland would have a navy since they’re landlocked — I would probably want to take that hedgehogs in Swiss military uniforms and link back to my original one here.

I wouldn’t necessarily want to do the same thing and link over here, unless I decide, hey, a lot of people who are interested in this are going to want to check out this article too, in which case it’s fine to do that.

I would worry a little bit that sometimes people bias to quantity over quality of links internally when they’re publishing their blog content or publishing these detail pages and they think, “Oh, I need to link to everything that’s possibly relevant.” I wouldn’t do that. I would actually link to the things that you are most certain that a high number, a high percent of the users who are enjoying or visiting or consuming one page, one piece of information are really going to want in their journey. If you don’t have that confidence, I wouldn’t necessarily put them in there. I wouldn’t try and stack those up with tons of extra links.

Like I said, you don’t need to worry about keyword cannibalization. If you want to publish a new article every week about hedgehogs in military uniforms, you go for it. That’s a great blog.

3. Make sub-pages if intent is unique, combine if not

Number three, and the last one here, make these sub-pages when there’s unique intent. Information architecture is actually really good about this in practice. They basically say, “Hey, why would we create a new page if we already have a page that serves the same goals and same intent?” One of the reasons that people used to say, “Well, I know that we have that, but it doesn’t do a great job of targeting phrase A and phrase B, which both have the same intent but aren’t going to rank for those two separate phrases A and B.”

That’s also not the case anymore in the SEO world. Google and Bing have both become incredibly good at sorting out searcher intent and matching those to the pages and the keywords that fit those intents, even if the keyword match isn’t perfect one-to-one exact.

So if I’ve got a page that’s on slow lorises yawning and another one on slow lorises that are sleepy, are those really all that different? Is the intent of the searcher very different? When someone is searching for a sleepy loris, are they looking for one that’s probably yawning? Yeah. You know what? I would say these are the same intent. I would make a single page for them.

However, over here I’ve got a slow loris in a sombrero and a slow loris wearing a top hat. Now, these are two very different kinds of head wear, and people who are searching for sombreros are not going to want to find a slow loris wearing a top hat. They might want to see a cross link over between them. They might say, “Oh, top hat wearing slow lorises are also interesting to me.” But this is very specific intent, different from this one. Two different intents means two different pages.

That’s how I do all of my information architecture when it comes to a keyword and SEO perspective. You want to go broad to narrow. You want to not worry too much about publishing fresh content, but you do want to link back to the original evergreen. You want to make sure that if there are pages or intents that are exactly the same, you make a single page. If they’re intents that are different, you have different pages targeting those different intents.

All right everyone, look forward to the comments, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

How Can the Value of Top-of-Funnel Channels be Measured – Whiteboard Friday

Posted by randfish

Rand has talked many times about what he calls “serendipitous marketing,” where the work we do at the top of the funnel can take winding and often unexpected paths to conversions. One of the most common questions about content marketing, public relations, and other top-of-funnel efforts is how to prove their value. 

In today’s Whiteboard Friday, Rand offers up three ways you can attempt those measurements, along with a bit of perspective you can bring to your clients and higher-ups.

For reference, here’s a still of this week’s whiteboard!

Video transcription

Howdy, Moz Fans, and welcome to another edition of Whiteboard Friday. This week we’re going to talk about the value of top of the funnel demand creation, sorts of channels and tactics, and how you can actually measure the value behind them.

I’m guilty of doing something. I’m going to own up to it. A lot of the time when I talk about these kinds of tactics, stuff that sits at the very top of the funnel that creates that demand or interest in your potential target market, I call them serendipitous and unmeasurable channels. It is true that many of them are very serendipitous, but it’s not entirely true that they’re completely unmeasurable. They’re just very, very hard to measure, but not impossible.

So today I’m going to walk you through that, not because I actually expect you to go and try and do this with every one of those serendipitous, hard to measure channels, but because I think you need to, as a marketer, have this in your toolbox and in your knowledge kit so that when your CMO, your boss, your client, your manager, your team says, “Hey how do we know that xyz is producing returns,” you can say, “Actually, we don’t know that.” Or, “We proved it once, and we have the data from then. We continue to believe that it will drive investment. But here’s how tough it is to measure, and this is why we continue to invest in it and believe in it as a channel even though we don’t have the proof.”

So bear with me for a second. You’ve got your classic marketing funnel. Top of funnel stuff is like creating that awareness of the issue, the problem, the challenge, your industry. Your middle of the funnel is where you’re showing off your solution. The bottom of the funnel is usually where you’re convincing folks to convert and then trying to retain people. So this is fairly simplistic. Most marketers are familiar with it.

The stuff that fits into this creating awareness bucket, that very top of funnel demand creation stuff, those are things like: public relations, getting in news and media and press coverage; a lot of social media engagement, especially social media that is not directly tied to either supporting your product or pushing your product is in that bucket; a lot of conferences, events, trade shows, booths; certainly all those coffee and beer meetings that you might have with people in your field, people outside of your field, and people who are curious; a lot of those serendipitous meetings.

Anything that it fits into what we call top of funnel, which I actually like the shortened acronym there TOFU, TOFU content marketing. Much of the content that content marketers invested in and create is designed to be kind of above the funnel, before people are actually interested in your product or solution. Actually, this includes a lot of things that are brand advertising focused, that are just creating awareness of who you are as a company and that you exist, without specifically talking about the problem folks are facing or your solution to that problem.

So proving the value of this stuff is insanely hard. Let’s use public relations as an example. The classic yard stick that PR professionals have traditionally reported on are number of stories and the quality of those stories and pieces, and where they’ve been published. That’s a lot like in the SEO world reporting rankings and traffic. They’re very high level metrics. They’re sort of interesting to know. But then you have to have the belief that they connect up, that the rankings and the traffic are going to connect up to conversions, or that getting all those print pieces on the web, getting those links, or whatever is going to convert.

This is tough. The way to prove the value of this is you basically have these three options. You can segment, meaning that you segment by something like an industry vertical, by the demographics of your target, pr by geography. I’ll give you an example of this.

So Moz might say, “Hey, we really think that among urban professionals in the technical marketing fields, that is who we’re going to bias all of our public relations efforts to over the next year.” So we’re going to tell our PR firm, our in-house PR person, “Hey, that’s what we want you to focus on. Get us the publications that are relevant to those folks, that are read by them on and off the Web. That’s where we want to be.”

This is interesting, because it means that we can then in the future actually go and measure like, “Well yeah, we had this kind of a result with that particular group that we targeted with PR.” We had this much lower result with this other group that we didn’t target with PR, that we could the next quarter or the next year. This is one way of doing it.

Geography actually is the most common way that I see a lot of startups and technology companies doing this. They basically focus all their efforts around a particular city or a particular state or region, sometimes even a country, and they’ll do this.

At one point, I actually did run a split test using Sweden and Norway, which were places where I visited several people from Moz over the course of a couple years, spoke at some conferences and events, and then we looked at our traffic from those countries, our coverage in those countries, our links from those countries, and eventually our conversions from those countries. We did see a lift, kind of suggesting to us that maybe there was some value in those conferences.

Number two, the second way to do this is you can invest in a channel or tactic for only one of your product lines. If we’re at Moz, we’re going to say, “Hey, you know what? We’re going to do a lot of public relations for Followerwonk specifically, but we are not going to do it for our SEO products. We’re not going to do it for Moz Local. But let’s see how that goes.” This is another sort of segmentation tactic and can be effective. If you see that it works very well for one particular product, you might try repeating it for others.

Then the third one is that you can invest for a limited period of time. Now what’s sad is this one is kind of the most common, but also the worst by far. The reason it’s the worst by far, at least usually, is because most of the work that goes into any of these types of channels, think about it, press and PR, or a coffee and a beer meeting, or going to conferences and events, oftentimes takes a long time to show its value. It builds upon itself. So if I’m doing lots of in-person meetings, some of those will filter back and build on themselves. If you hear about Moz from one or two people in Seattle, well, okay, that’s one signal. If you hear about it from 10, that’s another thing. That might have a different kind of impact on how our brand gets out there.

So this time period stuff I really don’t recommend and usually don’t like. There are cases where it can be okay.

In all three of these, though, what makes it so incredibly challenging is that we have to be able to observe a number of metrics and then try and take the segments that we’re supposed to be looking at, whether that’s time or a product or a vertical or geography, and we want to observe metrics like traffic. We might try to look at mentions, especially for PR and branding focused stuff. We might look at links. We might look at conversion rate and total conversions. Then we have to try and control for every other thing that we’re doing in our marketing that might or might not have affected those metrics as they apply to these channels.

This is why honestly that control bit is so hard. Who’s to say whether public relations are really because we did a big PR effort and we talked to a lot of folks? Or is it because our products got a lot better, customers started buzzing about us, and the industry was turning our way anyway? We would have gotten 50% of those mentions even if we hadn’t invested in PR. I don’t know.

This is why a lot of the time with these forms of marketing, my bias is to say, “You know what? You need to use your educated opinion, and you need to believe in and invest in the quantity of serendipity that you believe you can afford or that you can’t afford not to do, rather than trying to perfectly measure the value that you’re getting out of these.”

It’s possible, but it is tremendously challenging. These are some ways that you can try it if you’d like to. I’d love to hear from all of you in the comments, especially if you’ve invested in this type of stuff in the past or if you have other ways of valuing, of figuring out, and of convincing your managers, your clients, your bosses, your teams to go put some dollars and energy behind these.

All right everyone, we’ll see you next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Experiment: We Removed a Major Website from Google Search, for Science!

Posted by Cyrus-Shepard

The folks at Groupon surprised us earlier this summer when they reported the
results of an experiment that showed that up to 60% of direct traffic is organic.

In order to accomplish this, Groupon de-indexed their site, effectively removing themselves from Google search results. That’s crazy talk!

Of course, we knew we had to try this ourselves.

We rolled up our sleeves and chose to de-index
Followerwonk, both for its consistent Google traffic and its good analytics setup—that way we could properly measure everything. We were also confident we could quickly bring the site back into Google’s results, which minimized the business risks.

(We discussed de-indexing our main site moz.com, but… no soup for you!)

We wanted to measure and test several things:

  1. How quickly will Google remove a site from its index?
  2. How much of our organic traffic is actually attributed as direct traffic?
  3. How quickly can you bring a site back into search results using the URL removal tool?

Here’s what happened.

How to completely remove a site from Google

The fastest, simplest, and most direct method to completely remove an entire site from Google search results is by using the
URL removal tool

We also understood, via statements form Google engineers, that using this method gave us the biggest chance of bringing the site back, with little risk. Other methods of de-indexing, such as using meta robots NOINDEX, might have taken weeks and caused recovery to take months.

CAUTION: Removing any URLs from a search index is potentially very dangerous, and should be taken very seriously. Do not try this at home; you will not pass go, and will not collect $200!

CAUTION: Removing any URLs from a search index is potentially very dangerous, and should be taken very seriously. Do not try this at home; you will not pass go, and will not collect $200!

After submitting the request, Followerwonk URLs started
disappearing from Google search results in 2-3 hours

The information needs to propagate across different data centers across the globe, so the effect can be delayed in areas. In fact, for the entire duration of the test, organic Google traffic continued to trickle in and never dropped to zero.

The effect on direct vs. organic traffic

In the Groupon experiment, they found that when they lost organic traffic, they
actually lost a bunch of direct traffic as well. The Groupon conclusion was that a large amount of their direct traffic was actually organic—up to 60% on “long URLs”.

At first glance, the overall amount of direct traffic to Followerwonk didn’t change significantly, even when organic traffic dropped.

In fact, we could find no discrepancy in direct traffic outside the expected range.

I ran this by our contacts at Groupon, who said this wasn’t totally unexpected. You see, in their experiment they saw the biggest drop in direct traffic on
long URLs, defined as a URL that is at least as long enough to be in a subfolder, like https://followerwonk.com/bio/?q=content+marketer.

For Followerwonk, the vast majority of traffic goes to the homepage and a handful of other URLs. This means we didn’t have a statistically significant sample size of long URLs to judge the effect. For the long URLs we were able to measure, the results were nebulous. 

Conclusion: While we can’t confirm the Groupon results with our outcome, we can’t discount them either.

It’s quite likely that a portion of your organic traffic is attributed as direct. This is because of different browsers, operating systems and user privacy settings can potentially block referral information from reaching your website.

Bringing your site back from death

After waiting 2 hours,
we deleted the request. Within a few hours all traffic returned to normal. Whew!

Does Google need to recrawl the pages?

If the time period is short enough, and you used the URL removal tool, apparently not.

In the case of Followerwonk, Google removed over
300,000 URLs from its search results, and made them all reappear in mere hours. This suggests that the domain wasn’t completely removed from Google’s index, but only “masked” from appearing for a short period of time.

What about longer periods of de-indexation?

In both the Groupon and Followerwonk experiments, the sites were only de-indexed for a short period of time, and bounced back quickly.

We wanted to find out what would happen if you de-indexed a site for a longer period, like
two and a half days?

I couldn’t convince the team to remove any of our sites from Google search results for a few days, so I choose a smaller personal site that I often subject to merciless SEO experiments.

In this case, I de-indexed the site and didn’t remove the request until three days later. Even with this longer period, all URLs returned within just
a few hours of cancelling the URL removal request.

In the chart below, we revoked the URL removal request on Friday the 25th. The next two days were Saturday and Sunday, both lower traffic days.

Test #2: De-index a personal site for 3 days

Likely, the URLs were still in Google’s index, so we didn’t have to wait for them to be recrawled. 

Here’s another shot of organic traffic before and after the second experiment.

For longer removal periods, a few weeks for example, I speculate Google might drop these semi-permanently from the index and re-inclusion would comprise a much longer time period.

What we learned

  1. While a portion of your organic traffic may be attributed as direct (due to browsers, privacy settings, etc) in our case the effect on direct traffic was negligible.
  2. If you accidentally de-index your site using Google Webmaster Tools, in most cases you can quickly bring it back to life by deleting the request.
  3. Reinclusion happens quickly even after we removed a site for over 2 days. Longer than this, the result is unknown, and you could have problems getting all the pages of your site indexed again.

Further reading

Moz community member Adina Toma wrote an excellent YouMoz post on the re-inclusion process using the same technique, with some excellent tips for other, more extreme situations.

Big thanks to
Peter Bray for volunteering Followerwonk for testing. You are a brave man!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]