Moving 5 Domains to 1: An SEO Case Study

Posted by Dr-Pete

People often ask me if they should change domain names, and I always shudder just a little. Changing domains is a huge, risky undertaking, and too many people rush into it seeing only the imaginary upside. The success of the change also depends wildly on the details, and it’s not the kind of question anyone should be asking casually on social media.

Recently, I decided that it was time to find a new permanent home for my personal and professional blogs, which had gradually spread out over 5 domains. I also felt my main domain was no longer relevant to my current situation, and it was time for a change. So, ultimately I ended up with a scenario that looked like this:

The top three sites were active, with UserEffect.com being my former consulting site and blog (and relatively well-trafficked). The bottom two sites were both inactive and were both essentially gag sites. My one-pager, AreYouARealDoctor.com, did previously rank well for “are you a real doctor”, so I wanted to try to recapture that.

I started migrating the 5 sites in mid-January, and I’ve been tracking the results. I thought it would be useful to see how this kind of change plays out, in all of the gory details. As it turns out, nothing is ever quite “textbook” when it comes to technical SEO.

Why Change Domains at All?

The rationale for picking a new domain could fill a month’s worth of posts, but I want to make one critical point – changing domains should be about your business goals first, and SEO second. I did not change domains to try to rank better for “Dr. Pete” – that’s a crap shoot at best. I changed domains because my old consulting brand (“User Effect”) no longer represented the kind of work I do and I’m much more known by my personal brand.

That business case was strong enough that I was willing to accept some losses. We went through a similar transition here
from SEOmoz.org to Moz.com. That was a difficult transition that cost us some SEO ground, especially short-term, but our core rationale was grounded in the business and where it’s headed. Don’t let an SEO pipe dream lead you into a risky decision.

Why did I pick a .co domain? I did it for the usual reason – the .com was taken. For a project of this type, where revenue wasn’t on the line, I didn’t have any particular concerns about .co. The evidence on how top-level domains (TLDs) impact ranking is tough to tease apart (so many other factors correlate with .com’s), and Google’s attitude tends to change over time, especially if new TLDs are abused. Anecdotally, though, I’ve seen plenty of .co’s rank, and I wasn’t concerned.

Step 1 – The Boring Stuff

It is absolutely shocking how many people build a new site, slap up some 301s, pull the switch, and hope for the best. It’s less shocking how many of those people end up in Q&A a week later, desperate and bleeding money.


Planning is hard work, and it’s boring – get over it.

You need to be intimately familiar with every page on your existing site(s), and, ideally, you should make a list. Not only do you have to plan for what will happen to each of these pages, but you’ll need that list to make sure everything works smoothly later.

In my case, I decided it might be time to do some housekeeping – the User Effect blog had hundreds of posts, many outdated and quite a few just not very good. So, I started with the easy data – recent traffic. I’m sure you’ve seen this Google Analytics report (Behavior > Site Content > All Pages):

Since I wanted to focus on recent activity, and none of the sites had much new content, I restricted myself to a 3-month window (Q4 of 2014). Of course, I looked much deeper than the top 10, but the principle was simple – I wanted to make sure the data matched my intuition and that I wasn’t cutting off anything important. This helped me prioritize the list.

Of course, from an SEO standpoint, I also didn’t want to lose content that had limited traffic but solid inbound links. So, I checked my “Top Pages” report in
Open Site Explorer:

Since the bulk of my main site was a blog, the top trafficked and top linked-to pages fortunately correlated pretty well. Again, this is only a way to prioritize. If you’re dealing with sites with thousands of pages, you need to work methodically through the site architecture.

I’m going to say something that makes some SEOs itchy – it’s ok not to move some pages to the new site. It’s even ok to let some pages 404. In Q4, UserEffect.com had traffic to 237 URLs. The top 10 pages accounted for 91.9% of that traffic. I strongly believe that moving domains is a good time to refocus a site and concentrate your visitors and link equity on your best content. More is not better in 2015.

Letting go of some pages also means that you’re not 301-redirecting a massive number of old URLs to a new home-page. This can look like a low-quality attempt to consolidate link-equity, and at large scale it can raise red flags with Google. Content worth keeping should exist on the new site, and your 301s should have well-matched targets.

In one case, I had a blog post that had a decent trickle of traffic due to ranking for “50,000 push-ups,” but the post itself was weak and the bounce rate was very high:

The post was basically just a placeholder announcing that I’d be attempting this challenge, but I never recapped anything after finishing it. So, in this case,
I rewrote the post.

Of course, this process was repeated across the 3 active sites. The 2 inactive sites only constituted a handful of total pages. In the case of AreYouARealDoctor.com, I decided to turn the previous one-pager
into a new page on the new site. That way, I had a very well-matched target for the 301-redirect, instead of simply mapping the old site to my new home-page.

I’m trying to prove a point – this is the amount of work I did for a handful of sites that were mostly inactive and producing no current business value. I don’t need consulting gigs and these sites produce no direct revenue, and yet I still considered this process worth the effort.

Step 2 – The Big Day

Eventually, you’re going to have to make the move, and in most cases, I prefer ripping off the bandage. Of course, doing something all at once doesn’t mean you shouldn’t be careful.

The biggest problem I see with domain switches (even if they’re 1-to-1) is that people rely on data that can take weeks to evaluate, like rankings and traffic, or directly checking Google’s index. By then, a lot of damage is already done. Here are some ways to find out quickly if you’ve got problems…

(1) Manually Check Pages

Remember that list you were supposed to make? It’s time to check it, or at least spot-check it. Someone needs to physically go to a browser and make sure that each major section of the site and each important individual page is resolving properly. It doesn’t matter how confident your IT department/guy/gal is – things go wrong.

(2) Manually Check Headers

Just because a page resolves, it doesn’t mean that your 301-redirects are working properly, or that you’re not firing some kind of 17-step redirect chain. Check your headers. There are tons of free tools, but lately I’m fond of
URI Valet. Guess what – I screwed up my primary 301-redirects. One of my registrar transfers wasn’t working, so I had to have a setting changed by customer service, and I inadvertently ended up with 302s (Pro tip: Don’t change registrars and domains in one step):

Don’t think that because you’re an “expert”, your plan is foolproof. Mistakes happen, and because I caught this one I was able to correct it fairly quickly.

(3) Submit Your New Site

You don’t need to submit your site to Google in 2015, but now that Google Webmaster Tools allows it, why not do it? The primary argument I hear is “well, it’s not necessary.” True, but direct submission has one advantage – it’s fast.

To be precise, Google Webmaster Tools separates the process into “Fetch” and “Submit to index” (you’ll find this under “Crawl” > “Fetch as Google”). Fetching will quickly tell you if Google can resolve a URL and retrieve the page contents, which alone is pretty useful. Once a page is fetched, you can submit it, and you should see something like this:

This isn’t really about getting indexed – it’s about getting nearly instantaneous feedback. If Google has any major problems with crawling your site, you’ll know quickly, at least at the macro level.

(4) Submit New XML Sitemaps

Finally, submit a new set of XML sitemaps in Google Webmaster Tools, and preferably tiered sitemaps. While it’s a few years old now, Rob Ousbey has a great post on the subject of
XML sitemap structure. The basic idea is that, if you divide your sitemap into logical sections, it’s going to be much easier to diagnosis what kinds of pages Google is indexing and where you’re running into trouble.

A couple of pro tips on sitemaps – first, keep your old sitemaps active temporarily. This is counterintuitive to some people, but unless Google can crawl your old URLs, they won’t see and process the 301-redirects and other signals. Let the old accounts stay open for a couple of months, and don’t cut off access to the domains you’re moving.

Second (I learned this one the hard way), make sure that your Google Webmaster Tools site verification still works. If you use file uploads or meta tags and don’t move those files/tags to the new site, GWT verification will fail and you won’t have access to your old accounts. I’d recommend using a more domain-independent solution, like verifying with Google Analytics. If you lose verification, don’t panic – your data won’t be instantly lost.

Step 3 – The Waiting Game

Once you’ve made the switch, the waiting begins, and this is where many people start to panic. Even executed perfectly, it can take Google weeks or even months to process all of your 301-redirects and reevaluate a new domain’s capacity to rank. You have to expect short term fluctuations in ranking and traffic.

During this period, you’ll want to watch a few things – your traffic, your rankings, your indexed pages (via GWT and the site: operator), and your errors (such as unexpected 404s). Traffic will recover the fastest, since direct traffic is immediately carried through redirects, but ranking and indexation will lag, and errors may take time to appear.

(1) Monitor Traffic

I’m hoping you know how to check your traffic, but actually trying to determine what your new levels should be and comparing any two days can be easier said than done. If you launch on a Friday, and then Saturday your traffic goes down on the new site, that’s hardly cause for panic – your traffic probably
always goes down on Saturday.

In this case, I redirected the individual sites over about a week, but I’m going to focus on UserEffect.com, as that was the major traffic generator. That site was redirected, in full on January 21st, and the Google Analytics data for January for the old site looked like this:

So far, so good – traffic bottomed out almost immediately. Of course, losing traffic is easy – the real question is what’s going on with the new domain. Here’s the graph for January for DrPete.co:

This one’s a bit trickier – the first spike, on January 16th, is when I redirected the first domain. The second spike, on January 22nd, is when I redirected UserEffect.com. Both spikes are meaningless – I announced these re-launches on social media and got a short-term traffic burst. What we really want to know is where traffic is leveling out.

Of course, there isn’t a lot of history here, but a typical day for UserEffect.com in January was about 1,000 pageviews. The traffic to DrPete.co after it leveled out was about half that (500 pageviews). It’s not a complete crisis, but we’re definitely looking at a short-term loss.

Obviously, I’m simplifying the process here – for a large, ecommerce site you’d want to track a wide range of metrics, including conversion metrics. Hopefully, though, this illustrates the core approach. So, what am I missing out on? In this day of [not provided], tracking down a loss can be tricky. Let’s look for clues in our other three areas…

(2) Monitor Indexation

You can get a broad sense of your indexed pages from Google Webmaster Tools, but this data often lags real-time and isn’t very granular. Despite its shortcomings, I still prefer
the site: operator. Generally, I monitor a domain daily – any one measurement has a lot of noise, but what you’re looking for is the trend over time. Here’s the indexed page count for DrPete.co:

The first set of pages was indexed fairly quickly, and then the second set started being indexed soon after UserEffect.com was redirected. All in all, we’re seeing a fairly steady upward trend, and that’s what we’re hoping to see. The number is also in the ballpark of sanity (compared to the actual page count) and roughly matched GWT data once it started being reported.

So, what happened to UserEffect.com’s index after the switch?

The timeframe here is shorter, since UserEffect.com was redirected last, but we see a gradual decline in indexation, as expected. Note that the index size plateaus around 60 pages – about 1/4 of the original size. This isn’t abnormal – low-traffic and unlinked pages (or those with deep links) are going to take a while to clear out. This is a long-term process. Don’t panic over the absolute numbers – what you want here is a downward trend on the old domain accompanied by a roughly equal upward trend on the new domain.

The fact that UserEffect.com didn’t bottom out is definitely worth monitoring, but this timespan is too short for the plateau to be a major concern. The next step would be to dig into these specific pages and look for a pattern.

(3) Monitor Rankings

The old domain is dropping out of the index, and the new domain is taking its place, but we still don’t know why the new site is taking a traffic hit. It’s time to dig into our core keyword rankings.

Historically, UserEffect.com had ranked well for keywords related to “split test calculator” (near #1) and “usability checklist” (in the top 3). While [not provided] makes keyword-level traffic analysis tricky, we also know that the split-test calculator is one of the top trafficked pages on the site, so let’s dig into that one. Here’s the ranking data from Moz Analytics for “split test calculator”:

The new site took over the #1 position from the old site at first, but then quickly dropped down to the #3/#4 ranking. That may not sound like a lot, but given this general keyword category was one of the site’s top traffic drivers, the CTR drop from #1 to #3/#4 could definitely be causing problems.

When you have a specific keyword you can diagnose, it’s worth taking a look at the live SERP, just to get some context. The day after relaunch, I captured this result for “dr. pete”:

Here, the new domain is ranking, but it’s showing the old title tag. This may not be cause for alarm – weird things often happen in the very short term – but in this case we know that I accidentally set up a 302-redirect. There’s some reason to believe that Google didn’t pass full link equity during that period when 301s weren’t implemented.

Let’s look at a domain where the 301s behaved properly. Before the site was inactive, AreYouARealDoctor.com ranked #1 for “are you a real doctor”. Since there was an inactive period, and I dropped the exact-match domain, it wouldn’t be surprising to see a corresponding ranking drop.

In reality, the new site was ranking #1 for “are you a real doctor” within 2 weeks of 301-redirecting the old domain. The graph is just a horizontal line at #1, so I’m not going to bother you with it, but here’s a current screenshot (incognito):

Early on, I also spot-checked this result, and it wasn’t showing the strange title tag crossover that UserEffect.com pages exhibited. So, it’s very likely that the 302-redirects caused some problems.

Of course, these are just a couple of keywords, but I hope it provides a starting point for you to understand how to methodically approach this problem. There’s no use crying over spilled milk, and I’m not going to fire myself, so let’s move on to checking any other errors that I might have missed.

(4) Check Errors (404s, etc.)

A good first stop for unexpected errors is the “Crawl Errors” report in Google Webmaster Tools (Crawl > Crawl Errors). This is going to take some digging, especially if you’ve deliberately 404’ed some content. Over the couple of weeks after re-launch, I spotted the following problems:

The old site had a “/blog” directory, but the new site put the blog right on the home-page and had no corresponding directory. Doh. Hey, do as I say, not as I do, ok? Obviously, this was a big blunder, as the old blog home-page was well-trafficked.

The other two errors here are smaller but easy to correct. MinimalTalent.com had a “/free” directory that housed downloads (mostly PDFs). I missed it, since my other sites used a different format. Luckily, this was easy to remap.

The last error is a weird looking URL, and there are other similar URLs in the 404 list. This is where site knowledge is critical. I custom-designed a URL shortener for UserEffect.com and, in some cases, people linked to those URLs. Since those URLs didn’t exist in the site architecture, I missed them. This is where digging deep into historical traffic reports and your top-linked pages is critical. In this case, the fix isn’t easy, and I have to decide whether the loss is worth the time.

What About the New EMD?

My goal here wasn’t to rank better for “Dr. Pete,” and finally unseat Dr. Pete’s Marinades, Dr. Pete the Sodastream flavor (yes, it’s hilarious – you can stop sending me your grocery store photos), and 172 dentists. Ok, it mostly wasn’t my goal. Of course, you might be wondering how switching to an EMD worked out.

In the short term, I’m afraid the answer is “not very well.” I didn’t track ranking for “Dr. Pete” and related phrases very often before the switch, but it appears that ranking actually fell in the short-term. Current estimates have me sitting around page 4, even though my combined link profile suggests a much stronger position. Here’s a look at the ranking history for “dr pete” since relaunch (from Moz Analytics):

There was an initial drop, after which the site evened out a bit. This less-than-impressive plateau could be due to the bad 302s during transition. It could be Google evaluating a new EMD and multiple redirects to that EMD. It could be that the prevalence of natural anchor text with “Dr. Pete” pointing to my site suddenly looked unnatural when my domain name switched to DrPete.co. It could just be that this is going to take time to shake out.

If there’s a lesson here (and, admittedly, it’s too soon to tell), it’s that you shouldn’t rush to buy an EMD in 2015 in the wild hope of instantly ranking for that target phrase. There are so many factors involved in ranking for even a moderately competitive term, and your domain is just one small part of the mix.

So, What Did We Learn?

I hope you learned that I should’ve taken my own advice and planned a bit more carefully. I admit that this was a side project and it didn’t get the attention it deserved. The problem is that, even when real money is at stake, people rush these things and hope for the best. There’s a real cheerleading mentality when it comes to change – people want to take action and only see the upside.

Ultimately, in a corporate or agency environment, you can’t be the one sour note among the cheering. You’ll be ignored, and possibly even fired. That’s not fair, but it’s reality. What you need to do is make sure the work gets done right and people go into the process with eyes wide open. There’s no room for shortcuts when you’re moving to a new domain.

That said, a domain change isn’t a death sentence, either. Done right, and with sensible goals in mind – balancing not just SEO but broader marketing and business objectives – a domain migration can be successful, even across multiple sites.

To sum up: Plan, plan, plan, monitor, monitor, monitor, and try not to panic.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Developing Innovative Content: What You Need to Know

Posted by richardbaxterseo

A few weeks ago, I attended a breakfast meeting with a bunch of entrepreneurs in the technology, space (yes, space travel), software and engineering industry. I felt so blown away by the incredible talent of the speakers. You know, there are people out there building things, like private satellite networks, bio printing facilities, quantum computers and self-driving cars. I was completely transfixed by the incredibly future facing, innovative and exceptionally inventive group in front of me. I also immediately wished I’d worked a little harder in my twenties.

After the presentations, one of the questions that came up during the Q&A session was: “what’s the next big thing?”

Wow. Have you ever thought about “the next big thing”?

Part of the magic of predicting innovation is that it’s really, really hard to get right. Those that can accurately predict the future (in my humble opinion) are those that tend to understand how people will respond to an idea once they’re exposed to it. I think predicting this is a very special skill indeed.

Then again, we’re expected to be able to predict the outcome of our marketing, all the time. While predicting it is one thing, making it happen it is a whole different ball game.

Competition for the attention of our customers is getting tougher

In our industry, when you really boil down what it is we do, we’re fixing things, making things, or we’re communicating things.

Most of the time, we’re building content that communicates: ideas, stories, news and guidance–you get the idea. The problem is, no matter which vertical you work in, we’re all competing for something: the attention of our customers.

As our customers get smarter, that competition is getting tougher and tougher.

The most successful marketers in our industry all have a special trait in common. They are good at finding new ways to communicate ideas. Take a look at classic presentations
like this from Ross Hudgens to see just how powerful it can be to observe, imitate and develop an idea with astounding viral reach.

I particularly enjoy the idea of taking a piece of content and making improvements, be it through design, layout or simply updating what’s there. I like it because it’s actually pretty easy to do, and there’s growing evidence of it happening all over the Internet. Brands are taking a second look at how they’re developing their content to appeal to a wider audience, or to appeal to a viral audience (or both!).

For example; take a look at this beautiful
travel guide to Vietnam (credit: travelindochina.com) or this long form guide to commercial property insurance (credit: Towergate Insurance / Builtvisible.com) for examples of brands in competitive verticals developing their existing content. In verticals where ordinary article content has been done to death, redeveloping the medium itself feels like an important next step.

Innovative isn’t the same thing as technical

I’ve felt for a long time that there’s a conflict between our interpretation of “innovative” and “technical”. As I’ve written before, those that really understand how the web works are at a huge advantage.
Learn how it’s built, and you’ll find yourself able to make great things happen on your own, simply by learning and experimenting.

In my opinion though, you don’t have to be able to learn how to build your own site or be a developer. All you have to do is learn the vocabulary and build a broad understanding of how things work in a browser. I actually think we all need to be doing this, right now. Why?

We need more innovation in content marketing

I think our future depends on our industry’s ability to innovate. Of course, you still need to have your basics in place. We’ll always be
T-Shaped marketers, executing a bit of technical SEO here, a bit of content strategy there. But, we’re all SEOs and we know we need to acquire links, build audiences and generally think big about our ambitions. When your goal is to attract new followers, fans, links, and garner shares in their thousands, you need to do something pretty exciting to attract attention to yourself.

The vocabulary of content development

I’ve designed this post to be a primer on more advanced features found in innovative content development. My original MozCon 2014 presentation was designed to educate on some of the technologies we should be aware of in our content development projects and the process we follow to build things. We’ll save process for another post (shout in the comments if you think that would be useful!) and focus on the “what” for now.

At Builtvisible, we’re working hard on extending our in-house content development capabilities. We learn through sharing amazing examples with each other. Our policy is to always attempt to deconstruct how something might have been developed, that way, we’re learning. Some of the things we see on the web are amazing–they deserve so much respect for the talent and the skills that surface the content.

Here are some examples that I think demonstrate some of the most useful types of approach for content marketers. I hope that these help as much as they’ve helped us, and I hope you can form a perspective of what innovative features look like in more advanced content development. Of course, do feel welcome to share your own examples in the comments, too! The more, the merrier!

The story of EBoy

eBoy: the graphic design firm whose three co-founders and sole members are widely regarded as the “godfathers” of pixel art.

The consistent styling (as well as the beautifully written content) is excellent. Technically speaking, perhaps the most clever and elegant feature is the zoom of the image positioned on the Z axis in a <canvas> container (more on this in a moment).

An event listener (jQuery) helps size the canvas appropriate to the browser window size and the z axis position shifts on scroll to create an elegant zoom effect.


View the example here:

http://www.theverge.com/2014/6/17/5803850/pixel-perfect-the-story-of-eboy.

<canvas> is an HTML element which can be used to draw graphics using scripting (usually JavaScript). This can, for instance, be used to draw graphs, make photo composition or simple animations.

Colorizing the past

Take a look at
Pixart Printing’s Guide to Colourizing the Past (credit: Pixartprinting / Builtvisible.com) for a clever example of <canvas> in use. Here’s one of the images (tip, mouse-over and click the image):

The colorization feature takes advantage of the power of the canvas element. In this case, the color version of the image is applied to the canvas as a background image, with the black and white version on a layer above. Clicking (or touching, on mobile) erases portions of the top image, revealing the color version underneath.

Chrome Experiments: Globe

Globe is “simple” global data visualization of the Earth’s population growth over a set range of dates. The 3d visualization based in
webGL: a JavaScript API for rendering interactive 3D graphics and 2D graphics within any compatible web browser without the use of plug-ins.


View the example here:

http://globe.chromeexperiments.com/.

WebGL is a really exciting, emerging option available to content marketers who might want to experiment with immersive experiences or highly interactive, simulated environments.

Some of my
favourite WebGL examples include Hello Racer and Tweetopia, a 3d Twitter Hastag visualizer.

If you’d like to see more examples of webGL in action, take a look at
Chrome Experiments. Don’t worry, this stuff works in the latest versions of Firefox and IE, too.

Polygon’s PS4 Review

You might have seen me cover this long form concept over at Builtvisible. Polygon’s Playstation 4 review is a fully featured “long form” review of Sony’s much loved gaming machine. The bit that I love is the SVG visualizations:

“What’s SVG?”, I hear you ask!

SVG is super-fast, sharp rendering of vector images inside the browser. Unlike image files (like .jpg, .gif, .png), SVG is XML based, light on file size, loads quickly and adjusts to responsive browser widths perfectly. SVG’s XML based schema lends itself to some interesting manipulation for stunning, easy to implement effects.

View Polygon’s example here: http://www.polygon.com/a/ps4-review

That line tracing animation you see is known as
path animation. Essentially the path attribute in the SVG’s XML can be manipulated in the DOM with a little jQuery. What you’ll get is a pretty snazzy animation to keep your users eyes fixated on your content and yet another nice little effect to keep eyeballs engaged.

My favourite example of SVG execution is Lewis Lehe’s
Gridlocks and Bottlenecks. Gridlocks is a AngularJS, d3.js based visualization of the surprisingly technical and oft-misunderstood “gridlock” and “bottleneck” events in road traffic management.

It’s also very cool:

View the example here:http://setosa.io/blog/2014/09/02/gridlock/.

I have a short vocabulary list that I expect our team to be able to explain (certainly these questions come up in an interview with us!). I think that if you can explain what these things are, as a developing content marketer you’re way ahead of the curve:

  • HTML5
  • Responsive CSS (& libraries)
  • CSS3 (& frameworks)
  • JavaScript (& frameworks: jQuery, MooTools, Jade, Handlebars)
  • JSON (api post and response data)
  • webGL
  • HTML5 audio & video
  • SVG
  • HTML5 History API manipulation with pushState
  • Infinite Scroll

Want to learn more?

I’ve
amassed a series of videos on web development that I think marketers should watch. Not necessarily to learn web development, but definitely to be able to describe what it is you’d like your own content to do. My favourite: I really loved Wes Bos’s JS + HTML5 Video + Canvas tutorial. Amazing.

Innovation in content is such a huge topic but I realize I’ve run out of space (this is already a 1,400 word post) for now.

In my follow up, I’d like to talk about how to plan your content when it’s a little more extensive than just an article, give you some tips on how to work with (or find!) a developer, and how to make the most of every component in your content to get the most from your marketing efforts.

Until then, I’d love to see your own examples of great content and questions in the comments!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Back to Fundamentals: 6 Untapped Keyword Sources that Will Boost Organic Traffic

Posted by neilpatel

I used to perform keyword research in the typical, perfunctory way—go to the Keyword Tool, type in some words, and punch out a list of terms.

Easy. Quick. Simple.

Today, things are different. The much-loved
keyword tool has been replaced, long-tail keywords have the ascendancy, and it’s harder to figure out what users are actually searching for.

The rules have changed, and so have the ways of playing the game. I still use the
Keyword Planner, but I’ve also discovered a medley of not-so-obvious ways to get keywords that improve my organic traffic.

1. Wikipedia

Do you think of Wikipedia as just a massive encyclopedia? Think again.
I use Wikipedia for keyword research.

Image from Search Engine Journal.

My process is pretty simple.

Step 1: Google inurl:Wikipedia and my topic. Or just Google the topic or head term. Wikipedia is often the first organic result.

Step 2: Look at the SERP to identify the most relevant terms and possible keywords within a Wikipedia entry.

Step 3: Open the entry in Wikipedia and identify the most relevant terms from the first few paragraphs, morphing them into longail iterations.

Step 4: Identify other relevant terms from Wikipedia’s table of contents on the topic.

Step 5: Link to other associated Wikipedia to see related subjects, and identify even more keywords.

Wikipedia is the world’s
sixth most popular website, and ranks it at number #4 on Google’s list. It boasts 310,000,000 unique visitors (20% of its traffic), and has 7,900,000,000 pageviews. All of this with absolutely no advertising.

In other words, Wikipedia has one of the best organic SEO strategies on the planet. Obviously, these are keywords that matter. Wikipedia’s popularity shows us that people want information. It’s like the greatest content marketing strategy ever, combining user-generated content with prolific publishing on a grand scale.

Do what Wikipedia does. Use the terms that people search for. You won’t outrank Wikipedia, but you will start to rank organically for the longtail varieties that you discern from Wikipedia.

2. Google autocomplete

When you type stuff into Google’s search bar, Google predicts your query and types it out for you. The feature has been around
for a long time. The more time that goes by, the more intelligent the autocomplete algorithm becomes.

These autocomplete suggestions are all based on real user queries. They vary based on geographic location and language. However, in spite of the variation, autocomplete provides a fairly accurate representation of what people are looking for.

Here is why autocomplete is a killer source of keywords:

Step 1: It indicates some of the most popular keywords.

Step 2: It provides longtail suggestions.

Step 3: The keywords are ranked according to the “freshness layer” algorithm. That means that currently popular search terms will rank higher in the autocomplete list.

How do you use autocomplete for keyword research? Well, you can go about this the good old-fashioned spade and shovel way, like this:

Google 2014-08-11 13-50-24

Step 4: Open Google. To prevent Google from autocompleting previously-searched for terms, log out of Google or open an “incognito” window (Chrome: Shift + Cmnd + N).

Step 5: Type in your main keyword or longtail keyword E.g. “lawnmower.”

Step 6: Write down the suggestions that appear in autocomplete.

Step 7: After you type in your main keyword or head term, type in “A” and write down the autocomplete suggestions.

Step 8: Repeat Step 7 for rest of the alphabet.

Or, you can do it the easy way, with Übersuggest. It’s called”suggest on steroids.” It will do all the work for you. The only downside is that it doesn’t suggest keyword extensions based on search popularity.

Keyword suggestion tool — Google suggest scraper — Übersuggest 2014-08-11 13-53-48

If you can get past the eye-popping UI, Übersuggest is a pretty awesome tool.

Keep in mind that Google is not going to provide suggestions for everything.
As quoted in Search Engine Land, here is what the algorithm will filter out:

  • Hate- or violence-related suggestions
  • Personally identifiable information in suggestions
  • Porn & adult content-related suggestions
  • Legally mandated removals
  • Piracy-related suggestions

3. Google Related Searches

Since Google is the biggest search engine, we’ve got to take our cues from its mighty algorithm, imperfect and agonizing though it may be.

Google’s related searches is a really easy way to snag some instant keyword research.


Step 1:
Search for your keyword in Google.


Step 2:
Scroll to the bottom, and ignore everything in between.

There, at the bottom is a harvest of keywords, ripe for the selection:

lawn mower - Google Search 2014-08-11 14-05-22

The idea is similar to Google suggest. However, instead of providing autocomplete suggestions, Google takes the keyword and mixes it up with other words. These other words may be at the end, at the beginning, or sprinkled throughout. These related searches might not even include the actual keyword, but are simply connected in a tangential way.

Whatever the case, you will undoubtedly find some keyword ideas from this list.

4. MetaGlossary.com

Not a whole lot of people know about MetaGlossary.com. You won’t find a lot of information about the company itself, but you will find a ton of keyword ideas.

Here are the instructions. Not too hard.

MetaGlossary.com 2014-08-11 14-53-43

The whole point of the glossary is to provide definitions. But along with the many definitions, you’ll get “related terms.” That’s what we’re looking for.

When I type in “Search Engine Optimization,” my head term, here’s what I get:

Metaglossary.com - Definitions for "search engine optimization" 2014-08-11 14-56-26

All of those are potential keywords.

I can take this a step further by looking through the definitions. These can provide even more keyword fodder:

Metaglossary.com - Definitions for "search engine optimization" 2014-08-11 14-57-28

For this particular term, I found 117 definitions. That’s enough to keep me busy for a while.

5. Competitor keywords

Another great way to get keyword ideas is to snag them from the competition.

Not only are you going to identify some great keywords, but you’ll be able to gain these keywords ideas from the top-ranking organic sites in the SERPs.

Here’s how to do it.

Step 1: Google your top keyword.

Step 2: Click the first organic result.

Step 3: View the page source (Chrome: Cmnd + Alt + u)

Step 4: Search for “<Title>”. Identify any non-branded terms as possible keywords.

Step 5: Search for “<h1>”. Identify any potential keywords in the H1 text.

Step 6: Search for “<keywords>”. Identify any potential keywords that they have identified as such. Some websites have this, such as specific WordPress themed sites, or WP sites using an SEO plugin. Most websites don’t.

Step 7: Look at all the content and locate any additional longtail keywords or keyword variations.

The competitors that are first in the SERP for a given head term or longtail query are ranking high for a variety of reasons. One of those reasons is their keyword selection. Sure, they may have good link profiles, but you can’t rank for a keyword unless you actually have that keyword (or some variation thereof) on your page.

6. Amazon.com

Amazon.com is king of the ecommerce jungle, no questions asked.

Part of their power is that they have total domination of the organic search results for just about any purchase-related keyword. When your audience circles closer to a transactional search query, Amazon is ranking somewhere.

Why? They’ve got keywords—lots of them. And they have reviews—lots of them. This means one thing for you: Lots of keywords ideas.

Let me make a quick clarification. Not everyone is going to find keyword ideas on Amazon. This works best if you have a physical products, and obviously only if Amazon sells it.

Here’s how to skim the cream off of Amazon’s great keywords.

Step 1: Google your keyword.

Step 2: Locate the Amazon entry in the SERP.

Step 3: Click on the result to see the product/landing page on Google.

Step 4: Locate keywords in the following places.

-“Show results for” menu

-Main header

-Text underneath main header

-“## Results for” text.

-Breadcrumb

-Items listed

Here’s a quick survey of where you can find these keywords. Notice the highlighted text.

Amazon.com: Bags & Cases: Electronics: Sleeves & Slipcases, Messenger Bags, Shoulder Bags, Backpacks & More 2014-08-11 14-28-16

You’ll find even more keywords once you dive into individual products.

Pay special attention to these areas on product pages:

-“Customers Who Bought This Item Also Bought”

-“Product Description”

-“Product Ads from External Websites”

-“Customer Questions & Answers.” You’ll find some nice query-like longtail keywords here.

-“Customer Reviews.” Again, this is a great source of longtails.

Let Amazon be your guide. They’re the biggest e-retailer around, and they have some great keyword clout going for them.

Conclusion

Keyword research is a basic skill for any SEO. The actual process of finding those keywords, however, does not require expensive tools, formula-driven methods, or an extremely limited pool of options.

I’ve used each of these methods for myself and my clients with incredible success.


What is your favorite source for finding great keywords? 

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

6 Things I Wish I Knew Before Using Optimizely

Posted by tallen1985

Diving into Conversion Rate Optimization (CRO) for the first time can be a challenge. You are faced with a whole armoury of new tools, each containing a huge variety of features. Optimizely is one of those tools you will quickly encounter and through this post I’m going to cover 6 features I wish I had known from day one that have helped improve test performance/debugging and the ability to track results accurately.

1. You don’t have to use the editor

The editor within Optimizely is a useful tool if you don’t have much experience working with code. The editor
should be used for making simple visual changes, such as changing an image, adjusting copy or making minor layout changes.

If you are looking to make changes that change the behaviour of the page rather than just straightforward visual changes, then the editor can become troublesome. In this case you should use the “Edit Code” feature at the foot of the editor.

For any large-scale changes to the site, such as completely redesigning the page, Optimizely should be used for traffic allocation and not editing pages. To do this:

1. Build a new version of the page outside of Optimizely

2. Upload the variation page to your site.
Important: Ensure that the variation page is noindexed.

We now have two variations of our page:

www.myhomepage.com & www.myhomepage.com/variation1

3. Select the variation drop down menu and click Redirect to a new page

4. Enter the variation URL, apply the settings and save the experiment. You can now use Optimizely as an A/B test management tool to allocate traffic, exclude traffic/device types, and gather further test data.

If you do use the editor be aware of excess code

One problem to be aware of here is that each time you move or change an element Optimizely adds a new line of code. The variation code below actually repositions the h2 title four times.

Instead when using the editor we should make sure that we strip out any excess code. If you move and save a page element multiple times, open the <edit code> tab at the foot of the page and delete any excess code. For example, the following positions my h2 title in exactly the same position as before with three fewer lines of code. Over the course of multiple changes, this excess code can result in an increase of load time for Optimizely.


2. Enabling analytics tracking

Turning on analytics tracking seems obvious, right? In fact, why would we even need to turn it on in the first place, surely it would be defaulted to on?

Optimizely currently sets analytics tracking to the default option of off. As a result if you don’t manually change the setting nothing will be getting reporting into your analytics platform of choice.

To turn on analytics tracking, simply open the settings in the top right corner from within the editor mode and select Analytics Integration.

Turn on the relevant analytics tracking. If you are using Google Analytics, then at this point you should assign a vacant custom variable slot (for Classic Analytics) or a vacant custom dimension (Universal Analytics) to the experiment.

Once the test is live, wait for a while (up to 24 hours), then check to be sure the data is reporting correctly within the custom segments.


3. Test your variations in a live environment

Before you set your test live, it’s important that you test the new variation to ensure everything works as expected. To do this we need to see the test in a live environment while ensuring no customers see the test versions yet. I’ve suggested a couple of ways to do this below:

Query parameter targeting

Query parameter tracking is available on all accounts and is our preferred method for sharing live versions with clients, mainly because once set up, it is as simple as sharing a URL.

1. Click the audiences icon at the top of the page 

2. Select create a new audience

3. Drag Query Parameters from the possible conditions and enter parameters of your choice.

4. Click Apply and save the experiment.

5. To view the experiment visit the test URL with query parameters added. In the above example the URL would be:
http://www.distilled.net?test=variation

Cookie targeting

1. Open the browser and create a bookmark on any page

2. Edit the bookmark and change both properties to:

a) Name: Set A Test Cookie

b)URL: The following Javascript code:

<em>javascript:(function(){ var hostname = window.location.hostname; var parts = hostname.split("."); var publicSuffix = hostname; var last = parts[parts.length - 1]; var expireDate = new Date(); expireDate.setDate(expireDate.getDate() + 7); var TOP_LEVEL_DOMAINS = ["com", "local", "net", "org", "xxx", "edu", "es", "gov", "biz", "info", "fr", "gr", "nl", "ca", "de", "kr", "it", "me", "ly", "tv", "mx", "cn", "jp", "il", "in", "iq"]; var SPECIAL_DOMAINS = ["jp", "uk", "au"]; if(parts.length > 2 && SPECIAL_DOMAINS.indexOf(last) != -1){ publicSuffix = parts[parts.length - 3] + "."+ parts[parts.length - 2] + "."+ last} else if(parts.length > 1 && TOP_LEVEL_DOMAINS.indexOf(last) != -1) {publicSuffix = parts[parts.length - 2] + "."+ last} document.cookie = "optly_"+publicSuffix.split(".")[0]+"_test=true; domain=."+publicSuffix+"; path=/; expires="+expireDate.toGMTString()+";"; })();</em>

You should end up with the following:

3. Open the page where you want to place the cookie and click the bookmark

4. The cookie will now be set on the domain you are browsing and will looking something like: ‘optly_YOURDOMAINNAME_test=true’

Next we need to target our experiment to only allow visitors who have the cookie set to see test variations.

5. Click the audiences icon at the top of the page

6. Select create a new audience

7. Drag Cookie into the Conditions and change the name to optly_YOURDOMAINNAME_test=true

8. Click Apply and save the experiment.

Source:
https://help.optimizely.com/hc/en-us/articles/200293784-Setting-a-test-cookie-for-your-site

IP address targeting (only available on Enterprise accounts)

Using IP address targeting is useful when you are looking to test variations in house and on a variety of different devices and browsers.

1. Click the audiences icon at the top of the page

2. Select create a new audience

3. Drag IP Address from the possible conditions and enter the IP address being used. (Not sure of your IP address then head to
http://whatismyipaddress.com/)

4. Click Apply and Save the experiment.


4. Force variations using parameters when debugging pages

There will be times, particular when testing new variations, that there will be the need to view a specific variation. Obviously this can be an issue if your browser has already been bucketed into an alternative variation. Optimizely overcomes this by allowing you to force the variation you wish to view, simply using query parameters.

The query parameter is structured in the following way: optimizely_x
EXPRIMENTID=VARIATIONINDEX

1. The
EXPERIMENTID can be found in the browser URL

2.
VARIATIONINDEX is the variation you want to run, 0 is for the original, 1 is variation #1, 2 is variation #2 etc.

3. Using the above example to force a variation, we would use the following URLstructure to display variation 1 of our experiment:
http://www.yourwebsite.com/?optimizely_x1845540742=1

Source:
https://help.optimizely.com/hc/en-us/articles/200107480-Forcing-a-specific-variation-to-run-and-other-advanced-URL-parameters


5. Don’t change the traffic allocation sliders

Once a test is live it is important not change the amount of traffic allocated to each variation. Doing so can massively affect test results, as one version would potentially begin to receive more return visitors who in turn have a much higher chance of converting.

My colleague Tom Capper discussed further the
do’s and don’ts of statistical significance earlier this year where he explained,

“At the start of your test, you decide to play it safe and set your traffic allocation to 90/10. After a time, it seems the variation is non-disastrous, and you decide to move the slider to 50/50. But return visitors are still always assigned their original group, so now you have a situation where the original version has a larger proportion of return visitors, who are far more likely to convert.”

To summarize, if you do need to adjust the amount of traffic allocated to each test variation, you should look to restart the test to have complete confidence that the data you receive is accurate.


6. Use segmentation to generate better analysis

Okay I understand this one isn’t strictly about Optimizely, but it is certainly worth keeping in mind, particularly earlier on in the CRO process when producing hypothesis around device type.

Conversion rates can vary greatly, particularly when we start segmenting data by locations, browsers, medium, return visits vs new visits, just to name a few. However, by using segmentation we can unearth opportunities that we may have previously overlooked, allowing us to generate new hypotheses for future experiments.


Example

You have been running a test for a month and unfortunately the results are inconclusive. The test version of the page didn’t perform any better or worse than the original. Overall the test results look like the following:


Page Version

Visitors

Transactions

Conversion Rate
Original 41781 1196 2.86%
Variation 42355 1225 2.89%

In this case the test variation overall has only performed
1% better than the original with a significance of 60%. With these results this test variation certainly wouldn’t be getting rolled out any time soon.

However when these results are segmented by
device they tell a very different story:

Drilling into the
desktop results we actually find that the test variation saw a 10% increase in conversions over the original with 97% significance. Yet those using a tablet were converting way below the original, thus driving down the overall conversion rates we were seeing in the first table.

Ultimately with this data we would be able to generate a new hypothesis of “we believe the variation will increase conversion rate for users on a desktop”. We would then re-run the test to desktop only users to verify the previous data and the new hypothesis.

Using segmented data here could also potentially help the experiment reach significance at a much faster rate as
explained in this video from Opticon 2014.

Should the new test be successful and achieve significance we would serve users on the desktops the new variation, whilst those on mobile and tablets continue to be displayed the original site.

Key takeaways

  • Always turn on Google Analytics tracking (and then double check it is turned on).
  • If you plan to make behavioural changes to a page use the Javascript editor rather than the drag and drop feature
  • Use IP address targeting for device testing and query parameters to share a live test with clients.
  • If you need to change the traffic allocation to test variations you should restart the test.
  • Be aware that test performance can vary greatly based on device.

What problems and solutions have you come across when creating CRO experiments with Optimizely? What pieces of information do you wish you had known 6 months ago?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]