​​Measure Your Mobile Rankings and Search Visibility in Moz Analytics

Posted by jon.white

We have launched a couple of new things in Moz Pro that we are excited to share with you all: Mobile Rankings and a Search Visibility score. If you want, you can jump right in by heading to a campaign and adding a mobile engine, or keep reading for more details!

Track your mobile vs. desktop rankings in Moz Analytics

Mobilegeddon came and went with slightly less fanfare than expected, somewhat due to the vast ‘Mobile Friendly’ updates we all did at super short notice (nice work everyone!). Nevertheless, mobile rankings visibility is now firmly on everyone’s radar, and will only become more important over time.

Now you can track your campaigns’ mobile rankings for all of the same keywords and locations you are tracking on desktop.

For this campaign my mobile visibility is almost 20% lower than my desktop visibility and falling;
I can drill down to find out why

Clicking on this will take you into a new Engines tab within your Keyword Rankings page where you can find a more detailed version of this chart as well as a tabular view by keyword for both desktop and mobile. Here you can also filter by label and location.

Here I can see Search Visibility across engines including mobile;
in this case, for my branded keywords.

We have given an extra engine to all campaigns

We’ve given customers an extra engine for each campaign, increasing the number from 3 to 4. Use the extra slot to add the mobile engine and unlock your mobile data!

We will begin to track mobile rankings within 24 hours of adding to a campaign. Once you are set up, you will notice a new chart on your dashboard showing visibility for Desktop vs. Mobile Search Visibility.

Measure your Search Visibility score vs. competitors

The overall Search Visibility for my campaign

Along with this change we have also added a Search Visibility score to your rankings data. Use your visibility score to track and report on your overall campaign ranking performance, compare to your competitors, and look for any large shifts that might indicate penalties or algorithm changes. For a deeper drill-down into your data you can also segment your visibility score by keyword labels or locations. Visit the rankings summary page on any campaign to get started.

How is Search Visibility calculated?

Good question!

The Search Visibility score is the percentage of clicks we estimate you receive based on your rankings positions, across all of your keywords.

We take each ranking position for each keyword, multiply by an estimated click-thru-rate, and then take the average of all of your keywords. You can think of it as the percentage of your SERPs that you own. The score is expressed as a percentage, though scores of 100% would be almost impossible unless you are tracking keywords using the “site:” modifier. It is probably more useful to measure yourself vs. your competitors rather than focus on the actual score, but, as a rule of thumb, mid-40s is probably the realistic maximum for non-branded keywords.

Jeremy, our Moz Analytics TPM, came up with this metaphor:

Think of the SERPs for your keywords as villages. Each position on the SERP is a plot of land in SERP-village. The Search Visibility score is the average amount of plots you own in each SERP-village. Prime real estate plots (i.e., better ranking positions, like #1) are worth more. A complete monopoly of real estate in SERP-village would equate to a score of 100%. The Search Visibility score equates to how much total land you own in all SERP-villages.

Some neat ways to use this feature

  • Label and group your keywords, particularly when you add them – As visibility score is an average of all of your keywords, when you add or remove keywords from your campaign you will likely see fluctuations in the score that are unrelated to performance. Solve this by getting in the habit of labeling keywords when you add them. Then segment your data by these labels to track performance of specific keyword groups over time.
  • See how location affects your mobile rankings – Using the Engines tab in Keyword Rankings, use the filters to select just local keywords. Look for big differences between Mobile and Desktop where Google might be assuming local intent for mobile searches but not for desktop. Check out how your competitors perform for these keywords. Can you use this data?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Your Daily SEO Fix: Week 5

Posted by Trevor-Klein

We’ve arrived, folks! This is the last installment of our short (< 2-minute) video tutorials that help you all get the most out of Moz’s tools. If you haven’t been following along, these are each designed to solve a use case that we regularly hear about from Moz community members.

Here’s a quick recap of the previous round-ups in case you missed them:

  • Week 1: Reclaim links using Open Site Explorer, build links using Fresh Web Explorer, and find the best time to tweet using Followerwonk.
  • Week 2: Analyze SERPs using new MozBar features, boost your rankings through on-page optimization, check your anchor text using Open Site Explorer, do keyword research with OSE and the keyword difficulty tool, and discover keyword opportunities in Moz Analytics.
  • Week 3: Compare link metrics in Open Site Explorer, find tweet topics with Followerwonk, create custom reports in Moz Analytics, use Spam Score to identify high-risk links, and get link building opportunities delivered to your inbox.
  • Week 4: Use Fresh Web Explorer to build links, analyze rank progress for a given keyword, use the MozBar to analyze your competitors’ site markup, use the Top Pages report to find content ideas, and find on-site errors with Crawl Test.

We’ve got five new fixes for you in this edition:

  • How to Use the Full SERP Report
  • How to Find Fresh Links and Manage Your Brand Online Using Open Site Explorer
  • How to Build Your Link Profile with Link Intersect
  • How to Find Local Citations Using the MozBar
  • Bloopers: How to Screw Up While Filming a Daily SEO Fix

Hope you enjoy them!


Fix 1: How to Use the Full SERP Report

Moz’s Full SERP Report is a detailed report that shows the top ten ranking URLs for a specific keyword and presents the potential ranking signals in an easy-to-view format. In this Daily SEO Fix, Meredith breaks down the report so you can see all the sections and how each are used.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 2: How to Find Fresh Links and Manage Your Brand Online Using Open Site Explorer

The Just-Discovered Links report in Open Site Explorer helps you discover recently created links within an hour of them being published. In this fix, Nick shows you how to use the report to view who is linking to you, how they’re doing it, and what they are saying, so you can capitalize on link opportunities while they’re still fresh and join the conversation about your brand.


Fix 3: How to Build Your Link Profile with Link Intersect

The quantity and (more importantly) quality of backlinks to your website make up your link profile, one of the most important elements in SEO and an incredibly important factor in search engine rankings. In this Daily SEO Fix, Tori shows you how to use Moz’s Link Intersect tool to analyze the competitions’ backlinks. Plus, learn how to find opportunities to build links and strengthen your own link profile.


Fix 4: How to Find Local Citations Using the MozBar

Citations are mentions of your business and address on webpages other than your own such as an online yellow pages directory or a local business association page. They are a key component in search engine ranking algorithms so building consistent and accurate citations for your local business(s) is a key Local SEO tactic. In today’s Daily SEO Fix, Tori shows you how to use MozBar to find local citations around the web


Bloopers: How to Screw Up While Filming a Daily SEO Fix

We had a lot of fun filming this series, and there were plenty of laughs along the way. Like these ones. =)


Looking for more?

We’ve got more videos in the previous four weeks’ round-ups!

Your Daily SEO Fix: Week 1

Your Daily SEO Fix: Week 2

Your Daily SEO Fix: Week 3

Your Daily SEO Fix: Week 4


Don’t have a Pro subscription? No problem. Everything we cover in these Daily SEO Fix videos is available with a free 30-day trial.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Your Daily SEO Fix: Week 4

Posted by Trevor-Klein

This week, we’ve got the fourth (and second-to-last) installment of our short (< 2-minute) video tutorials that help you all get the most out of Moz’s tools. They’re each designed to solve a use case that we regularly hear about from Moz community members.

Here’s a quick recap of the previous round-ups in case you missed them:

  • Week 1: Reclaim links using Open Site Explorer, build links using Fresh Web Explorer, and find the best time to tweet using Followerwonk.
  • Week 2: Analyze SERPs using new MozBar features, boost your rankings through on-page optimization, check your anchor text using Open Site Explorer, do keyword research with OSE and the keyword difficulty tool, and discover keyword opportunities in Moz Analytics.
  • Week 3: Compare link metrics in Open Site Explorer, find tweet topics with Followerwonk, create custom reports in Moz Analytics, use Spam Score to identify high-risk links, and get link building opportunities delivered to your inbox.

In this installment, we’ve got five brand new tutorials:

  • How to Use Fresh Web Explorer to Build Links
  • How to Analyze Rank Progress for a Given Keyword
  • How to Use the MozBar to Analyze Your Competitors’ Site Markup
  • How to Use the Top Pages Report to Find Content Ideas
  • How to Find On-Site Errors with Crawl Test

Hope you enjoy them!

Fix 1: How to Use Fresh Web Explorer to Build Links

If you have unique data or a particularly excellent resource on your site, that content can be a great link magnet. In this Daily SEO Fix, Felicia shows you how to set up alerts in Fresh Web Explorer to track mentions of relevant keyword phrases, find link opportunities, and build links to your content.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 2: How to Analyze Rank Progress for a Given Keyword

Moz’s Rank Tracker tool retrieves search engine rankings for pages and keywords, storing them for easy comparison later. In this fix, James shows you how to use this helpful tool to track keywords, save time, and improve your rankings.


Fix 3: How to Use the MozBar to Analyze Your Competitors’ Site Markup

Schema markup helps search engines better identify what your (and your competitors’) website pages are all about and as a result can lead to a boost to rankings. In this Daily SEO Fix, Jordan shows you how to use the MozBar to analyze the schema markup of the competition and optimize your own site and pages for rich snippets.


Fix 4: How to Use the Top Pages Report to Find Content Ideas

With Moz’s Top Pages report in Open Site Explorer, you can see the pages on your site (and the competitions’ sites!) that are top performers. In this fix, Nick shows you how to use the report to analyze your competitors’ content marketing efforts and to inform your own.


Fix 5: How to Find On-Site Errors with Crawl Test

Identifying and understanding any potential errors on your site is crucial to the life of any SEO. In this Daily SEO Fix Sean shows you how to use the Crawl Test tool in Moz Analytics to pull reports and identify any errors on your site.


Looking for more?

We’ve got more videos in the previous three weeks’ round-ups!

Your Daily SEO Fix: Week 1

Your Daily SEO Fix: Week 2

Your Daily SEO Fix: Week 3


Don’t have a Pro subscription? No problem. Everything we cover in these Daily SEO Fix videos is available with a free 30-day trial.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Your Daily SEO Fix: Week 3

Posted by Trevor-Klein

Welcome to the third installment of our short (< 2-minute) video tutorials that help you all get the most out of Moz’s tools. Each tutorial is designed to solve a use case that we regularly hear about from Moz community members—a need or problem for which you all could use a solution.

If you missed the previous roundups, you can find ’em here:

  • Week 1: Reclaim links using Open Site Explorer, build links using Fresh Web Explorer, and find the best time to tweet using Followerwonk.
  • Week 2: Analyze SERPs using new MozBar features, boost your rankings through on-page optimization, check your anchor text using Open Site Explorer, do keyword research with OSE and the keyword difficulty tool, and discover keyword opportunities in Moz Analytics.

Today, we’ve got a brand-new roundup of the most recent videos:

  • How to Compare Link Metrics in Open Site Explorer
  • How to Find Tweet Topics with Followerwonk
  • How to Create Custom Reports in Moz Analytics
  • How to Use Spam Score to Identify High-Risk Links
  • How to Get Link Building Opportunities Delivered to Your Inbox

Hope you enjoy them!

Fix 1: How to Compare Link Metrics in Open Site Explorer

Not all links are created equal. In this Daily SEO Fix, Chiaryn shows you how to use Open Site Explorer to analyze and compare link metrics for up to five URLs to see which are strongest.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 2: How to Find Tweet Topics with Followerwonk

Understanding what works best for your competitors on Twitter is a great place to start when forming your own Twitter strategy. In this fix, Ellie explains how to identify strong-performing tweets from your competitors and how to use those tweets to shape your own voice and plan.


Fix 3: How to Create Custom Reports in Moz Analytics

In this Daily SEO Fix, Kevin shows you how to create a custom report in Moz Analytics and schedule it to be delivered to your inbox on a daily, weekly, or monthly basis.


Fix 4: How to Use Spam Score to Identify High-Risk Links

Almost every site has a few bad links pointing to it, but lots of highly risky links can have a negative impact on your search engine rankings. In this fix, Tori shows you how to use Moz’s Spam Score metric to identify spammy links.


Fix 5: How to Get Link Building Opportunities Delivered to Your Inbox

Building high-quality links is one of the most important aspects of SEO. In this Daily SEO Fix, Erin shows you how to use Moz Analytics to set up a weekly custom report that will notify you of pages on the web that mention your site but do not include a link, so you can use this info to build more links.


Looking for more?

We’ve got more videos in the previous two weeks’ round-ups!

Your Daily SEO Fix: Week 1

Your Daily SEO Fix: Week 2


Don’t have a Pro subscription? No problem. Everything we cover in these Daily SEO Fix videos is available with a free 30-day trial.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Your Daily SEO Fix: Week 2

Posted by Trevor-Klein

Last week, we began posting short (< 2-minute) video tutorials that help you all get the most out of Moz’s tools. Each tutorial is designed to solve a use case that we regularly hear about from Moz community members—a need or problem for which you all could use a solution.

Today, we’ve got a brand-new roundup of the most recent videos:

  • How to Examine and Analyze SERPs Using New MozBar Features
  • How to Boost Your Rankings through On-Page Optimization
  • How to Check Your Anchor Text Using Open Site Explorer
  • How to Do Keyword Research with OSE and the Keyword Difficulty Tool
  • How to Discover Keyword Opportunities in Moz Analytics

Let’s get right down to business!

Fix 1: How to Examine and Analyze SERPs Using New MozBar Features

The MozBar is a handy tool that helps you access important SEO metrics while you surf the web. In this Daily SEO Fix, Abe shows you how to use this toolbar to examine and analyze SERPs and access keyword difficulty scores for a given page—in a single click.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 2: How to Boost Your Rankings through On-Page Optimization

There are several on-page factors that influence your search engine rankings. In this Daily SEO Fix, Holly shows you how to use Moz’s On-Page Optimization tool to identify pages on your website that could use some love and what you can do to improve them.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 3: How to Check Your Anchor Text Using Open Site Explorer

Dive into OSE with Tori in this Daily SEO Fix to check out the anchor text opportunities for Moz.com. By highlighting all your anchor text you can discover other potential keyword ranking opportunities you might not have thought of before.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 4: How to Do Keyword Research with OSE and the Keyword Difficulty Tool

Studying your competitors can help identify keyword opportunities for your own site. In this Daily SEO Fix, Jacki walks through how to use OSE to research the anchor text for competitors websites and how to use the Keyword Difficulty Tool to identify potential expansion opportunities for your site.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 5: How to Discover Keyword Opportunities in Moz Analytics

Digesting organic traffic that is coming to your site is an easy way to surface potential keyword opportunities. In this Daily SEO Fix, Chiaryn walks through the keyword opportunity tab in Moz Analytics and highlights a quick tip for leveraging that tool.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Looking for more?

We’ve got more videos in last week’s round-up! Check it out here.


Don’t have a Pro subscription? No problem. Everything we cover in these Daily SEO Fix videos is available with a free 30-day trial.

Sounds good. Sign me up!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Moving 5 Domains to 1: An SEO Case Study

Posted by Dr-Pete

People often ask me if they should change domain names, and I always shudder just a little. Changing domains is a huge, risky undertaking, and too many people rush into it seeing only the imaginary upside. The success of the change also depends wildly on the details, and it’s not the kind of question anyone should be asking casually on social media.

Recently, I decided that it was time to find a new permanent home for my personal and professional blogs, which had gradually spread out over 5 domains. I also felt my main domain was no longer relevant to my current situation, and it was time for a change. So, ultimately I ended up with a scenario that looked like this:

The top three sites were active, with UserEffect.com being my former consulting site and blog (and relatively well-trafficked). The bottom two sites were both inactive and were both essentially gag sites. My one-pager, AreYouARealDoctor.com, did previously rank well for “are you a real doctor”, so I wanted to try to recapture that.

I started migrating the 5 sites in mid-January, and I’ve been tracking the results. I thought it would be useful to see how this kind of change plays out, in all of the gory details. As it turns out, nothing is ever quite “textbook” when it comes to technical SEO.

Why Change Domains at All?

The rationale for picking a new domain could fill a month’s worth of posts, but I want to make one critical point – changing domains should be about your business goals first, and SEO second. I did not change domains to try to rank better for “Dr. Pete” – that’s a crap shoot at best. I changed domains because my old consulting brand (“User Effect”) no longer represented the kind of work I do and I’m much more known by my personal brand.

That business case was strong enough that I was willing to accept some losses. We went through a similar transition here
from SEOmoz.org to Moz.com. That was a difficult transition that cost us some SEO ground, especially short-term, but our core rationale was grounded in the business and where it’s headed. Don’t let an SEO pipe dream lead you into a risky decision.

Why did I pick a .co domain? I did it for the usual reason – the .com was taken. For a project of this type, where revenue wasn’t on the line, I didn’t have any particular concerns about .co. The evidence on how top-level domains (TLDs) impact ranking is tough to tease apart (so many other factors correlate with .com’s), and Google’s attitude tends to change over time, especially if new TLDs are abused. Anecdotally, though, I’ve seen plenty of .co’s rank, and I wasn’t concerned.

Step 1 – The Boring Stuff

It is absolutely shocking how many people build a new site, slap up some 301s, pull the switch, and hope for the best. It’s less shocking how many of those people end up in Q&A a week later, desperate and bleeding money.


Planning is hard work, and it’s boring – get over it.

You need to be intimately familiar with every page on your existing site(s), and, ideally, you should make a list. Not only do you have to plan for what will happen to each of these pages, but you’ll need that list to make sure everything works smoothly later.

In my case, I decided it might be time to do some housekeeping – the User Effect blog had hundreds of posts, many outdated and quite a few just not very good. So, I started with the easy data – recent traffic. I’m sure you’ve seen this Google Analytics report (Behavior > Site Content > All Pages):

Since I wanted to focus on recent activity, and none of the sites had much new content, I restricted myself to a 3-month window (Q4 of 2014). Of course, I looked much deeper than the top 10, but the principle was simple – I wanted to make sure the data matched my intuition and that I wasn’t cutting off anything important. This helped me prioritize the list.

Of course, from an SEO standpoint, I also didn’t want to lose content that had limited traffic but solid inbound links. So, I checked my “Top Pages” report in
Open Site Explorer:

Since the bulk of my main site was a blog, the top trafficked and top linked-to pages fortunately correlated pretty well. Again, this is only a way to prioritize. If you’re dealing with sites with thousands of pages, you need to work methodically through the site architecture.

I’m going to say something that makes some SEOs itchy – it’s ok not to move some pages to the new site. It’s even ok to let some pages 404. In Q4, UserEffect.com had traffic to 237 URLs. The top 10 pages accounted for 91.9% of that traffic. I strongly believe that moving domains is a good time to refocus a site and concentrate your visitors and link equity on your best content. More is not better in 2015.

Letting go of some pages also means that you’re not 301-redirecting a massive number of old URLs to a new home-page. This can look like a low-quality attempt to consolidate link-equity, and at large scale it can raise red flags with Google. Content worth keeping should exist on the new site, and your 301s should have well-matched targets.

In one case, I had a blog post that had a decent trickle of traffic due to ranking for “50,000 push-ups,” but the post itself was weak and the bounce rate was very high:

The post was basically just a placeholder announcing that I’d be attempting this challenge, but I never recapped anything after finishing it. So, in this case,
I rewrote the post.

Of course, this process was repeated across the 3 active sites. The 2 inactive sites only constituted a handful of total pages. In the case of AreYouARealDoctor.com, I decided to turn the previous one-pager
into a new page on the new site. That way, I had a very well-matched target for the 301-redirect, instead of simply mapping the old site to my new home-page.

I’m trying to prove a point – this is the amount of work I did for a handful of sites that were mostly inactive and producing no current business value. I don’t need consulting gigs and these sites produce no direct revenue, and yet I still considered this process worth the effort.

Step 2 – The Big Day

Eventually, you’re going to have to make the move, and in most cases, I prefer ripping off the bandage. Of course, doing something all at once doesn’t mean you shouldn’t be careful.

The biggest problem I see with domain switches (even if they’re 1-to-1) is that people rely on data that can take weeks to evaluate, like rankings and traffic, or directly checking Google’s index. By then, a lot of damage is already done. Here are some ways to find out quickly if you’ve got problems…

(1) Manually Check Pages

Remember that list you were supposed to make? It’s time to check it, or at least spot-check it. Someone needs to physically go to a browser and make sure that each major section of the site and each important individual page is resolving properly. It doesn’t matter how confident your IT department/guy/gal is – things go wrong.

(2) Manually Check Headers

Just because a page resolves, it doesn’t mean that your 301-redirects are working properly, or that you’re not firing some kind of 17-step redirect chain. Check your headers. There are tons of free tools, but lately I’m fond of
URI Valet. Guess what – I screwed up my primary 301-redirects. One of my registrar transfers wasn’t working, so I had to have a setting changed by customer service, and I inadvertently ended up with 302s (Pro tip: Don’t change registrars and domains in one step):

Don’t think that because you’re an “expert”, your plan is foolproof. Mistakes happen, and because I caught this one I was able to correct it fairly quickly.

(3) Submit Your New Site

You don’t need to submit your site to Google in 2015, but now that Google Webmaster Tools allows it, why not do it? The primary argument I hear is “well, it’s not necessary.” True, but direct submission has one advantage – it’s fast.

To be precise, Google Webmaster Tools separates the process into “Fetch” and “Submit to index” (you’ll find this under “Crawl” > “Fetch as Google”). Fetching will quickly tell you if Google can resolve a URL and retrieve the page contents, which alone is pretty useful. Once a page is fetched, you can submit it, and you should see something like this:

This isn’t really about getting indexed – it’s about getting nearly instantaneous feedback. If Google has any major problems with crawling your site, you’ll know quickly, at least at the macro level.

(4) Submit New XML Sitemaps

Finally, submit a new set of XML sitemaps in Google Webmaster Tools, and preferably tiered sitemaps. While it’s a few years old now, Rob Ousbey has a great post on the subject of
XML sitemap structure. The basic idea is that, if you divide your sitemap into logical sections, it’s going to be much easier to diagnosis what kinds of pages Google is indexing and where you’re running into trouble.

A couple of pro tips on sitemaps – first, keep your old sitemaps active temporarily. This is counterintuitive to some people, but unless Google can crawl your old URLs, they won’t see and process the 301-redirects and other signals. Let the old accounts stay open for a couple of months, and don’t cut off access to the domains you’re moving.

Second (I learned this one the hard way), make sure that your Google Webmaster Tools site verification still works. If you use file uploads or meta tags and don’t move those files/tags to the new site, GWT verification will fail and you won’t have access to your old accounts. I’d recommend using a more domain-independent solution, like verifying with Google Analytics. If you lose verification, don’t panic – your data won’t be instantly lost.

Step 3 – The Waiting Game

Once you’ve made the switch, the waiting begins, and this is where many people start to panic. Even executed perfectly, it can take Google weeks or even months to process all of your 301-redirects and reevaluate a new domain’s capacity to rank. You have to expect short term fluctuations in ranking and traffic.

During this period, you’ll want to watch a few things – your traffic, your rankings, your indexed pages (via GWT and the site: operator), and your errors (such as unexpected 404s). Traffic will recover the fastest, since direct traffic is immediately carried through redirects, but ranking and indexation will lag, and errors may take time to appear.

(1) Monitor Traffic

I’m hoping you know how to check your traffic, but actually trying to determine what your new levels should be and comparing any two days can be easier said than done. If you launch on a Friday, and then Saturday your traffic goes down on the new site, that’s hardly cause for panic – your traffic probably
always goes down on Saturday.

In this case, I redirected the individual sites over about a week, but I’m going to focus on UserEffect.com, as that was the major traffic generator. That site was redirected, in full on January 21st, and the Google Analytics data for January for the old site looked like this:

So far, so good – traffic bottomed out almost immediately. Of course, losing traffic is easy – the real question is what’s going on with the new domain. Here’s the graph for January for DrPete.co:

This one’s a bit trickier – the first spike, on January 16th, is when I redirected the first domain. The second spike, on January 22nd, is when I redirected UserEffect.com. Both spikes are meaningless – I announced these re-launches on social media and got a short-term traffic burst. What we really want to know is where traffic is leveling out.

Of course, there isn’t a lot of history here, but a typical day for UserEffect.com in January was about 1,000 pageviews. The traffic to DrPete.co after it leveled out was about half that (500 pageviews). It’s not a complete crisis, but we’re definitely looking at a short-term loss.

Obviously, I’m simplifying the process here – for a large, ecommerce site you’d want to track a wide range of metrics, including conversion metrics. Hopefully, though, this illustrates the core approach. So, what am I missing out on? In this day of [not provided], tracking down a loss can be tricky. Let’s look for clues in our other three areas…

(2) Monitor Indexation

You can get a broad sense of your indexed pages from Google Webmaster Tools, but this data often lags real-time and isn’t very granular. Despite its shortcomings, I still prefer
the site: operator. Generally, I monitor a domain daily – any one measurement has a lot of noise, but what you’re looking for is the trend over time. Here’s the indexed page count for DrPete.co:

The first set of pages was indexed fairly quickly, and then the second set started being indexed soon after UserEffect.com was redirected. All in all, we’re seeing a fairly steady upward trend, and that’s what we’re hoping to see. The number is also in the ballpark of sanity (compared to the actual page count) and roughly matched GWT data once it started being reported.

So, what happened to UserEffect.com’s index after the switch?

The timeframe here is shorter, since UserEffect.com was redirected last, but we see a gradual decline in indexation, as expected. Note that the index size plateaus around 60 pages – about 1/4 of the original size. This isn’t abnormal – low-traffic and unlinked pages (or those with deep links) are going to take a while to clear out. This is a long-term process. Don’t panic over the absolute numbers – what you want here is a downward trend on the old domain accompanied by a roughly equal upward trend on the new domain.

The fact that UserEffect.com didn’t bottom out is definitely worth monitoring, but this timespan is too short for the plateau to be a major concern. The next step would be to dig into these specific pages and look for a pattern.

(3) Monitor Rankings

The old domain is dropping out of the index, and the new domain is taking its place, but we still don’t know why the new site is taking a traffic hit. It’s time to dig into our core keyword rankings.

Historically, UserEffect.com had ranked well for keywords related to “split test calculator” (near #1) and “usability checklist” (in the top 3). While [not provided] makes keyword-level traffic analysis tricky, we also know that the split-test calculator is one of the top trafficked pages on the site, so let’s dig into that one. Here’s the ranking data from Moz Analytics for “split test calculator”:

The new site took over the #1 position from the old site at first, but then quickly dropped down to the #3/#4 ranking. That may not sound like a lot, but given this general keyword category was one of the site’s top traffic drivers, the CTR drop from #1 to #3/#4 could definitely be causing problems.

When you have a specific keyword you can diagnose, it’s worth taking a look at the live SERP, just to get some context. The day after relaunch, I captured this result for “dr. pete”:

Here, the new domain is ranking, but it’s showing the old title tag. This may not be cause for alarm – weird things often happen in the very short term – but in this case we know that I accidentally set up a 302-redirect. There’s some reason to believe that Google didn’t pass full link equity during that period when 301s weren’t implemented.

Let’s look at a domain where the 301s behaved properly. Before the site was inactive, AreYouARealDoctor.com ranked #1 for “are you a real doctor”. Since there was an inactive period, and I dropped the exact-match domain, it wouldn’t be surprising to see a corresponding ranking drop.

In reality, the new site was ranking #1 for “are you a real doctor” within 2 weeks of 301-redirecting the old domain. The graph is just a horizontal line at #1, so I’m not going to bother you with it, but here’s a current screenshot (incognito):

Early on, I also spot-checked this result, and it wasn’t showing the strange title tag crossover that UserEffect.com pages exhibited. So, it’s very likely that the 302-redirects caused some problems.

Of course, these are just a couple of keywords, but I hope it provides a starting point for you to understand how to methodically approach this problem. There’s no use crying over spilled milk, and I’m not going to fire myself, so let’s move on to checking any other errors that I might have missed.

(4) Check Errors (404s, etc.)

A good first stop for unexpected errors is the “Crawl Errors” report in Google Webmaster Tools (Crawl > Crawl Errors). This is going to take some digging, especially if you’ve deliberately 404’ed some content. Over the couple of weeks after re-launch, I spotted the following problems:

The old site had a “/blog” directory, but the new site put the blog right on the home-page and had no corresponding directory. Doh. Hey, do as I say, not as I do, ok? Obviously, this was a big blunder, as the old blog home-page was well-trafficked.

The other two errors here are smaller but easy to correct. MinimalTalent.com had a “/free” directory that housed downloads (mostly PDFs). I missed it, since my other sites used a different format. Luckily, this was easy to remap.

The last error is a weird looking URL, and there are other similar URLs in the 404 list. This is where site knowledge is critical. I custom-designed a URL shortener for UserEffect.com and, in some cases, people linked to those URLs. Since those URLs didn’t exist in the site architecture, I missed them. This is where digging deep into historical traffic reports and your top-linked pages is critical. In this case, the fix isn’t easy, and I have to decide whether the loss is worth the time.

What About the New EMD?

My goal here wasn’t to rank better for “Dr. Pete,” and finally unseat Dr. Pete’s Marinades, Dr. Pete the Sodastream flavor (yes, it’s hilarious – you can stop sending me your grocery store photos), and 172 dentists. Ok, it mostly wasn’t my goal. Of course, you might be wondering how switching to an EMD worked out.

In the short term, I’m afraid the answer is “not very well.” I didn’t track ranking for “Dr. Pete” and related phrases very often before the switch, but it appears that ranking actually fell in the short-term. Current estimates have me sitting around page 4, even though my combined link profile suggests a much stronger position. Here’s a look at the ranking history for “dr pete” since relaunch (from Moz Analytics):

There was an initial drop, after which the site evened out a bit. This less-than-impressive plateau could be due to the bad 302s during transition. It could be Google evaluating a new EMD and multiple redirects to that EMD. It could be that the prevalence of natural anchor text with “Dr. Pete” pointing to my site suddenly looked unnatural when my domain name switched to DrPete.co. It could just be that this is going to take time to shake out.

If there’s a lesson here (and, admittedly, it’s too soon to tell), it’s that you shouldn’t rush to buy an EMD in 2015 in the wild hope of instantly ranking for that target phrase. There are so many factors involved in ranking for even a moderately competitive term, and your domain is just one small part of the mix.

So, What Did We Learn?

I hope you learned that I should’ve taken my own advice and planned a bit more carefully. I admit that this was a side project and it didn’t get the attention it deserved. The problem is that, even when real money is at stake, people rush these things and hope for the best. There’s a real cheerleading mentality when it comes to change – people want to take action and only see the upside.

Ultimately, in a corporate or agency environment, you can’t be the one sour note among the cheering. You’ll be ignored, and possibly even fired. That’s not fair, but it’s reality. What you need to do is make sure the work gets done right and people go into the process with eyes wide open. There’s no room for shortcuts when you’re moving to a new domain.

That said, a domain change isn’t a death sentence, either. Done right, and with sensible goals in mind – balancing not just SEO but broader marketing and business objectives – a domain migration can be successful, even across multiple sites.

To sum up: Plan, plan, plan, monitor, monitor, monitor, and try not to panic.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

How to Defeat Duplicate Content – Next Level

Posted by EllieWilkinson

Welcome to the third installment of Next Level! In the previous Next Level blog post, we shared a workflow showing you how to take on your competitors using Moz tools. We’re continuing the educational series with several new videos all about resolving duplicate content. Read on and level up!


Dealing with duplicate content can feel a bit like doing battle with your site’s evil doppelgänger—confusing and tricky to defeat! But identifying and resolving duplicates is a necessary part of helping search engines decide on relevant results. In this short video, learn about how duplicate content happens, why it’s important to fix, and a bit about how you can uncover it.

Next Level – Identifying Duplicate_pt1

[
Quick clarification: Search engines don’t actively penalize duplicate content, per se; they just don’t always understand it as well, which can lead to a drop in rankings. More info here.]

Now that you have a better idea of how to identify those dastardly duplicates, let’s get rid of ’em once and for all. Watch this next video to review how to use Moz Analytics to find and fix duplicate content using three common solutions. (You’ll need a Moz Pro subscription to use Moz Analytics. If you aren’t yet a Moz Pro subscriber, you can always try out the tools with a
30-day free trial.)

Workflow summary

Here’s a review of the three common solutions to conquering duplicate content:

  1. 301 redirect. Check Page Authority to see if one page has a higher PA than the other using Open Site Explorer, then set up a 301 redirect from the duplicate page to the original page. This will ensure that they no longer compete with one another in the search results. Wondering what a 301 redirect is and how to do it? Read more about redirection here.
  2. Rel=canonical. A rel=canonical tag passes the same amount of ranking power as a 301 redirect, and there’s a bonus: it often takes less development time to implement! Add this tag to the HTML head of a web page to tell search engines that it should be treated as a copy of the “canon,” or original, page:
    <head> <link rel="canonical" href="http://moz.com/blog/" /> </head>

    If you’re curious, you can
    read more about canonicalization here.

  3. noindex, follow. Add the values “noindex, follow” to the meta robots tag to tell search engines not to include the duplicate pages in their indexes, but to crawl their links. This works really well with paginated content or if you have a system set up to tag or categorize content (as with a blog). Here’s what it should look like:
    <head> <meta name="robots" content="noindex, follow" /> </head>

    If you’re looking to block the Moz crawler, Rogerbot, you can use the robots.txt file if you prefer—he’s a good robot, and he’ll obey!
    More about meta robots (and robots.txt) here.

Can’t get enough of duplicate content? Want to become a duplicate content connoisseur? This last video explains more about how Moz finds duplicates, if you’re curious. And you can read even more over at the
Moz Developer Blog.

We’d love to hear about your techniques for defeating duplicates! Chime in below in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Location is Everything: Local Rankings in Moz Analytics

Posted by MatthewBrown

Today we are thrilled to launch 
local rankings as a feature in Moz Analytics, which gives our customers the ability to assign geo-locations to their tracked keywords. If you’re a Moz Analytics customer and are ready to jump right in, here’s where you an find the new feature within the application:

Not a Moz Analytics customer? You can take the new features for a free spin…

One of the biggest SEO developments of the last several years is how frequently Google is returning localized organics across a rapidly increasing number of search queries. It’s not just happening for “best pizza in Portland” (the answer to that is
Apizza Scholls, by the way). Searches like “financial planning” and “election guide” now trigger Google’s localization algorithm:

local search results election guide

This type of query underscores the need to track rankings on a local level. I’m searching for a non-localized keyword (“election guide”), but Google recognizes I’m searching from Portland, Oregon so they add the localization layer to the result.

Local tends to get lost in the shuffle of zoo animal updates we’ve seen from Google in the last couple of years, but search marketers are coming around to realize the 2012 Venice update was one of the most important changes Google made to the search landscape. It certainly didn’t seem like a huge deal when it launched; here’s how Google described Venice as part of the late lamented
monthly search product updates they used to provide:

  • Improvements to ranking for local search results. [launch codename “Venice”] This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal.

Seems innocent enough, right? What the Venice update actually kicked off was a long-term relationship between local search results (what we see in Google local packs and map results) and the organic search results that, once upon a time, existed on their own. “Localized organics,” as they are known, have been increasingly altering the organic search landscape for keywords that normally triggered “generic” or national rankings. If you haven’t already read it, Mike Ramsey’s article on
how to adjust for the Venice update remains one of the best strategic looks at the algorithm update.

This jump in localized organic results has prompted both marketers and business owners to track rankings at the local level. An increasing number of Moz customers have been requesting the ability to add locations to their keywords since the 2012 Venice update, and this is likely due to Google expanding the queries which trigger a localized result. You asked for it, and today we’re delivering. Our new local rankings feature allows our customers to track keywords for any city, state, or ZIP/postal code.

Geo-located searches

We can now return rankings based on a location you specify, just like I set my search to Portland in the example above. This is critical for monitoring the health of your local search campaigns, as Google continues to fold the location layer into the organic results. Here’s how it looks in Moz Analytics:

tracking local keyword ranking

A keyword with a location specified counts against your keyword limit in Moz Analytics just like any other keyword.

The location being tracked will also be displayed in your rankings reports as well as on the keyword analysis page:

local keyword difficulty

The local rankings feature allows you to enter your desired tracking location by city, state, neighborhood, and zip or postal code. We provide neighborhood-level granularity via dropdown for the United States, United Kingdom, Canada and Australia. The dropdown will also provide city-level listings for other countries. It’s also possible to enter a location of your choice not on the list in the text box. Fair warning: We cannot guarantee the accuracy of rankings in mythical locations like Westeros or Twin Peaks, or mythical spellings like Pordland or Los Andules.

An easy way to get started with the new feature is to look at keywords you are already tracking, and find the ones that have an obvious local intent for searchers. Then add the neighborhood or city you are targeting for the most qualified searchers.

What’s next?

We will be launching local rankings functionality within the Moz Local application in the first part of 2015, which will provide needed visibility to folks who are mainly concerned with Local SEO. We’re also working on functionality to allow users to easily add geo-modifiers to their tracked keywords, so we can provide rankings for “health club Des Moines” alongside tracking rankings for “health clubs” in the 50301 zip code.

Right now this feature works with all Google engines (we’ll be adding Bing and Yahoo! later). We’ll also be keeping tabs on Google’s advancements on the local front so we can provide our customers with the best data on their local visibility.

Please let us know what you think in the comments below! Customer feedback, suggestions, and comments were instrumental into both the design and prioritization of this feature.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Conquer Your Competition with these Three Moz Tools – Next Level

Posted by EllieWilkinson

Welcome to the second edition of Next Level! In the first Next Level blog post, the Success Team and Help Team here at Moz created 10 video walkthroughs to help you “power up” your knowledge of the Moz tools. We’re continuing the educational series with a new video and a workflow showing you how to take on your competitors using Moz. Read on and level up!


For SEOs, the battle to rank highest in the search results often comes down to survival of the fittest. But if you know how to size up your competition, you can gain the upper hand and become king of the jungle! Come on a SERP-fari in this Next Level video and try these three ways to use the Moz tools to out-hunt all the other lions.

Workflow summary

To review, here’s an outline of the three steps to scoping out the competition!

(You’ll need a Moz Pro subscription to use Keyword Difficulty and Fresh Web Explorer. If you aren’t yet a Moz Pro subscriber, you can always try out the tools with a 30-day free trial.)

  1. After you’ve entered three competitors in your Moz Analytics campaign settings, head over to the Keyword Difficulty tool to get a detailed look at the search results for keywords you’re targeting. Don’t forget to run a full SERP analysis report for even more data!
  2. Next, investigate your competitors’ recent links and brand mentions using Fresh Web Explorer to get some content and link building ideas.
  3. Finally, head over to Followerwonk to find potential Twitter followers to poach from your competitors.

Looking for other resources to help you plan your attack? Here are some that might help. Go get ’em, tiger! (But watch out for zebras…)

If you have other ways of using the Moz tools to rule the jungle, we’d love to hear them! Sound off in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Open Site Explorer’s New Link Building Opportunities Section (and a Slight Redesign)

Posted by randfish

Why hello there! You’re looking marvelous today, you really are. And, in other good news, Open Site Explorer has a bit of a new look—and an entirely new section called “Link Opportunities” to help make some link prospecting tasks easier and more automated. Come with me and I’ll show you; it’ll be fun 🙂

The new look

We know a lot of folks liked the old tab structure but we ran out of space. With this redesign we now have the flexibility to add new features and functionality simply by popping in new sections on the left sidebar menu. It’s a little bit more like Moz Analytics, too, and we figure some cohesion between our products is probably wise.

  • New side navigation with plenty of room to grow and add new features (spam scoring and analysis, for example, will be coming in Q4—but shhh… I didn’t actually ask for permission to talk about that yet. I figure begging forgiveness will work.)
  • Improved filtering that lets you slice and dice your link data more easily.
  • Notice How Fast the New OSE Is? Oh yeah, that’s the stuff 🙂

You can still access the old Open Site Explorer’s design for a few more weeks, but the new features will exist only in the new version.

Introducing the new link opportunities section

Need help finding outreach targets for your link building campaign? We’re introducing three new reports that will help you build a curated list of potential targets. The new reports are available to all Moz Pro subscribers. If you’re a community member, sign up for a
Moz Pro Free Trial and you, too, can kick it with the new functionality.


Reclaim links

A filtered view of Top Pages that lets you easily export a ranked list of URLs to fix.


Unlinked mentions

Powered by FreshScape, you can use
Fresh Web Explorer queries to find mentions of a brand or site without links. Ping sources that may have talked about your brand, website, people, or products without giving you a link and you can often encourage/nudge that link into existence (along with the great SEO benefits they bring)


Link intersect

Find pages that are linking to your competitors but not you. By entering two competitive domains (they don’t have to be directly competitive; anyone you think you should be on lists with, or mentioned by the press alongside, is a good candidate), you can see pages that link to those sites but not yours. Getting creative with your targets here can reveal loads of awesome link opportunities.


This, however, is just the beginning. Be on the lookout for additional insights and opportunities as we improve our link index—we’ve just recently grown the size of Freshscape, which powers Fresh Web Explorer and two of the sections in link opportunities, so you should find lots of good stuff in there, but it can be a challenge. If you’re struggling with query formatting or getting creative around potential opportunities, let us know (in the comments or via Q&A) and we can give you some pointers or maybe find some searches that do the trick.

What about the old OSE?

We changed the workflow a bit and want to make sure you’ve got time to adjust. If you’re cranking through monthly reports or audits and want a more familiar OSE experience, you can switch to OSE Classic for a limited time. Just click on the “View in OSE Classic” link in the top right, and we’ll default to the old version.

But keep in mind new features and enhancements, like improved performance and Link Opportunities, will only be available in the new release. We’ll keep OSE Classic active until December 3rd in case you’re feeling nostalgic.

We’d love your feedback

If you’re using the new OSE and find problems, wish we’d change something, or have a particularly awesome experience, we’d love to hear from you in the comments below, in Q&A, or (especially if your issue is urgent/something broken) via our help team.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]