How to Rid Your Website of Six Common Google Analytics Headaches

Posted by amandaecking

I’ve been in and out of Google Analytics (GA) for the past five or so years agency-side. I’ve seen three different code libraries, dozens of new different features and reports roll out, IP addresses stop being reported, and keywords not-so-subtly phased out of the free platform.

Analytics has been a focus of mine for the past year or so—mainly, making sure clients get their data right. Right now, our new focus is closed loop tracking, but that’s a topic for another day. If you’re using Google Analytics, and only Google Analytics for the majority of your website stats, or it’s your primary vehicle for analysis, you need to make sure it’s accurate.

Not having data pulling in or reporting properly is like building a house on a shaky foundation: It doesn’t end well. Usually there are tears.

For some reason, a lot of people, including many of my clients, assume everything is tracking properly in Google Analytics… because Google. But it’s not Google who sets up your analytics. People do that. And people are prone to make mistakes.

I’m going to go through six scenarios where issues are commonly encountered with Google Analytics.

I’ll outline the remedy for each issue, and in the process, show you how to move forward with a diagnosis or resolution.

1. Self-referrals

This is probably one of the areas we’re all familiar with. If you’re seeing a lot of traffic from your own domain, there’s likely a problem somewhere—or you need to extend the default session length in Google Analytics. (For example, if you have a lot of long videos or music clips and don’t use event tracking; a website like TEDx or SoundCloud would be a good equivalent.)

Typically one of the first things I’ll do to help diagnose the problem is include an advanced filter to show the full referrer string. You do this by creating a filter, as shown below:

Filter Type: Custom filter > Advanced
Field A: Hostname
Extract A: (.*)
Field B: Request URI
Extract B: (.*)
Output To: Request URI
Constructor: $A1$B1

You’ll then start seeing the subdomains pulling in. Experience has shown me that if you have a separate subdomain hosted in another location (say, if you work with a separate company and they host and run your mobile site or your shopping cart), it gets treated by Google Analytics as a separate domain. Thus, you ‘ll need to implement cross domain tracking. This way, you can narrow down whether or not it’s one particular subdomain that’s creating the self-referrals.

In this example below, we can see all the revenue is being reported to the booking engine (which ended up being cross domain issues) and their own site is the fourth largest traffic source:

I’ll also a good idea to check the browser and device reports to start narrowing down whether the issue is specific to a particular element. If it’s not, keep digging. Look at pages pulling the self-referrals and go through the code with a fine-tooth comb, drilling down as much as you can.

2. Unusually low bounce rate

If you have a crazy-low bounce rate, it could be too good to be true. Unfortunately. An unusually low bounce rate could (and probably does) mean that at least on some pages of your website have the same Google Analytics tracking code installed twice.

Take a look at your source code, or use Google Tag Assistant (though it does have known bugs) to see if you’ve got GA tracking code installed twice.

While I tell clients having Google Analytics installed on the same page can lead to double the pageviews, I’ve not actually encountered that—I usually just say it to scare them into removing the duplicate implementation more quickly. Don’t tell on me.

3. Iframes anywhere

I’ve heard directly from Google engineers and Google Analytics evangelists that Google Analytics does not play well with iframes, and that it will never will play nice with this dinosaur technology.

If you track the iframe, you inflate your pageviews, plus you still aren’t tracking everything with 100% clarity.

If you don’t track across iframes, you lose the source/medium attribution and everything becomes a self-referral.

Damned if you do; damned if you don’t.

My advice: Stop using iframes. They’re Netscape-era technology anyway, with rainbow marquees and Comic Sans on top. Interestingly, and unfortunately, a number of booking engines (for hotels) and third-party carts (for ecommerce) still use iframes.

If you have any clients in those verticals, or if you’re in the vertical yourself, check with your provider to see if they use iframes. Or you can check for yourself, by right-clicking as close as you can to the actual booking element:

iframe-booking.png

There is no neat and tidy way to address iframes with Google Analytics, and usually iframes are not the only complicated element of setup you’ll encounter. I spent eight months dealing with a website on a subfolder, which used iframes and had a cross domain booking system, and the best visibility I was able to get was about 80% on a good day.

Typically, I’d approach diagnosing iframes (if, for some reason, I had absolutely no access to viewing a website or talking to the techs) similarly to diagnosing self-referrals, as self-referrals are one of the biggest symptoms of iframe use.

4. Massive traffic jumps

Massive jumps in traffic don’t typically just happen. (Unless, maybe, you’re Geraldine.) There’s always an explanation—a new campaign launched, you just turned on paid ads for the first time, you’re using content amplification platforms, you’re getting a ton of referrals from that recent press in The New York Times. And if you think it just happened, it’s probably a technical glitch.

I’ve seen everything from inflated pageviews result from including tracking on iframes and unnecessary implementation of virtual pageviews, to not realizing the tracking code was installed on other microsites for the same property. Oops.

Usually I’ve seen this happen when the tracking code was somewhere it shouldn’t be, so if you’re investigating a situation of this nature, first confirm the Google Analytics code is only in the places it needs to be.Tools like Google Tag Assistant and Screaming Frog can be your BFFs in helping you figure this out.

Also, I suggest bribing the IT department with sugar (or booze) to see if they’ve changed anything lately.

5. Cross-domain tracking

I wish cross-domain tracking with Google Analytics out of the box didn’t require any additional setup. But it does.

If you don’t have it set up properly, things break down quickly, and can be quite difficult to untangle.

The older the GA library you’re using, the harder it is. The easiest setup, by far, is Google Tag Manager with Universal Analytics. Hard-coded universal analytics is a bit more difficult because you have to implement autoLink manually and decorate forms, if you’re using them (and you probably are). Beyond that, rather than try and deal with it, I say update your Google Analytics code. Then we can talk.

Where I’ve seen the most murkiness with tracking is when parts of cross domain tracking are implemented, but not all. For some reason, if allowLinker isn’t included, or you forget to decorate all the forms, the cookies aren’t passed between domains.

The absolute first place I would start with this would be confirming the cookies are all passing properly at all the right points, forms, links, and smoke signals. I’ll usually use a combination of the Real Time report in Google Analytics, Google Tag Assistant, and GA debug to start testing this. Any debug tool you use will mean you’re playing in the console, so get friendly with it.

6. Internal use of UTM strings

I’ve saved the best for last. Internal use of campaign tagging. We may think, oh, I use Google to tag my campaigns externally, and we’ve got this new promotion on site which we’re using a banner ad for. That’s a campaign. Why don’t I tag it with a UTM string?

Step away from the keyboard now. Please.

When you tag internal links with UTM strings, you override the original source/medium. So that visitor who came in through your paid ad and then who clicks on the campaign banner has now been manually tagged. You lose the ability to track that they came through on the ad the moment they click on the tagged internal link. Their source and medium is now your internal campaign, not that paid ad you’re spending gobs of money on and have to justify to your manager. See the problem?

I’ve seen at least three pretty spectacular instances of this in the past year, and a number of smaller instances of it. Annie Cushing also talks about the evils of internal UTM tags and the odd prevalence of it. (Oh, and if you haven’t explored her blog, and the amazing spreadsheets she shares, please do.)

One clothing company I worked with tagged all of their homepage offers with UTM strings, which resulted in the loss of visibility for one-third of their audience: One million visits over the course of a year, and $2.1 million in lost revenue.

Let me say that again. One million visits, and $2.1 million. That couldn’t be attributed to an external source/campaign/spend.

Another client I audited included campaign tagging on nearly every navigational element on their website. It still gives me nightmares.

If you want to see if you have any internal UTM strings, head straight to the Campaigns report in Acquisition in Google Analytics, and look for anything like “home” or “navigation” or any language you may use internally to refer to your website structure.

And if you want to see how users are moving through your website, go to the Flow reports. Or if you really, really, really want to know how many people click on that sidebar link, use event tracking. But please, for the love of all things holy (and to keep us analytics lovers from throwing our computers across the room), stop using UTM tagging on your internal links.

Now breathe and smile

Odds are, your Google Analytics setup is fine. If you are seeing any of these issues, though, you have somewhere to start in diagnosing and addressing the data.

We’ve looked at six of the most common points of friction I’ve encountered with Google Analytics and how to start investigating them: self-referrals, bounce rate, iframes, traffic jumps, cross domain tracking and internal campaign tagging.

What common data integrity issues have you encountered with Google Analytics? What are your favorite tools to investigate?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Your Daily SEO Fix: Week 5

Posted by Trevor-Klein

We’ve arrived, folks! This is the last installment of our short (< 2-minute) video tutorials that help you all get the most out of Moz’s tools. If you haven’t been following along, these are each designed to solve a use case that we regularly hear about from Moz community members.

Here’s a quick recap of the previous round-ups in case you missed them:

  • Week 1: Reclaim links using Open Site Explorer, build links using Fresh Web Explorer, and find the best time to tweet using Followerwonk.
  • Week 2: Analyze SERPs using new MozBar features, boost your rankings through on-page optimization, check your anchor text using Open Site Explorer, do keyword research with OSE and the keyword difficulty tool, and discover keyword opportunities in Moz Analytics.
  • Week 3: Compare link metrics in Open Site Explorer, find tweet topics with Followerwonk, create custom reports in Moz Analytics, use Spam Score to identify high-risk links, and get link building opportunities delivered to your inbox.
  • Week 4: Use Fresh Web Explorer to build links, analyze rank progress for a given keyword, use the MozBar to analyze your competitors’ site markup, use the Top Pages report to find content ideas, and find on-site errors with Crawl Test.

We’ve got five new fixes for you in this edition:

  • How to Use the Full SERP Report
  • How to Find Fresh Links and Manage Your Brand Online Using Open Site Explorer
  • How to Build Your Link Profile with Link Intersect
  • How to Find Local Citations Using the MozBar
  • Bloopers: How to Screw Up While Filming a Daily SEO Fix

Hope you enjoy them!


Fix 1: How to Use the Full SERP Report

Moz’s Full SERP Report is a detailed report that shows the top ten ranking URLs for a specific keyword and presents the potential ranking signals in an easy-to-view format. In this Daily SEO Fix, Meredith breaks down the report so you can see all the sections and how each are used.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 2: How to Find Fresh Links and Manage Your Brand Online Using Open Site Explorer

The Just-Discovered Links report in Open Site Explorer helps you discover recently created links within an hour of them being published. In this fix, Nick shows you how to use the report to view who is linking to you, how they’re doing it, and what they are saying, so you can capitalize on link opportunities while they’re still fresh and join the conversation about your brand.


Fix 3: How to Build Your Link Profile with Link Intersect

The quantity and (more importantly) quality of backlinks to your website make up your link profile, one of the most important elements in SEO and an incredibly important factor in search engine rankings. In this Daily SEO Fix, Tori shows you how to use Moz’s Link Intersect tool to analyze the competitions’ backlinks. Plus, learn how to find opportunities to build links and strengthen your own link profile.


Fix 4: How to Find Local Citations Using the MozBar

Citations are mentions of your business and address on webpages other than your own such as an online yellow pages directory or a local business association page. They are a key component in search engine ranking algorithms so building consistent and accurate citations for your local business(s) is a key Local SEO tactic. In today’s Daily SEO Fix, Tori shows you how to use MozBar to find local citations around the web


Bloopers: How to Screw Up While Filming a Daily SEO Fix

We had a lot of fun filming this series, and there were plenty of laughs along the way. Like these ones. =)


Looking for more?

We’ve got more videos in the previous four weeks’ round-ups!

Your Daily SEO Fix: Week 1

Your Daily SEO Fix: Week 2

Your Daily SEO Fix: Week 3

Your Daily SEO Fix: Week 4


Don’t have a Pro subscription? No problem. Everything we cover in these Daily SEO Fix videos is available with a free 30-day trial.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Your Daily SEO Fix: Week 4

Posted by Trevor-Klein

This week, we’ve got the fourth (and second-to-last) installment of our short (< 2-minute) video tutorials that help you all get the most out of Moz’s tools. They’re each designed to solve a use case that we regularly hear about from Moz community members.

Here’s a quick recap of the previous round-ups in case you missed them:

  • Week 1: Reclaim links using Open Site Explorer, build links using Fresh Web Explorer, and find the best time to tweet using Followerwonk.
  • Week 2: Analyze SERPs using new MozBar features, boost your rankings through on-page optimization, check your anchor text using Open Site Explorer, do keyword research with OSE and the keyword difficulty tool, and discover keyword opportunities in Moz Analytics.
  • Week 3: Compare link metrics in Open Site Explorer, find tweet topics with Followerwonk, create custom reports in Moz Analytics, use Spam Score to identify high-risk links, and get link building opportunities delivered to your inbox.

In this installment, we’ve got five brand new tutorials:

  • How to Use Fresh Web Explorer to Build Links
  • How to Analyze Rank Progress for a Given Keyword
  • How to Use the MozBar to Analyze Your Competitors’ Site Markup
  • How to Use the Top Pages Report to Find Content Ideas
  • How to Find On-Site Errors with Crawl Test

Hope you enjoy them!

Fix 1: How to Use Fresh Web Explorer to Build Links

If you have unique data or a particularly excellent resource on your site, that content can be a great link magnet. In this Daily SEO Fix, Felicia shows you how to set up alerts in Fresh Web Explorer to track mentions of relevant keyword phrases, find link opportunities, and build links to your content.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 2: How to Analyze Rank Progress for a Given Keyword

Moz’s Rank Tracker tool retrieves search engine rankings for pages and keywords, storing them for easy comparison later. In this fix, James shows you how to use this helpful tool to track keywords, save time, and improve your rankings.


Fix 3: How to Use the MozBar to Analyze Your Competitors’ Site Markup

Schema markup helps search engines better identify what your (and your competitors’) website pages are all about and as a result can lead to a boost to rankings. In this Daily SEO Fix, Jordan shows you how to use the MozBar to analyze the schema markup of the competition and optimize your own site and pages for rich snippets.


Fix 4: How to Use the Top Pages Report to Find Content Ideas

With Moz’s Top Pages report in Open Site Explorer, you can see the pages on your site (and the competitions’ sites!) that are top performers. In this fix, Nick shows you how to use the report to analyze your competitors’ content marketing efforts and to inform your own.


Fix 5: How to Find On-Site Errors with Crawl Test

Identifying and understanding any potential errors on your site is crucial to the life of any SEO. In this Daily SEO Fix Sean shows you how to use the Crawl Test tool in Moz Analytics to pull reports and identify any errors on your site.


Looking for more?

We’ve got more videos in the previous three weeks’ round-ups!

Your Daily SEO Fix: Week 1

Your Daily SEO Fix: Week 2

Your Daily SEO Fix: Week 3


Don’t have a Pro subscription? No problem. Everything we cover in these Daily SEO Fix videos is available with a free 30-day trial.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Your Daily SEO Fix: Week 3

Posted by Trevor-Klein

Welcome to the third installment of our short (< 2-minute) video tutorials that help you all get the most out of Moz’s tools. Each tutorial is designed to solve a use case that we regularly hear about from Moz community members—a need or problem for which you all could use a solution.

If you missed the previous roundups, you can find ’em here:

  • Week 1: Reclaim links using Open Site Explorer, build links using Fresh Web Explorer, and find the best time to tweet using Followerwonk.
  • Week 2: Analyze SERPs using new MozBar features, boost your rankings through on-page optimization, check your anchor text using Open Site Explorer, do keyword research with OSE and the keyword difficulty tool, and discover keyword opportunities in Moz Analytics.

Today, we’ve got a brand-new roundup of the most recent videos:

  • How to Compare Link Metrics in Open Site Explorer
  • How to Find Tweet Topics with Followerwonk
  • How to Create Custom Reports in Moz Analytics
  • How to Use Spam Score to Identify High-Risk Links
  • How to Get Link Building Opportunities Delivered to Your Inbox

Hope you enjoy them!

Fix 1: How to Compare Link Metrics in Open Site Explorer

Not all links are created equal. In this Daily SEO Fix, Chiaryn shows you how to use Open Site Explorer to analyze and compare link metrics for up to five URLs to see which are strongest.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 2: How to Find Tweet Topics with Followerwonk

Understanding what works best for your competitors on Twitter is a great place to start when forming your own Twitter strategy. In this fix, Ellie explains how to identify strong-performing tweets from your competitors and how to use those tweets to shape your own voice and plan.


Fix 3: How to Create Custom Reports in Moz Analytics

In this Daily SEO Fix, Kevin shows you how to create a custom report in Moz Analytics and schedule it to be delivered to your inbox on a daily, weekly, or monthly basis.


Fix 4: How to Use Spam Score to Identify High-Risk Links

Almost every site has a few bad links pointing to it, but lots of highly risky links can have a negative impact on your search engine rankings. In this fix, Tori shows you how to use Moz’s Spam Score metric to identify spammy links.


Fix 5: How to Get Link Building Opportunities Delivered to Your Inbox

Building high-quality links is one of the most important aspects of SEO. In this Daily SEO Fix, Erin shows you how to use Moz Analytics to set up a weekly custom report that will notify you of pages on the web that mention your site but do not include a link, so you can use this info to build more links.


Looking for more?

We’ve got more videos in the previous two weeks’ round-ups!

Your Daily SEO Fix: Week 1

Your Daily SEO Fix: Week 2


Don’t have a Pro subscription? No problem. Everything we cover in these Daily SEO Fix videos is available with a free 30-day trial.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Should I Use Relative or Absolute URLs? – Whiteboard Friday

Posted by RuthBurrReedy

It was once commonplace for developers to code relative URLs into a site. There are a number of reasons why that might not be the best idea for SEO, and in today’s Whiteboard Friday, Ruth Burr Reedy is here to tell you all about why.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Let’s discuss some non-philosophical absolutes and relatives

Howdy, Moz fans. My name is Ruth Burr Reedy. You may recognize me from such projects as when I used to be the Head of SEO at Moz. I’m now the Senior SEO Manager at BigWing Interactive in Oklahoma City. Today we’re going to talk about relative versus absolute URLs and why they are important.

At any given time, your website can have several different configurations that might be causing duplicate content issues. You could have just a standard http://www.example.com. That’s a pretty standard format for a website.

But the main sources that we see of domain level duplicate content are when the non-www.example.com does not redirect to the www or vice-versa, and when the HTTPS versions of your URLs are not forced to resolve to HTTP versions or, again, vice-versa. What this can mean is if all of these scenarios are true, if all four of these URLs resolve without being forced to resolve to a canonical version, you can, in essence, have four versions of your website out on the Internet. This may or may not be a problem.

It’s not ideal for a couple of reasons. Number one, duplicate content is a problem because some people think that duplicate content is going to give you a penalty. Duplicate content is not going to get your website penalized in the same way that you might see a spammy link penalty from Penguin. There’s no actual penalty involved. You won’t be punished for having duplicate content.

The problem with duplicate content is that you’re basically relying on Google to figure out what the real version of your website is. Google is seeing the URL from all four versions of your website. They’re going to try to figure out which URL is the real URL and just rank that one. The problem with that is you’re basically leaving that decision up to Google when it’s something that you could take control of for yourself.

There are a couple of other reasons that we’ll go into a little bit later for why duplicate content can be a problem. But in short, duplicate content is no good.

However, just having these URLs not resolve to each other may or may not be a huge problem. When it really becomes a serious issue is when that problem is combined with injudicious use of relative URLs in internal links. So let’s talk a little bit about the difference between a relative URL and an absolute URL when it comes to internal linking.

With an absolute URL, you are putting the entire web address of the page that you are linking to in the link. You’re putting your full domain, everything in the link, including /page. That’s an absolute URL.

However, when coding a website, it’s a fairly common web development practice to instead code internal links with what’s called a relative URL. A relative URL is just /page. Basically what that does is it relies on your browser to understand, “Okay, this link is pointing to a page that’s on the same domain that we’re already on. I’m just going to assume that that is the case and go there.”

There are a couple of really good reasons to code relative URLs

1) It is much easier and faster to code.

When you are a web developer and you’re building a site and there thousands of pages, coding relative versus absolute URLs is a way to be more efficient. You’ll see it happen a lot.

2) Staging environments

Another reason why you might see relative versus absolute URLs is some content management systems — and SharePoint is a great example of this — have a staging environment that’s on its own domain. Instead of being example.com, it will be examplestaging.com. The entire website will basically be replicated on that staging domain. Having relative versus absolute URLs means that the same website can exist on staging and on production, or the live accessible version of your website, without having to go back in and recode all of those URLs. Again, it’s more efficient for your web development team. Those are really perfectly valid reasons to do those things. So don’t yell at your web dev team if they’ve coded relative URLS, because from their perspective it is a better solution.

Relative URLs will also cause your page to load slightly faster. However, in my experience, the SEO benefits of having absolute versus relative URLs in your website far outweigh the teeny-tiny bit longer that it will take the page to load. It’s very negligible. If you have a really, really long page load time, there’s going to be a whole boatload of things that you can change that will make a bigger difference than coding your URLs as relative versus absolute.

Page load time, in my opinion, not a concern here. However, it is something that your web dev team may bring up with you when you try to address with them the fact that, from an SEO perspective, coding your website with relative versus absolute URLs, especially in the nav, is not a good solution.

There are even better reasons to use absolute URLs

1) Scrapers

If you have all of your internal links as relative URLs, it would be very, very, very easy for a scraper to simply scrape your whole website and put it up on a new domain, and the whole website would just work. That sucks for you, and it’s great for that scraper. But unless you are out there doing public services for scrapers, for some reason, that’s probably not something that you want happening with your beautiful, hardworking, handcrafted website. That’s one reason. There is a scraper risk.

2) Preventing duplicate content issues

But the other reason why it’s very important to have absolute versus relative URLs is that it really mitigates the duplicate content risk that can be presented when you don’t have all of these versions of your website resolving to one version. Google could potentially enter your site on any one of these four pages, which they’re the same page to you. They’re four different pages to Google. They’re the same domain to you. They are four different domains to Google.

But they could enter your site, and if all of your URLs are relative, they can then crawl and index your entire domain using whatever format these are. Whereas if you have absolute links coded, even if Google enters your site on www. and that resolves, once they crawl to another page, that you’ve got coded without the www., all of that other internal link juice and all of the other pages on your website, Google is not going to assume that those live at the www. version. That really cuts down on different versions of each page of your website. If you have relative URLs throughout, you basically have four different websites if you haven’t fixed this problem.

Again, it’s not always a huge issue. Duplicate content, it’s not ideal. However, Google has gotten pretty good at figuring out what the real version of your website is.

You do want to think about internal linking, when you’re thinking about this. If you have basically four different versions of any URL that anybody could just copy and paste when they want to link to you or when they want to share something that you’ve built, you’re diluting your internal links by four, which is not great. You basically would have to build four times as many links in order to get the same authority. So that’s one reason.

3) Crawl Budget

The other reason why it’s pretty important not to do is because of crawl budget. I’m going to point it out like this instead.

When we talk about crawl budget, basically what that is, is every time Google crawls your website, there is a finite depth that they will. There’s a finite number of URLs that they will crawl and then they decide, “Okay, I’m done.” That’s based on a few different things. Your site authority is one of them. Your actual PageRank, not toolbar PageRank, but how good Google actually thinks your website is, is a big part of that. But also how complex your site is, how often it’s updated, things like that are also going to contribute to how often and how deep Google is going to crawl your site.

It’s important to remember when we think about crawl budget that, for Google, crawl budget cost actual dollars. One of Google’s biggest expenditures as a company is the money and the bandwidth it takes to crawl and index the Web. All of that energy that’s going into crawling and indexing the Web, that lives on servers. That bandwidth comes from servers, and that means that using bandwidth cost Google actual real dollars.

So Google is incentivized to crawl as efficiently as possible, because when they crawl inefficiently, it cost them money. If your site is not efficient to crawl, Google is going to save itself some money by crawling it less frequently and crawling to a fewer number of pages per crawl. That can mean that if you have a site that’s updated frequently, your site may not be updating in the index as frequently as you’re updating it. It may also mean that Google, while it’s crawling and indexing, may be crawling and indexing a version of your website that isn’t the version that you really want it to crawl and index.

So having four different versions of your website, all of which are completely crawlable to the last page, because you’ve got relative URLs and you haven’t fixed this duplicate content problem, means that Google has to spend four times as much money in order to really crawl and understand your website. Over time they’re going to do that less and less frequently, especially if you don’t have a really high authority website. If you’re a small website, if you’re just starting out, if you’ve only got a medium number of inbound links, over time you’re going to see your crawl rate and frequency impacted, and that’s bad. We don’t want that. We want Google to come back all the time, see all our pages. They’re beautiful. Put them up in the index. Rank them well. That’s what we want. So that’s what we should do.

There are couple of ways to fix your relative versus absolute URLs problem

1) Fix what is happening on the server side of your website

You have to make sure that you are forcing all of these different versions of your domain to resolve to one version of your domain. For me, I’m pretty agnostic as to which version you pick. You should probably already have a pretty good idea of which version of your website is the real version, whether that’s www, non-www, HTTPS, or HTTP. From my view, what’s most important is that all four of these versions resolve to one version.

From an SEO standpoint, there is evidence to suggest and Google has certainly said that HTTPS is a little bit better than HTTP. From a URL length perspective, I like to not have the www. in there because it doesn’t really do anything. It just makes your URLs four characters longer. If you don’t know which one to pick, I would pick one this one HTTPS, no W’s. But whichever one you pick, what’s really most important is that all of them resolve to one version. You can do that on the server side, and that’s usually pretty easy for your dev team to fix once you tell them that it needs to happen.

2) Fix your internal links

Great. So you fixed it on your server side. Now you need to fix your internal links, and you need to recode them for being relative to being absolute. This is something that your dev team is not going to want to do because it is time consuming and, from a web dev perspective, not that important. However, you should use resources like this Whiteboard Friday to explain to them, from an SEO perspective, both from the scraper risk and from a duplicate content standpoint, having those absolute URLs is a high priority and something that should get done.

You’ll need to fix those, especially in your navigational elements. But once you’ve got your nav fixed, also pull out your database or run a Screaming Frog crawl or however you want to discover internal links that aren’t part of your nav, and make sure you’re updating those to be absolute as well.

Then you’ll do some education with everybody who touches your website saying, “Hey, when you link internally, make sure you’re using the absolute URL and make sure it’s in our preferred format,” because that’s really going to give you the most bang for your buck per internal link. So do some education. Fix your internal links.

Sometimes your dev team going to say, “No, we can’t do that. We’re not going to recode the whole nav. It’s not a good use of our time,” and sometimes they are right. The dev team has more important things to do. That’s okay.

3) Canonicalize it!

If you can’t get your internal links fixed or if they’re not going to get fixed anytime in the near future, a stopgap or a Band-Aid that you can kind of put on this problem is to canonicalize all of your pages. As you’re changing your server to force all of these different versions of your domain to resolve to one, at the same time you should be implementing the canonical tag on all of the pages of your website to self-canonize. On every page, you have a canonical page tag saying, “This page right here that they were already on is the canonical version of this page. ” Or if there’s another page that’s the canonical version, then obviously you point to that instead.

But having each page self-canonicalize will mitigate both the risk of duplicate content internally and some of the risk posed by scrappers, because when they scrape, if they are scraping your website and slapping it up somewhere else, those canonical tags will often stay in place, and that lets Google know this is not the real version of the website.

In conclusion, relative links, not as good. Absolute links, those are the way to go. Make sure that you’re fixing these very common domain level duplicate content problems. If your dev team tries to tell you that they don’t want to do this, just tell them I sent you. Thanks guys.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Your Daily SEO Fix: Week 2

Posted by Trevor-Klein

Last week, we began posting short (< 2-minute) video tutorials that help you all get the most out of Moz’s tools. Each tutorial is designed to solve a use case that we regularly hear about from Moz community members—a need or problem for which you all could use a solution.

Today, we’ve got a brand-new roundup of the most recent videos:

  • How to Examine and Analyze SERPs Using New MozBar Features
  • How to Boost Your Rankings through On-Page Optimization
  • How to Check Your Anchor Text Using Open Site Explorer
  • How to Do Keyword Research with OSE and the Keyword Difficulty Tool
  • How to Discover Keyword Opportunities in Moz Analytics

Let’s get right down to business!

Fix 1: How to Examine and Analyze SERPs Using New MozBar Features

The MozBar is a handy tool that helps you access important SEO metrics while you surf the web. In this Daily SEO Fix, Abe shows you how to use this toolbar to examine and analyze SERPs and access keyword difficulty scores for a given page—in a single click.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 2: How to Boost Your Rankings through On-Page Optimization

There are several on-page factors that influence your search engine rankings. In this Daily SEO Fix, Holly shows you how to use Moz’s On-Page Optimization tool to identify pages on your website that could use some love and what you can do to improve them.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 3: How to Check Your Anchor Text Using Open Site Explorer

Dive into OSE with Tori in this Daily SEO Fix to check out the anchor text opportunities for Moz.com. By highlighting all your anchor text you can discover other potential keyword ranking opportunities you might not have thought of before.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 4: How to Do Keyword Research with OSE and the Keyword Difficulty Tool

Studying your competitors can help identify keyword opportunities for your own site. In this Daily SEO Fix, Jacki walks through how to use OSE to research the anchor text for competitors websites and how to use the Keyword Difficulty Tool to identify potential expansion opportunities for your site.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 5: How to Discover Keyword Opportunities in Moz Analytics

Digesting organic traffic that is coming to your site is an easy way to surface potential keyword opportunities. In this Daily SEO Fix, Chiaryn walks through the keyword opportunity tab in Moz Analytics and highlights a quick tip for leveraging that tool.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Looking for more?

We’ve got more videos in last week’s round-up! Check it out here.


Don’t have a Pro subscription? No problem. Everything we cover in these Daily SEO Fix videos is available with a free 30-day trial.

Sounds good. Sign me up!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Your Daily SEO Fix

Posted by Trevor-Klein

We at the Mozplex have noticed a recurring event. Somewhat regularly, one of our community members—sometimes even a Pro subscriber—will ask us if we know of any tools that’ll do a good job solving for a particular use case. They’ve got a need and are looking for a solution. That solution, it turns out, is available in our own tools—they just never made the connection.

This week, we began a series of video tutorials we’re calling the Moz Daily SEO Fix. The videos are shorter than two minutes each and are designed to offer you solutions to some of the most common problems faced by SEOs and online marketers of all stripes. A new video will be released every weekday for a month, and we’ll post a weekly roundup on Thursday afternoons.

Whether you’re a seasoned veteran of the old SEOmoz days or have never once used a Moz tool, we hope these videos will show you a way to make your marketing life a little easier. =)


Fix 1: How to reclaim links using Open Site Explorer

In today’s Daily SEO Fix, David explains how to use the Open Site Explorer’s top pages tab plus the filter for 4xx and 5xx errors to find the pages on your site with the most potential link equity that are broken and can be redirected. 301’ing these URLs to relevant pages on your site can give your rankings a serious boost.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 2: How to build links using Fresh Web Explorer

In this Daily SEO Fix, Michael shows you how to set up an alert in Fresh Web Explorer for anyone who mentions (or links to) your two biggest competitors but not to you. Monitor your inbox for these alerts and you’ll find new link building opportunities, ripe for the picking.


Fix 3: How to find the best times to tweet using Followerwonk

Finding the best time to tweet is unique for everyone and figuring out what times work best for you is key to maximizing your presence on Twitter. In this Daily SEO Fix, Ellie shows you how to use Followerwonk to find the best times to tweet so your followers don’t miss out on your updates.


Don’t have a Pro subscription? No problem. Everything we cover in these Daily SEO Fix videos is available with a free 30-day trial.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it