Stop Ghost Spam in Google Analytics with One Filter

Posted by CarloSeo

The spam in Google Analytics (GA) is becoming a serious issue. Due to a deluge of referral spam from social buttons, adult sites, and many, many other sources, people are starting to become overwhelmed by all the filters they are setting up to manage the useless data they are receiving.

The good news is, there is no need to panic. In this post, I’m going to focus on the most common mistakes people make when fighting spam in GA, and explain an efficient way to prevent it.

But first, let’s make sure we understand how spam works. A couple of months ago, Jared Gardner wrote an excellent article explaining what referral spam is, including its intended purpose. He also pointed out some great examples of referral spam.

Types of spam

The spam in Google Analytics can be categorized by two types: ghosts and crawlers.

Ghosts

The vast majority of spam is this type. They are called ghosts because they never access your site. It is important to keep this in mind, as it’s key to creating a more efficient solution for managing spam.

As unusual as it sounds, this type of spam doesn’t have any interaction with your site at all. You may wonder how that is possible since one of the main purposes of GA is to track visits to our sites.

They do it by using the Measurement Protocol, which allows people to send data directly to Google Analytics’ servers. Using this method, and probably randomly generated tracking codes (UA-XXXXX-1) as well, the spammers leave a “visit” with fake data, without even knowing who they are hitting.

Crawlers

This type of spam, the opposite to ghost spam, does access your site. As the name implies, these spam bots crawl your pages, ignoring rules like those found in robots.txt that are supposed to stop them from reading your site. When they exit your site, they leave a record on your reports that appears similar to a legitimate visit.

Crawlers are harder to identify because they know their targets and use real data. But it is also true that new ones seldom appear. So if you detect a referral in your analytics that looks suspicious, researching it on Google or checking it against this list might help you answer the question of whether or not it is spammy.

Most common mistakes made when dealing with spam in GA

I’ve been following this issue closely for the last few months. According to the comments people have made on my articles and conversations I’ve found in discussion forums, there are primarily three mistakes people make when dealing with spam in Google Analytics.

Mistake #1. Blocking ghost spam from the .htaccess file

One of the biggest mistakes people make is trying to block Ghost Spam from the .htaccess file.

For those who are not familiar with this file, one of its main functions is to allow/block access to your site. Now we know that ghosts never reach your site, so adding them here won’t have any effect and will only add useless lines to your .htaccess file.

Ghost spam usually shows up for a few days and then disappears. As a result, sometimes people think that they successfully blocked it from here when really it’s just a coincidence of timing.

Then when the spammers later return, they get worried because the solution is not working anymore, and they think the spammer somehow bypassed the barriers they set up.

The truth is, the .htaccess file can only effectively block crawlers such as buttons-for-website.com and a few others since these access your site. Most of the spam can’t be blocked using this method, so there is no other option than using filters to exclude them.

Mistake #2. Using the referral exclusion list to stop spam

Another error is trying to use the referral exclusion list to stop the spam. The name may confuse you, but this list is not intended to exclude referrals in the way we want to for the spam. It has other purposes.

For example, when a customer buys something, sometimes they get redirected to a third-party page for payment. After making a payment, they’re redirected back to you website, and GA records that as a new referral. It is appropriate to use referral exclusion list to prevent this from happening.

If you try to use the referral exclusion list to manage spam, however, the referral part will be stripped since there is no preexisting record. As a result, a direct visit will be recorded, and you will have a bigger problem than the one you started with since. You will still have spam, and direct visits are harder to track.

Mistake #3. Worrying that bounce rate changes will affect rankings

When people see that the bounce rate changes drastically because of the spam, they start worrying about the impact that it will have on their rankings in the SERPs.

bounce.png

This is another mistake commonly made. With or without spam, Google doesn’t take into consideration Google Analytics metrics as a ranking factor. Here is an explanation about this from Matt Cutts, the former head of Google’s web spam team.

And if you think about it, Cutts’ explanation makes sense; because although many people have GA, not everyone uses it.

Assuming your site has been hacked

Another common concern when people see strange landing pages coming from spam on their reports is that they have been hacked.

landing page

The page that the spam shows on the reports doesn’t exist, and if you try to open it, you will get a 404 page. Your site hasn’t been compromised.

But you have to make sure the page doesn’t exist. Because there are cases (not spam) where some sites have a security breach and get injected with pages full of bad keywords to defame the website.

What should you worry about?

Now that we’ve discarded security issues and their effects on rankings, the only thing left to worry about is your data. The fake trail that the spam leaves behind pollutes your reports.

It might have greater or lesser impact depending on your site traffic, but everyone is susceptible to the spam.

Small and midsize sites are the most easily impacted – not only because a big part of their traffic can be spam, but also because usually these sites are self-managed and sometimes don’t have the support of an analyst or a webmaster.

Big sites with a lot of traffic can also be impacted by spam, and although the impact can be insignificant, invalid traffic means inaccurate reports no matter the size of the website. As an analyst, you should be able to explain what’s going on in even in the most granular reports.

You only need one filter to deal with ghost spam

Usually it is recommended to add the referral to an exclusion filter after it is spotted. Although this is useful for a quick action against the spam, it has three big disadvantages.

  • Making filters every week for every new spam detected is tedious and time-consuming, especially if you manage many sites. Plus, by the time you apply the filter, and it starts working, you already have some affected data.
  • Some of the spammers use direct visits along with the referrals.
  • These direct hits won’t be stopped by the filter so even if you are excluding the referral you will sill be receiving invalid traffic, which explains why some people have seen an unusual spike in direct traffic.

Luckily, there is a good way to prevent all these problems. Most of the spam (ghost) works by hitting GA’s random tracking-IDs, meaning the offender doesn’t really know who is the target, and for that reason either the hostname is not set or it uses a fake one. (See report below)

Ghost-Spam.png

You can see that they use some weird names or don’t even bother to set one. Although there are some known names in the list, these can be easily added by the spammer.

On the other hand, valid traffic will always use a real hostname. In most of the cases, this will be the domain. But it also can also result from paid services, translation services, or any other place where you’ve inserted GA tracking code.

Valid-Referral.png

Based on this, we can make a filter that will include only hits that use real hostnames. This will automatically exclude all hits from ghost spam, whether it shows up as a referral, keyword, or pageview; or even as a direct visit.

To create this filter, you will need to find the report of hostnames. Here’s how:

  1. Go to the Reporting tab in GA
  2. Click on Audience in the lefthand panel
  3. Expand Technology and select Network
  4. At the top of the report, click on Hostname

Valid-list

You will see a list of all hostnames, including the ones that the spam uses. Make a list of all the valid hostnames you find, as follows:

  • yourmaindomain.com
  • blog.yourmaindomain.com
  • es.yourmaindomain.com
  • payingservice.com
  • translatetool.com
  • anotheruseddomain.com

For small to medium sites, this list of hostnames will likely consist of the main domain and a couple of subdomains. After you are sure you got all of them, create a regular expression similar to this one:

yourmaindomain\.com|anotheruseddomain\.com|payingservice\.com|translatetool\.com

You don’t need to put all of your subdomains in the regular expression. The main domain will match all of them. If you don’t have a view set up without filters, create one now.

Then create a Custom Filter.

Make sure you select INCLUDE, then select “Hostname” on the filter field, and copy your expression into the Filter Pattern box.

filter

You might want to verify the filter before saving to check that everything is okay. Once you’re ready, set it to save, and apply the filter to all the views you want (except the view without filters).

This single filter will get rid of future occurrences of ghost spam that use invalid hostnames, and it doesn’t require much maintenance. But it’s important that every time you add your tracking code to any service, you add it to the end of the filter.

Now you should only need to take care of the crawler spam. Since crawlers access your site, you can block them by adding these lines to the .htaccess file:

## STOP REFERRER SPAM 
RewriteCond %{HTTP_REFERER} semalt\.com [NC,OR] 
RewriteCond %{HTTP_REFERER} buttons-for-website\.com [NC] 
RewriteRule .* - [F]

It is important to note that this file is very sensitive, and misplacing a single character it it can bring down your entire site. Therefore, make sure you create a backup copy of your .htaccess file prior to editing it.

If you don’t feel comfortable messing around with your .htaccess file, you can alternatively make an expression with all the crawlers, then and add it to an exclude filter by Campaign Source.

Implement these combined solutions, and you will worry much less about spam contaminating your analytics data. This will have the added benefit of freeing up more time for you to spend actually analyze your valid data.

After stopping spam, you can also get clean reports from the historical data by using the same expressions in an Advance Segment to exclude all the spam.

Bonus resources to help you manage spam

If you still need more information to help you understand and deal with the spam on your GA reports, you can read my main article on the subject here: http://www.ohow.co/what-is-referrer-spam-how-stop-it-guide/.

Additional information on how to stop spam can be found at these URLs:

In closing, I am eager to hear your ideas on this serious issue. Please share them in the comments below.

(Editor’s Note: All images featured in this post were created by the author.)

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Why Effective, Modern SEO Requires Technical, Creative, and Strategic Thinking – Whiteboard Friday

Posted by randfish

There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.

There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.

Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”

Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.

First off, some of the good points he did bring up.

One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”

Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.

And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.

You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.

But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.

First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.

So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.

Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.

Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.

So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.

And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.

Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.

The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”

Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”

I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.

There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.

Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;

  • Good on-site experience
  • Writing good content
  • Getting others to acknowledge you as an authority
  • Rising in social popularity
  • Earning local relevance
  • Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)

The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.

I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.

But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.

I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.

We have to be able to understand things like;

  • Content rendering and indexability
  • Crawl structure, internal links, JavaScript, Ajax. If something’s post-loading after the page and Google’s not able to index it, or there are links that are accessible via JavaScript or Ajax, maybe Google can’t necessarily see those or isn’t crawling them as effectively, or is crawling them, but isn’t assigning them as much link weight as they might be assigning other stuff, and you’ve made it tough to link to them externally, and so they can’t crawl it.
  • Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?
  • Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.
  • Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.

Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”

Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.

  • International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?
  • Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?
  • Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.

Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.

I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.

  • Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.
  • Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.
  • Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.
  • Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.
  • Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.
  • Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.
  • Diagnosing mobile friendliness issues
  • Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.

Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.

So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.

All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Why We Can’t Do Keyword Research Like It’s 2010 – Whiteboard Friday

Posted by randfish

Keyword Research is a very different field than it was just five years ago, and if we don’t keep up with the times we might end up doing more harm than good. From the research itself to the selection and targeting process, in today’s Whiteboard Friday Rand explains what has changed and what we all need to do to conduct effective keyword research today.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

What do we need to change to keep up with the changing world of keyword research?

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat a little bit about keyword research, why it’s changed from the last five, six years and what we need to do differently now that things have changed. So I want to talk about changing up not just the research but also the selection and targeting process.

There are three big areas that I’ll cover here. There’s lots more in-depth stuff, but I think we should start with these three.

1) The Adwords keyword tool hides data!

This is where almost all of us in the SEO world start and oftentimes end with our keyword research. We go to AdWords Keyword Tool, what used to be the external keyword tool and now is inside AdWords Ad Planner. We go inside that tool, and we look at the volume that’s reported and we sort of record that as, well, it’s not good, but it’s the best we’re going to do.

However, I think there are a few things to consider here. First off, that tool is hiding data. What I mean by that is not that they’re not telling the truth, but they’re not telling the whole truth. They’re not telling nothing but the truth, because those rounded off numbers that you always see, you know that those are inaccurate. Anytime you’ve bought keywords, you’ve seen that the impression count never matches the count that you see in the AdWords tool. It’s not usually massively off, but it’s often off by a good degree, and the only thing it’s great for is telling relative volume from one from another.

But because AdWords hides data essentially by saying like, “Hey, you’re going to type in . . .” Let’s say I’m going to type in “college tuition,” and Google knows that a lot of people search for how to reduce college tuition, but that doesn’t come up in the suggestions because it’s not a commercial term, or they don’t think that an advertiser who bids on that is going to do particularly well and so they don’t show it in there. I’m giving an example. They might indeed show that one.

But because that data is hidden, we need to go deeper. We need to go beyond and look at things like Google Suggest and related searches, which are down at the bottom. We need to start conducting customer interviews and staff interviews, which hopefully has always been part of your brainstorming process but really needs to be now. Then you can apply that to AdWords. You can apply that to suggest and related.

The beautiful thing is once you get these tools from places like visiting forums or communities, discussion boards and seeing what terms and phrases people are using, you can collect all this stuff up, plug it back into AdWords, and now they will tell you how much volume they’ve got. So you take that how to lower college tuition term, you plug it into AdWords, they will show you a number, a non-zero number. They were just hiding it in the suggestions because they thought, “Hey, you probably don’t want to bid on that. That won’t bring you a good ROI.” So you’ve got to be careful with that, especially when it comes to SEO kinds of keyword research.

2) Building separate pages for each term or phrase doesn’t make sense

It used to be the case that we built separate pages for every single term and phrase that was in there, because we wanted to have the maximum keyword targeting that we could. So it didn’t matter to us that college scholarship and university scholarships were essentially people looking for exactly the same thing, just using different terminology. We would make one page for one and one page for the other. That’s not the case anymore.

Today, we need to group by the same searcher intent. If two searchers are searching for two different terms or phrases but both of them have exactly the same intent, they want the same information, they’re looking for the same answers, their query is going to be resolved by the same content, we want one page to serve those, and that’s changed up a little bit of how we’ve done keyword research and how we do selection and targeting as well.

3) Build your keyword consideration and prioritization spreadsheet with the right metrics

Everybody’s got an Excel version of this, because I think there’s just no awesome tool out there that everyone loves yet that kind of solves this problem for us, and Excel is very, very flexible. So we go into Excel, we put in our keyword, the volume, and then a lot of times we almost stop there. We did keyword volume and then like value to the business and then we prioritize.

What are all these new columns you’re showing me, Rand? Well, here I think is how sophisticated, modern SEOs that I’m seeing in the more advanced agencies, the more advanced in-house practitioners, this is what I’m seeing them add to the keyword process.

Difficulty

A lot of folks have done this, but difficulty helps us say, “Hey, this has a lot of volume, but it’s going to be tremendously hard to rank.”

The difficulty score that Moz uses and attempts to calculate is a weighted average of the top 10 domain authorities. It also uses page authority, so it’s kind of a weighted stack out of the two. If you’re seeing very, very challenging pages, very challenging domains to get in there, it’s going to be super hard to rank against them. The difficulty is high. For all of these ones it’s going to be high because college and university terms are just incredibly lucrative.

That difficulty can help bias you against chasing after terms and phrases for which you are very unlikely to rank for at least early on. If you feel like, “Hey, I already have a powerful domain. I can rank for everything I want. I am the thousand pound gorilla in my space,” great. Go after the difficulty of your choice, but this helps prioritize.

Opportunity

This is actually very rarely used, but I think sophisticated marketers are using it extremely intelligently. Essentially what they’re saying is, “Hey, if you look at a set of search results, sometimes there are two or three ads at the top instead of just the ones on the sidebar, and that’s biasing some of the click-through rate curve.” Sometimes there’s an instant answer or a Knowledge Graph or a news box or images or video, or all these kinds of things that search results can be marked up with, that are not just the classic 10 web results. Unfortunately, if you’re building a spreadsheet like this and treating every single search result like it’s just 10 blue links, well you’re going to lose out. You’re missing the potential opportunity and the opportunity cost that comes with ads at the top or all of these kinds of features that will bias the click-through rate curve.

So what I’ve seen some really smart marketers do is essentially build some kind of a framework to say, “Hey, you know what? When we see that there’s a top ad and an instant answer, we’re saying the opportunity if I was ranking number 1 is not 10 out of 10. I don’t expect to get whatever the average traffic for the number 1 position is. I expect to get something considerably less than that. Maybe something around 60% of that, because of this instant answer and these top ads.” So I’m going to mark this opportunity as a 6 out of 10.

There are 2 top ads here, so I’m giving this a 7 out of 10. This has two top ads and then it has a news block below the first position. So again, I’m going to reduce that click-through rate. I think that’s going down to a 6 out of 10.

You can get more and less scientific and specific with this. Click-through rate curves are imperfect by nature because we truly can’t measure exactly how those things change. However, I think smart marketers can make some good assumptions from general click-through rate data, which there are several resources out there on that to build a model like this and then include it in their keyword research.

This does mean that you have to run a query for every keyword you’re thinking about, but you should be doing that anyway. You want to get a good look at who’s ranking in those search results and what kind of content they’re building . If you’re running a keyword difficulty tool, you are already getting something like that.

Business value

This is a classic one. Business value is essentially saying, “What’s it worth to us if visitors come through with this search term?” You can get that from bidding through AdWords. That’s the most sort of scientific, mathematically sound way to get it. Then, of course, you can also get it through your own intuition. It’s better to start with your intuition than nothing if you don’t already have AdWords data or you haven’t started bidding, and then you can refine your sort of estimate over time as you see search visitors visit the pages that are ranking, as you potentially buy those ads, and those kinds of things.

You can get more sophisticated around this. I think a 10 point scale is just fine. You could also use a one, two, or three there, that’s also fine.

Requirements or Options

Then I don’t exactly know what to call this column. I can’t remember the person who’ve showed me theirs that had it in there. I think they called it Optional Data or Additional SERPs Data, but I’m going to call it Requirements or Options. Requirements because this is essentially saying, “Hey, if I want to rank in these search results, am I seeing that the top two or three are all video? Oh, they’re all video. They’re all coming from YouTube. If I want to be in there, I’ve got to be video.”

Or something like, “Hey, I’m seeing that most of the top results have been produced or updated in the last six months. Google appears to be biasing to very fresh information here.” So, for example, if I were searching for “university scholarships Cambridge 2015,” well, guess what? Google probably wants to bias to show results that have been either from the official page on Cambridge’s website or articles from this year about getting into that university and the scholarships that are available or offered. I saw those in two of these search results, both the college and university scholarships had a significant number of the SERPs where a fresh bump appeared to be required. You can see that a lot because the date will be shown ahead of the description, and the date will be very fresh, sometime in the last six months or a year.

Prioritization

Then finally I can build my prioritization. So based on all the data I had here, I essentially said, “Hey, you know what? These are not 1 and 2. This is actually 1A and 1B, because these are the same concepts. I’m going to build a single page to target both of those keyword phrases.” I think that makes good sense. Someone who is looking for college scholarships, university scholarships, same intent.

I am giving it a slight prioritization, 1A versus 1B, and the reason I do this is because I always have one keyword phrase that I’m leaning on a little more heavily. Because Google isn’t perfect around this, the search results will be a little different. I want to bias to one versus the other. In this case, my title tag, since I more targeting university over college, I might say something like college and university scholarships so that university and scholarships are nicely together, near the front of the title, that kind of thing. Then 1B, 2, 3.

This is kind of the way that modern SEOs are building a more sophisticated process with better data, more inclusive data that helps them select the right kinds of keywords and prioritize to the right ones. I’m sure you guys have built some awesome stuff. The Moz community is filled with very advanced marketers, probably plenty of you who’ve done even more than this.

I look forward to hearing from you in the comments. I would love to chat more about this topic, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Why Good Unique Content Needs to Die – Whiteboard Friday

Posted by randfish

We all know by now that not just any old content is going to help us rank in competitive SERPs. We often hear people talking about how it takes “good, unique content.” That’s the wrong bar. In today’s Whiteboard Friday, Rand talks about where we should be aiming, and how to get there.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about something that I really have a problem with in the SEO world, and that is the phrase “good, unique content.” I’ll tell you why this troubles me so much. It’s because I get so many emails, I hear so many times at conferences and events with people I meet, with folks I talk to in the industry saying, “Hey, we created some good, unique content, but we don’t seem to be performing well in search.” My answer back to that is always that is not the bar for entry into SEO. That is not the bar for ranking.

The content quality scale

So I made this content quality scale to help illustrate what I’m talking about here. You can see that it starts all the way up at 10x, and down here I’ve got Panda Invasion. So quality, like Google Panda is coming for your site, it’s going to knock you out of the rankings. It’s going to penalize you, like your content is thin and largely useless.

Then you go up a little bit, and it’s like, well four out of five searchers find it pretty bad. They clicked the Back button. Maybe one out of five is thinking, “Well, this is all right. This solves my most basic problems.”

Then you get one level higher than that, and you have good, unique content, which I think many folks think of as where they need to get to. It’s essentially, hey, it’s useful enough. It answers the searcher’s query. It’s unique from any other content on the Web. If you read it, you wouldn’t vomit. It’s good enough, right? Good, unique content.

Problem is almost everyone can get here. They really can. It’s not a high bar, a high barrier to entry to say you need good, unique content. In fact, it can scale. So what I see lots of folks doing is they look at a search result or a set of search results in their industry. Say you’re in travel and vacations, and you look at these different countries and you’re going to look at the hotels or recommendations in those countries and then see all the articles there. You go, “Yeah, you know what, I think we could do something as good as what’s up there or almost.” Well, okay, that puts you in the range. That’s good, unique content.

But in my opinion, the minimum bar today for modern SEO is a step higher, and that is as good as the best in the search results on the search results page. If you can’t consistently say, “We’re the best result that a searcher could find in the search results,” well then, guess what? You’re not going to have an opportunity to rank. It’s much, much harder to get into those top 10 positions, page 1, page 2 positions than it was in the past because there are so many ranking signals that so many of these websites have already built up over the last 5, 10, 15 years that you need to go above and beyond.

Really, where I want folks to go and where I always expect content from Moz to go is here, and that is 10x, 10 times better than anything I can find in the search results today. If I don’t think I can do that, then I’m not going to try and rank for those keywords. I’m just not going to pursue it. I’m going to pursue content in areas where I believe I can create something 10 times better than the best result out there.

What changed?

Why is this? What changed? Well, a bunch of things actually.

  • User experience became a much bigger element in the ranking algorithms, and that’s direct influences, things that we’ve talked about here on Whiteboard Friday before like pogo-sticking, and lots of indirect ones like the links that you earn based on the user experience that you provide and Google rendering pages, Google caring about load speed and device rendering, mobile friendliness, all these kinds of things.
  • Earning links overtook link building. It used to be you put out a page and you built a bunch of links to it. Now that doesn’t so much work anymore because Google is very picky about the links that it’s going to consider. If you can’t earn links naturally, not only can you not get links fast enough and not get good ones, but you also are probably earning links that Google doesn’t even want to count or may even penalize you for. It’s nearly impossible to earn links with just good, unique content. If there’s something better out there on page one of the search results, why would they even bother to link to you? Someone’s going to do a search, and they’re going to find something else to link to, something better.
  • Third, the rise of content marketing over the last five, six years has meant that there’s just a lot more competition. This field is a lot more crowded than it used to be, with many people trying to get to a higher and higher quality bar.
  • Finally, as a result of many of these things, user expectations have gone crazy. Users expect pages to load insanely fast, even on mobile devices, even when their connection’s slow. They expect it to look great. They expect to be provided with an answer almost instantaneously. The quality of results that Google has delivered and the quality of experience that sites like Facebook, which everyone is familiar with, are delivering means that our brains have rewired themselves to expect very fast, very high quality results consistently.

How do we create “10x” content?

So, because of all these changes, we need a process. We need a process to choose, to figure out how we can get to 10x content, not good, unique content, 10x content. A process that I often like to use — this probably is not the only one, but you’re welcome to use it if you find it valuable — is to go, “All right, you know what? I’m going to perform some of these search queries.”

By the way, I would probably perform the search query in two places. One is in Google and their search results, and the other is actually in BuzzSumo, which I think is a great tool for this, where I can see the content that has been most shared. So if you haven’t already, check out BuzzSumo.com.

I might search for something like Costa Rica ecolodges, which I might be considering a Costa Rica vacation at some point in the future. I look at these top ranking results, probably the whole top 10 as well as the most shared content on social media.

Then I’m going to ask myself these questions;

  • What questions are being asked and answered by these search results?
  • What sort of user experience is provided? I look at this in terms of speed, in terms of mobile friendliness, in terms of rendering, in terms of layout and design quality, in terms of what’s required from the user to be able to get the information? Is it all right there, or do I need to click? Am I having trouble finding things?
  • What’s the detail and thoroughness of the information that’s actually provided? Is it lacking? Is it great?
  • What about use of visuals? Visual content can often take best in class all the way up to 10x if it’s done right. So I might check out the use of visuals.
  • The quality of the writing.
  • I’m going to look at information and data elements. Where are they pulling from? What are their sources? What’s the quality of that stuff? What types of information is there? What types of information is missing?

In fact, I like to ask, “What’s missing?” a lot.

From this, I can determine like, hey, here are the strengths and weaknesses of who’s getting all of the social shares and who’s ranking well, and here’s the delta between me and them today. This is the way that I can be 10 times better than the best results in there.

If you use this process or a process like this and you do this type of content auditing and you achieve this level of content quality, you have a real shot at rankings. One of the secret reasons for that is that the effort axis that I have here, like I go to Fiverr, I get Panda invasion. I make the intern write it. This is going to take a weekend to build versus there’s no way to scale this content.

This is a super power. When your competitors or other folks in the field look and say, “Hey, there’s no way that we can scale content quality like this. It’s just too much effort. We can’t keep producing it at this level,” well, now you have a competitive advantage. You have something that puts you in a category by yourself and that’s very hard for competitors to catch up to. It’s a huge advantage in search, in social, on the Web as a whole.

All right everyone, hope you’ve enjoyed this edition of Whiteboard Friday, and we’ll see you again next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

How to Combat 5 of the SEO World’s Most Infuriating Problems – Whiteboard Friday

Posted by randfish

These days, most of us have learned that spammy techniques aren’t the way to go, and we have a solid sense for the things we should be doing to rank higher, and ahead of our often spammier competitors. Sometimes, maddeningly, it just doesn’t work. In today’s Whiteboard Friday, Rand talks about five things that can infuriate SEOs with the best of intentions, why those problems exist, and what we can do about them.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

What SEO problems make you angry?

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about some of the most infuriating things in the SEO world, specifically five problems that I think plague a lot of folks and some of the ways that we can combat and address those.

I’m going to start with one of the things that really infuriates a lot of new folks to the field, especially folks who are building new and emerging sites and are doing SEO on them. You have all of these best practices list. You might look at a web developer’s cheat sheet or sort of a guide to on-page and on-site SEO. You go, “Hey, I’m doing it. I’ve got my clean URLs, my good, unique content, my solid keyword targeting, schema markup, useful internal links, my XML sitemap, and my fast load speed. I’m mobile friendly, and I don’t have manipulative links.”

Great. “Where are my results? What benefit am I getting from doing all these things, because I don’t see one?” I took a site that was not particularly SEO friendly, maybe it’s a new site, one I just launched or an emerging site, one that’s sort of slowly growing but not yet a power player. I do all this right stuff, and I don’t get SEO results.

This makes a lot of people stop investing in SEO, stop believing in SEO, and stop wanting to do it. I can understand where you’re coming from. The challenge is not one of you’ve done something wrong. It’s that this stuff, all of these things that you do right, especially things that you do right on your own site or from a best practices perspective, they don’t increase rankings. They don’t. That’s not what they’re designed to do.

1) Following best practices often does nothing for new and emerging sites

This stuff, all of these best practices are designed to protect you from potential problems. They’re designed to make sure that your site is properly optimized so that you can perform to the highest degree that you are able. But this is not actually rank boosting stuff unfortunately. That is very frustrating for many folks. So following a best practices list, the idea is not, “Hey, I’m going to grow my rankings by doing this.”

On the flip side, many folks do these things on larger, more well-established sites, sites that have a lot of ranking signals already in place. They’re bigger brands, they have lots of links to them, and they have lots of users and usage engagement signals. You fix this stuff. You fix stuff that’s already broken, and boom, rankings pop up. Things are going well, and more of your pages are indexed. You’re getting more search traffic, and it feels great. This is a challenge, on our part, of understanding what this stuff does, not a challenge on the search engine’s part of not ranking us properly for having done all of these right things.

2) My competition seems to be ranking on the back of spammy or manipulative links

What’s going on? I thought Google had introduced all these algorithms to kind of shut this stuff down. This seems very frustrating. How are they pulling this off? I look at their link profile, and I see a bunch of the directories, Web 2.0 sites — I love that the spam world decided that that’s Web 2.0 sites — article sites, private blog networks, and do follow blogs.

You look at this stuff and you go, “What is this junk? It’s terrible. Why isn’t Google penalizing them for this?” The answer, the right way to think about this and to come at this is: Are these really the reason that they rank? I think we need to ask ourselves that question.

One thing that we don’t know, that we can never know, is: Have these links been disavowed by our competitor here?

I’ve got my HulksIncredibleStore.com and their evil competitor Hulk-tastrophe.com. Hulk-tastrophe has got all of these terrible links, but maybe they disavowed those links and you would have no idea. Maybe they didn’t build those links. Perhaps those links came in from some other place. They are not responsible. Google is not treating them as responsible for it. They’re not actually what’s helping them.

If they are helping, and it’s possible they are, there are still instances where we’ve seen spam propping up sites. No doubt about it.

I think the next logical question is: Are you willing to loose your site or brand? What we don’t see anymore is we almost never see sites like this, who are ranking on the back of these things and have generally less legitimate and good links, ranking for two or three or four years. You can see it for a few months, maybe even a year, but this stuff is getting hit hard and getting hit frequently. So unless you’re willing to loose your site, pursuing their links is probably not a strategy.

Then what other signals, that you might not be considering potentially links, but also non-linking signals, could be helping them rank? I think a lot of us get blinded in the SEO world by link signals, and we forget to look at things like: Do they have a phenomenal user experience? Are they growing their brand? Are they doing offline kinds of things that are influencing online? Are they gaining engagement from other channels that’s then influencing their SEO? Do they have things coming in that I can’t see? If you don’t ask those questions, you can’t really learn from your competitors, and you just feel the frustration.

3) I have no visibility or understanding of why my rankings go up vs down

On my HulksIncredibleStore.com, I’ve got my infinite stretch shorts, which I don’t know why he never wears — he should really buy those — my soothing herbal tea, and my anger management books. I look at my rankings and they kind of jump up all the time, jump all over the place all the time. Actually, this is pretty normal. I think we’ve done some analyses here, and the average page one search results shift is 1.5 or 2 position changes daily. That’s sort of the MozCast dataset, if I’m recalling correctly. That means that, over the course of a week, it’s not uncommon or unnatural for you to be bouncing around four, five, or six positions up, down, and those kind of things.

I think we should understand what can be behind these things. That’s a very simple list. You made changes, Google made changes, your competitors made changes, or searcher behavior has changed in terms of volume, in terms of what they were engaging with, what they’re clicking on, what their intent behind searches are. Maybe there was just a new movie that came out and in one of the scenes Hulk talks about soothing herbal tea. So now people are searching for very different things than they were before. They want to see the scene. They’re looking for the YouTube video clip and those kind of things. Suddenly Hulk’s soothing herbal tea is no longer directing as well to your site.

So changes like these things can happen. We can’t understand all of them. I think what’s up to us to determine is the degree of analysis and action that’s actually going to provide a return on investment. Looking at these day over day or week over week and throwing up our hands and getting frustrated probably provides very little return on investment. Looking over the long term and saying, “Hey, over the last 6 months, we can observe 26 weeks of ranking change data, and we can see that in aggregate we are now ranking higher and for more keywords than we were previously, and so we’re going to continue pursuing this strategy. This is the set of keywords that we’ve fallen most on, and here are the factors that we’ve identified that are consistent across that group.” I think looking at rankings in aggregate can give us some real positive ROI. Looking at one or two, one week or the next week probably very little ROI.

4) I cannot influence or affect change in my organization because I cannot accurately quantify, predict, or control SEO

That’s true, especially with things like keyword not provided and certainly with the inaccuracy of data that’s provided to us through Google’s Keyword Planner inside of AdWords, for example, and the fact that no one can really control SEO, not fully anyway.

You get up in front of your team, your board, your manager, your client and you say, “Hey, if we don’t do these things, traffic will suffer,” and they go, “Well, you can’t be sure about that, and you can’t perfectly predict it. Last time you told us something, something else happened. So because the data is imperfect, we’d rather spend money on channels that we can perfectly predict, that we can very effectively quantify, and that we can very effectively control.” That is understandable. I think that businesses have a lot of risk aversion naturally, and so wanting to spend time and energy and effort in areas that you can control feels a lot safer.

Some ways to get around this are, first off, know your audience. If you know who you’re talking to in the room, you can often determine the things that will move the needle for them. For example, I find that many managers, many boards, many executives are much more influenced by competitive pressures than they are by, “We won’t do as well as we did before, or we’re loosing out on this potential opportunity.” Saying that is less powerful than saying, “This competitor, who I know we care about and we track ourselves against, is capturing this traffic and here’s how they’re doing it.”

Show multiple scenarios. Many of the SEO presentations that I see and have seen and still see from consultants and from in-house folks come with kind of a single, “Hey, here’s what we predict will happen if we do this or what we predict will happen if we don’t do this.” You’ve got to show multiple scenarios, especially when you know you have error bars because you can’t accurately quantify and predict. You need to show ranges.

So instead of this, I want to see: What happens if we do it a little bit? What happens if we really overinvest? What happens if Google makes a much bigger change on this particular factor than we expect or our competitors do a much bigger investment than we expect? How might those change the numbers?

Then I really do like bringing case studies, especially if you’re a consultant, but even in-house there are so many case studies in SEO on the Web today, you can almost always find someone who’s analogous or nearly analogous and show some of their data, some of the results that they’ve seen. Places like SEMrush, a tool that offers competitive intelligence around rankings, can be great for that. You can show, hey, this media site in our sector made these changes. Look at the delta of keywords they were ranking for versus R over the next six months. Correlation is not causation, but that can be a powerful influencer showing those kind of things.

Then last, but not least, any time you’re going to get up like this and present to a group around these topics, if you very possibly can, try to talk one-on-one with the participants before the meeting actually happens. I have found it almost universally the case that when you get into a group setting, if you haven’t had the discussions beforehand about like, “What are your concerns? What do you think is not valid about this data? Hey, I want to run this by you and get your thoughts before we go to the meeting.” If you don’t do that ahead of time, people can gang up and pile on. One person says, “Hey, I don’t think this is right,” and everybody in the room kind of looks around and goes, “Yeah, I also don’t think that’s right.” Then it just turns into warfare and conflict that you don’t want or need. If you address those things beforehand, then you can include the data, the presentations, and the “I don’t know the answer to this and I know this is important to so and so” in that presentation or in that discussion. It can be hugely helpful. Big difference between winning and losing with that.

5) Google is biasing to big brands. It feels hopeless to compete against them

A lot of people are feeling this hopelessness, hopelessness in SEO about competing against them. I get that pain. In fact, I’ve felt that very strongly for a long time in the SEO world, and I think the trend has only increased. This comes from all sorts of stuff. Brands now have the little dropdown next to their search result listing. There are these brand and entity connections. As Google is using answers and knowledge graph more and more, it’s feeling like those entities are having a bigger influence on where things rank and where they’re visible and where they’re pulling from.

User and usage behavior signals on the rise means that big brands, who have more of those signals, tend to perform better. Brands in the knowledge graph, brands growing links without any effort, they’re just growing links because they’re brands and people point to them naturally. Well, that is all really tough and can be very frustrating.

I think you have a few choices on the table. First off, you can choose to compete with brands where they can’t or won’t. So this is areas like we’re going after these keywords that we know these big brands are not chasing. We’re going after social channels or people on social media that we know big brands aren’t. We’re going after user generated content because they have all these corporate requirements and they won’t invest in that stuff. We’re going after content that they refuse to pursue for one reason or another. That can be very effective.

You better be building, growing, and leveraging your competitive advantage. Whenever you build an organization, you’ve got to say, “Hey, here’s who is out there. This is why we are uniquely better or a uniquely better choice for this set of customers than these other ones.” If you can leverage that, you can generally find opportunities to compete and even to win against big brands. But those things have to become obvious, they have to become well-known, and you need to essentially build some of your brand around those advantages, or they’re not going to give you help in search. That includes media, that includes content, that includes any sort of press and PR you’re doing. That includes how you do your own messaging, all of these things.

(C) You can choose to serve a market or a customer that they don’t or won’t. That can be a powerful way to go about search, because usually search is bifurcated by the customer type. There will be slightly different forms of search queries that are entered by different kinds of customers, and you can pursue one of those that isn’t pursued by the competition.

Last, but not least, I think for everyone in SEO we all realize we’re going to have to become brands ourselves. That means building the signals that are typically associated with brands — authority, recognition from an industry, recognition from a customer set, awareness of our brand even before a search has happened. I talked about this in a previous Whiteboard Friday, but I think because of these things, SEO is becoming a channel that you benefit from as you grow your brand rather than the channel you use to initially build your brand.

All right, everyone. Hope these have been helpful in combating some of these infuriating, frustrating problems and that we’ll see some great comments from you guys. I hope to participate in those as well, and we’ll catch you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

How Much Has Link Building Changed in Recent Years?

Posted by Paddy_Moogan

I get asked this question a lot. It’s mainly asked by people who are considering buying my link building book and want to know whether it’s still up to date. This is understandable given that the first edition was published in February 2013 and our industry has a deserved reputation for always changing.

I find myself giving the same answer, even though I’ve been asked it probably dozens of times in the last two years—”not that much”. I don’t think this is solely due to the book itself standing the test of time, although I’ll happily take a bit of credit for that 🙂 I think it’s more a sign of our industry as a whole not changing as much as we’d like to think.

I started to question myself and if I was right and honestly, it’s one of the reasons it has taken me over two years to release the second edition of the book.

So I posed this question to a group of friends not so long ago, some via email and some via a Facebook group. I was expecting to be called out by many of them because my position was that in reality, it hasn’t actually changed that much. The thing is, many of them agreed and the conversations ended with a pretty long thread with lots of insights. In this post, I’d like to share some of them, share what my position is and talk about what actually has changed.

My personal view

Link building hasn’t changed as much we think it has.

The core principles of link building haven’t changed. The signals around link building have changed, but mainly around new machine learning developments that have indirectly affected what we do. One thing that has definitely changed is the mindset of SEOs (and now clients) towards link building.

I think the last big change to link building came in April 2012 when Penguin rolled out. This genuinely did change our industry and put to bed a few techniques that should never have worked so well in the first place.

Since then, we’ve seen some things change, but the core principles haven’t changed if you want to build a business that will be around for years to come and not run the risk of being hit by a link related Google update. For me, these principles are quite simple:

  • You need to deserve links – either an asset you create or your product
  • You need to put this asset in front of a relevant audience who have the ability to share it
  • You need consistency – one new asset every year is unlikely to cut it
  • Anything that scales is at risk

For me, the move towards user data driving search results + machine learning has been the biggest change we’ve seen in recent years and it’s still going.

Let’s dive a bit deeper into all of this and I’ll talk about how this relates to link building.

The typical mindset for building links has changed

I think that most SEOs are coming round to the idea that you can’t get away with building low quality links any more, not if you want to build a sustainable, long-term business. Spammy link building still works in the short-term and I think it always will, but it’s much harder than it used to be to sustain websites that are built on spam. The approach is more “churn and burn” and spammers are happy to churn through lots of domains and just make a small profit on each one before moving onto another.

For everyone else, it’s all about the long-term and not putting client websites at risk.

This has led to many SEOs embracing different forms of link building and generally starting to use content as an asset when it comes to attracting links. A big part of me feels that it was actually Penguin in 2012 that drove the rise of content marketing amongst SEOs, but that’s a post for another day…! For today though, this goes some way towards explain the trend we see below.

Slowly but surely, I’m seeing clients come to my company already knowing that low quality link building isn’t what they want. It’s taken a few years after Penguin for it to filter down to client / business owner level, but it’s definitely happening. This is a good thing but unfortunately, the main reason for this is that most of them have been burnt in the past by SEO companies who have built low quality links without giving thought to building good quality ones too.

I have no doubt that it’s this change in mindset which has led to trends like this:

The thing is, I don’t think this was by choice.

Let’s be honest. A lot of us used the kind of link building tactics that Google no longer like because they worked. I don’t think many SEOs were under the illusion that it was genuinely high quality stuff, but it worked and it was far less risky to do than it is today. Unless you were super-spammy, the low-quality links just worked.

Fast forward to a post-Penguin world, things are far more risky. For me, it’s because of this that we see the trends like the above. As an industry, we had the easiest link building methods taken away from us and we’re left with fewer options. One of the main options is content marketing which, if you do it right, can lead to good quality links and importantly, the types of links you won’t be removing in the future. Get it wrong and you’ll lose budget and lose the trust if your boss or client in the power of content when it comes to link building.

There are still plenty of other methods to build links and sometimes we can forget this. Just look at this epic list from Jon Cooper. Even with this many tactics still available to us, it’s hard work. Way harder than it used to be.

My summary here is that as an industry, our mindset has shifted but it certainly wasn’t a voluntary shift. If the tactics that Penguin targeted still worked today, we’d still be using them.

A few other opinions…

I definitely think too many people want the next easy win. As someone surfing the edge of what Google is bringing our way, here’s my general take—SEO, in broad strokes, is changing a lot, *but* any given change is more and more niche and impacts fewer people. What we’re seeing isn’t radical, sweeping changes that impact everyone, but a sort of modularization of SEO, where we each have to be aware of what impacts our given industries, verticals, etc.”

Dr. Pete

 

I don’t feel that techniques for acquiring links have changed that much. You can either earn them through content and outreach or you can just buy them. What has changed is the awareness of “link building” outside of the SEO community. This makes link building / content marketing much harder when pitching to journalists and even more difficult when pitching to bloggers.

“Link building has to be more integrated with other channels and struggles to work in its own environment unless supported by brand, PR and social. Having other channels supporting your link development efforts also creates greater search signals and more opportunity to reach a bigger audience which will drive a greater ROI.

Carl Hendy

 

SEO has grown up in terms of more mature staff and SEOs becoming more ingrained into businesses so there is a smarter (less pressure) approach. At the same time, SEO has become more integrated into marketing and has made marketing teams and decision makers more intelligent in strategies and not pushing for the quick win. I’m also seeing that companies who used to rely on SEO and building links have gone through IPOs and the need to build 1000s of links per quarter has rightly reduced.

Danny Denhard

Signals that surround link building have changed

There is no question about this one in my mind. I actually wrote about this last year in my previous blog post where I talked about signals such as anchor text and deep links changing over time.

Many of the people I asked felt the same, here are some quotes from them, split out by the types of signal.

Domain level link metrics

I think domain level links have become increasingly important compared with page level factors, i.e. you can get a whole site ranking well off the back of one insanely strong page, even with sub-optimal PageRank flow from that page to the rest of the site.

Phil Nottingham

I’d agree with Phil here and this is what I was getting at in my previous post on how I feel “deep links” will matter less over time. It’s not just about domain level links here, it’s just as much about the additional signals available for Google to use (more on that later).

Anchor text

I’ve never liked anchor text as a link signal. I mean, who actually uses exact match commercial keywords as anchor text on the web?

SEOs. 🙂

Sure there will be natural links like this, but honestly, I struggle with the idea that it took Google so long to start turning down the dial on commercial anchor text as a ranking signal. They are starting to turn it down though, slowly but surely. Don’t get me wrong, it still matters and it still works. But like pure link spam, the barrier is a lot more lower now in terms what of constitutes too much.

Rand feels that they matter more than we’d expect and I’d mostly agree with this statement:

Exact match anchor text links still have more power than you’d expect—I think Google still hasn’t perfectly sorted what is “brand” or “branded query” from generics (i.e. they want to start ranking a new startup like meldhome.com for “Meld” if the site/brand gets popular, but they can’t quite tell the difference between that and https://moz.com/learn/seo/redirection getting a few manipulative links that say “redirect”)

Rand Fishkin

What I do struggle with though, is that Google still haven’t figured this out and that short-term, commercial anchor text spam is still so effective. Even for a short burst of time.

I don’t think link building as a concept has changed loads—but I think links as a signal have, mainly because of filters and penalties but I don’t see anywhere near the same level of impact from coverage anymore, even against 18 months ago.

Paul Rogers

New signals have been introduced

It isn’t just about established signals changing though, there are new signals too and I personally feel that this is where we’ve seen the most change in Google algorithms in recent years—going all the way back to Panda in 2011.

With Panda, we saw a new level of machine learning where it almost felt like Google had found a way of incorporating human reaction / feelings into their algorithms. They could then run this against a website and answer questions like the ones included in this post. Things such as:

  • “Would you be comfortable giving your credit card information to this site?”
  • “Does this article contain insightful analysis or interesting information that is beyond obvious?”
  • “Are the pages produced with great care and attention to detail vs. less attention to detail?”

It is a touch scary that Google was able to run machine learning against answers to questions like this and write an algorithm to predict the answers for any given page on the web. They have though and this was four years ago now.

Since then, they’ve made various moves to utilize machine learning and AI to build out new products and improve their search results. For me, this was one of the biggest and went pretty unnoticed by our industry. Well, until Hummingbird came along I feel pretty sure that we have Ray Kurzweil to thank for at least some of that.

There seems to be more weight on theme/topic related to sites, though it’s hard to tell if this is mostly link based or more user/usage data based. Google is doing a good job of ranking sites and pages that don’t earn the most links but do provide the most relevant/best answer. I have a feeling they use some combination of signals to say “people who perform searches like this seem to eventually wind up on this website—let’s rank it.” One of my favorite examples is the Audubon Society ranking for all sorts of birding-related searches with very poor keyword targeting, not great links, etc. I think user behavior patterns are stronger in the algo than they’ve ever been.

– Rand Fishkin

Leading on from what Rand has said, it’s becoming more and more common to see search results that just don’t make sense if you look at the link metrics—but are a good result.

For me, the move towards user data driving search results + machine learning advanced has been the biggest change we’ve seen in recent years and it’s still going.

Edit: since drafting this post, Tom Anthony released this excellent blog post on his views on the future of search and the shift to data-driven results. I’d recommend reading that as it approaches this whole area from a different perspective and I feel that an off-shoot of what Tom is talking about is the impact on link building.

You may be asking at this point, what does machine learning have to do with link building?

Everything. Because as strong as links are as a ranking signal, Google want more signals and user signals are far, far harder to manipulate than established link signals. Yes it can be done—I’ve seen it happen. There have even been a few public tests done. But it’s very hard to scale and I’d venture a guess that only the top 1% of spammers are capable of doing it, let alone maintaining it for a long period of time. When I think about the process for manipulation here, I actually think we go a step beyond spammers towards hackers and more cut and dry illegal activity.

For link building, this means that traditional methods of manipulating signals are going to become less and less effective as these user signals become stronger. For us as link builders, it means we can’t keep searching for that silver bullet or the next method of scaling link building just for an easy win. The fact is that scalable link building is always going to be at risk from penalization from Google—I don’t really want to live a life where I’m always worried about my clients being hit by the next update. Even if Google doesn’t catch up with a certain method, machine learning and user data mean that these methods may naturally become less effective and cost efficient over time.

There are of course other things such as social signals that have come into play. I certainly don’t feel like these are a strong ranking factor yet, but with deals like this one between Google and Twitter being signed, I wouldn’t be surprised if that ever-growing dataset is used at some point in organic results. The one advantage that Twitter has over Google is it’s breaking news freshness. Twitter is still way quicker at breaking news than Google is—140 characters in a tweet is far quicker than Google News! Google know this which is why I feel they’ve pulled this partnership back into existence after a couple of years apart.

There is another important point to remember here and it’s nicely summarised by Dr. Pete:

At the same time, as new signals are introduced, these are layers not replacements. People hear social signals or user signals or authorship and want it to be the link-killer, because they already fucked up link-building, but these are just layers on top of on-page and links and all of the other layers. As each layer is added, it can verify the layers that came before it and what you need isn’t the magic signal but a combination of signals that generally matches what Google expects to see from real, strong entities. So, links still matter, but they matter in concert with other things, which basically means it’s getting more complicated and, frankly, a bit harder. Of course, on one wants to hear that.”

– Dr. Pete

The core principles have not changed

This is the crux of everything for me. With all the changes listed above, the key is that the core principles around link building haven’t changed. I could even argue that Penguin didn’t change the core principles because the techniques that Penguin targeted should never have worked in the first place. I won’t argue this too much though because even Google advised website owners to build directory links at one time.

You need an asset

You need to give someone a reason to link to you. Many won’t do it out of the goodness of their heart! One of the most effective ways to do this is to develop a content asset and use this as your reason to make people care. Once you’ve made someone care, they’re more likely to share the content or link to it from somewhere.

You need to promote that asset to the right audience

I really dislike the stance that some marketers take when it comes to content promotion—build great content and links will come.

No. Sorry but for the vast majority of us, that’s simply not true. The exceptions are people that sky dive from space or have huge existing audiences to leverage.

You simply have to spend time promoting your content or your asset for it to get shares and links. It is hard work and sometimes you can spend a long time on it and get little return, but it’s important to keep working at until you’re at a point where you have two things:

  • A big enough audience where you can almost guarantee at least some traffic to your new content along with some shares
  • Enough strong relationships with relevant websites who you can speak to when new content is published and stand a good chance of them linking to it

Getting to this point is hard—but that’s kind of the point. There are various hacks you can use along the way but it will take time to get right.

You need consistency

Leading on from the previous point. It takes time and hard work to get links to your content—the types of links that stand the test of time and you’re not going to be removing in 12 months time anyway! This means that you need to keep pushing content out and getting better each and every time. This isn’t to say you should just churn content out for the sake of it, far from it. I am saying that with each piece of content you create, you will learn to do at least one thing better the next time. Try to give yourself the leverage to do this.

Anything scalable is at risk

Scalable link building is exactly what Google has been trying to crack down on for the last few years. Penguin was the biggest move and hit some of the most scalable tactics we had at our disposal. When you scale something, you often lose some level of quality, which is exactly what Google doesn’t want when it comes to links. If you’re still relying on tactics that could fall into the scalable category, I think you need to be very careful and just look at the trend in the types of links Google has been penalizing to understand why.

The part Google plays in this

To finish up, I want to briefly talk about the part that Google plays in all of this and shaping the future they want for the web.

I’ve always tried to steer clear of arguments involving the idea that Google is actively pushing FUD into the community. I’ve preferred to concentrate more on things I can actually influence and change with my clients rather than what Google is telling us all to do.

However, for the purposes of this post, I want to talk about it.

General paranoia has increased. My bet is there are some companies out there carrying out zero specific linkbuilding activity through worry.

Dan Barker

Dan’s point is a very fair one and just a day or two after reading this in an email, I came across a page related to a client’s target audience that said:

“We are not publishing guest posts on SITE NAME any more. All previous guest posts are now deleted. For more information, see www.mattcutts.com/blog/guest-blogging/“.

I’ve reworded this as to not reveal the name of the site, but you get the point.

This is silly. Honestly, so silly. They are a good site, publish good content, and had good editorial standards. Yet they have ignored all of their own policies, hard work, and objectives to follow a blog post from Matt. I’m 100% confident that it wasn’t sites like this one that Matt was talking about in this blog post.

This is, of course, from the publishers’ angle rather than the link builders’ angle, but it does go to show the effect that statements from Google can have. Google know this so it does make sense for them to push out messages that make their jobs easier and suit their own objectives—why wouldn’t they? In a similar way, what did they do when they were struggling to classify at scale which links are bad vs. good and they didn’t have a big enough web spam team? They got us to do it for them 🙂

I’m mostly joking here, but you see the point.

The most recent infamous mobilegeddon update, discussed here by Dr. Pete is another example of Google pushing out messages that ultimately scared a lot of people into action. Although to be fair, I think that despite the apparent small impact so far, the broad message from Google is a very serious one.

Because of this, I think we need to remember that Google does have their own agenda and many shareholders to keep happy. I’m not in the camp of believing everything that Google puts out is FUD, but I’m much more sensitive and questioning of the messages now than I’ve ever been.

What do you think? I’d love to hear your feedback and thoughts in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Why the Links You’ve Built Aren’t Helping Your Page Rank Higher – Whiteboard Friday

Posted by randfish

Link building can be incredibly effective, but sometimes a lot of effort can go into earning links with absolutely no improvement in rankings. Why? In today’s Whiteboard Friday, Rand shows us four things we should look at in these cases, help us hone our link building skills and make the process more effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about why link building sometimes fails.

So I’ve got an example here. I’m going to do a search for artificial sweeteners. Let’s say I’m working for these guys, ScienceMag.org. Well, this is actually in position 10. I put it in position 3 here, but I see that I’m position 10. I think to myself, “Man, if I could get higher up on this page, that would be excellent. I’ve already produced the content. It’s on my domain. Like, Google seems to have indexed it fine. It’s performing well enough to perform on page one, granted at the bottom of page one, for this competitive query. Now I want to move my rankings up.”

So a lot of SEOs, naturally and historically, for a long time have thought, “I need to build more links to that page. If I can get more links pointing to this page, I can move up the rankings.” Granted, there are some other ways to do that too, and we’ve discussed those in previous Whiteboard Fridays. But links are one of the big ones that people use.

I think one of the challenges that we encounter is sometimes we invest that effort. We go through the process of that outreach campaign, talking to bloggers and other news sites and looking at where our link sources are coming from and trying to get some more of those. It just doesn’t seem to do anything. The link building appears to fail. It’s like, man, I’ve got all these nice links and no new results. I didn’t move up at all. I am basically staying where I am, or maybe I’m even falling down. Why is that? Why does link building sometimes work so well and so clearly and obviously, and sometimes it seems to do nothing at all?

What are some possible reasons link acquisition efforts may not be effective?

Oftentimes if you get a fresh set of eyes on it, an outside SEO perspective, they can do this audit, and they’ll walk through a lot of this stuff and help you realize, “Oh yeah, that’s probably why.” These are things that you might need to change strategically or tactically as you approach this problem. But you can do this yourself as well by looking at why a link building campaign, why a link building effort, for a particular page, might not be working.

1) Not the right links

First one, it’s not the right links. Not the right links, I mean a wide range of things, even broader than what I’ve listed here. But a lot of times that could mean low domain diversity. Yeah, you’re getting new links, but they’re coming from all the same places that you always get links from. Google, potentially, maybe views that as not particularly worthy of moving you up the rankings, especially around competitive queries.

It might be trustworthiness of source. So maybe they’re saying “Yeah, you got some links, but they’re not from particularly trustworthy places.” Tied into that maybe we don’t think or we’re sure that they’re not editorial. Maybe we think they’re paid, or we think they’re promotional in some way rather than being truly editorially given by this independent resource.

They might not come from a site or from a page that has the authority that’s necessary to move you up. Again, particularly for competitive queries, sometimes low-value links are just that. They’re not going to move the needle, especially not like they used to three, four, five or six years ago, where really just a large quantity of links, even from diverse domains, even if they were crappy links on crappy pages on relatively crappy or unknown websites would move the needle, not so much anymore. Google is seeing a lot more about these things.

Where else does the source link to? Is that source pointing to other stuff that is potentially looking manipulative to Google and so they discounted the outgoing links from that particular domain or those sites or those pages on those sites?

They might look at the relevance and say, “Hey, you know what? Yeah, you got linked to by some technology press articles. That doesn’t really have anything to do with artificial sweeteners, this topic, this realm, or this region.” So you’re not getting the same result. Now we’ve shown that off-topic links can oftentimes move the rankings, but in particular areas and in health, in fact, may be one of those Google might be more topically sensitive to where the links are coming from than other places.

Location on page. So I’ve got a page here and maybe all of my links are coming from a bunch of different domains, but it’s always in the right sidebar and it’s always in this little feed section. So Google’s saying, “Hey, that’s not really an editorial endorsement. That’s just them showing all the links that come through your particular blog feed or a subscription that they’ve got to your content or whatever it is promotionally pushing out. So we’re not going to count it that way.” Same thing a lot of times with footer links. Doesn’t work quite as well. If you’re being honest with yourself, you really want those in content links. Generally speaking, those tend to perform the best.

Or uniqueness. So they might look and they might say, “Yeah, you’ve got a ton of links from people who are republishing your same article and then just linking back to it. That doesn’t feel to us like an editorial endorsement, and so we’re just going to treat those copies as if those links didn’t exist at all.” But the links themselves may not actually be the problem. I think this can be a really important topic if you’re doing link acquisition auditing, because sometimes people get too focused on, “Oh, it must be something about the links that we’re getting.” That’s not always the case actually.

2) Not the right content

Sometimes it’s not the right content. So that could mean things like it’s temporally focused versus evergreen. So for different kinds of queries, Google interprets the intent of the searchers to be different. So it could be that when they see a search like “artificial sweeteners,” they say, “Yeah, it’s great that you wrote this piece about this recent research that came out. But you know what, we’re actually thinking that searchers are going to want in the top few results something that’s evergreen, that contains all the broad information that a searcher might need around this particular topic.”

That speaks to it might not answer the searchers questions. You might think, “Well, I’m answering a great question here.” The problem is, yeah you’re answering one. Searchers may have many questions that they’re asking around a topic, and Google is looking for something comprehensive, something that doesn’t mean a searcher clicks your result and then says, “Well, that was interesting, but I need more from a different result.” They’re looking for the one true result, the one true answer that tells them, “Hey, this person is very happy with these types of results.”

It could be poor user experience causing people to bounce back. That could be speed things, UI things, layout things, browser support things, multi-device support things. It might not use language formatting or text that people or engines can interpret as on the topic. Perhaps this is way over people’s heads, far too scientifically focused, most searchers can’t understand the language, or the other way around. It’s a highly scientific search query and a very advanced search query and your language is way dumbed down. Google isn’t interpreting that as on-topic. All the Hummingbird and topic modeling kind of things that they have say this isn’t for them.

Or it might not match expectations of searchers. This is distinct and different from searchers’ questions. So searchers’ questions is, “I want to know how artificial sweeteners might affect me.” Expectations might be, “I expect to learn this kind of information. I expect to find out these things.” For example, if you go down a rabbit hole of artificial sweeteners will make your skin shiny, they’re like, “Well, that doesn’t meet with my expectation. I don’t think that’s right.” Even if you have some data around that, that’s not what they were expecting to find. They might bounce back. Engines might not interpret you as on-topic, etc. So lots of content kinds of things.

3) Not the right domain

Then there are also domain issues. You might not have the right domain. Your domain might not be associated with the topic or content that Google and searchers are expecting. So they see Mayo Clinic, they see MedicineNet, and they go, “ScienceMag? Do they do health information? I don’t think they do. I’m not sure if that’s an appropriate one.” It might be perceived, even if you aren’t, as spammy or manipulative by Google, more probably than by searchers. Or searchers just won’t click your brand for that content. This is a very frustrating one, because we have seen a ton of times when search behavior is biased by the brand itself, by what’s in this green text here, the domain name or the brand name that Google might show there. That’s very frustrating, but it means that you need to build brand affinity between that topic, that keyword, and what’s in searchers’ heads.

4) Accessibility or technical issues

Then finally, there could be some accessibility or technical issues. Usually when that’s the case, you will notice pretty easily because the page will have an error. It won’t show the content properly. The cache will be an issue. That’s a rare one, but you might want to check for it as well.

But hopefully, using this kind of an audit system, you can figure out why a link building campaign, a link building effort isn’t working to move the needle on your rankings.

With that, we will see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it