What Do You Do When You Lose Organic Traffic to Google SERP Features?

Posted by Emily.Potter

Google’s increasing dominance of their own search engine results pages (SERPs) has kicked up a lot of panic and controversy in the SEO industry. As Barry Adams pointed out on Twitter recently, this move by Google is not exactly new, but it does feel like Google has suddenly placed their foot on the accelerator:

Follow that Twitter thread and you’ll see the sort of back-and-forth these changes have started to create. Is this an ethical move by Google? Did you deserve the business they’re taking in the first place? Will SEO soon be dead? Or can we do what we’ve always done and adapt our strategies in smart, agile ways?

It’s hard to think positive when Google takes a stab at you like it did with this move on Ookla:

But regardless of how you feel about what’s happening, local packs, featured snippets, and SERP features from Google, properties like Google News, Images, Flights, Videos, and Maps are riding on a train that has no plans on stopping.

To give you an idea of how rapid these changes are occurring, the image below is what the SERP rankings looked like in November 2016 for one of our client’s key head terms:

And this image is the SERP for the same keyword by early December 2017 (our client is in green):

Check out MozCast’s Feature Graph if you want to see the percentage of queries specific features are appearing on.

Who is this blog post for?

You’re likely reading this blog post because you noticed your organic traffic has dropped and you suspect it could be Google tanking you.

Traffic drops tend to come about from four main causes: a drop in rankings, a decrease in search volume, you are now ranking for fewer keywords, or because SERP features and/or advertising are depressing your CTRs.

If you have not already done a normal traffic drop analysis and ruled out the first three causes, then your time is better spent doing that first. But if you have done a traffic drop analysis and reached the conclusion that you’re likely to be suffering from a change in SERP features, then keep reading.

But I’m too lazy to do a full analysis

Aside from ruling everything else out, other strong indications that SERP features are to blame will be a significant drop in clicks (either broadly or especially for specific queries) in Google Search Console where average ranking is static, but a near consistent amount of impressions.

I’ll keep harping on about this point, but make sure that you check clicks vs impressions for both mobile and desktop. Do this both broadly and for specific key head terms.

When you spend most of your day working on a desktop computer, sometimes in this industry we forget how much mobile actually dominates the scene. On desktop, the impact these have on traffic there is not as drastic; but when you go over to a mobile device, it’s not uncommon for it to take around four full scrolls down before organic listings appear.

From there, the steps to dealing with a Google-induced traffic drop are roughly as follows:

  1. Narrow down your traffic drop to the introduction of SERP features or an increase in paid advertising
  2. Figure out what feature(s) you are being hit by
  3. Gain hard evidence from SEO tools and performance graphs
  4. Adapt your SEO strategy accordingly

That covers step one, so let’s move on.

Step 2.0: Figure out which feature(s) you are being hit by

For a comprehensive list of all the different enhanced results that appear on Google, Overthink Group has documented them here. To figure out which one is impacting you, follow the below steps.

Step 2.1

Based off of your industry, you probably already have an idea of which features you’re most vulnerable to.

  • Are you an e-commerce website? Google Shopping and paid advertising will be a likely candidate.
  • Do you tend to generate a lot of blog traffic? Look at who owns the featured snippets on your most important queries.
  • Are you a media company? Check and see if you are getting knocked out of top news results.
  • Do you run a listings site? Maybe you’re being knocked by sponsored listings or Google Jobs.

Step 2.2

From there, sanity check this by spot-checking the SERPs for a couple of the keywords you’re concerned about to get a sense for what changed. If you roughly know what you’re looking for when you dig into the data, it will be easier to spot. This works well for SERP features, but determining a change in the amount of paid advertising will be harder to spot this way.

Once again, be sure to do this on both mobile and desktop. What may look insignificant from your office computer screen could be showing you a whole different story on your mobile device.

Step 3.0: Gain hard evidence from SEO tools and performance graphs

Once you have a top level idea of what has changed, you need to confirm it with SEO tools. If you have access to one, a historical rank tracking tool will be the most efficient way to dig into how your SERPs are evolving. I most frequently use STAT, but other great tools for this are Moz’s SERP features report, SEOmonitor, and SEMRush.

Using one of these tools, look back at historical data (either broadly or for specific important keywords) and find the date the SERP feature appeared if you can. Once you have this date, line it up with a dip in your organic traffic or other performance metric. If there’s a match, you can be pretty confident that’s to blame.

For example, here’s what this analysis looked like for one of our clients on a keyword with a regional search volume of 49,500. They got hit hard on mobile-first by the appearance of a local pack, then an events snippet 10 days later.

This was the clicks and impression data for the head term on mobile from Google Search Console:

As this case demonstrates, here’s another strong reminder that when you’re analyzing these changes, you must check both mobile and desktop. Features like knowledge panels are much more intrusive on mobile devices than they are on desktop, so while you may not be seeing a dramatic change in your desktop traffic, you may on mobile.

For this client we improved their structured data so that they showed up in the event snippet instead, and were able to recover a good portion of the lost traffic.

How to adapt your SEO strategy

You may not be able to fully recover, but here are some different strategies you can use depending on the SERP feature. Use these links to jump to a specific section:

Have you tried bidding to beat Google?

I cover what to do if you’re specifically losing out on organic traffic due to paid advertising (spoiler alert: you’re probably gonna have to pay), but paid advertising can also be used as a tactic to subvert Google SERP features.

For example, Sky Scanner has done this by bidding on the query “flights” so they appear above the Google Flights widget:

Accelerated Mobile Pages (AMP)

AMP is a project sponsored by Google to improve the speed of mobile pages. For a lot of these challenges, implementing AMP may be a way to improve your rankings as Google SERPs continue to change.

If you’ve noticed a number of websites with AMP implemented are ranking on the first page of SERPs you care about, it’s likely worth investigating.

If you are a news website, implementing AMP is absolutely a must.

Featured snippets and PAA boxes

If you’re losing traffic because one of your competitors owns the featured snippets on your SERPs, then you need to optimize your content to win featured snippets. I’ve already written a blog post for our Distilled blog on tactics to steal them before, which you can read here.

In summary, though, you have a chance to win a featured snippet if:

  • The ones you’re targeting are pretty volatile or frequently changing hands, as that’s a good indication the owner doesn’t have a strong hold on it
  • If you rank higher than the current owner, as this indicates Google prefers your page; the structure of your content simply needs some tweaking to win the snippet

If you’ve identified some featured snippets you have a good chance of stealing, compare what the current owner has done with their content that you haven’t. Typically it’s things like the text heading the block of content and the format of the content that differentiates a featured snippet owner from your content.

Local packs

At SearchLove London 2018, Rob Bucci shared data from STAT on local packs and search intent. Local SEO is a big area that I can’t cover fully here, but if you’re losing traffic because a local pack has appeared that you’re not being featured in, then you need to try and optimize your Google My Business listing for the local pack if you can. For a more in depth instruction on how you can get featured in a local pack, read here.

Unfortunately, it may just not be possible for you to be featured, but if it’s a query you have a chance at appearing in local pack for, you first need to get set up on Google My Business with a link to your website.

Once you have Google My Business set up, make sure the contact and address information is correct.

Reviews are incredibly important for anyone competing within a local pack, and not just high reviews but also the number of reviews you’ve received is important. You should also consider creating Google Posts. In a lot of spaces this feature is yet to have been taken advantage of, which means you could be able to get a jumpstart on your competitors.

More queries are seeing paid advertisements now, and there are also more ads appearing per query, as told in this Moz post.

If you’re losing traffic because a competitor has set up a PPC campaign and started to bid on keywords you’re ranking well for, then you may need to consider overbidding on these queries if they’re important to you.

Unfortunately, there’s no real secret here: either you gotta pay or you’re going to have to shift your focus to other target queries.

You should have already done so, but if you haven’t already included structured data on your website you need to, as it will help you stand out on SERPs with lots of advertising. Wrapped into this is the need to get good reviews for your brand and for your products.

Google Shopping

Similar to paid advertising, if the appearance of Google Shopping sponsored ads has taken over your SERPs, you should consider whether it’s worth you building your own Google Shopping campaign.

Again, structured data will be an important tactic to employ here as well. If you’re competing with Google Shopping ads, you’re competing with product listings that have images, prices, and reviews directly in the SERP results to draw in users. You should have the same.

Look into getting your pages implemented in Accelerated Mobile Pages (AMP), which is sponsored by Google. Not only has Google shown it favors pages that are in AMP, better site speed will lead to better conversion rates for your site.

To see if implementing AMP may be beneficial to your business, you can read some case studies of other businesses that have done so here.

Knowledge panels and carousels

Knowledge panels such as the one below appear for broad informational searches, and rarely on highly converting keywords. While they are arguably the most imposing of all the SERP features, unless you’re a content site or CelebrityNetWorth.com, they probably steal some of your less valuable traffic.

If you’re losing clicks due to knowledge panels, it’s likely happening on queries that typically can be satisfied by quick answers and therefore are by users who might have bounced from your site anyway. You won’t be able to beat a knowledge panel for quick answers, but you can optimize your content to satisfy affiliated longer-tail queries that users will still scroll to organic listings to find.

Create in-depth content that answers these questions and make sure that you have strong title tags and meta descriptions for these pages so you can have a better chance of standing out in the SERP.

In some cases, knowledge panels may be something you can exploit for your branded search queries. There’s no guaranteed way to get your content featured in a knowledge panel, and the information presented in them does not come from your site, so they can’t be “won” in the same way as a featured snippet.

To get into a knowledge panel, you can try using structured data markup or try to get your brand on Wikipedia if you haven’t already. The Knowledge Graph relies heavily on existing databases like Wikipedia that users directly contribute to, so developing more Wikipedia articles for your brand and any personal brands associated with it can be one avenue to explore.

Search Engine Journal has some tips on how to implement both of these strategies and more in their blog post here.

Google Jobs

Google Jobs has taken up huge amounts of organic real estate from listing sites. It will be tough to compete, but there are strategies you can employ, especially if you run a niche job boards site.

Shifting your digital strategy to integrate more paid advertising so you can sit above Google and to generating content in other areas, like on news websites and advice boards, can help you.

For more details on how to employ some of these strategies, you can read Search Engine Journal’s Google Jobs survival tips.

To conclude

Look, I’d be lying to you if I said this was good news for us SEOs. It’s not. Organic is going to get more and more difficult. But it’s not all doom and gloom. As Rand Fishkin noted in his BrightonSEO speech this September, if we create intelligent SEO strategies with an eye towards the future, then we have the opportunity to be ahead of the curve when the real disruption hits.

We also need to start integrating our SEO strategies with other mediums; we need to be educated on optimizing for social media, paid advertising, and other tactics for raising brand awareness. The more adaptable and diverse your online marketing strategies are, the better.

Google will always be getting smarter, which just means we have to get smarter too.

To quote Jayson DeMers,

“If you define SEO as the ability to manipulate your way to the top of search rankings, then SEO will die. But if you define SEO as the practice of improving a website’s visibility in search results, then SEO will never die; it will only continue to evolve.”

Search, like nearly every other industry today, will continue to come against dramatic unanticipated changes in the future. Yet search will also only continue to grow in importance. It may become increasingly more difficult to manipulate your way to the top of search results, but there will always be a need to try, and Google will continue to reward content that serves its users well.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 1 month ago from tracking.feedpress.it

The Meta Referrer Tag: An Advancement for SEO and the Internet

Posted by Cyrus-Shepard

The movement to make the Internet more secure through HTTPS brings several useful advancements for webmasters. In addition to security improvements, HTTPS promises future technological advances and potential SEO benefits for marketers.

HTTPS in search results is rising. Recent MozCast data from Dr. Pete shows nearly 20% of first page Google results are now HTTPS.

Sadly, HTTPS also has its downsides.

Marketers run into their first challenge when they switch regular HTTP sites over to HTTPS. Technically challenging, the switch typically involves routing your site through a series of 301 redirects. Historically, these types of redirects are associated with a loss of link equity (thought to be around 15%) which can lead to a loss in rankings. This can offset any SEO advantage that Google claims switching.

Ross Hudgens perfectly summed it up in this tweet:

Many SEOs have anecdotally shared stories of HTTPS sites performing well in Google search results (and our soon-to-be-published Ranking Factors data seems to support this.) However, the short term effect of a large migration can be hard to take. When Moz recently switched to HTTPS to provide better security to our logged-in users, we saw an 8-9% dip in our organic search traffic.

Problem number two is the subject of this post. It involves the loss of referral data. Typically, when one site sends traffic to another, information is sent that identifies the originating site as the source of traffic. This invaluable data allows people to see where their traffic is coming from, and helps spread the flow of information across the web.

SEOs have long used referrer data for a number of beneficial purposes. Oftentimes, people will link back or check out the site sending traffic when they see the referrer in their analytics data. Spammers know this works, as evidenced by the recent increase in referrer spam:

This process stops when traffic flows from an HTTPS site to a non-secure HTTP site. In this case, no referrer data is sent. Webmasters can’t know where their traffic is coming from.

Here’s how referral data to my personal site looked when Moz switched to HTTPS. I lost all visibility into where my traffic came from.

Its (not provided) all over again!

Enter the meta referrer tag

While we can’t solve the ranking challenges imposed by switching a site to HTTPS, we can solve the loss of referral data, and it’s actually super-simple.

Almost completely unknown to most marketers, the relatively new meta referrer tag (it’s actually been around for a few years) was designed to help out in these situations.

Better yet, the tag allows you to control how your referrer information is passed.

The meta referrer tag works with most browsers to pass referrer information in a manner defined by the user. Traffic remains encrypted and all the benefits of using HTTPS remain in place, but now you can pass referrer data to all websites, even those that use HTTP.

How to use the meta referrer tag

What follows are extremely simplified instructions for using the meta referrer tag. For more in-depth understanding, we highly recommend referring to the W3C working draft of the spec.

The meta referrer tag is placed in the <head> section of your HTML, and references one of five states, which control how browsers send referrer information from your site. The five states are:

  1. None: Never pass referral data
    <meta name="referrer" content="none">
    
  2. None When Downgrade: Sends referrer information to secure HTTPS sites, but not insecure HTTP sites
    <meta name="referrer" content="none-when-downgrade">
    
  3. Origin Only: Sends the scheme, host, and port (basically, the subdomain) stripped of the full URL as a referrer, i.e. https://moz.com/example.html would simply send https://moz.com
    <meta name="referrer" content="origin">
    

  4. Origin When Cross-Origin: Sends the full URL as the referrer when the target has the same scheme, host, and port (i.e. subdomain) regardless if it’s HTTP or HTTPS, while sending origin-only referral information to external sites. (note: There is a typo in the official spec. Future versions should be “origin-when-cross-origin”)
    <meta name="referrer" content="origin-when-crossorigin">
    
  5. Unsafe URL: Always passes the URL string as a referrer. Note if you have any sensitive information contained in your URL, this isn’t the safest option. By default, URL fragments, username, and password are automatically stripped out.
    <meta name="referrer" content="unsafe-url">
    

The meta referrer tag in action

By clicking the link below, you can get a sense of how the meta referrer tag works.

Check Referrer

Boom!

We’ve set the meta referrer tag for Moz to “origin”, which means when we link out to another site, we pass our scheme, host, and port. The end result is you see http://moz.com as the referrer, stripped of the full URL path (/meta-referrer-tag).

My personal site typically receives several visits per day from Moz. Here’s what my analytics data looked like before and after we implemented the meta referrer tag.

For simplicity and security, most sites may want to implement the “origin” state, but there are drawbacks.

One negative side effect was that as soon as we implemented the meta referrer tag, our AdRoll analytics, which we use for retargeting, stopped working. It turns out that AdRoll uses our referrer information for analytics, but the meta referrer tag “origin” state meant that the only URL they ever saw reported was https://moz.com.

Conclusion

We love the meta referrer tag because it keeps information flowing on the Internet. It’s the way the web is supposed to work!

It helps marketers and webmasters see exactly where their traffic is coming from. It encourages engagement, communication, and even linking, which can lead to improvements in SEO.

Useful links:

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

UX, Content Quality, and SEO – Whiteboard Friday

Posted by EricEnge

Editor’s note: Today we’re featuring back-to-back episodes of Whiteboard Friday from our friends at Stone Temple Consulting. Make sure to also check out the first episode, “Becoming Better SEO Scientists” from Mark Traphagen.

User experience and the quality of your content have an incredibly broad impact on your SEO efforts. In this episode of Whiteboard Friday, Stone Temple’s Eric Enge shows you how paying attention to your users can benefit your position in the SERPs.

For reference, here’s a still of this week’s whiteboard.
Click on it to open a high resolution image in a new tab!

Video transcription

Hi, Mozzers. I’m Eric Enge, CEO of Stone Temple Consulting. Today I want to talk to you about one of the most underappreciated aspects of SEO, and that is the interaction between user experience, content quality, and your SEO rankings and traffic.

I’m going to take you through a little history first. You know, we all know about the Panda algorithm update that came out in February 23, 2011, and of course more recently we have the search quality update that came out in May 19, 2015. Our Panda friend had 27 different updates that we know of along the way. So a lot of stuff has gone on, but we need to realize that that is not where it all started.

The link algorithm from the very beginning was about search quality. Links allowed Google to have an algorithm that gave better results than the other search engines of their day, which were dependent on keywords. These things however, that I’ve just talked about, are still just the tip of the iceberg. Google goes a lot deeper than that, and I want to walk you through the different things that it does.

So consider for a moment, you have someone search on the phrase “men’s shoes” and they come to your website.

What is that they want when they come to your website? Do they want sneakers, sandals, dress shoes? Well, those are sort of the obvious things that they might want. But you need to think a little bit more about what the user really wants to be able to know before they buy from you.

First of all, there has to be a way to buy. By the way, affiliate sites don’t have ways to buy. So the line of thinking I’m talking about might not work out so well for affiliate sites and works better for people who can actually sell the product directly. But in addition to a way to buy, they might want a privacy policy. They might want to see an About Us page. They might want to be able to see your phone number. These are all different kinds of things that users look for when they arrive on the pages of your site.

So as we think about this, what is it that we can do to do a better job with our websites? Well, first of all, lose the focus on keywords. Don’t get me wrong, keywords haven’t gone entirely away. But the pages where we overemphasize one particular keyword over another or related phrases are long gone, and you need to have a broader focus on how you approach things.

User experience is now a big deal. You really need to think about how users are interacting with your page and how that shows your overall page quality. Think about the percent satisfaction. If I send a hundred users to your page from my search engine, how many of those users are going to be happy with the content or the products or everything that they see with your page? You need to think through the big picture. So at the end of the day, this impacts the content on your page to be sure, but a lot more than that it impacts the design, related items that you have on the page.

So let me just give you an example of that. I looked at one page recently that was for a flower site. It was a page about annuals on that site, and that page had no link to their perennials page. Well, okay, a fairly good percentage of people who arrive on a page about annuals are also going to want to have perennials as something they might consider buying. So that page was probably coming across as a poor user experience. So these related items concepts are incredibly important.

Then the links to your page is actually a way to get to some of those related items, and so those are really important as well. What are the related products that you link to?

Finally, really it impacts everything you do with your page design. You need to move past the old-fashioned way of thinking about SEO and into the era of: How am I doing with satisfying all the people who come to the pages of your site?

Thank you, Mozzers. Have a great day.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

The Importance of Being Different: Creating a Competitive Advantage With Your USP

Posted by TrentonGreener

“The one who follows the crowd will usually go no further than the crowd. Those who walk alone are likely to find themselves in places no one has ever been before.”

While this quote has been credited to everyone from Francis Phillip Wernig, under the pseudonym Alan Ashley-Pitt, to Einstein himself, the powerful message does not lose its substance no matter whom you choose to credit. There is a very important yet often overlooked effect of not heeding this warning. One which can be applied to all aspects of life. From love and happiness, to business and marketing, copying what your competitors are doing and failing to forge your own path can be a detrimental mistake.

While as marketers we are all acutely aware of the importance of differentiation, we’ve been trained for the majority of our lives to seek out the norm.

We spend the majority of our adolescent lives trying desperately not to be different. No one has ever been picked on for being too normal or not being different enough. We would beg our parents to buy us the same clothes little Jimmy or little Jamie wore. We’d want the same backpack and the same bike everyone else had. With the rise of the cell phone and later the smartphone, on hands and knees, we begged and pleaded for our parents to buy us the Razr, the StarTAC (bonus points if you didn’t have to Google that one), and later the iPhone. Did we truly want these things? Yes, but not just because they were cutting edge and nifty. We desired them because the people around us had them. We didn’t want to be the last to get these devices. We didn’t want to be different.

Thankfully, as we mature we begin to realize the fallacy that is trying to be normal. We start to become individuals and learn to appreciate that being different is often seen as beautiful. However, while we begin to celebrate being different on a personal level, it does not always translate into our business or professional lives.

We unconsciously and naturally seek out the normal, and if we want to be different—truly different in a way that creates an advantage—we have to work for it.

The truth of the matter is, anyone can be different. In fact, we all are very different. Even identical twins with the same DNA will often have starkly different personalities. As a business, the real challenge lies in being different in a way that is relevant, valuable to your audience, and creates an advantage.

“Strong products and services are highly differentiated from all other products and services. It’s that simple. It’s that difficult.” – Austin McGhie, Brand Is a Four Letter Word

Let’s explore the example of Revel Hotel & Casino. Revel is a 70-story luxury casino in Atlantic City that was built in 2012. There is simply not another casino of the same class in Atlantic City, but there might be a reason for this. Even if you’re not familiar with the city, a quick jump onto Atlantic City’s tourism website reveals that of the five hero banners that rotate, not one specifically mentions gambling, but three reference the boardwalk. This is further illustrated when exploring their internal linking structure. The beaches, boardwalk, and shopping all appear before a single mention of casinos. There simply isn’t as much of a market for high-end gamblers in the Atlantic City area; in the states Las Vegas serves that role. So while Revel has a unique advantage, their ability to attract customers to their resort has not resulted in profitable earnings reports. In Q2 2012, Revel had a gross operating loss of $35.177M, and in Q3 2012 that increased to $36.838M.

So you need to create a unique selling proposition (also known as unique selling point and commonly referred to as a USP), and your USP needs to be valuable to your audience and create a competitive advantage. Sounds easy enough, right? Now for the kicker. That advantage needs to be as sustainable as physically possible over the long term.

“How long will it take our competitors to duplicate our advantage?”

You really need to explore this question and the possible solutions your competitors could utilize to play catch-up or duplicate what you’ve done. Look no further than Google vs Bing to see this in action. No company out there is going to just give up because your USP is so much better; most will pivot or adapt in some way.

Let’s look at a Seattle-area coffee company of which you may or may not be familiar. Starbucks has tried quite a few times over the years to level-up their tea game with limited success, but the markets that Starbucks has really struggled to break into are the pastry, breads, dessert, and food markets.

Other stores had more success in these markets, and they thought that high-quality teas and bakery items were the USPs that differentiated them from the Big Bad Wolf that is Starbucks. And while they were right to think that their brick house would save them from the Big Bad Wolf for some time, this fable doesn’t end with the Big Bad Wolf in a boiling pot.

Never underestimate your competitor’s ability to be agile, specifically when overcoming a competitive disadvantage.

If your competitor can’t beat you by making a better product or service internally, they can always choose to buy someone who can.

After months of courting, on June 4th, 2012 Starbucks announced that they had come to an agreement to purchase La Boulange in order to “elevate core food offerings and build a premium, artisanal bakery brand.” If you’re a small-to-medium sized coffee shop and/or bakery that even indirectly competed with Starbucks, a new challenger approaches. And while those tea shops momentarily felt safe within the brick walls that guarded their USP, on the final day of that same year, the Big Bad Wolf huffed and puffed and blew a stack of cash all over Teavana. Making Teavana a wholly-owned subsidiary of Starbucks for the low, low price of $620M.

Sarcasm aside, this does a great job of illustrating the ability of companies—especially those with deep pockets—to be agile, and demonstrates that they often have an uncanny ability to overcome your company’s competitive advantage. In seven months, Starbucks went from a minor player in these markets to having all the tools they need to dominate tea and pastries. Have you tried their raspberry pound cake? It’s phenomenal.

Why does this matter to me?

Ok, we get it. We need to be different, and in a way that is relevant, valuable, defensible, and sustainable. But I’m not the CEO, or even the CMO. I cannot effect change on a company level; why does this matter to me?

I’m a firm believer that you effect change no matter what the name plate on your desk may say. Sure, you may not be able to call an all-staff meeting today and completely change the direction of your company tomorrow, but you can effect change on the parts of the business you do touch. No matter your title or area of responsibility, you need to know your company’s, client’s, or even a specific piece of content’s USP, and you need to ensure it is applied liberally to all areas of your work.

Look at this example SERP for “Mechanics”:

While yes, this search is very likely to be local-sensitive, that doesn’t mean you can’t stand out. Every single AdWords result, save one, has only the word “Mechanics” in the headline. (While the top of page ad is pulling description line 1 into the heading, the actual headline is still only “Mechanic.”) But even the one headline that is different doesn’t do a great job of illustrating the company’s USP. Mechanics at home? Whose home? Mine or theirs? I’m a huge fan of Steve Krug’s “Don’t Make Me Think,” and in this scenario there are too many questions I need answered before I’m willing to click through. “Mechanics; We Come To You” or even “Traveling Mechanics” illustrates this point much more clearly, and still fits within the 25-character limit for the headline.

If you’re an AdWords user, no matter how big or small your monthly spend may be, take a look at your top 10-15 keywords by volume and evaluate how well you’re differentiating yourself from the other brands in your industry. Test ad copy that draws attention to your USP and reap the rewards.

Now while this is simply an AdWords text ad example, the same concept can be applied universally across all of marketing.

Title tags & meta descriptions

As we alluded to above, not only do companies have USPs, but individual pieces of content can, and should, have their own USP. Use your title tag and meta description to illustrate what differentiates your piece of content from the competition and do so in a way that attracts the searcher’s click. Use your USP to your advantage. If you have already established a strong brand within a specific niche, great! Now use it to your advantage. Though it’s much more likely that you are competing against a strong brand, and in these scenarios ask yourself, “What makes our content different from theirs?” The answer you come up with is your content’s USP. Call attention to that in your title tag and meta description, and watch the CTR climb.

I encourage you to hop into your own site’s analytics and look at your top 10-15 organic landing pages and see how well you differentiate yourself. Even if you’re hesitant to negatively affect your inbound gold mines by changing the title tags, run a test and change up your meta description to draw attention to your USP. In an hour’s work, you just may make the change that pushes you a little further up those SERPs.

Branding

Let’s break outside the world of digital marketing and look at the world of branding. Tom’s Shoes competes against some heavy hitters in Nike, Adidas, Reebok, and Puma just to name a few. While Tom’s can’t hope to compete against the marketing budgets of these companies in a fair fight, they instead chose to take what makes them different, their USP, and disseminate it every chance they get. They have labeled themselves “The One for One” company. It’s in their homepage’s title tag, in every piece of marketing they put out, and it smacks you in the face when you land on their site. They even use the call-to-action “Get Good Karma” throughout their site.

Now as many of us may know, partially because of the scandal it created in late 2013, Tom’s is not actually a non-profit organization. No matter how you feel about the matter, this marketing strategy has created a positive effect on their bottom line. Fast Company conservatively estimated their revenues in 2013 at $250M, with many estimates being closer to the $300M mark. Not too bad of a slice of the pie when competing against the powerhouses Tom’s does.

Wherever you stand on this issue, Tom’s Shoes has done a phenomenal job of differentiating their brand from the big hitters in their industry.

Know your USP and disseminate it every chance you get.

This is worth repeating. Know your USP and disseminate it every chance you get, whether that be in title tags, ad copy, on-page copy, branding, or any other segment of your marketing campaigns. Online or offline, be different. And remember the quote that we started with, “The one who follows the crowd will usually go no further than the crowd. Those who walk alone are likely to find themselves in places no one has ever been before.”

The amount of marketing knowledge that can be taken from this one simple statement is astounding. Heed the words, stand out from the crowd, and you will have success.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

How to Rid Your Website of Six Common Google Analytics Headaches

Posted by amandaecking

I’ve been in and out of Google Analytics (GA) for the past five or so years agency-side. I’ve seen three different code libraries, dozens of new different features and reports roll out, IP addresses stop being reported, and keywords not-so-subtly phased out of the free platform.

Analytics has been a focus of mine for the past year or so—mainly, making sure clients get their data right. Right now, our new focus is closed loop tracking, but that’s a topic for another day. If you’re using Google Analytics, and only Google Analytics for the majority of your website stats, or it’s your primary vehicle for analysis, you need to make sure it’s accurate.

Not having data pulling in or reporting properly is like building a house on a shaky foundation: It doesn’t end well. Usually there are tears.

For some reason, a lot of people, including many of my clients, assume everything is tracking properly in Google Analytics… because Google. But it’s not Google who sets up your analytics. People do that. And people are prone to make mistakes.

I’m going to go through six scenarios where issues are commonly encountered with Google Analytics.

I’ll outline the remedy for each issue, and in the process, show you how to move forward with a diagnosis or resolution.

1. Self-referrals

This is probably one of the areas we’re all familiar with. If you’re seeing a lot of traffic from your own domain, there’s likely a problem somewhere—or you need to extend the default session length in Google Analytics. (For example, if you have a lot of long videos or music clips and don’t use event tracking; a website like TEDx or SoundCloud would be a good equivalent.)

Typically one of the first things I’ll do to help diagnose the problem is include an advanced filter to show the full referrer string. You do this by creating a filter, as shown below:

Filter Type: Custom filter > Advanced
Field A: Hostname
Extract A: (.*)
Field B: Request URI
Extract B: (.*)
Output To: Request URI
Constructor: $A1$B1

You’ll then start seeing the subdomains pulling in. Experience has shown me that if you have a separate subdomain hosted in another location (say, if you work with a separate company and they host and run your mobile site or your shopping cart), it gets treated by Google Analytics as a separate domain. Thus, you ‘ll need to implement cross domain tracking. This way, you can narrow down whether or not it’s one particular subdomain that’s creating the self-referrals.

In this example below, we can see all the revenue is being reported to the booking engine (which ended up being cross domain issues) and their own site is the fourth largest traffic source:

I’ll also a good idea to check the browser and device reports to start narrowing down whether the issue is specific to a particular element. If it’s not, keep digging. Look at pages pulling the self-referrals and go through the code with a fine-tooth comb, drilling down as much as you can.

2. Unusually low bounce rate

If you have a crazy-low bounce rate, it could be too good to be true. Unfortunately. An unusually low bounce rate could (and probably does) mean that at least on some pages of your website have the same Google Analytics tracking code installed twice.

Take a look at your source code, or use Google Tag Assistant (though it does have known bugs) to see if you’ve got GA tracking code installed twice.

While I tell clients having Google Analytics installed on the same page can lead to double the pageviews, I’ve not actually encountered that—I usually just say it to scare them into removing the duplicate implementation more quickly. Don’t tell on me.

3. Iframes anywhere

I’ve heard directly from Google engineers and Google Analytics evangelists that Google Analytics does not play well with iframes, and that it will never will play nice with this dinosaur technology.

If you track the iframe, you inflate your pageviews, plus you still aren’t tracking everything with 100% clarity.

If you don’t track across iframes, you lose the source/medium attribution and everything becomes a self-referral.

Damned if you do; damned if you don’t.

My advice: Stop using iframes. They’re Netscape-era technology anyway, with rainbow marquees and Comic Sans on top. Interestingly, and unfortunately, a number of booking engines (for hotels) and third-party carts (for ecommerce) still use iframes.

If you have any clients in those verticals, or if you’re in the vertical yourself, check with your provider to see if they use iframes. Or you can check for yourself, by right-clicking as close as you can to the actual booking element:

iframe-booking.png

There is no neat and tidy way to address iframes with Google Analytics, and usually iframes are not the only complicated element of setup you’ll encounter. I spent eight months dealing with a website on a subfolder, which used iframes and had a cross domain booking system, and the best visibility I was able to get was about 80% on a good day.

Typically, I’d approach diagnosing iframes (if, for some reason, I had absolutely no access to viewing a website or talking to the techs) similarly to diagnosing self-referrals, as self-referrals are one of the biggest symptoms of iframe use.

4. Massive traffic jumps

Massive jumps in traffic don’t typically just happen. (Unless, maybe, you’re Geraldine.) There’s always an explanation—a new campaign launched, you just turned on paid ads for the first time, you’re using content amplification platforms, you’re getting a ton of referrals from that recent press in The New York Times. And if you think it just happened, it’s probably a technical glitch.

I’ve seen everything from inflated pageviews result from including tracking on iframes and unnecessary implementation of virtual pageviews, to not realizing the tracking code was installed on other microsites for the same property. Oops.

Usually I’ve seen this happen when the tracking code was somewhere it shouldn’t be, so if you’re investigating a situation of this nature, first confirm the Google Analytics code is only in the places it needs to be.Tools like Google Tag Assistant and Screaming Frog can be your BFFs in helping you figure this out.

Also, I suggest bribing the IT department with sugar (or booze) to see if they’ve changed anything lately.

5. Cross-domain tracking

I wish cross-domain tracking with Google Analytics out of the box didn’t require any additional setup. But it does.

If you don’t have it set up properly, things break down quickly, and can be quite difficult to untangle.

The older the GA library you’re using, the harder it is. The easiest setup, by far, is Google Tag Manager with Universal Analytics. Hard-coded universal analytics is a bit more difficult because you have to implement autoLink manually and decorate forms, if you’re using them (and you probably are). Beyond that, rather than try and deal with it, I say update your Google Analytics code. Then we can talk.

Where I’ve seen the most murkiness with tracking is when parts of cross domain tracking are implemented, but not all. For some reason, if allowLinker isn’t included, or you forget to decorate all the forms, the cookies aren’t passed between domains.

The absolute first place I would start with this would be confirming the cookies are all passing properly at all the right points, forms, links, and smoke signals. I’ll usually use a combination of the Real Time report in Google Analytics, Google Tag Assistant, and GA debug to start testing this. Any debug tool you use will mean you’re playing in the console, so get friendly with it.

6. Internal use of UTM strings

I’ve saved the best for last. Internal use of campaign tagging. We may think, oh, I use Google to tag my campaigns externally, and we’ve got this new promotion on site which we’re using a banner ad for. That’s a campaign. Why don’t I tag it with a UTM string?

Step away from the keyboard now. Please.

When you tag internal links with UTM strings, you override the original source/medium. So that visitor who came in through your paid ad and then who clicks on the campaign banner has now been manually tagged. You lose the ability to track that they came through on the ad the moment they click on the tagged internal link. Their source and medium is now your internal campaign, not that paid ad you’re spending gobs of money on and have to justify to your manager. See the problem?

I’ve seen at least three pretty spectacular instances of this in the past year, and a number of smaller instances of it. Annie Cushing also talks about the evils of internal UTM tags and the odd prevalence of it. (Oh, and if you haven’t explored her blog, and the amazing spreadsheets she shares, please do.)

One clothing company I worked with tagged all of their homepage offers with UTM strings, which resulted in the loss of visibility for one-third of their audience: One million visits over the course of a year, and $2.1 million in lost revenue.

Let me say that again. One million visits, and $2.1 million. That couldn’t be attributed to an external source/campaign/spend.

Another client I audited included campaign tagging on nearly every navigational element on their website. It still gives me nightmares.

If you want to see if you have any internal UTM strings, head straight to the Campaigns report in Acquisition in Google Analytics, and look for anything like “home” or “navigation” or any language you may use internally to refer to your website structure.

And if you want to see how users are moving through your website, go to the Flow reports. Or if you really, really, really want to know how many people click on that sidebar link, use event tracking. But please, for the love of all things holy (and to keep us analytics lovers from throwing our computers across the room), stop using UTM tagging on your internal links.

Now breathe and smile

Odds are, your Google Analytics setup is fine. If you are seeing any of these issues, though, you have somewhere to start in diagnosing and addressing the data.

We’ve looked at six of the most common points of friction I’ve encountered with Google Analytics and how to start investigating them: self-referrals, bounce rate, iframes, traffic jumps, cross domain tracking and internal campaign tagging.

What common data integrity issues have you encountered with Google Analytics? What are your favorite tools to investigate?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

10 Reasons you might lose Trust Flow or Citation Flow

It is entirely possible that sites and or pages or the web can suddenly seem to lose Trust Flow and/or Citation Flow. If you do not yet understand these metrics, then you might first like to watch this short video on Understanding Flow Metrics. [If you are looking at this near the date of publishing,…

The post 10 Reasons you might lose Trust Flow or Citation Flow appeared first on Majestic Blog.

Reblogged 3 years ago from blog.majestic.com

10 Reasons you might lose Trust Flow or Citation Flow

It is entirely possible that sites and or pages or the web can suddenly seem to lose Trust Flow and/or Citation Flow. If you do not yet understand these metrics, then you might first like to watch this short video on Understanding Flow Metrics. [If you are looking at this near the date of publishing,…

The post 10 Reasons you might lose Trust Flow or Citation Flow appeared first on Majestic Blog.

Reblogged 3 years ago from blog.majestic.com

Why We Can’t Do Keyword Research Like It’s 2010 – Whiteboard Friday

Posted by randfish

Keyword Research is a very different field than it was just five years ago, and if we don’t keep up with the times we might end up doing more harm than good. From the research itself to the selection and targeting process, in today’s Whiteboard Friday Rand explains what has changed and what we all need to do to conduct effective keyword research today.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

What do we need to change to keep up with the changing world of keyword research?

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat a little bit about keyword research, why it’s changed from the last five, six years and what we need to do differently now that things have changed. So I want to talk about changing up not just the research but also the selection and targeting process.

There are three big areas that I’ll cover here. There’s lots more in-depth stuff, but I think we should start with these three.

1) The Adwords keyword tool hides data!

This is where almost all of us in the SEO world start and oftentimes end with our keyword research. We go to AdWords Keyword Tool, what used to be the external keyword tool and now is inside AdWords Ad Planner. We go inside that tool, and we look at the volume that’s reported and we sort of record that as, well, it’s not good, but it’s the best we’re going to do.

However, I think there are a few things to consider here. First off, that tool is hiding data. What I mean by that is not that they’re not telling the truth, but they’re not telling the whole truth. They’re not telling nothing but the truth, because those rounded off numbers that you always see, you know that those are inaccurate. Anytime you’ve bought keywords, you’ve seen that the impression count never matches the count that you see in the AdWords tool. It’s not usually massively off, but it’s often off by a good degree, and the only thing it’s great for is telling relative volume from one from another.

But because AdWords hides data essentially by saying like, “Hey, you’re going to type in . . .” Let’s say I’m going to type in “college tuition,” and Google knows that a lot of people search for how to reduce college tuition, but that doesn’t come up in the suggestions because it’s not a commercial term, or they don’t think that an advertiser who bids on that is going to do particularly well and so they don’t show it in there. I’m giving an example. They might indeed show that one.

But because that data is hidden, we need to go deeper. We need to go beyond and look at things like Google Suggest and related searches, which are down at the bottom. We need to start conducting customer interviews and staff interviews, which hopefully has always been part of your brainstorming process but really needs to be now. Then you can apply that to AdWords. You can apply that to suggest and related.

The beautiful thing is once you get these tools from places like visiting forums or communities, discussion boards and seeing what terms and phrases people are using, you can collect all this stuff up, plug it back into AdWords, and now they will tell you how much volume they’ve got. So you take that how to lower college tuition term, you plug it into AdWords, they will show you a number, a non-zero number. They were just hiding it in the suggestions because they thought, “Hey, you probably don’t want to bid on that. That won’t bring you a good ROI.” So you’ve got to be careful with that, especially when it comes to SEO kinds of keyword research.

2) Building separate pages for each term or phrase doesn’t make sense

It used to be the case that we built separate pages for every single term and phrase that was in there, because we wanted to have the maximum keyword targeting that we could. So it didn’t matter to us that college scholarship and university scholarships were essentially people looking for exactly the same thing, just using different terminology. We would make one page for one and one page for the other. That’s not the case anymore.

Today, we need to group by the same searcher intent. If two searchers are searching for two different terms or phrases but both of them have exactly the same intent, they want the same information, they’re looking for the same answers, their query is going to be resolved by the same content, we want one page to serve those, and that’s changed up a little bit of how we’ve done keyword research and how we do selection and targeting as well.

3) Build your keyword consideration and prioritization spreadsheet with the right metrics

Everybody’s got an Excel version of this, because I think there’s just no awesome tool out there that everyone loves yet that kind of solves this problem for us, and Excel is very, very flexible. So we go into Excel, we put in our keyword, the volume, and then a lot of times we almost stop there. We did keyword volume and then like value to the business and then we prioritize.

What are all these new columns you’re showing me, Rand? Well, here I think is how sophisticated, modern SEOs that I’m seeing in the more advanced agencies, the more advanced in-house practitioners, this is what I’m seeing them add to the keyword process.

Difficulty

A lot of folks have done this, but difficulty helps us say, “Hey, this has a lot of volume, but it’s going to be tremendously hard to rank.”

The difficulty score that Moz uses and attempts to calculate is a weighted average of the top 10 domain authorities. It also uses page authority, so it’s kind of a weighted stack out of the two. If you’re seeing very, very challenging pages, very challenging domains to get in there, it’s going to be super hard to rank against them. The difficulty is high. For all of these ones it’s going to be high because college and university terms are just incredibly lucrative.

That difficulty can help bias you against chasing after terms and phrases for which you are very unlikely to rank for at least early on. If you feel like, “Hey, I already have a powerful domain. I can rank for everything I want. I am the thousand pound gorilla in my space,” great. Go after the difficulty of your choice, but this helps prioritize.

Opportunity

This is actually very rarely used, but I think sophisticated marketers are using it extremely intelligently. Essentially what they’re saying is, “Hey, if you look at a set of search results, sometimes there are two or three ads at the top instead of just the ones on the sidebar, and that’s biasing some of the click-through rate curve.” Sometimes there’s an instant answer or a Knowledge Graph or a news box or images or video, or all these kinds of things that search results can be marked up with, that are not just the classic 10 web results. Unfortunately, if you’re building a spreadsheet like this and treating every single search result like it’s just 10 blue links, well you’re going to lose out. You’re missing the potential opportunity and the opportunity cost that comes with ads at the top or all of these kinds of features that will bias the click-through rate curve.

So what I’ve seen some really smart marketers do is essentially build some kind of a framework to say, “Hey, you know what? When we see that there’s a top ad and an instant answer, we’re saying the opportunity if I was ranking number 1 is not 10 out of 10. I don’t expect to get whatever the average traffic for the number 1 position is. I expect to get something considerably less than that. Maybe something around 60% of that, because of this instant answer and these top ads.” So I’m going to mark this opportunity as a 6 out of 10.

There are 2 top ads here, so I’m giving this a 7 out of 10. This has two top ads and then it has a news block below the first position. So again, I’m going to reduce that click-through rate. I think that’s going down to a 6 out of 10.

You can get more and less scientific and specific with this. Click-through rate curves are imperfect by nature because we truly can’t measure exactly how those things change. However, I think smart marketers can make some good assumptions from general click-through rate data, which there are several resources out there on that to build a model like this and then include it in their keyword research.

This does mean that you have to run a query for every keyword you’re thinking about, but you should be doing that anyway. You want to get a good look at who’s ranking in those search results and what kind of content they’re building . If you’re running a keyword difficulty tool, you are already getting something like that.

Business value

This is a classic one. Business value is essentially saying, “What’s it worth to us if visitors come through with this search term?” You can get that from bidding through AdWords. That’s the most sort of scientific, mathematically sound way to get it. Then, of course, you can also get it through your own intuition. It’s better to start with your intuition than nothing if you don’t already have AdWords data or you haven’t started bidding, and then you can refine your sort of estimate over time as you see search visitors visit the pages that are ranking, as you potentially buy those ads, and those kinds of things.

You can get more sophisticated around this. I think a 10 point scale is just fine. You could also use a one, two, or three there, that’s also fine.

Requirements or Options

Then I don’t exactly know what to call this column. I can’t remember the person who’ve showed me theirs that had it in there. I think they called it Optional Data or Additional SERPs Data, but I’m going to call it Requirements or Options. Requirements because this is essentially saying, “Hey, if I want to rank in these search results, am I seeing that the top two or three are all video? Oh, they’re all video. They’re all coming from YouTube. If I want to be in there, I’ve got to be video.”

Or something like, “Hey, I’m seeing that most of the top results have been produced or updated in the last six months. Google appears to be biasing to very fresh information here.” So, for example, if I were searching for “university scholarships Cambridge 2015,” well, guess what? Google probably wants to bias to show results that have been either from the official page on Cambridge’s website or articles from this year about getting into that university and the scholarships that are available or offered. I saw those in two of these search results, both the college and university scholarships had a significant number of the SERPs where a fresh bump appeared to be required. You can see that a lot because the date will be shown ahead of the description, and the date will be very fresh, sometime in the last six months or a year.

Prioritization

Then finally I can build my prioritization. So based on all the data I had here, I essentially said, “Hey, you know what? These are not 1 and 2. This is actually 1A and 1B, because these are the same concepts. I’m going to build a single page to target both of those keyword phrases.” I think that makes good sense. Someone who is looking for college scholarships, university scholarships, same intent.

I am giving it a slight prioritization, 1A versus 1B, and the reason I do this is because I always have one keyword phrase that I’m leaning on a little more heavily. Because Google isn’t perfect around this, the search results will be a little different. I want to bias to one versus the other. In this case, my title tag, since I more targeting university over college, I might say something like college and university scholarships so that university and scholarships are nicely together, near the front of the title, that kind of thing. Then 1B, 2, 3.

This is kind of the way that modern SEOs are building a more sophisticated process with better data, more inclusive data that helps them select the right kinds of keywords and prioritize to the right ones. I’m sure you guys have built some awesome stuff. The Moz community is filled with very advanced marketers, probably plenty of you who’ve done even more than this.

I look forward to hearing from you in the comments. I would love to chat more about this topic, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it