How Much Has Link Building Changed in Recent Years?

Posted by Paddy_Moogan

I get asked this question a lot. It’s mainly asked by people who are considering buying my link building book and want to know whether it’s still up to date. This is understandable given that the first edition was published in February 2013 and our industry has a deserved reputation for always changing.

I find myself giving the same answer, even though I’ve been asked it probably dozens of times in the last two years—”not that much”. I don’t think this is solely due to the book itself standing the test of time, although I’ll happily take a bit of credit for that 🙂 I think it’s more a sign of our industry as a whole not changing as much as we’d like to think.

I started to question myself and if I was right and honestly, it’s one of the reasons it has taken me over two years to release the second edition of the book.

So I posed this question to a group of friends not so long ago, some via email and some via a Facebook group. I was expecting to be called out by many of them because my position was that in reality, it hasn’t actually changed that much. The thing is, many of them agreed and the conversations ended with a pretty long thread with lots of insights. In this post, I’d like to share some of them, share what my position is and talk about what actually has changed.

My personal view

Link building hasn’t changed as much we think it has.

The core principles of link building haven’t changed. The signals around link building have changed, but mainly around new machine learning developments that have indirectly affected what we do. One thing that has definitely changed is the mindset of SEOs (and now clients) towards link building.

I think the last big change to link building came in April 2012 when Penguin rolled out. This genuinely did change our industry and put to bed a few techniques that should never have worked so well in the first place.

Since then, we’ve seen some things change, but the core principles haven’t changed if you want to build a business that will be around for years to come and not run the risk of being hit by a link related Google update. For me, these principles are quite simple:

  • You need to deserve links – either an asset you create or your product
  • You need to put this asset in front of a relevant audience who have the ability to share it
  • You need consistency – one new asset every year is unlikely to cut it
  • Anything that scales is at risk

For me, the move towards user data driving search results + machine learning has been the biggest change we’ve seen in recent years and it’s still going.

Let’s dive a bit deeper into all of this and I’ll talk about how this relates to link building.

The typical mindset for building links has changed

I think that most SEOs are coming round to the idea that you can’t get away with building low quality links any more, not if you want to build a sustainable, long-term business. Spammy link building still works in the short-term and I think it always will, but it’s much harder than it used to be to sustain websites that are built on spam. The approach is more “churn and burn” and spammers are happy to churn through lots of domains and just make a small profit on each one before moving onto another.

For everyone else, it’s all about the long-term and not putting client websites at risk.

This has led to many SEOs embracing different forms of link building and generally starting to use content as an asset when it comes to attracting links. A big part of me feels that it was actually Penguin in 2012 that drove the rise of content marketing amongst SEOs, but that’s a post for another day…! For today though, this goes some way towards explain the trend we see below.

Slowly but surely, I’m seeing clients come to my company already knowing that low quality link building isn’t what they want. It’s taken a few years after Penguin for it to filter down to client / business owner level, but it’s definitely happening. This is a good thing but unfortunately, the main reason for this is that most of them have been burnt in the past by SEO companies who have built low quality links without giving thought to building good quality ones too.

I have no doubt that it’s this change in mindset which has led to trends like this:

The thing is, I don’t think this was by choice.

Let’s be honest. A lot of us used the kind of link building tactics that Google no longer like because they worked. I don’t think many SEOs were under the illusion that it was genuinely high quality stuff, but it worked and it was far less risky to do than it is today. Unless you were super-spammy, the low-quality links just worked.

Fast forward to a post-Penguin world, things are far more risky. For me, it’s because of this that we see the trends like the above. As an industry, we had the easiest link building methods taken away from us and we’re left with fewer options. One of the main options is content marketing which, if you do it right, can lead to good quality links and importantly, the types of links you won’t be removing in the future. Get it wrong and you’ll lose budget and lose the trust if your boss or client in the power of content when it comes to link building.

There are still plenty of other methods to build links and sometimes we can forget this. Just look at this epic list from Jon Cooper. Even with this many tactics still available to us, it’s hard work. Way harder than it used to be.

My summary here is that as an industry, our mindset has shifted but it certainly wasn’t a voluntary shift. If the tactics that Penguin targeted still worked today, we’d still be using them.

A few other opinions…

I definitely think too many people want the next easy win. As someone surfing the edge of what Google is bringing our way, here’s my general take—SEO, in broad strokes, is changing a lot, *but* any given change is more and more niche and impacts fewer people. What we’re seeing isn’t radical, sweeping changes that impact everyone, but a sort of modularization of SEO, where we each have to be aware of what impacts our given industries, verticals, etc.”

Dr. Pete

 

I don’t feel that techniques for acquiring links have changed that much. You can either earn them through content and outreach or you can just buy them. What has changed is the awareness of “link building” outside of the SEO community. This makes link building / content marketing much harder when pitching to journalists and even more difficult when pitching to bloggers.

“Link building has to be more integrated with other channels and struggles to work in its own environment unless supported by brand, PR and social. Having other channels supporting your link development efforts also creates greater search signals and more opportunity to reach a bigger audience which will drive a greater ROI.

Carl Hendy

 

SEO has grown up in terms of more mature staff and SEOs becoming more ingrained into businesses so there is a smarter (less pressure) approach. At the same time, SEO has become more integrated into marketing and has made marketing teams and decision makers more intelligent in strategies and not pushing for the quick win. I’m also seeing that companies who used to rely on SEO and building links have gone through IPOs and the need to build 1000s of links per quarter has rightly reduced.

Danny Denhard

Signals that surround link building have changed

There is no question about this one in my mind. I actually wrote about this last year in my previous blog post where I talked about signals such as anchor text and deep links changing over time.

Many of the people I asked felt the same, here are some quotes from them, split out by the types of signal.

Domain level link metrics

I think domain level links have become increasingly important compared with page level factors, i.e. you can get a whole site ranking well off the back of one insanely strong page, even with sub-optimal PageRank flow from that page to the rest of the site.

Phil Nottingham

I’d agree with Phil here and this is what I was getting at in my previous post on how I feel “deep links” will matter less over time. It’s not just about domain level links here, it’s just as much about the additional signals available for Google to use (more on that later).

Anchor text

I’ve never liked anchor text as a link signal. I mean, who actually uses exact match commercial keywords as anchor text on the web?

SEOs. 🙂

Sure there will be natural links like this, but honestly, I struggle with the idea that it took Google so long to start turning down the dial on commercial anchor text as a ranking signal. They are starting to turn it down though, slowly but surely. Don’t get me wrong, it still matters and it still works. But like pure link spam, the barrier is a lot more lower now in terms what of constitutes too much.

Rand feels that they matter more than we’d expect and I’d mostly agree with this statement:

Exact match anchor text links still have more power than you’d expect—I think Google still hasn’t perfectly sorted what is “brand” or “branded query” from generics (i.e. they want to start ranking a new startup like meldhome.com for “Meld” if the site/brand gets popular, but they can’t quite tell the difference between that and https://moz.com/learn/seo/redirection getting a few manipulative links that say “redirect”)

Rand Fishkin

What I do struggle with though, is that Google still haven’t figured this out and that short-term, commercial anchor text spam is still so effective. Even for a short burst of time.

I don’t think link building as a concept has changed loads—but I think links as a signal have, mainly because of filters and penalties but I don’t see anywhere near the same level of impact from coverage anymore, even against 18 months ago.

Paul Rogers

New signals have been introduced

It isn’t just about established signals changing though, there are new signals too and I personally feel that this is where we’ve seen the most change in Google algorithms in recent years—going all the way back to Panda in 2011.

With Panda, we saw a new level of machine learning where it almost felt like Google had found a way of incorporating human reaction / feelings into their algorithms. They could then run this against a website and answer questions like the ones included in this post. Things such as:

  • “Would you be comfortable giving your credit card information to this site?”
  • “Does this article contain insightful analysis or interesting information that is beyond obvious?”
  • “Are the pages produced with great care and attention to detail vs. less attention to detail?”

It is a touch scary that Google was able to run machine learning against answers to questions like this and write an algorithm to predict the answers for any given page on the web. They have though and this was four years ago now.

Since then, they’ve made various moves to utilize machine learning and AI to build out new products and improve their search results. For me, this was one of the biggest and went pretty unnoticed by our industry. Well, until Hummingbird came along I feel pretty sure that we have Ray Kurzweil to thank for at least some of that.

There seems to be more weight on theme/topic related to sites, though it’s hard to tell if this is mostly link based or more user/usage data based. Google is doing a good job of ranking sites and pages that don’t earn the most links but do provide the most relevant/best answer. I have a feeling they use some combination of signals to say “people who perform searches like this seem to eventually wind up on this website—let’s rank it.” One of my favorite examples is the Audubon Society ranking for all sorts of birding-related searches with very poor keyword targeting, not great links, etc. I think user behavior patterns are stronger in the algo than they’ve ever been.

– Rand Fishkin

Leading on from what Rand has said, it’s becoming more and more common to see search results that just don’t make sense if you look at the link metrics—but are a good result.

For me, the move towards user data driving search results + machine learning advanced has been the biggest change we’ve seen in recent years and it’s still going.

Edit: since drafting this post, Tom Anthony released this excellent blog post on his views on the future of search and the shift to data-driven results. I’d recommend reading that as it approaches this whole area from a different perspective and I feel that an off-shoot of what Tom is talking about is the impact on link building.

You may be asking at this point, what does machine learning have to do with link building?

Everything. Because as strong as links are as a ranking signal, Google want more signals and user signals are far, far harder to manipulate than established link signals. Yes it can be done—I’ve seen it happen. There have even been a few public tests done. But it’s very hard to scale and I’d venture a guess that only the top 1% of spammers are capable of doing it, let alone maintaining it for a long period of time. When I think about the process for manipulation here, I actually think we go a step beyond spammers towards hackers and more cut and dry illegal activity.

For link building, this means that traditional methods of manipulating signals are going to become less and less effective as these user signals become stronger. For us as link builders, it means we can’t keep searching for that silver bullet or the next method of scaling link building just for an easy win. The fact is that scalable link building is always going to be at risk from penalization from Google—I don’t really want to live a life where I’m always worried about my clients being hit by the next update. Even if Google doesn’t catch up with a certain method, machine learning and user data mean that these methods may naturally become less effective and cost efficient over time.

There are of course other things such as social signals that have come into play. I certainly don’t feel like these are a strong ranking factor yet, but with deals like this one between Google and Twitter being signed, I wouldn’t be surprised if that ever-growing dataset is used at some point in organic results. The one advantage that Twitter has over Google is it’s breaking news freshness. Twitter is still way quicker at breaking news than Google is—140 characters in a tweet is far quicker than Google News! Google know this which is why I feel they’ve pulled this partnership back into existence after a couple of years apart.

There is another important point to remember here and it’s nicely summarised by Dr. Pete:

At the same time, as new signals are introduced, these are layers not replacements. People hear social signals or user signals or authorship and want it to be the link-killer, because they already fucked up link-building, but these are just layers on top of on-page and links and all of the other layers. As each layer is added, it can verify the layers that came before it and what you need isn’t the magic signal but a combination of signals that generally matches what Google expects to see from real, strong entities. So, links still matter, but they matter in concert with other things, which basically means it’s getting more complicated and, frankly, a bit harder. Of course, on one wants to hear that.”

– Dr. Pete

The core principles have not changed

This is the crux of everything for me. With all the changes listed above, the key is that the core principles around link building haven’t changed. I could even argue that Penguin didn’t change the core principles because the techniques that Penguin targeted should never have worked in the first place. I won’t argue this too much though because even Google advised website owners to build directory links at one time.

You need an asset

You need to give someone a reason to link to you. Many won’t do it out of the goodness of their heart! One of the most effective ways to do this is to develop a content asset and use this as your reason to make people care. Once you’ve made someone care, they’re more likely to share the content or link to it from somewhere.

You need to promote that asset to the right audience

I really dislike the stance that some marketers take when it comes to content promotion—build great content and links will come.

No. Sorry but for the vast majority of us, that’s simply not true. The exceptions are people that sky dive from space or have huge existing audiences to leverage.

You simply have to spend time promoting your content or your asset for it to get shares and links. It is hard work and sometimes you can spend a long time on it and get little return, but it’s important to keep working at until you’re at a point where you have two things:

  • A big enough audience where you can almost guarantee at least some traffic to your new content along with some shares
  • Enough strong relationships with relevant websites who you can speak to when new content is published and stand a good chance of them linking to it

Getting to this point is hard—but that’s kind of the point. There are various hacks you can use along the way but it will take time to get right.

You need consistency

Leading on from the previous point. It takes time and hard work to get links to your content—the types of links that stand the test of time and you’re not going to be removing in 12 months time anyway! This means that you need to keep pushing content out and getting better each and every time. This isn’t to say you should just churn content out for the sake of it, far from it. I am saying that with each piece of content you create, you will learn to do at least one thing better the next time. Try to give yourself the leverage to do this.

Anything scalable is at risk

Scalable link building is exactly what Google has been trying to crack down on for the last few years. Penguin was the biggest move and hit some of the most scalable tactics we had at our disposal. When you scale something, you often lose some level of quality, which is exactly what Google doesn’t want when it comes to links. If you’re still relying on tactics that could fall into the scalable category, I think you need to be very careful and just look at the trend in the types of links Google has been penalizing to understand why.

The part Google plays in this

To finish up, I want to briefly talk about the part that Google plays in all of this and shaping the future they want for the web.

I’ve always tried to steer clear of arguments involving the idea that Google is actively pushing FUD into the community. I’ve preferred to concentrate more on things I can actually influence and change with my clients rather than what Google is telling us all to do.

However, for the purposes of this post, I want to talk about it.

General paranoia has increased. My bet is there are some companies out there carrying out zero specific linkbuilding activity through worry.

Dan Barker

Dan’s point is a very fair one and just a day or two after reading this in an email, I came across a page related to a client’s target audience that said:

“We are not publishing guest posts on SITE NAME any more. All previous guest posts are now deleted. For more information, see www.mattcutts.com/blog/guest-blogging/“.

I’ve reworded this as to not reveal the name of the site, but you get the point.

This is silly. Honestly, so silly. They are a good site, publish good content, and had good editorial standards. Yet they have ignored all of their own policies, hard work, and objectives to follow a blog post from Matt. I’m 100% confident that it wasn’t sites like this one that Matt was talking about in this blog post.

This is, of course, from the publishers’ angle rather than the link builders’ angle, but it does go to show the effect that statements from Google can have. Google know this so it does make sense for them to push out messages that make their jobs easier and suit their own objectives—why wouldn’t they? In a similar way, what did they do when they were struggling to classify at scale which links are bad vs. good and they didn’t have a big enough web spam team? They got us to do it for them 🙂

I’m mostly joking here, but you see the point.

The most recent infamous mobilegeddon update, discussed here by Dr. Pete is another example of Google pushing out messages that ultimately scared a lot of people into action. Although to be fair, I think that despite the apparent small impact so far, the broad message from Google is a very serious one.

Because of this, I think we need to remember that Google does have their own agenda and many shareholders to keep happy. I’m not in the camp of believing everything that Google puts out is FUD, but I’m much more sensitive and questioning of the messages now than I’ve ever been.

What do you think? I’d love to hear your feedback and thoughts in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

How to Create Boring-Industry Content that Gets Shared

Posted by ronell-smith

If you think creating content for boring industries is tough, try creating content for an expensive product that’ll be sold in a so-called boring industry. Such was the problem faced by Mike Jackson, head of sales for a large Denver-based company that was debuting a line of new high-end products for the fishing industry in 2009.

After years of pestering the executives of his traditional, non-flashy company to create a line of products that could be sold to anglers looking to buy premium items, he finally had his wish: a product so expensive only a small percentage of anglers could afford them.

(image source)

What looked like being boxed into a corner was actually part of the plan.

When asked how he could ever put his neck on the line for a product he’d find tough to sell and even tougher to market, he revealed his brilliant plan.

“I don’t need to sell one million of [these products] a year,” he said. “All I need to do is sell a few hundred thousand, which won’t be hard. And as far as marketing, that’s easy: I’m ignoring the folks who’ll buy the items. I’m targeting professional anglers, the folks the buyers are influenced by. If the pros, the influencers, talk about and use the products, people will buy them.”

Such was my first introduction to how it’s often wise to ignore who’ll buy the product in favor of marketing to those who’ll help you market and sell the product.

These influencers are a sweet spot in product marketing and they are largely ignored by many brands

Looking at content for boring industries all wrong

A few months back, I received a message in Google Plus that really piqued my interest: “What’s the best way to create content for my boring business? Just kidding. No one will read it, nor share information from a painter anyway.”

I went from being dismayed to disheartened. Dismayed because the business owner hadn’t yet found a way to connect with his prospects through meaningful content. Disheartened because he seemed to have given up trying.

You can successfully create content for boring industries. Doing so requires nothing out of the ordinary from what you’d normally do to create content for any industry. That’s the good news.

The bad news: Creating successful content for boring industries requires you think beyond content and SEO, focusing heavily on content strategy and outreach.

Successfully creating content for boring industries—or any industry, for that matter—comes down to who’ll share it and who’ll link to it, not who’ll read it, a point nicely summed up in this tweet:

So when businesses struggle with creating content for their respective industries, the culprits are typically easy to find:

  • They lack clarity on who they are creating content for (e.g., content strategy, personas)
  • There are no specific goals (e.g., traffic, links, conversions, etc.) assigned regarding the content, so measuring its effectiveness is impossible
  • They’re stuck in neutral thinking viral content is the only option, while ignoring the value of content amplification (e.g., PR/outreach)

Alone, these three elements are bad; taken together, though, they spell doom for your brand.

content does not equal amplification

If you lack clarity on who you’re creating content for, the best you can hope for is that sometimes you’ll create and share information members of your audience find useful, but you likely won’t be able to reach or engage them with the needed frequency to make content marketing successful.

Goals, or lack thereof, are the real bugaboo of content creation. The problem is even worse for boring industries, where the pressure is on to deliver a content vehicle that meets the threshold of interest to simply gain attention, much less, earn engagement.

For all the hype about viral content, it’s dismaying that so few marketers aren’t being honest on the topic: it’s typically hard to create, impossible to predict and typically has very, very little connection to conversions for most businesses.

What I’ve found is that businesses, regardless of category, struggle to create worthwhile content, leading me to believe there is no boring industry content, only content that’s boring.

“Whenever we label content as ‘boring,’ we’re really admitting we have no idea how to approach marketing something,” says Builtvisible’s Richard Baxter.

Now that we know what the impediments are to producing content for any industry, including boring industries, it’s time to tackle the solution.

Develop a link earning mindset

There are lots of article on the web regarding how to create content for boring industries, some of which have appeared on this very blog.

But, to my mind, the one issue they all suffer from is they all focus on what content should be created, not (a) what content is worthy of promotion, (b) how to identify those who could help with promotion, and (c) how to earn links from boring industry content. (Remember, much of the content that’s read is never shared; much of what’s shared is never read in its entirety; and some of the most linked-to content is neither heavily shared nor heavily read.)

This is why content creators in boring industries should scrap their notions of having the most-read and most-shared content, shifting their focus to creating content that can earn links in addition to generating traffic and social signals to the site.

After all, links and conversions are the main priorities for most businesses sharing content online, including so-called local businesses.

ranking factors survey results

(Image courtesy of the 2014 Moz Local Search Ranking Factors Survey)

If you’re ready to create link-earning, traffic-generating content for your boring-industry business follow the tips from the fictitious example of RZ’s Auto Repair, a Dallas, Texas, automobile shop.

With the Dallas-Forth Worth market being large and competitive, RZ’s has narrowed their speciality to storm repair, mainly hail damage, which is huge in the area. Even with the narrowed focus, however, they still have stiff competition from the major players in the vertical, including MAACO.

What the brand does have in its favor, however, is a solid website and a strong freelance copywriter to help produce content.

Remember, those three problems we mentioned above—lack of goals, lack of clarity and lack of focus on amplification—we’ll now put them to good use to drive our main objectives of traffic, links and conversions.

Setting the right goals

For RZ, this is easy: He needs sales, business (e.g., qualified leads and conversions), but he knows he must be patient since using paid media is not in the cards.

Therefore, he sits down with his partner, and they come up with what seems like the top five workable, important goals:

  1. Increased traffic on the website – He’s noticed that when traffic increases, so does his business.
  2. More phone calls – If they get a customer on the phone, the chances of closing the sale are around 75%.
  3. One blog per week on the site – The more often he blogs, the more web traffic, visits and phone calls increase.
  4. Links from some of the businesses in the area – He’s no dummy. He knows the importance of links, which are that much better when they come from a large company that could send him business.
  5. Develop relationships with small and midsize non-competing businesses in the area for cross promotions, events and the like.

Know the audience

marketing group discussing personas

(image source)

Too many businesses create cute blogs that might generate traffic but do nothing for sales. RZ isn’t falling for this trap. He’s all about identifying the audience who’s likely to do business with him.

Luckily, his secretary is a meticulous record keeper, allowing him to build a reasonable profile of his target persona based on past clients.

  • 21-35 years old
  • Drives a truck that’s less than fours years old
  • Has an income of $45,000-$59,000
  • Employed by a corporation with greater than 500 employees
  • Active on social media, especially Facebook and Twitter
  • Consumes most of their information online
  • Typically referred by a friend or a co-worker

This information will prove invaluable as he goes about creating content. Most important, these nuggets create a clearer picture of how he should go about looking for people and/or businesses to amplify his content.

PR and outreach: Your amplification engines

Armed with his goals and the knowledge of his audience, RZ can now focus on outreach for amplification, thinking along the lines of…

  • Who/what influences his core audience?
  • What could he offer them by way of content to earn their help?
  • What content would they find valuable enough to share and link to?
  • What challenges do they face that he could help them with?
  • How could his brand set itself apart from any other business looking for help from these potential outreach partners?

Putting it all together

Being the savvy businessperson he is, RZ pulls his small staff together and they put their thinking caps on.

Late spring through early fall is prime hail storm season in Dallas. The season accounts for 80 percent of his yearly business. (The other 20% is fender benders.) Also, they realize, many of the storms happen in the late afternoon/early evening, when people are on their way home from work and are stuck in traffic, or when they duck into the grocery store or hit the gym after work.

What’s more, says one of the staffers, often a huge group of clients will come at once, owing to having been parked in the same lot when a storm hits.

Eureka!

lightbulb

(image source)

That’s when RZ bolts out of his chair with the idea that could put his business on the map: Let’s create content for businesses getting a high volume of after-work traffic—sit-down restaurants, gyms, grocery stores, etc.

The businesses would be offering something of value to their customers, who’ll learn about precautions to take in the event of a hail storm, and RZ would have willing amplifiers for his content.

Content is only as boring as your outlook

First—and this is a fatal mistake too many content creators make—RZ visits the handful of local businesses he’d like to partner with. The key here, however, is he smartly makes them aware that he’s done his homework and is eager to help their patrons while making them aware of his service.

This is an integral part of outreach: there must be a clear benefit to the would-be benefactor.

After RZ learns that several of the businesses are amenable to sharing his business’s helpful information, he takes the next step and asks what form the content should take. For now, all he can get them to promote is a glossy one-sheeter, “How To Protect Your Vehicle Against Extensive Hail Damage,” that the biggest gym in the area will promote via a small display at the check-in in return for a 10% coupon for customers.

Three of the five others he talked to also agreed to promote the one-sheeter, though each said they’d be willing to promote other content investments provided they added value for their customers.

The untold truth about creating content for boring industries

When business owners reach out to me about putting together a content strategy for their boring brand, I make two things clear from the start:

  1. There are no boring brands. Those two words are a cop out. No matter what industry you serve, there are hoards of people who use the products or services who are quite smitten.
  2. What they see as boring, I see as an opportunity.

In almost every case, they want to discuss some of another big content piece that’s sure to draw eyes, engagement, and that maybe even leads to a few links. Sure, I say, if you have tons of money to spend.

big content example

(Amazing piece of interactive content created by BuiltVisible)

Assuming you don’t have money to burn, and you want a plan you can replicate easily over time, try what I call the 1-2-1 approach for monthly blog content:

1: A strong piece of local content (goal: organic reach, topical relevance, local SEO)

2: Two pieces of evergreen content (goal: traffic)

1: A link-worthy asset (goal: links)

This plan is not very hard at all to pull off, provided you have your ear to the street in the local market; have done your keyword research, identifying several long-tail keywords you have the ability to rank for; and you’re willing to continue with outreach.

What it does is allow the brand to create content with enough frequency to attain significance with the search engines, while also developing the habit of sharing, promoting and amplifying content as well. For example, all of the posts would be shared on Twitter, Google Plus, and Facebook. (Don’t sleep on paid promotion via Facebook.)

Also, for the link-worthy asset, there would be outreach in advance of its creation, then amplification, and continued promotion from the company and those who’ve agreed to support the content.

Create a winning trifecta: Outreach, promotion and amplification

To RZ’s credit, he didn’t dawdle, getting right to work creating worthwhile content via the 1-2-1 method:

1: “The Worst Places in Dallas to be When a Hail Storm Hits”
2: “Can Hail Damage Cause Structural Damage to Your Car?” and “Should You Buy a Car Damaged by Hail?”
1: “Big as Hail!” contest

This contest idea came from the owner of a large local gym. RZ’s will give $500 to the local homeowner who sends in the largest piece of hail, as judged by Facebook fans, during the season. In return, the gym will promote the contest at its multiple locations, link to the content promotion page on RZ’s website, and share images of its fans holding large pieces of hail via social media.

What does the gym get in return: A catchy slogan (e.g., it’s similar to “big as hell,” popular gym parlance) to market around during the hail season.

It’s a win-win for everyone involved, especially RZ.

He gets a link, but most important he realizes how to create content to nail each one of his goals. You can do the same. All it takes is a change in mindset. Away from content creation. Toward outreach, promote and amplify.

Summary

While the story of RZ’s entirely fictional, it is based on techniques I’ve used with other small and midsize businesses. The keys, I’ve found, are to get away from thinking about your industry/brand as being boring, even if it is, and marshal the resources to find the audience who’ll benefit from from your content and, most important, identify the influencers who’ll promote and amplify it.

What are your thoughts?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Moving 5 Domains to 1: An SEO Case Study

Posted by Dr-Pete

People often ask me if they should change domain names, and I always shudder just a little. Changing domains is a huge, risky undertaking, and too many people rush into it seeing only the imaginary upside. The success of the change also depends wildly on the details, and it’s not the kind of question anyone should be asking casually on social media.

Recently, I decided that it was time to find a new permanent home for my personal and professional blogs, which had gradually spread out over 5 domains. I also felt my main domain was no longer relevant to my current situation, and it was time for a change. So, ultimately I ended up with a scenario that looked like this:

The top three sites were active, with UserEffect.com being my former consulting site and blog (and relatively well-trafficked). The bottom two sites were both inactive and were both essentially gag sites. My one-pager, AreYouARealDoctor.com, did previously rank well for “are you a real doctor”, so I wanted to try to recapture that.

I started migrating the 5 sites in mid-January, and I’ve been tracking the results. I thought it would be useful to see how this kind of change plays out, in all of the gory details. As it turns out, nothing is ever quite “textbook” when it comes to technical SEO.

Why Change Domains at All?

The rationale for picking a new domain could fill a month’s worth of posts, but I want to make one critical point – changing domains should be about your business goals first, and SEO second. I did not change domains to try to rank better for “Dr. Pete” – that’s a crap shoot at best. I changed domains because my old consulting brand (“User Effect”) no longer represented the kind of work I do and I’m much more known by my personal brand.

That business case was strong enough that I was willing to accept some losses. We went through a similar transition here
from SEOmoz.org to Moz.com. That was a difficult transition that cost us some SEO ground, especially short-term, but our core rationale was grounded in the business and where it’s headed. Don’t let an SEO pipe dream lead you into a risky decision.

Why did I pick a .co domain? I did it for the usual reason – the .com was taken. For a project of this type, where revenue wasn’t on the line, I didn’t have any particular concerns about .co. The evidence on how top-level domains (TLDs) impact ranking is tough to tease apart (so many other factors correlate with .com’s), and Google’s attitude tends to change over time, especially if new TLDs are abused. Anecdotally, though, I’ve seen plenty of .co’s rank, and I wasn’t concerned.

Step 1 – The Boring Stuff

It is absolutely shocking how many people build a new site, slap up some 301s, pull the switch, and hope for the best. It’s less shocking how many of those people end up in Q&A a week later, desperate and bleeding money.


Planning is hard work, and it’s boring – get over it.

You need to be intimately familiar with every page on your existing site(s), and, ideally, you should make a list. Not only do you have to plan for what will happen to each of these pages, but you’ll need that list to make sure everything works smoothly later.

In my case, I decided it might be time to do some housekeeping – the User Effect blog had hundreds of posts, many outdated and quite a few just not very good. So, I started with the easy data – recent traffic. I’m sure you’ve seen this Google Analytics report (Behavior > Site Content > All Pages):

Since I wanted to focus on recent activity, and none of the sites had much new content, I restricted myself to a 3-month window (Q4 of 2014). Of course, I looked much deeper than the top 10, but the principle was simple – I wanted to make sure the data matched my intuition and that I wasn’t cutting off anything important. This helped me prioritize the list.

Of course, from an SEO standpoint, I also didn’t want to lose content that had limited traffic but solid inbound links. So, I checked my “Top Pages” report in
Open Site Explorer:

Since the bulk of my main site was a blog, the top trafficked and top linked-to pages fortunately correlated pretty well. Again, this is only a way to prioritize. If you’re dealing with sites with thousands of pages, you need to work methodically through the site architecture.

I’m going to say something that makes some SEOs itchy – it’s ok not to move some pages to the new site. It’s even ok to let some pages 404. In Q4, UserEffect.com had traffic to 237 URLs. The top 10 pages accounted for 91.9% of that traffic. I strongly believe that moving domains is a good time to refocus a site and concentrate your visitors and link equity on your best content. More is not better in 2015.

Letting go of some pages also means that you’re not 301-redirecting a massive number of old URLs to a new home-page. This can look like a low-quality attempt to consolidate link-equity, and at large scale it can raise red flags with Google. Content worth keeping should exist on the new site, and your 301s should have well-matched targets.

In one case, I had a blog post that had a decent trickle of traffic due to ranking for “50,000 push-ups,” but the post itself was weak and the bounce rate was very high:

The post was basically just a placeholder announcing that I’d be attempting this challenge, but I never recapped anything after finishing it. So, in this case,
I rewrote the post.

Of course, this process was repeated across the 3 active sites. The 2 inactive sites only constituted a handful of total pages. In the case of AreYouARealDoctor.com, I decided to turn the previous one-pager
into a new page on the new site. That way, I had a very well-matched target for the 301-redirect, instead of simply mapping the old site to my new home-page.

I’m trying to prove a point – this is the amount of work I did for a handful of sites that were mostly inactive and producing no current business value. I don’t need consulting gigs and these sites produce no direct revenue, and yet I still considered this process worth the effort.

Step 2 – The Big Day

Eventually, you’re going to have to make the move, and in most cases, I prefer ripping off the bandage. Of course, doing something all at once doesn’t mean you shouldn’t be careful.

The biggest problem I see with domain switches (even if they’re 1-to-1) is that people rely on data that can take weeks to evaluate, like rankings and traffic, or directly checking Google’s index. By then, a lot of damage is already done. Here are some ways to find out quickly if you’ve got problems…

(1) Manually Check Pages

Remember that list you were supposed to make? It’s time to check it, or at least spot-check it. Someone needs to physically go to a browser and make sure that each major section of the site and each important individual page is resolving properly. It doesn’t matter how confident your IT department/guy/gal is – things go wrong.

(2) Manually Check Headers

Just because a page resolves, it doesn’t mean that your 301-redirects are working properly, or that you’re not firing some kind of 17-step redirect chain. Check your headers. There are tons of free tools, but lately I’m fond of
URI Valet. Guess what – I screwed up my primary 301-redirects. One of my registrar transfers wasn’t working, so I had to have a setting changed by customer service, and I inadvertently ended up with 302s (Pro tip: Don’t change registrars and domains in one step):

Don’t think that because you’re an “expert”, your plan is foolproof. Mistakes happen, and because I caught this one I was able to correct it fairly quickly.

(3) Submit Your New Site

You don’t need to submit your site to Google in 2015, but now that Google Webmaster Tools allows it, why not do it? The primary argument I hear is “well, it’s not necessary.” True, but direct submission has one advantage – it’s fast.

To be precise, Google Webmaster Tools separates the process into “Fetch” and “Submit to index” (you’ll find this under “Crawl” > “Fetch as Google”). Fetching will quickly tell you if Google can resolve a URL and retrieve the page contents, which alone is pretty useful. Once a page is fetched, you can submit it, and you should see something like this:

This isn’t really about getting indexed – it’s about getting nearly instantaneous feedback. If Google has any major problems with crawling your site, you’ll know quickly, at least at the macro level.

(4) Submit New XML Sitemaps

Finally, submit a new set of XML sitemaps in Google Webmaster Tools, and preferably tiered sitemaps. While it’s a few years old now, Rob Ousbey has a great post on the subject of
XML sitemap structure. The basic idea is that, if you divide your sitemap into logical sections, it’s going to be much easier to diagnosis what kinds of pages Google is indexing and where you’re running into trouble.

A couple of pro tips on sitemaps – first, keep your old sitemaps active temporarily. This is counterintuitive to some people, but unless Google can crawl your old URLs, they won’t see and process the 301-redirects and other signals. Let the old accounts stay open for a couple of months, and don’t cut off access to the domains you’re moving.

Second (I learned this one the hard way), make sure that your Google Webmaster Tools site verification still works. If you use file uploads or meta tags and don’t move those files/tags to the new site, GWT verification will fail and you won’t have access to your old accounts. I’d recommend using a more domain-independent solution, like verifying with Google Analytics. If you lose verification, don’t panic – your data won’t be instantly lost.

Step 3 – The Waiting Game

Once you’ve made the switch, the waiting begins, and this is where many people start to panic. Even executed perfectly, it can take Google weeks or even months to process all of your 301-redirects and reevaluate a new domain’s capacity to rank. You have to expect short term fluctuations in ranking and traffic.

During this period, you’ll want to watch a few things – your traffic, your rankings, your indexed pages (via GWT and the site: operator), and your errors (such as unexpected 404s). Traffic will recover the fastest, since direct traffic is immediately carried through redirects, but ranking and indexation will lag, and errors may take time to appear.

(1) Monitor Traffic

I’m hoping you know how to check your traffic, but actually trying to determine what your new levels should be and comparing any two days can be easier said than done. If you launch on a Friday, and then Saturday your traffic goes down on the new site, that’s hardly cause for panic – your traffic probably
always goes down on Saturday.

In this case, I redirected the individual sites over about a week, but I’m going to focus on UserEffect.com, as that was the major traffic generator. That site was redirected, in full on January 21st, and the Google Analytics data for January for the old site looked like this:

So far, so good – traffic bottomed out almost immediately. Of course, losing traffic is easy – the real question is what’s going on with the new domain. Here’s the graph for January for DrPete.co:

This one’s a bit trickier – the first spike, on January 16th, is when I redirected the first domain. The second spike, on January 22nd, is when I redirected UserEffect.com. Both spikes are meaningless – I announced these re-launches on social media and got a short-term traffic burst. What we really want to know is where traffic is leveling out.

Of course, there isn’t a lot of history here, but a typical day for UserEffect.com in January was about 1,000 pageviews. The traffic to DrPete.co after it leveled out was about half that (500 pageviews). It’s not a complete crisis, but we’re definitely looking at a short-term loss.

Obviously, I’m simplifying the process here – for a large, ecommerce site you’d want to track a wide range of metrics, including conversion metrics. Hopefully, though, this illustrates the core approach. So, what am I missing out on? In this day of [not provided], tracking down a loss can be tricky. Let’s look for clues in our other three areas…

(2) Monitor Indexation

You can get a broad sense of your indexed pages from Google Webmaster Tools, but this data often lags real-time and isn’t very granular. Despite its shortcomings, I still prefer
the site: operator. Generally, I monitor a domain daily – any one measurement has a lot of noise, but what you’re looking for is the trend over time. Here’s the indexed page count for DrPete.co:

The first set of pages was indexed fairly quickly, and then the second set started being indexed soon after UserEffect.com was redirected. All in all, we’re seeing a fairly steady upward trend, and that’s what we’re hoping to see. The number is also in the ballpark of sanity (compared to the actual page count) and roughly matched GWT data once it started being reported.

So, what happened to UserEffect.com’s index after the switch?

The timeframe here is shorter, since UserEffect.com was redirected last, but we see a gradual decline in indexation, as expected. Note that the index size plateaus around 60 pages – about 1/4 of the original size. This isn’t abnormal – low-traffic and unlinked pages (or those with deep links) are going to take a while to clear out. This is a long-term process. Don’t panic over the absolute numbers – what you want here is a downward trend on the old domain accompanied by a roughly equal upward trend on the new domain.

The fact that UserEffect.com didn’t bottom out is definitely worth monitoring, but this timespan is too short for the plateau to be a major concern. The next step would be to dig into these specific pages and look for a pattern.

(3) Monitor Rankings

The old domain is dropping out of the index, and the new domain is taking its place, but we still don’t know why the new site is taking a traffic hit. It’s time to dig into our core keyword rankings.

Historically, UserEffect.com had ranked well for keywords related to “split test calculator” (near #1) and “usability checklist” (in the top 3). While [not provided] makes keyword-level traffic analysis tricky, we also know that the split-test calculator is one of the top trafficked pages on the site, so let’s dig into that one. Here’s the ranking data from Moz Analytics for “split test calculator”:

The new site took over the #1 position from the old site at first, but then quickly dropped down to the #3/#4 ranking. That may not sound like a lot, but given this general keyword category was one of the site’s top traffic drivers, the CTR drop from #1 to #3/#4 could definitely be causing problems.

When you have a specific keyword you can diagnose, it’s worth taking a look at the live SERP, just to get some context. The day after relaunch, I captured this result for “dr. pete”:

Here, the new domain is ranking, but it’s showing the old title tag. This may not be cause for alarm – weird things often happen in the very short term – but in this case we know that I accidentally set up a 302-redirect. There’s some reason to believe that Google didn’t pass full link equity during that period when 301s weren’t implemented.

Let’s look at a domain where the 301s behaved properly. Before the site was inactive, AreYouARealDoctor.com ranked #1 for “are you a real doctor”. Since there was an inactive period, and I dropped the exact-match domain, it wouldn’t be surprising to see a corresponding ranking drop.

In reality, the new site was ranking #1 for “are you a real doctor” within 2 weeks of 301-redirecting the old domain. The graph is just a horizontal line at #1, so I’m not going to bother you with it, but here’s a current screenshot (incognito):

Early on, I also spot-checked this result, and it wasn’t showing the strange title tag crossover that UserEffect.com pages exhibited. So, it’s very likely that the 302-redirects caused some problems.

Of course, these are just a couple of keywords, but I hope it provides a starting point for you to understand how to methodically approach this problem. There’s no use crying over spilled milk, and I’m not going to fire myself, so let’s move on to checking any other errors that I might have missed.

(4) Check Errors (404s, etc.)

A good first stop for unexpected errors is the “Crawl Errors” report in Google Webmaster Tools (Crawl > Crawl Errors). This is going to take some digging, especially if you’ve deliberately 404’ed some content. Over the couple of weeks after re-launch, I spotted the following problems:

The old site had a “/blog” directory, but the new site put the blog right on the home-page and had no corresponding directory. Doh. Hey, do as I say, not as I do, ok? Obviously, this was a big blunder, as the old blog home-page was well-trafficked.

The other two errors here are smaller but easy to correct. MinimalTalent.com had a “/free” directory that housed downloads (mostly PDFs). I missed it, since my other sites used a different format. Luckily, this was easy to remap.

The last error is a weird looking URL, and there are other similar URLs in the 404 list. This is where site knowledge is critical. I custom-designed a URL shortener for UserEffect.com and, in some cases, people linked to those URLs. Since those URLs didn’t exist in the site architecture, I missed them. This is where digging deep into historical traffic reports and your top-linked pages is critical. In this case, the fix isn’t easy, and I have to decide whether the loss is worth the time.

What About the New EMD?

My goal here wasn’t to rank better for “Dr. Pete,” and finally unseat Dr. Pete’s Marinades, Dr. Pete the Sodastream flavor (yes, it’s hilarious – you can stop sending me your grocery store photos), and 172 dentists. Ok, it mostly wasn’t my goal. Of course, you might be wondering how switching to an EMD worked out.

In the short term, I’m afraid the answer is “not very well.” I didn’t track ranking for “Dr. Pete” and related phrases very often before the switch, but it appears that ranking actually fell in the short-term. Current estimates have me sitting around page 4, even though my combined link profile suggests a much stronger position. Here’s a look at the ranking history for “dr pete” since relaunch (from Moz Analytics):

There was an initial drop, after which the site evened out a bit. This less-than-impressive plateau could be due to the bad 302s during transition. It could be Google evaluating a new EMD and multiple redirects to that EMD. It could be that the prevalence of natural anchor text with “Dr. Pete” pointing to my site suddenly looked unnatural when my domain name switched to DrPete.co. It could just be that this is going to take time to shake out.

If there’s a lesson here (and, admittedly, it’s too soon to tell), it’s that you shouldn’t rush to buy an EMD in 2015 in the wild hope of instantly ranking for that target phrase. There are so many factors involved in ranking for even a moderately competitive term, and your domain is just one small part of the mix.

So, What Did We Learn?

I hope you learned that I should’ve taken my own advice and planned a bit more carefully. I admit that this was a side project and it didn’t get the attention it deserved. The problem is that, even when real money is at stake, people rush these things and hope for the best. There’s a real cheerleading mentality when it comes to change – people want to take action and only see the upside.

Ultimately, in a corporate or agency environment, you can’t be the one sour note among the cheering. You’ll be ignored, and possibly even fired. That’s not fair, but it’s reality. What you need to do is make sure the work gets done right and people go into the process with eyes wide open. There’s no room for shortcuts when you’re moving to a new domain.

That said, a domain change isn’t a death sentence, either. Done right, and with sensible goals in mind – balancing not just SEO but broader marketing and business objectives – a domain migration can be successful, even across multiple sites.

To sum up: Plan, plan, plan, monitor, monitor, monitor, and try not to panic.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Everything You Need to Know About Mobile App Search

Posted by Justin_Briggs

Mobile isn’t the future. It’s the present. Mobile apps are not only changing how we interact with devices and websites, they’re changing the way we search. Companies are creating meaningful experiences on mobile-friendly websites and apps, which in turn create new opportunities to get in front of users.

I’d like to explore the growth of mobile app search and its current opportunities to gain visibility and drive engagement.

Rise of mobile app search

The growth of mobile device usage has driven a significant lift in app-related searches. This is giving rise to mobile app search as a vertical within traditional universal search.

While it has been clear for some time that mobile search is important, that importance has been more heavily emphasized by Google recently, as they continue to push
mobile-friendly labels in SERPs, and are likely increasing mobile-friendliness’s weight as a ranking factor.

The future of search marketing involves mobile, and it will not be limited to optimizing HTML webpages, creating responsive designs, and optimizing UX. Mobile SEO is a world where apps, knowledge graph, and conversational search are front and center.

For the
top 10 leading properties online, 34% of visitors are mobile-only (comScore data), and, anecdotally, we’re seeing similar numbers with our clients, if not more.

Mobile device and app growth

It’s also worth noting that
72% of mobile engagement relies on apps vs. on browsers. Looking at teen usage, apps are increasingly dominant. Additionally,
55% of teens use voice search more than once per day

If you haven’t read it, grab some coffee and read
A Teenagers View on Social Media, which is written by a 19-year old who gives his perspective of online behavior. Reading between the lines shows a number of subtle shifts in behavior. I noticed that every time I expected him say website, he said application. In fact, he referenced application 15 times, and it is the primary way he describes social networks.

This means that one of the fasting growing segments of mobile users cannot be marketed to by optimizing HTML webpages alone, requiring search marketers to expand their skills into app optimization.

The mobile app pack

This shift is giving rise to the mobile app pack and app search results, which are triggered on searches from mobile devices in instances of high mobile app intent. Think of these as being similar to local search results. Considering
mobile searcher behavior, these listings dominate user attention.

Mobile app search results and mobile app pack

As with local search, mobile app search can reorder traditional results, completely push them down, or integrate app listings with traditional web results.

You can test on your desktop using a
user-agent switcher, or by searching on your iOS or Android device. 

There are slight differences between iPhone and Android mobile app results:

iOS and Android mobile search result listing

From what I’ve seen, mobile app listings trigger more frequently, and with more results, on Android search results when compared to iOS. Additionally, iOS mobile app listings are represented as a traditional website result listing, while mobile app listings on Android are more integrated.

Some of the differences also come from the differences in app submission guidelines on the two major stores, the Apple App Store and Google Play.

Overview of differences in mobile app results

  1. Title – Google uses the app listing page’s HTML title (which is the app’s title). iOS app titles can exceed 55-62 characters, which causes wrapping and title truncation like a traditional result. Android app title requirements are shorter, so titles are typically shorter on Android mobile app listings.
  2. URL – iOS mobile app listings display the iTunes URL to the App Store as part of the search result.
  3. Icon – iOS icons are square and Android icons have rounded corners.
  4. Design – Android results stand out more, with an “Apps” headline above the pack and a link to Google Play at the end.
  5. App store content – The other differences show up in the copy, ratings, and reviews on each app store.

Ranking in mobile app search results

Ranking in mobile app search results is a
combination of App Store Optimization (ASO) and traditional SEO. The on-page factors are dependent upon your app listing, so optimization starts with having solid ASO. If you’re not familiar with ASO, it’s the process of optimizing your app listing for internal app store search.

Basics of ASO

Ranking in the Apple App Store and in Google Play is driven by two primary factors: keyword alignment and app performance. Text fields in the app store listing, such as title, description, and keyword list, align the app with a particular set of keywords. Performance metrics including download velocity, app ratings, and reviews determine how well the app will rank for each of those keywords. (Additionally, the Google Play algorithm may include external, web-based performance metrics like citations and links as ranking factors.)

App store ranking factors

Mobile app listing optimization

While I won’t explore ASO in-depth here, as it’s very similar to traditional SEO,
optimizing app listings is primarily a function of keyword targeting.

Tools like
Sensor Tower, MobileDevHQ, and App Annie can help you with mobile app keyword research. However, keep in mind that mobile app search listings show up in universal search, so it’s important to leverage traditional keyword research tools like the AdWords Tool or Google Trends.

While there are similarities with ASO, optimizing for these mobile app search listings on the web has some slight differences.

Differences between ASO & mobile app SEO targeting

  1. Titles – While the Apple App Store allows relatively long titles, they are limited to the preview length in organic search. Titles should be optimized with Google search in mind, in addition to optimizing for the app store. Additionally, several apps aggressively target keywords in their app title, but caution should be used as spamming keywords could influence app performance in Google.
  2. Description – The app description on the App Store may not be a factor in internal search, but it will impact external app search results. Leverage keyword targeting best practices when writing your iOS app description, as well as your Android app description.
  3. Device and platform keywords – When targeting for app store search, it is not as important to target terms related to the OS or device. However, these terms can help visibility in external search. Include device and OS terms, such as Android, Samsung Note, iOS, iPad, and iPhone.

App performance optimization

Outside of content optimization, Google looks at the performance of the app. On the Android side, they have access to the data, but for iOS they have to rely on publicly available information.

App performance factors

  • Number of ratings
  • Average rating score
  • Content and sentiment analysis of reviews
  • Downloads / installs
  • Engagement and retention
  • Internal links on app store

For iOS, the primary public metrics are ratings and reviews. However, app performance can be inferred using the App Store’s ranking charts and search results, which can be leveraged as proxies of these performance metrics.


The following objectives will have the greatest influence on your mobile app search ranking:

  1. Increase your average rating number
  2. Increase your number of ratings
  3. Increase downloads

For app ratings and reviews, leverage platforms like
Apptentive to improve your ratings. They are very effective at driving positive ratings. Additionally, paid tactics are a great way to drive install volume and are one area where paid budget capacity could directly influence organic results in Google. Anecdotally, both app stores use rating numbers (typically above or below 4 stars) to make decisions around promoting an app, either through merchandising spots or co-branded campaigns. I suspect this is being used as a general cut-off for what is displayed in universal results. Increasing your rating above 4 stars should improve the likelihood you’ll appear in mobile app search results.

Lastly, think of merchandising and rankings in terms of 
internal linking structures. The more visible you are inside of the app store, the more visibility you have in external search.

App web performance optimization

Lastly, we’re talking Google rankings, so factors like links, citations, and social shares matter. You should be
conducting content marketing, PR, and outreach for your app. Focus on merchandising your app on your own site, as well as increasing coverage of your app (linking to the app store page). The basics of link optimization apply here.

App indexation – drive app engagement

Application search is not limited to driving installs via app search results. With app indexing, you can leverage your desktop/mobile website visibility in organic search to drive engagement with those who have your app installed. Google can discover and expose content deep inside your app directly in search results. This means that when a user clicks on your website in organic search, it can open your app directly, taking them to that exact piece of content in your app, instead of opening your website.

App indexation fundamentally changes technical SEO, extending SEO from server and webpage setup to the setup and optimization of applications.

App indexation on Google

This also fundamentally changes search. Your most avid and engaged user may choose to no longer visit your website. For example, on my Note 4, when I click a link to a site of a brand that I have an app installed for, Google gives me the option not only to open in the app, but to set opening the app as a default behavior.

If a user chooses to open your site in your app, they may never visit your site from organic search again.

App indexation is currently limited to Android devices, but there is evidence to suggest that it’s already in the works and is
soon to be released on iOS devices. There have been hints for some time, but markup is showing up in the wild suggesting that Google is actively working with Apple and select brands to develop iOS app indexing.

URI optimization for apps

The first step in creating an indexable app is to set up your app to support deep links. Deep links are URIs that are understood by your app and will open up a specific piece of content. They are effectively URLs for applications.

Once this URI is supported, a user can be sent to deep content in the app. These can be discovered as alternates to your desktop site’s URLs, similar to how
separate-site mobile sites are defined as alternate URLs for the desktop site. In instances of proper context (on an Android device with the app installed), Google can direct a user to the app instead of the website.

Setting this up requires working with your app developer to implement changes inside the app as well as working with your website developers to add references on your desktop site.

Adding intent filters

Android has
documented the technical setup of deep links in detail, but it starts with setting up intent filters in an app’s Android manifest file. This is done with the following code.

<activity android:name="com.example.android.GizmosActivity"
android:label="@string/title_gizmos" >
<intent-filter android:label="@string/filter_title_viewgizmos">
<action android:name="android.intent.action.VIEW" />
<data android:scheme="http"
android:host="example.com"
android:pathPrefix="/gizmos" />
<category android:name="android.intent.category.DEFAULT" />
<category android:name="android.intent.category.BROWSABLE" />
</intent-filter>
</activity>

This dictates the technical optimization of your app URIs for app indexation and defines the elements used in the URI example above.

  • The <intent-filter> element should be added for activities that should be launchable from search results.
  • The <action> element specifies the ACTION_VIEW intent action so that the intent filter can be reached from Google Search.
  • The <data> tag represents a URI format that resolves to the activity. At minimum, the <data> tag must include the android:scheme attribute.
  • Include the BROWSABLE category. The BROWSABLE category is required in order for the intent filter to be accessible from a web browser. Without it, clicking a link in a browser cannot resolve to your app. The DEFAULT category is optional, but recommended. Without this category, the activity can be started only with an explicit intent, using your app component name.

Testing deep links

Google has created tools to help test your deep link setup. You can use
Google’s Deep Link Test Tool to test your app behavior with deep links on your phone. Additionally, you can create an HTML page with an intent:// link in it.

For example
:

<a href="intent://example.com/page-1#Intent;scheme=http;package=com.example.android;end;"> <a href="http://example.com/page-1">http://example.com/page-1></a>

This link would open up deep content inside the app from the HTML page.

App URI crawl and discovery

Once an app has deep link functionality, the next step is to
ensure that Google can discover these URIs as part of its traditional desktop crawling.

Ways to get apps crawled

  1. Rel=”alternate” in HTML head
  2. ViewAction with Schema.org
  3. Rel=”alternate” in XML Sitemap

Implementing all three will create clear signals, but at minimum you should add the rel=”alternate” tag to the HTML head of your webpages.

Effectively, think of the app URI as being similar to a mobile site URL when
setting up a separate-site mobile site for SEO. The mobile deep link is an alternative way to view a webpage on your site. You map a piece of content on your site to a corresponding piece of content inside the app.

Before you get started, be sure to
verify your website and app following the guidelines here. This will verify your app in Google Play Developer Console and Google Webmaster Tools.

#1: Rel=”alternate” in HTML head

On an example page, such as example.com/page-1, you would add the following code to the head of the document. Again, very similar to separate-site mobile optimization.

<html>
<head> 
... 
<link rel="alternate" href="android-app://com.example.android/http/example.com/page-1" /> 
...
</head>
<body>
</body>
#2: ViewAction with Schema.org

Additionally, you can reference the deep link using Schema.org and JSON by using a 
ViewAction.

<script type="application/ld+json"> 
{ 
"@context": "http://schema.org", 
"@type": "WebPage", 
"@id": "http://example.com/gizmos", 
"potentialAction": { 
"@type": "ViewAction", 
"target": "android-app://com.example.android/http/example.com/gizmos" 
} 
} 
</script>
#3 Rel=”alternate” in XML sitemap

Lastly, you can reference the alternate URL in your XML Sitemaps, similar to using the rel=”alternate” for mobile sites.

<?xml version="1.0" encoding="UTF-8" ?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"> 
<url> 
<loc>http://example.com/page-1</loc> 
<xhtml:link rel="alternate" href="android-app://com.example.android/http/example.com/page-1" /> 
</url> 
... 
</urlset>

Once these are in place, Google can discover the app URI and provide your app as an alternative way to view content found in search.

Bot control and robots noindex for apps

There may be instances where there is content within your app that you do not want indexed in Google. A good example of this might be content or functionality that is built out on your site, but has not yet been developed in your app. This would create an inferior experience for users. The good news is that we can block indexation with a few updates to the app.

First, add the following to your app resource directory (res/xml/noindex.xml).

<?xml version="1.0" encoding="utf-8"?> 
<search-engine xmlns:android="http://schemas.android.com/apk/res/android"> 
<noindex uri="http://example.com/gizmos/hidden_uri"/> 
<noindex uriPrefix="http://example.com/gizmos/hidden_prefix"/> 
<noindex uri="gizmos://hidden_path"/> 
<noindex uriPrefix="gizmos://hidden_prefix"/> 
</search-engine>

As you can see above, you can block an individual URI or define a URI prefix to block entire folders.

Once this has been added, you need to update the AndroidManifest.xml file to denote that you’re using noindex.html to block indexation.

<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.example.android.Gizmos"> 
<application> 
<activity android:name="com.example.android.GizmosActivity" android:label="@string/title_gizmos" > 
<intent-filter android:label="@string/filter_title_viewgizmos"> 
<action android:name="android.intent.action.VIEW"/> 
... 
</activity> 
<meta-data android:name="search-engine" android:resource="@xml/noindex"/> 
</application> 
<uses-permission android:name="android.permission.INTERNET"/> 
</manifest>

App indexing API to drive re-engagement

In addition to URI discovery via desktop crawl, your mobile app can integrate
Google’s App Indexing API, which communicates with Google when users take actions inside your app. This sends information to Google about what users are viewing in the app. This is an additional method for deep link discovery and has some benefits.

The primary benefit is the ability to appear in
autocomplete. This can drive re-engagement through Google Search query autocompletions, providing access to inner pages in apps.

App auto suggest

Again, be sure to
verify your website and app following the guidelines here. This will verify your app in Google Play Developer Console and Google Webmaster Tools.

App actions with knowledge graph

The next, and most exciting, evolution of search is leveraging actions. These will be powerful when
combined with voice search, allowing search engines to take action on behalf of users, turning spoken language into executed actions.

App indexing allows you to take advantage of actions by allowing Google to not only launch an app, but execute actions inside of the app. Order me a pizza? Schedule my meeting? Drive my car? Ok, Google.

App actions work via entity detection and the application of the knowledge graph, allowing search engines to understand actions, words, ideas and objects. With that understanding, they can build an action graph that allows them to define common actions by entity type.

Here is a list of actions currently supported by Schema.org

For example, the PlayAction could be used to play a song in a music app. This can be achieve with the following markup.

<script type="application/ld+json">
{
"@context": "http://schema.org",
"@type": "MusicGroup",
"name": "Weezer", "potentialAction": {
"@type": "ListenAction",
"target": "android-app://com.spotify.music/http/we.../listen"
}
}
</script>
Once this is implemented, these app actions can begin to appear in search results and knowledge graph.

deep links in app search results

Overview of mobile app search opportunities

In summary, there are five primary ways to increase visibility and engagement for your mobile app in traditional organic search efforts.

Mobile apps in search results

The growth of mobile search is transforming how we define technical SEO, moving beyond front-end and back-end optimization of websites into the realm of structured data and application development. As app indexing expands to include iOS, I suspect the possibilities and opportunities associated with indexing applications, and their corresponding actions, to grow extensively. 

For those with Android apps, app indexing is a potential leapfrog style opportunity to get ahead of competitors who are dominant in traditional desktop search. Those with iOS devices should start by optimizing their app listings, while preparing to implement indexation, as I suspect it’ll be released for iOS this year.

Have you been leveraging traditional organic search to drive visibility and engagement for apps? Share your experiences in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

5 Years of SEO Changes and Better Goal-Setting – Videos from MozTalk 2

Posted by CharleneKate

Have you ever noticed how Rand is often speaking at conferences all around the world? Well, we realized that those of us here in Seattle rarely get to see them. So we started MozTalks, a free event here at the MozPlex.

It normally runs 2-3 hours with multiple speakers, one of whom is Rand. The event is hosted at the Moz HQ and offers time for mingling, appetizers, refreshments and of course, swag. The series is still evolving as we continue to test out new ideas (maybe taking the show on the road), so be on the lookout for any updates.

The world of marketing and SEO continues to change, but are you changing with it? Staying on the cutting edge should always be a priority, especially since early adoption has proven more beneficial than ever. Sticking with what works isn’t enough anymore, and marketing isn’t just about analyzing our successes and failures or understanding our returns and losses. It’s about what we do next.

In the presentations below, Rand and Dr. Pete will dive deep into where metrics serve us best, as well as what really works to drive traffic and what’s better left behind.

Rand: What Changed? A Brief Look at How SEO has Evolved over the Last 5 Years

2014-11-18 – MozTalk – Rand – What Changed?

Dr. Pete: From Lag to Lead: Actionable Analytics

2014-11-18 – MozTalk – Dr. Pete – From Lag to Lead

Top takeaways

We asked both presenters for a few of their top takeaways from their talks, and they’ve got some gems. Here’s what they had to say:

From Rand

  • Keyword matching has become intent matching, which doesn’t mean we should avoid using keywords, but it does mean we need to change the way we determine which pages to build, which to canonicalize, and how to structure our sites and content.
  • The job title “SEO” may be limiting the influence we have, and we may need broader authority to impact SEO in the modern era. The onus is on marketers to make teams, clients, and execs aware of these new requirements, so they understand what we need to do in order to grow search traffic.
  • Webspam has gone from Google’s problem to our problem. The onus is on marketers to stay wary and up-to-date with how Google is seeing their links and their site.

From Dr. Pete

  • As content marketers, we can’t afford to see only the forest or the trees. We have to understand a wide variety of metrics, and combine them in new and insightful ways.
  • We have to stop looking backward using lag goals like “Get 100,000 Likes in Q4.” They aren’t actionable, and succeed or fail, we have no way to repeat success. We have to focus on objectives that drive specific, measurable actions.

Missed the previous talk?

The first MozTalk featured Rand and his wife Geraldine, known in the blogosphere as The Everywhereist. Rand covered what bloggers need to know about SEO, and Geraldine talked about how to make your blog audience fall in love with you. Check them both out here:

Need-to-Know SEO and Making Your Blog Audience Fall in Love – Videos from MozTalk 1

Join us for the next one

Our next free MozTalk is set for 
Thursday, April 2nd, and we’re still finalizing plans. We’ll be sure to post the videos on this blog for those of you who can’t make it, but if you’re in town, keep your eyes open for more details. We hope to see you there!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]