The Importance of Being Different: Creating a Competitive Advantage With Your USP

Posted by TrentonGreener

“The one who follows the crowd will usually go no further than the crowd. Those who walk alone are likely to find themselves in places no one has ever been before.”

While this quote has been credited to everyone from Francis Phillip Wernig, under the pseudonym Alan Ashley-Pitt, to Einstein himself, the powerful message does not lose its substance no matter whom you choose to credit. There is a very important yet often overlooked effect of not heeding this warning. One which can be applied to all aspects of life. From love and happiness, to business and marketing, copying what your competitors are doing and failing to forge your own path can be a detrimental mistake.

While as marketers we are all acutely aware of the importance of differentiation, we’ve been trained for the majority of our lives to seek out the norm.

We spend the majority of our adolescent lives trying desperately not to be different. No one has ever been picked on for being too normal or not being different enough. We would beg our parents to buy us the same clothes little Jimmy or little Jamie wore. We’d want the same backpack and the same bike everyone else had. With the rise of the cell phone and later the smartphone, on hands and knees, we begged and pleaded for our parents to buy us the Razr, the StarTAC (bonus points if you didn’t have to Google that one), and later the iPhone. Did we truly want these things? Yes, but not just because they were cutting edge and nifty. We desired them because the people around us had them. We didn’t want to be the last to get these devices. We didn’t want to be different.

Thankfully, as we mature we begin to realize the fallacy that is trying to be normal. We start to become individuals and learn to appreciate that being different is often seen as beautiful. However, while we begin to celebrate being different on a personal level, it does not always translate into our business or professional lives.

We unconsciously and naturally seek out the normal, and if we want to be different—truly different in a way that creates an advantage—we have to work for it.

The truth of the matter is, anyone can be different. In fact, we all are very different. Even identical twins with the same DNA will often have starkly different personalities. As a business, the real challenge lies in being different in a way that is relevant, valuable to your audience, and creates an advantage.

“Strong products and services are highly differentiated from all other products and services. It’s that simple. It’s that difficult.” – Austin McGhie, Brand Is a Four Letter Word

Let’s explore the example of Revel Hotel & Casino. Revel is a 70-story luxury casino in Atlantic City that was built in 2012. There is simply not another casino of the same class in Atlantic City, but there might be a reason for this. Even if you’re not familiar with the city, a quick jump onto Atlantic City’s tourism website reveals that of the five hero banners that rotate, not one specifically mentions gambling, but three reference the boardwalk. This is further illustrated when exploring their internal linking structure. The beaches, boardwalk, and shopping all appear before a single mention of casinos. There simply isn’t as much of a market for high-end gamblers in the Atlantic City area; in the states Las Vegas serves that role. So while Revel has a unique advantage, their ability to attract customers to their resort has not resulted in profitable earnings reports. In Q2 2012, Revel had a gross operating loss of $35.177M, and in Q3 2012 that increased to $36.838M.

So you need to create a unique selling proposition (also known as unique selling point and commonly referred to as a USP), and your USP needs to be valuable to your audience and create a competitive advantage. Sounds easy enough, right? Now for the kicker. That advantage needs to be as sustainable as physically possible over the long term.

“How long will it take our competitors to duplicate our advantage?”

You really need to explore this question and the possible solutions your competitors could utilize to play catch-up or duplicate what you’ve done. Look no further than Google vs Bing to see this in action. No company out there is going to just give up because your USP is so much better; most will pivot or adapt in some way.

Let’s look at a Seattle-area coffee company of which you may or may not be familiar. Starbucks has tried quite a few times over the years to level-up their tea game with limited success, but the markets that Starbucks has really struggled to break into are the pastry, breads, dessert, and food markets.

Other stores had more success in these markets, and they thought that high-quality teas and bakery items were the USPs that differentiated them from the Big Bad Wolf that is Starbucks. And while they were right to think that their brick house would save them from the Big Bad Wolf for some time, this fable doesn’t end with the Big Bad Wolf in a boiling pot.

Never underestimate your competitor’s ability to be agile, specifically when overcoming a competitive disadvantage.

If your competitor can’t beat you by making a better product or service internally, they can always choose to buy someone who can.

After months of courting, on June 4th, 2012 Starbucks announced that they had come to an agreement to purchase La Boulange in order to “elevate core food offerings and build a premium, artisanal bakery brand.” If you’re a small-to-medium sized coffee shop and/or bakery that even indirectly competed with Starbucks, a new challenger approaches. And while those tea shops momentarily felt safe within the brick walls that guarded their USP, on the final day of that same year, the Big Bad Wolf huffed and puffed and blew a stack of cash all over Teavana. Making Teavana a wholly-owned subsidiary of Starbucks for the low, low price of $620M.

Sarcasm aside, this does a great job of illustrating the ability of companies—especially those with deep pockets—to be agile, and demonstrates that they often have an uncanny ability to overcome your company’s competitive advantage. In seven months, Starbucks went from a minor player in these markets to having all the tools they need to dominate tea and pastries. Have you tried their raspberry pound cake? It’s phenomenal.

Why does this matter to me?

Ok, we get it. We need to be different, and in a way that is relevant, valuable, defensible, and sustainable. But I’m not the CEO, or even the CMO. I cannot effect change on a company level; why does this matter to me?

I’m a firm believer that you effect change no matter what the name plate on your desk may say. Sure, you may not be able to call an all-staff meeting today and completely change the direction of your company tomorrow, but you can effect change on the parts of the business you do touch. No matter your title or area of responsibility, you need to know your company’s, client’s, or even a specific piece of content’s USP, and you need to ensure it is applied liberally to all areas of your work.

Look at this example SERP for “Mechanics”:

While yes, this search is very likely to be local-sensitive, that doesn’t mean you can’t stand out. Every single AdWords result, save one, has only the word “Mechanics” in the headline. (While the top of page ad is pulling description line 1 into the heading, the actual headline is still only “Mechanic.”) But even the one headline that is different doesn’t do a great job of illustrating the company’s USP. Mechanics at home? Whose home? Mine or theirs? I’m a huge fan of Steve Krug’s “Don’t Make Me Think,” and in this scenario there are too many questions I need answered before I’m willing to click through. “Mechanics; We Come To You” or even “Traveling Mechanics” illustrates this point much more clearly, and still fits within the 25-character limit for the headline.

If you’re an AdWords user, no matter how big or small your monthly spend may be, take a look at your top 10-15 keywords by volume and evaluate how well you’re differentiating yourself from the other brands in your industry. Test ad copy that draws attention to your USP and reap the rewards.

Now while this is simply an AdWords text ad example, the same concept can be applied universally across all of marketing.

Title tags & meta descriptions

As we alluded to above, not only do companies have USPs, but individual pieces of content can, and should, have their own USP. Use your title tag and meta description to illustrate what differentiates your piece of content from the competition and do so in a way that attracts the searcher’s click. Use your USP to your advantage. If you have already established a strong brand within a specific niche, great! Now use it to your advantage. Though it’s much more likely that you are competing against a strong brand, and in these scenarios ask yourself, “What makes our content different from theirs?” The answer you come up with is your content’s USP. Call attention to that in your title tag and meta description, and watch the CTR climb.

I encourage you to hop into your own site’s analytics and look at your top 10-15 organic landing pages and see how well you differentiate yourself. Even if you’re hesitant to negatively affect your inbound gold mines by changing the title tags, run a test and change up your meta description to draw attention to your USP. In an hour’s work, you just may make the change that pushes you a little further up those SERPs.

Branding

Let’s break outside the world of digital marketing and look at the world of branding. Tom’s Shoes competes against some heavy hitters in Nike, Adidas, Reebok, and Puma just to name a few. While Tom’s can’t hope to compete against the marketing budgets of these companies in a fair fight, they instead chose to take what makes them different, their USP, and disseminate it every chance they get. They have labeled themselves “The One for One” company. It’s in their homepage’s title tag, in every piece of marketing they put out, and it smacks you in the face when you land on their site. They even use the call-to-action “Get Good Karma” throughout their site.

Now as many of us may know, partially because of the scandal it created in late 2013, Tom’s is not actually a non-profit organization. No matter how you feel about the matter, this marketing strategy has created a positive effect on their bottom line. Fast Company conservatively estimated their revenues in 2013 at $250M, with many estimates being closer to the $300M mark. Not too bad of a slice of the pie when competing against the powerhouses Tom’s does.

Wherever you stand on this issue, Tom’s Shoes has done a phenomenal job of differentiating their brand from the big hitters in their industry.

Know your USP and disseminate it every chance you get.

This is worth repeating. Know your USP and disseminate it every chance you get, whether that be in title tags, ad copy, on-page copy, branding, or any other segment of your marketing campaigns. Online or offline, be different. And remember the quote that we started with, “The one who follows the crowd will usually go no further than the crowd. Those who walk alone are likely to find themselves in places no one has ever been before.”

The amount of marketing knowledge that can be taken from this one simple statement is astounding. Heed the words, stand out from the crowd, and you will have success.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

It’s Your Turn: Now Accepting Community Speaker Pitches for MozCon 2015

Posted by EricaMcGillivray

Yep, it’s that time of year, friends. Time to submit your online marketing talk pitch for MozCon 2015. I’m super excited this year as we’ll have 6 community speaker slots! That’s right—you all are so amazing that we want to see more from you.

The basic details:

  • To submit, just fill out the form below.
  • Talks must be about online marketing and are only 15 minutes in length.
  • Submissions close on Sunday, April 12 at 5pm PDT.
  • Final decisions are final and will be made in late April.
  • All presentations must adhere to the MozCon Code of Conduct.
  • You must attend MozCon in person, July 13-15 in Seattle.

Loading…


If you are selected, you will get the following:

  • 15 minutes on the MozCon stage to share with our audience, plus 5 minutes of Q&A.
  • A free ticket to MozCon. (If you already purchased yours, we’ll either refund or transfer the ticket to someone else.)
  • Four nights of lodging covered by us at our partner hotel.
  • A reimbursement for your travel (flight, train, car, etc.), up to $500 domestic and $750 international.
  • A free ticket for you to give to anyone you would like and a code for $300 off another ticket.
  • An invitation for you and your significant other to join us for the speakers’ dinner.

We work with you!

Pitching for a community speaker slot can feel intimidating. A lot of times, our ideas feel like an old hat and done a million times before. (When I say “our” here, I mean “mine.”)

At MozCon, we work with every single speaker to ensure your presentation is the best it can be. Myself and Matt Roney dedicate ourselves to helping you. Seriously, you get our personal cell phone numbers. Don’t get me wrong—you do the heavy lifting and the incredible work. But we set up calls, review sessions, and even take you up on the stage pre-MozCon to ensure that you feel awesome about your talk.


We’re happy to help, including:

  • Calls to discuss and refine your topic.
  • Assistance honing topic title and description.
  • Reviews of outlines and drafts (as many as you want!).
  • Best practices and guidance for slide decks, specifically for our stage.
  • A comprehensive, step-by-step guide for show flow.
  • Serving as an audience for practicing your talk.
  • Reviewing your final deck.
  • Sunday night pre-MozCon tour of the stage to meet our A/V crew, see your presentation on the screens, and test the clicker.
  • A dedicated crew to make your A/V outstanding.
  • Anything else we can do to make you successful.

Most of the above are required as part of the speaker process, so even those of you who don’t always ask for help (again, talking about myself here), will be sure to get it. We want you to know that anyone, regardless of experience or level of knowledge, can submit and present a great talk at MozCon. One of our past community speakers Zeph Snapp wrote a great post about his experiences with our process and at the show.


For great proposals:

  • Make sure to check out the confirmed MozCon 2015 topics from our other speakers so you don’t overlap.
  • Read about what makes a great pitch.
  • For extra jazz, include links to videos of you doing public speaking and your slide deck work in the optional fields.
  • Follow the guidelines. Yes, the word counts are limited on purpose. Do not submit links to Google Docs, etc. for more information. Tricky submissions will be disqualified.

While I can’t give direct pitch coaching—it would be unfair to others—I’m happy to answer your questions in the comments.

Submissions are reviewed by a selection committee at Moz, so multiple people look at and give their opinions on each pitch. The first run-through looks at pitches without speaker information attached to them in order to give an unbiased look at topics. Around 50% of pitches are weeded out here. The second run-through includes speaker bio information in order to get a more holistic view of the speaker and what your talk might be like in front of 1,400 people.

Everyone who submits a community speaker pitch will be informed either way. If your submission doesn’t make it and you’re wondering why, we can talk further on email as there’s always next year.

Finally, a big thank you to our wonderful community speakers from past MozCons including Stephanie BeadellMark TraphagenZeph SnappJustin Briggs, Darren Shaw, Dana Lookadoo, Fabio Ricotta, Jeff McRitchie, Sha Menz, Mike Arnesen, A. Litsa, and Kelsey Libert, who’ve all been so amazing.


Still need to confirm you’ll join us?

Buy your ticket!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Teach Google About Your Entities by Using Topical Hubs – Whiteboard Friday

Posted by gfiorelli1

I’m not so sure it’s correct to say—as is so common lately—that today’s SEO is a new one, especially with regard to on-site SEO.
Many of the things that are necessary today were also necessary in the past: a well-designed information architecture, a great navigation structure, good internal linking, etc.
We should talk instead of a new emphasis we must give to some factors as old as SEO itself.
Today I’ll talk about one of these factors—Topical Hubs—that, although it has been important in the past, is even more so today with Hummingbird and the increasing weight Google gives to semantics and thematic consistency of the sites.

[Disclaimer about my accent in the video: I swear, my English is not so bad, even if it really sounds Italian; just the idea that I was in Seattle shooting a WBF stressed every cell in my body].

For reference, here’s a still of this week’s whiteboard!

Video Transcription

Hola, Moz fans. I’m Gianluca Fiorelli. Finally, you are going to see my face and not just my picture or avatar.

I’m here not to talk about how to snap faces, but about topical hubs. 

What are topical hubs? We are going to discover it. 


Why are we talking about topical hubs? We are going to talk about it because of Hummingbird
. Hummingbird, we know that it’s not a really well-known algorithm, but it has really changed how Google works.

One thing we know is that it is simplifying the results [SERPs]. 

One thing that is not working anymore, that was really, really a goldmine for SEO, was working on long, long tails. You can remember maybe many sites targeting millions of pages about every kind of long queries possible. This is not so anymore because
Hummingbird has simplified [everything]. If query A, query B, and query C are the same when query D, Google will always show query D [SERPs].

In order to optimize your site for this kind of new semantic understanding that Google has of the queries – especially conversational query – we must understand that
we have to think in entities and not in keywords. We have to think about the connection between the entities, and we have to be really sure about the context of the content that we are creating

All these three things then will guide our keyword research.

How can we do this?

We should start our job not from keywords but from entities. 

These are a
few tools that we can use, like directly using the Freebase APIs, which is directly using a Google source (as Freebase is Google), or we can use the AlchemyAPI, which can make our job easier. 

There are also other tools, like 
ConceptNetYahoo Glimmer, and Bottlenose. Bottlenose… I suggest it to you if you are going to create or craft a site about something which is really mainstream, but has concepts stemming especially from social. Bottlenose is really good because it’s taking care also of entity recognition in social media. 

There is RelFinder, which is a really nice tool for free. It is relying on the dBASE, the Wikipedia database.

From there, using these tools, we can understand, for instance, let’s say we are talking about pizza because we are a pizzeria (I’m Italian). 

Using these tools, we can understand what the concepts are related to pizza: What kind of pizza (thin, crunchy, regular pizza, with tomatoes, without tomatoes, Neapolitan or Romana, so many kinds), but also the history of pizza, because Pizza Margherita was named from an Italian queen. 

We can discover also that pizza can be related to geography also because pizza is Italian, but the World Championship of Acrobatic Pizza (which is a sport) is Spanish. 

We can understand many, many entities, many, many facts around the concept of pizza that can populate our site about pizzas.

Let’s say that we are a pizzeria. We have a local site, and we are maybe in Tribeca. We shouldn’t just focus ourselves on the entity search of “pizzas,” but we should start also thinking about entity searches for entities related to Tribeca, so New York Movie Festival, Robert De Niro, etc.

Once we have all of these entities,
we should start thinking about the ontology we want to use, that we can extract from these entities, how to group them and create our categories for the site. 

The categories of a site substantially are our topical hubs.

Going to another kind of website, let’s think of a classical real estate classified site. 

We usually have in every classified site the homepage, then the category and product pages. People always say, “How can we make our category pages rank?”

Consider them to be topical hubs. 

A good site for topical hubs could be a microsite.
We have just to think of our site as if it was a composition of microsites all contextually connected

So the category page in this case should be considered as a new site all about, for instance, Tribeca or all about Harlem, or Capitol Hill in Seattle, or any other neighborhood if we are talking about real estate.

From there, once we have decided our categories, we can start doing the keyword research, but using a trick,
we must credit Dan Shure for the tip, which is to find keywords related to the entities

Now Dan Shure is suggesting to us to do this: going to Keyword Planner and instead of putting a few keywords to retrieve new ones, use a Wikipedia page of the entity related to the content that we are going to optimize. Goggle will start suggesting us keyword groups, and those keyword groups are all related to a specific subset of the entity we are talking about.

So we can start optimizing our page, our content hub, with the keywords Google itself is extracting from the best SERPs of entities (Freebase or Wikipedia). In doing so, we are creating a page which is really well optimized on the keywords side, but also on the entity side, because all of those keywords we are using are keywords that Google relates to specific entities.

But that’s not all, because when we talk about topical hubs, we have to talk, again, about the
context, and the context is not just writing the classic, old SEO text. It’s also giving value to the category page.

So if we have done a good audience analysis, maybe we can understand that in Capitol Hill, there is a certain demographic. So we can organize the content on the hub page focusing on that demographic, and we know that we will have our text talking about the neighborhood, but we also have our initial listings. Maybe we can see, for instance, if a neighborhood is really appreciated, or if the demographic is young families with two kids and so on. Maybe we can add values, like Zillow is doing: has school close to or in the neighborhood, or parks close to the neighborhood, or where to go to eat in the neighborhood, or landmarks in the neighborhood.

All of this content, which is adding value for the user, is also
adding contextual value and semantic value for Google.

One
tip. When you are optimizing a page, especially category pages, let’s say you have the category page Capitol Hill, Seattle for your real estate site. Tag it with the Schema.org property sameAs, the Capitol Hill word, and link that sameAs to the Wikipedia page of Capitol Hill. If it doesn’t exist, write yourself a web page about Capitol Hill. You are going to tell Google that your page is exactly about that entity.

So when we have all of these things, we can start thinking about the content we can create, which is contextually relevant both to our entity search (we did a keyword search related to the entities) and also to the audience analysis we did.

So, returning to my pizzeria, we know that we can start doing recipes and tag them with recipe micro data. We can do videos and mark that them with a video object. We can do short forms, and especially we can try to do the long forms and tag them with the article schema and trying to be included in the in-depth article box. We can start writing guides. We can start thinking about UGC and Q&A.

We can try especially to create things about the location where we are set, which in my pizzeria case was Tribeca, creating a news board to talk and discuss about the news of what’s happening in Tribeca, what the people of Tribeca are doing, and if we are lucky, we can also think to do newsjacking, which we know is really strong.

For instance, do you remember the Oscar night when the guy with the pizza was entering on the stage? Well, maybe we could do something similar in Tribeca, because there’s a movie festival there. So, maybe during the red carpet show our person goes to all of the celebrities and starts giving pizza to them, or at least a Coke?

So doing these things we are creating something which is really, really thought about in a semantic way, because we are really targeting our site to all of the entities related to our micro-topic. We have it optimized also on a keyword level, and we have it optimized on a semantic search level. We have created it crossing our search with the audience search.

We’re
creating content which is responding both to our audience and Google

And doing so, we are not going to need to create millions of pages targeting long, long tails. 

We just need really strong topical hubs that stem content, which will be able to respond properly to all the queries we were targeting before.

I hope you enjoyed this Whiteboard Friday.

And, again, I beg your pardon for my accent (luckily you have the transcript).

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Evolution of Search: Its impact on Hotel Marketing & SEO

http://axses-ianclayton.blogspot.ca/2013/07/evolution-of-search-as-media-hurting.html Danny Sullivan’s excellent study of Search forms bases of this review a…

[ccw-atrib-link]

Experiment: We Removed a Major Website from Google Search, for Science!

Posted by Cyrus-Shepard

The folks at Groupon surprised us earlier this summer when they reported the
results of an experiment that showed that up to 60% of direct traffic is organic.

In order to accomplish this, Groupon de-indexed their site, effectively removing themselves from Google search results. That’s crazy talk!

Of course, we knew we had to try this ourselves.

We rolled up our sleeves and chose to de-index
Followerwonk, both for its consistent Google traffic and its good analytics setup—that way we could properly measure everything. We were also confident we could quickly bring the site back into Google’s results, which minimized the business risks.

(We discussed de-indexing our main site moz.com, but… no soup for you!)

We wanted to measure and test several things:

  1. How quickly will Google remove a site from its index?
  2. How much of our organic traffic is actually attributed as direct traffic?
  3. How quickly can you bring a site back into search results using the URL removal tool?

Here’s what happened.

How to completely remove a site from Google

The fastest, simplest, and most direct method to completely remove an entire site from Google search results is by using the
URL removal tool

We also understood, via statements form Google engineers, that using this method gave us the biggest chance of bringing the site back, with little risk. Other methods of de-indexing, such as using meta robots NOINDEX, might have taken weeks and caused recovery to take months.

CAUTION: Removing any URLs from a search index is potentially very dangerous, and should be taken very seriously. Do not try this at home; you will not pass go, and will not collect $200!

CAUTION: Removing any URLs from a search index is potentially very dangerous, and should be taken very seriously. Do not try this at home; you will not pass go, and will not collect $200!

After submitting the request, Followerwonk URLs started
disappearing from Google search results in 2-3 hours

The information needs to propagate across different data centers across the globe, so the effect can be delayed in areas. In fact, for the entire duration of the test, organic Google traffic continued to trickle in and never dropped to zero.

The effect on direct vs. organic traffic

In the Groupon experiment, they found that when they lost organic traffic, they
actually lost a bunch of direct traffic as well. The Groupon conclusion was that a large amount of their direct traffic was actually organic—up to 60% on “long URLs”.

At first glance, the overall amount of direct traffic to Followerwonk didn’t change significantly, even when organic traffic dropped.

In fact, we could find no discrepancy in direct traffic outside the expected range.

I ran this by our contacts at Groupon, who said this wasn’t totally unexpected. You see, in their experiment they saw the biggest drop in direct traffic on
long URLs, defined as a URL that is at least as long enough to be in a subfolder, like https://followerwonk.com/bio/?q=content+marketer.

For Followerwonk, the vast majority of traffic goes to the homepage and a handful of other URLs. This means we didn’t have a statistically significant sample size of long URLs to judge the effect. For the long URLs we were able to measure, the results were nebulous. 

Conclusion: While we can’t confirm the Groupon results with our outcome, we can’t discount them either.

It’s quite likely that a portion of your organic traffic is attributed as direct. This is because of different browsers, operating systems and user privacy settings can potentially block referral information from reaching your website.

Bringing your site back from death

After waiting 2 hours,
we deleted the request. Within a few hours all traffic returned to normal. Whew!

Does Google need to recrawl the pages?

If the time period is short enough, and you used the URL removal tool, apparently not.

In the case of Followerwonk, Google removed over
300,000 URLs from its search results, and made them all reappear in mere hours. This suggests that the domain wasn’t completely removed from Google’s index, but only “masked” from appearing for a short period of time.

What about longer periods of de-indexation?

In both the Groupon and Followerwonk experiments, the sites were only de-indexed for a short period of time, and bounced back quickly.

We wanted to find out what would happen if you de-indexed a site for a longer period, like
two and a half days?

I couldn’t convince the team to remove any of our sites from Google search results for a few days, so I choose a smaller personal site that I often subject to merciless SEO experiments.

In this case, I de-indexed the site and didn’t remove the request until three days later. Even with this longer period, all URLs returned within just
a few hours of cancelling the URL removal request.

In the chart below, we revoked the URL removal request on Friday the 25th. The next two days were Saturday and Sunday, both lower traffic days.

Test #2: De-index a personal site for 3 days

Likely, the URLs were still in Google’s index, so we didn’t have to wait for them to be recrawled. 

Here’s another shot of organic traffic before and after the second experiment.

For longer removal periods, a few weeks for example, I speculate Google might drop these semi-permanently from the index and re-inclusion would comprise a much longer time period.

What we learned

  1. While a portion of your organic traffic may be attributed as direct (due to browsers, privacy settings, etc) in our case the effect on direct traffic was negligible.
  2. If you accidentally de-index your site using Google Webmaster Tools, in most cases you can quickly bring it back to life by deleting the request.
  3. Reinclusion happens quickly even after we removed a site for over 2 days. Longer than this, the result is unknown, and you could have problems getting all the pages of your site indexed again.

Further reading

Moz community member Adina Toma wrote an excellent YouMoz post on the re-inclusion process using the same technique, with some excellent tips for other, more extreme situations.

Big thanks to
Peter Bray for volunteering Followerwonk for testing. You are a brave man!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Calculating Estimated ROI for a Specific Site & Body of Keywords

Posted by shannonskinner

One of the biggest challenges for SEO is proving its worth. We all know it’s valuable, but it’s important to convey its value in terms that key stakeholders (up to and including CEOs) understand. To do that, I put together a process to calculate an estimate of ROI for implementing changes to keyword targeting.

In this post, I will walk through that process, so hopefully you can do the same for your clients (or as an in-house SEO to get buy-in), too!

Overview

  1. Gather your data
    1. Keyword Data
    2. Strength of your Preferred URLs
    3. Competition URLs by Keyword
    4. Strength of Competition URLs
  2. Analyze the Data by Keyword
  3. Calculate your potential opportunity

What you need

There are quite a few parts to this recipe, and while the calculation part is pretty easy, gathering the data to throw in the mix is the challenging part. I’ll list each section here, including the components of each, and then we can go through how to retrieve each of them. 

  • Keyword data

    • list of keywords
    • search volumes for each keyword
    • preferred URLs on the site you’re estimating ROI
    • current rank
    • current ranking URL
  • Strength of your preferred URLs

    • De-duplicated list of preferred URLs
    • Page Authorities for each preferred URL
    • BONUS: External & Internal Links for each URL. You can include any measure you like here, as long as it’s something that can be compared (i.e. a number).
  • Where the competition sits

    • For each keyword, the sites that are ranking 1-10 in search currently
  • Strength of the competition

    • De-duplicated list of competing URLs
    • Page Authorities, Domain Authorities, 
    • BONUS: External & Internal Links, for each competing URL. Include any measure you’ve included on the Strength of Your Preferred URLs list.


How to get what you need


There has been quite a lot written about keyword research, so I won’t go into too much detail here. For the Keyword data list, the important thing is to get whatever keywords you’d like to assess into a spreadsheet, and include all the information listed above. You’ll have to select the preferred URLs based on what you think the strongest-competing and most appropriate URL would be for each keyword. 


For the
Preferred URLs list, you’ll want to use the data that’s in your keyword data under the preferred URL.

  1. Copy the preferred URL data from your Keyword Data into a new tab. 
  2. Use the Remove Duplicates tool (Data>Data Tools in Excel) to remove any duplicated URLs

Once you have the list of de-duplicated preferred URLs, you’ll need to pull the data from Open Site Explorer for these URLs. I prefer using the Moz API with SEOTools. You’ll have to install it to use it for Excel, or if you’d like to take a stab at using it in Google Docs, there are some resources available for that. Unfortunately, with the most recent update to Google Spreadsheets, I’ve had some difficulty with this method, so I’ve gone with Excel for now. 

Once you’ve got SEOTools installed, you can make the call “=MOZ_URLMetrics_toFit([enter your cells])”. This should give you a list of URL titles, canonical URLs, External & Internal links, as well as a few other metrics and DA/PA. 


For the
Where the competition sits list, you’ll first need to perform a search for each of your keywords. Obviously, you could do this manually, or if you have exportable data from a keyword ranking tool and you’ve been ranking the keywords you’d like to look at, you could use either of these methods. If you don’t have those, you can use the hacky method that I did–basically, use the ImportXML command in Google Spreadsheets to grab the top ranking URLs for each query. 

I’ve put a sample version of this together, which you can access here. A few caveats: you should be able to run MANY searches in a row–I had about 850 for my data, and they ran fine. Google will block your IP address, though, if you run too many, and what I found is that I needed to copy out my results as values into a different spreadsheet once I’d gotten them, because they timed out relatively quickly, but you can just put them into the Excel spreadsheet you’re building to make the ROI calculations (you’ll need them there anyway!).


From this list, you can pull each URL into a single list, and de-duplicate as explained for the preferred URLs list to generate the
Strength of the Competition list, and then run the analysis you did with the preferred URLs to generate the same data for these URLs as you did for the preferred URLs with SEOTools for Excel. 


Making your data work for you

Once you’ve got these lists, you can use some VLOOKUP magic to pull in the information you need. I used the
Where the competition sits list as the foundation of my work. 

From there, I pulled in the corresponding preferred URL and its Page Authority, as well as the PAs and DAs for each URL currently ranking 1-10. I then was able to calculate an average PA & DA for each query, and could compare the page I want to rank to this. I estimated the chances that the page I wanted to rank (given that I’d already determined these were relevant pages) could rank with better keyword targeting.

Here’s where things get interesting. You can be rather conservative, and only sum search volumes of keywords you’re fairly confident your site can rank, which is my preferred method. That’s because I use this method primarily to determine if I’m on the right track–whether making these recommendations are really worth the time to get implemented. So I’m going to move forward assuming I’m counting only the search volumes of terms I think I’m quite competitive for, AND that I’m not yet ranking for on page 1. 


Now, you want to move to your analytics data in order to calculate a few things: 

  • Conversion Rate
  • Average order value
  • Previous year’s revenue (for the section you’re looking at)

I’ve set up my sample data in this spreadsheet that you can refer to or use to make your own calculations. 

Each of the assumptions can be adjusted depending on the actual site data, or using estimates. I’m using very very generic overall CTR estimates, but you can select which you’d like and get as granular as you want! The main point for me is really getting to two numbers that I can stand by as pretty good estimates: 

  • Annual Impact (Revenue $$)
  • Increase in Revenue ($$) from last year

This is because, for higher-up folks, money talks. Obviously, this won’t be something you can promise, but it gives them a metric that they understand to really wrap their head around the value that you’re potentially brining to the table if the changes you’re recommending can be made. 

There are some great tools for estimating this kind of stuff on a smaller scale, but for a massive body of keyword data, hopefully you will find this process useful as well. Let me know what you think, and I’d love to see what parts anyone else can streamline or make even more efficient. 

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]