The Importance of Being Different: Creating a Competitive Advantage With Your USP

Posted by TrentonGreener

“The one who follows the crowd will usually go no further than the crowd. Those who walk alone are likely to find themselves in places no one has ever been before.”

While this quote has been credited to everyone from Francis Phillip Wernig, under the pseudonym Alan Ashley-Pitt, to Einstein himself, the powerful message does not lose its substance no matter whom you choose to credit. There is a very important yet often overlooked effect of not heeding this warning. One which can be applied to all aspects of life. From love and happiness, to business and marketing, copying what your competitors are doing and failing to forge your own path can be a detrimental mistake.

While as marketers we are all acutely aware of the importance of differentiation, we’ve been trained for the majority of our lives to seek out the norm.

We spend the majority of our adolescent lives trying desperately not to be different. No one has ever been picked on for being too normal or not being different enough. We would beg our parents to buy us the same clothes little Jimmy or little Jamie wore. We’d want the same backpack and the same bike everyone else had. With the rise of the cell phone and later the smartphone, on hands and knees, we begged and pleaded for our parents to buy us the Razr, the StarTAC (bonus points if you didn’t have to Google that one), and later the iPhone. Did we truly want these things? Yes, but not just because they were cutting edge and nifty. We desired them because the people around us had them. We didn’t want to be the last to get these devices. We didn’t want to be different.

Thankfully, as we mature we begin to realize the fallacy that is trying to be normal. We start to become individuals and learn to appreciate that being different is often seen as beautiful. However, while we begin to celebrate being different on a personal level, it does not always translate into our business or professional lives.

We unconsciously and naturally seek out the normal, and if we want to be different—truly different in a way that creates an advantage—we have to work for it.

The truth of the matter is, anyone can be different. In fact, we all are very different. Even identical twins with the same DNA will often have starkly different personalities. As a business, the real challenge lies in being different in a way that is relevant, valuable to your audience, and creates an advantage.

“Strong products and services are highly differentiated from all other products and services. It’s that simple. It’s that difficult.” – Austin McGhie, Brand Is a Four Letter Word

Let’s explore the example of Revel Hotel & Casino. Revel is a 70-story luxury casino in Atlantic City that was built in 2012. There is simply not another casino of the same class in Atlantic City, but there might be a reason for this. Even if you’re not familiar with the city, a quick jump onto Atlantic City’s tourism website reveals that of the five hero banners that rotate, not one specifically mentions gambling, but three reference the boardwalk. This is further illustrated when exploring their internal linking structure. The beaches, boardwalk, and shopping all appear before a single mention of casinos. There simply isn’t as much of a market for high-end gamblers in the Atlantic City area; in the states Las Vegas serves that role. So while Revel has a unique advantage, their ability to attract customers to their resort has not resulted in profitable earnings reports. In Q2 2012, Revel had a gross operating loss of $35.177M, and in Q3 2012 that increased to $36.838M.

So you need to create a unique selling proposition (also known as unique selling point and commonly referred to as a USP), and your USP needs to be valuable to your audience and create a competitive advantage. Sounds easy enough, right? Now for the kicker. That advantage needs to be as sustainable as physically possible over the long term.

“How long will it take our competitors to duplicate our advantage?”

You really need to explore this question and the possible solutions your competitors could utilize to play catch-up or duplicate what you’ve done. Look no further than Google vs Bing to see this in action. No company out there is going to just give up because your USP is so much better; most will pivot or adapt in some way.

Let’s look at a Seattle-area coffee company of which you may or may not be familiar. Starbucks has tried quite a few times over the years to level-up their tea game with limited success, but the markets that Starbucks has really struggled to break into are the pastry, breads, dessert, and food markets.

Other stores had more success in these markets, and they thought that high-quality teas and bakery items were the USPs that differentiated them from the Big Bad Wolf that is Starbucks. And while they were right to think that their brick house would save them from the Big Bad Wolf for some time, this fable doesn’t end with the Big Bad Wolf in a boiling pot.

Never underestimate your competitor’s ability to be agile, specifically when overcoming a competitive disadvantage.

If your competitor can’t beat you by making a better product or service internally, they can always choose to buy someone who can.

After months of courting, on June 4th, 2012 Starbucks announced that they had come to an agreement to purchase La Boulange in order to “elevate core food offerings and build a premium, artisanal bakery brand.” If you’re a small-to-medium sized coffee shop and/or bakery that even indirectly competed with Starbucks, a new challenger approaches. And while those tea shops momentarily felt safe within the brick walls that guarded their USP, on the final day of that same year, the Big Bad Wolf huffed and puffed and blew a stack of cash all over Teavana. Making Teavana a wholly-owned subsidiary of Starbucks for the low, low price of $620M.

Sarcasm aside, this does a great job of illustrating the ability of companies—especially those with deep pockets—to be agile, and demonstrates that they often have an uncanny ability to overcome your company’s competitive advantage. In seven months, Starbucks went from a minor player in these markets to having all the tools they need to dominate tea and pastries. Have you tried their raspberry pound cake? It’s phenomenal.

Why does this matter to me?

Ok, we get it. We need to be different, and in a way that is relevant, valuable, defensible, and sustainable. But I’m not the CEO, or even the CMO. I cannot effect change on a company level; why does this matter to me?

I’m a firm believer that you effect change no matter what the name plate on your desk may say. Sure, you may not be able to call an all-staff meeting today and completely change the direction of your company tomorrow, but you can effect change on the parts of the business you do touch. No matter your title or area of responsibility, you need to know your company’s, client’s, or even a specific piece of content’s USP, and you need to ensure it is applied liberally to all areas of your work.

Look at this example SERP for “Mechanics”:

While yes, this search is very likely to be local-sensitive, that doesn’t mean you can’t stand out. Every single AdWords result, save one, has only the word “Mechanics” in the headline. (While the top of page ad is pulling description line 1 into the heading, the actual headline is still only “Mechanic.”) But even the one headline that is different doesn’t do a great job of illustrating the company’s USP. Mechanics at home? Whose home? Mine or theirs? I’m a huge fan of Steve Krug’s “Don’t Make Me Think,” and in this scenario there are too many questions I need answered before I’m willing to click through. “Mechanics; We Come To You” or even “Traveling Mechanics” illustrates this point much more clearly, and still fits within the 25-character limit for the headline.

If you’re an AdWords user, no matter how big or small your monthly spend may be, take a look at your top 10-15 keywords by volume and evaluate how well you’re differentiating yourself from the other brands in your industry. Test ad copy that draws attention to your USP and reap the rewards.

Now while this is simply an AdWords text ad example, the same concept can be applied universally across all of marketing.

Title tags & meta descriptions

As we alluded to above, not only do companies have USPs, but individual pieces of content can, and should, have their own USP. Use your title tag and meta description to illustrate what differentiates your piece of content from the competition and do so in a way that attracts the searcher’s click. Use your USP to your advantage. If you have already established a strong brand within a specific niche, great! Now use it to your advantage. Though it’s much more likely that you are competing against a strong brand, and in these scenarios ask yourself, “What makes our content different from theirs?” The answer you come up with is your content’s USP. Call attention to that in your title tag and meta description, and watch the CTR climb.

I encourage you to hop into your own site’s analytics and look at your top 10-15 organic landing pages and see how well you differentiate yourself. Even if you’re hesitant to negatively affect your inbound gold mines by changing the title tags, run a test and change up your meta description to draw attention to your USP. In an hour’s work, you just may make the change that pushes you a little further up those SERPs.

Branding

Let’s break outside the world of digital marketing and look at the world of branding. Tom’s Shoes competes against some heavy hitters in Nike, Adidas, Reebok, and Puma just to name a few. While Tom’s can’t hope to compete against the marketing budgets of these companies in a fair fight, they instead chose to take what makes them different, their USP, and disseminate it every chance they get. They have labeled themselves “The One for One” company. It’s in their homepage’s title tag, in every piece of marketing they put out, and it smacks you in the face when you land on their site. They even use the call-to-action “Get Good Karma” throughout their site.

Now as many of us may know, partially because of the scandal it created in late 2013, Tom’s is not actually a non-profit organization. No matter how you feel about the matter, this marketing strategy has created a positive effect on their bottom line. Fast Company conservatively estimated their revenues in 2013 at $250M, with many estimates being closer to the $300M mark. Not too bad of a slice of the pie when competing against the powerhouses Tom’s does.

Wherever you stand on this issue, Tom’s Shoes has done a phenomenal job of differentiating their brand from the big hitters in their industry.

Know your USP and disseminate it every chance you get.

This is worth repeating. Know your USP and disseminate it every chance you get, whether that be in title tags, ad copy, on-page copy, branding, or any other segment of your marketing campaigns. Online or offline, be different. And remember the quote that we started with, “The one who follows the crowd will usually go no further than the crowd. Those who walk alone are likely to find themselves in places no one has ever been before.”

The amount of marketing knowledge that can be taken from this one simple statement is astounding. Heed the words, stand out from the crowd, and you will have success.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

I Can’t Drive 155: Meta Descriptions in 2015

Posted by Dr-Pete

For years now, we (and many others) have been recommending keeping your Meta Descriptions shorter than
about 155-160 characters. For months, people have been sending me examples of search snippets that clearly broke that rule, like this one (on a search for “hummingbird food”):

For the record, this one clocks in at 317 characters (counting spaces). So, I set out to discover if these long descriptions were exceptions to the rule, or if we need to change the rules. I collected the search snippets across the MozCast 10K, which resulted in 92,669 snippets. All of the data in this post was collected on April 13, 2015.

The Basic Data

The minimum snippet length was zero characters. There were 69 zero-length snippets, but most of these were the new generation of answer box, that appears organic but doesn’t have a snippet. To put it another way, these were misidentified as organic by my code. The other 0-length snippets were local one-boxes that appeared as organic but had no snippet, such as this one for “chichen itza”:

These zero-length snippets were removed from further analysis, but considering that they only accounted for 0.07% of the total data, they didn’t really impact the conclusions either way. The shortest legitimate, non-zero snippet was 7 characters long, on a search for “geek and sundry”, and appears to have come directly from the site’s meta description:

The maximum snippet length that day (this is a highly dynamic situation) was 372 characters. The winner appeared on a search for “benefits of apple cider vinegar”:

The average length of all of the snippets in our data set (not counting zero-length snippets) was 143.5 characters, and the median length was 152 characters. Of course, this can be misleading, since some snippets are shorter than the limit and others are being artificially truncated by Google. So, let’s dig a bit deeper.

The Bigger Picture

To get a better idea of the big picture, let’s take a look at the display length of all 92,600 snippets (with non-zero length), split into 20-character buckets (0-20, 21-40, etc.):

Most of the snippets (62.1%) cut off as expected, right in the 141-160 character bucket. Of course, some snippets were shorter than that, and didn’t need to be cut off, and some broke the rules. About 1% (1,010) of the snippets in our data set measured 200 or more characters. That’s not a huge number, but it’s enough to take seriously.

That 141-160 character bucket is dwarfing everything else, so let’s zoom in a bit on the cut-off range, and just look at snippets in the 120-200 character range (in this case, by 5-character bins):

Zooming in, the bulk of the snippets are displaying at lengths between about 146-165 characters. There are plenty of exceptions to the 155-160 character guideline, but for the most part, they do seem to be exceptions.

Finally, let’s zoom in on the rule-breakers. This is the distribution of snippets displaying 191+ characters, bucketed in 10-character bins (191-200, 201-210, etc.):

Please note that the Y-axis scale is much smaller than in the previous 2 graphs, but there is a pretty solid spread, with a decent chunk of snippets displaying more than 300 characters.

Without looking at every original meta description tag, it’s very difficult to tell exactly how many snippets have been truncated by Google, but we do have a proxy. Snippets that have been truncated end in an ellipsis (…), which rarely appears at the end of a natural description. In this data set, more than half of all snippets (52.8%) ended in an ellipsis, so we’re still seeing a lot of meta descriptions being cut off.

I should add that, unlike titles/headlines, it isn’t clear whether Google is cutting off snippets by pixel width or character count, since that cut-off is done on the server-side. In most cases, Google will cut before the end of the second line, but sometimes they cut well before this, which could suggest a character-based limit. They also cut off at whole words, which can make the numbers a bit tougher to interpret.

The Cutting Room Floor

There’s another difficulty with telling exactly how many meta descriptions Google has modified – some edits are minor, and some are major. One minor edit is when Google adds some additional information to a snippet, such as a date at the beginning. Here’s an example (from a search for “chicken pox”):

With the date (and minus the ellipsis), this snippet is 164 characters long, which suggests Google isn’t counting the added text against the length limit. What’s interesting is that the rest comes directly from the meta description on the site, except that the site’s description starts with “Chickenpox.” and Google has removed that keyword. As a human, I’d say this matches the meta description, but a bot has a very hard time telling a minor edit from a complete rewrite.

Another minor rewrite occurs in snippets that start with search result counts:

Here, we’re at 172 characters (with spaces and minus the ellipsis), and Google has even let this snippet roll over to a third line. So, again, it seems like the added information at the beginning isn’t counting against the length limit.

All told, 11.6% of the snippets in our data set had some kind of Google-generated data, so this type of minor rewrite is pretty common. Even if Google honors most of your meta description, you may see small edits.

Let’s look at our big winner, the 372-character description. Here’s what we saw in the snippet:

Jan 26, 2015 – Health• Diabetes Prevention: Multiple studies have shown a correlation between apple cider vinegar and lower blood sugar levels. … • Weight Loss: Consuming apple cider vinegar can help you feel more full, which can help you eat less. … • Lower Cholesterol: … • Detox: … • Digestive Aid: … • Itchy or Sunburned Skin: … • Energy Boost:1 more items

So, what about the meta description? Here’s what we actually see in the tag:

Were you aware of all the uses of apple cider vinegar? From cleansing to healing, to preventing diabetes, ACV is a pantry staple you need in your home.

That’s a bit more than just a couple of edits. So, what’s happening here? Well, there’s a clue on that same page, where we see yet another rule-breaking snippet:

You might be wondering why this snippet is any more interesting than the other one. If you could see the top of the SERP, you’d know why, because it looks something like this:

Google is automatically extracting list-style data from these pages to fuel the expansion of the Knowledge Graph. In one case, that data is replacing a snippet
and going directly into an answer box, but they’re performing the same translation even for some other snippets on the page.

So, does every 2nd-generation answer box yield long snippets? After 3 hours of inadvisable mySQL queries, I can tell you that the answer is a resounding “probably not”. You can have 2nd-gen answer boxes without long snippets and you can have long snippets without 2nd-gen answer boxes,
but there does appear to be a connection between long snippets and Knowledge Graph in some cases.

One interesting connection is that Google has begun bolding keywords that seem like answers to the query (and not just synonyms for the query). Below is an example from a search for “mono symptoms”. There’s an answer box for this query, but the snippet below is not from the site in the answer box:

Notice the bolded words – “fatigue”, “sore throat”, “fever”, “headache”, “rash”. These aren’t synonyms for the search phrase; these are actual symptoms of mono. This data isn’t coming from the meta description, but from a bulleted list on the target page. Again, it appears that Google is trying to use the snippet to answer a question, and has gone well beyond just matching keywords.

Just for fun, let’s look at one more, where there’s no clear connection to the Knowledge Graph. Here’s a snippet from a search for “sons of anarchy season 4”:

This page has no answer box, and the information extracted is odd at best. The snippet bears little or no resemblance to the site’s meta description. The number string at the beginning comes out of a rating widget, and some of the text isn’t even clearly available on the page. This seems to be an example of Google acknowledging IMDb as a high-authority site and desperately trying to match any text they can to the query, resulting in a Frankenstein’s snippet.

The Final Verdict

If all of this seems confusing, that’s probably because it is. Google is taking a lot more liberties with snippets these days, both to better match queries, to add details they feel are important, or to help build and support the Knowledge Graph.

So, let’s get back to the original question – is it time to revise the 155(ish) character guideline? My gut feeling is: not yet. To begin with, the vast majority of snippets are still falling in that 145-165 character range. In addition, the exceptions to the rule are not only atypical situations, but in most cases those long snippets don’t seem to represent the original meta description. In other words, even if Google does grant you extra characters, they probably won’t be the extra characters you asked for in the first place.

Many people have asked: “How do I make sure that Google shows my meta description as is?” I’m afraid the answer is: “You don’t.” If this is very important to you, I would recommend keeping your description below the 155-character limit, and making sure that it’s a good match to your target keyword concepts. I suspect Google is going to take more liberties with snippets over time, and we’re going to have to let go of our obsession with having total control over the SERPs.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Moving 5 Domains to 1: An SEO Case Study

Posted by Dr-Pete

People often ask me if they should change domain names, and I always shudder just a little. Changing domains is a huge, risky undertaking, and too many people rush into it seeing only the imaginary upside. The success of the change also depends wildly on the details, and it’s not the kind of question anyone should be asking casually on social media.

Recently, I decided that it was time to find a new permanent home for my personal and professional blogs, which had gradually spread out over 5 domains. I also felt my main domain was no longer relevant to my current situation, and it was time for a change. So, ultimately I ended up with a scenario that looked like this:

The top three sites were active, with UserEffect.com being my former consulting site and blog (and relatively well-trafficked). The bottom two sites were both inactive and were both essentially gag sites. My one-pager, AreYouARealDoctor.com, did previously rank well for “are you a real doctor”, so I wanted to try to recapture that.

I started migrating the 5 sites in mid-January, and I’ve been tracking the results. I thought it would be useful to see how this kind of change plays out, in all of the gory details. As it turns out, nothing is ever quite “textbook” when it comes to technical SEO.

Why Change Domains at All?

The rationale for picking a new domain could fill a month’s worth of posts, but I want to make one critical point – changing domains should be about your business goals first, and SEO second. I did not change domains to try to rank better for “Dr. Pete” – that’s a crap shoot at best. I changed domains because my old consulting brand (“User Effect”) no longer represented the kind of work I do and I’m much more known by my personal brand.

That business case was strong enough that I was willing to accept some losses. We went through a similar transition here
from SEOmoz.org to Moz.com. That was a difficult transition that cost us some SEO ground, especially short-term, but our core rationale was grounded in the business and where it’s headed. Don’t let an SEO pipe dream lead you into a risky decision.

Why did I pick a .co domain? I did it for the usual reason – the .com was taken. For a project of this type, where revenue wasn’t on the line, I didn’t have any particular concerns about .co. The evidence on how top-level domains (TLDs) impact ranking is tough to tease apart (so many other factors correlate with .com’s), and Google’s attitude tends to change over time, especially if new TLDs are abused. Anecdotally, though, I’ve seen plenty of .co’s rank, and I wasn’t concerned.

Step 1 – The Boring Stuff

It is absolutely shocking how many people build a new site, slap up some 301s, pull the switch, and hope for the best. It’s less shocking how many of those people end up in Q&A a week later, desperate and bleeding money.


Planning is hard work, and it’s boring – get over it.

You need to be intimately familiar with every page on your existing site(s), and, ideally, you should make a list. Not only do you have to plan for what will happen to each of these pages, but you’ll need that list to make sure everything works smoothly later.

In my case, I decided it might be time to do some housekeeping – the User Effect blog had hundreds of posts, many outdated and quite a few just not very good. So, I started with the easy data – recent traffic. I’m sure you’ve seen this Google Analytics report (Behavior > Site Content > All Pages):

Since I wanted to focus on recent activity, and none of the sites had much new content, I restricted myself to a 3-month window (Q4 of 2014). Of course, I looked much deeper than the top 10, but the principle was simple – I wanted to make sure the data matched my intuition and that I wasn’t cutting off anything important. This helped me prioritize the list.

Of course, from an SEO standpoint, I also didn’t want to lose content that had limited traffic but solid inbound links. So, I checked my “Top Pages” report in
Open Site Explorer:

Since the bulk of my main site was a blog, the top trafficked and top linked-to pages fortunately correlated pretty well. Again, this is only a way to prioritize. If you’re dealing with sites with thousands of pages, you need to work methodically through the site architecture.

I’m going to say something that makes some SEOs itchy – it’s ok not to move some pages to the new site. It’s even ok to let some pages 404. In Q4, UserEffect.com had traffic to 237 URLs. The top 10 pages accounted for 91.9% of that traffic. I strongly believe that moving domains is a good time to refocus a site and concentrate your visitors and link equity on your best content. More is not better in 2015.

Letting go of some pages also means that you’re not 301-redirecting a massive number of old URLs to a new home-page. This can look like a low-quality attempt to consolidate link-equity, and at large scale it can raise red flags with Google. Content worth keeping should exist on the new site, and your 301s should have well-matched targets.

In one case, I had a blog post that had a decent trickle of traffic due to ranking for “50,000 push-ups,” but the post itself was weak and the bounce rate was very high:

The post was basically just a placeholder announcing that I’d be attempting this challenge, but I never recapped anything after finishing it. So, in this case,
I rewrote the post.

Of course, this process was repeated across the 3 active sites. The 2 inactive sites only constituted a handful of total pages. In the case of AreYouARealDoctor.com, I decided to turn the previous one-pager
into a new page on the new site. That way, I had a very well-matched target for the 301-redirect, instead of simply mapping the old site to my new home-page.

I’m trying to prove a point – this is the amount of work I did for a handful of sites that were mostly inactive and producing no current business value. I don’t need consulting gigs and these sites produce no direct revenue, and yet I still considered this process worth the effort.

Step 2 – The Big Day

Eventually, you’re going to have to make the move, and in most cases, I prefer ripping off the bandage. Of course, doing something all at once doesn’t mean you shouldn’t be careful.

The biggest problem I see with domain switches (even if they’re 1-to-1) is that people rely on data that can take weeks to evaluate, like rankings and traffic, or directly checking Google’s index. By then, a lot of damage is already done. Here are some ways to find out quickly if you’ve got problems…

(1) Manually Check Pages

Remember that list you were supposed to make? It’s time to check it, or at least spot-check it. Someone needs to physically go to a browser and make sure that each major section of the site and each important individual page is resolving properly. It doesn’t matter how confident your IT department/guy/gal is – things go wrong.

(2) Manually Check Headers

Just because a page resolves, it doesn’t mean that your 301-redirects are working properly, or that you’re not firing some kind of 17-step redirect chain. Check your headers. There are tons of free tools, but lately I’m fond of
URI Valet. Guess what – I screwed up my primary 301-redirects. One of my registrar transfers wasn’t working, so I had to have a setting changed by customer service, and I inadvertently ended up with 302s (Pro tip: Don’t change registrars and domains in one step):

Don’t think that because you’re an “expert”, your plan is foolproof. Mistakes happen, and because I caught this one I was able to correct it fairly quickly.

(3) Submit Your New Site

You don’t need to submit your site to Google in 2015, but now that Google Webmaster Tools allows it, why not do it? The primary argument I hear is “well, it’s not necessary.” True, but direct submission has one advantage – it’s fast.

To be precise, Google Webmaster Tools separates the process into “Fetch” and “Submit to index” (you’ll find this under “Crawl” > “Fetch as Google”). Fetching will quickly tell you if Google can resolve a URL and retrieve the page contents, which alone is pretty useful. Once a page is fetched, you can submit it, and you should see something like this:

This isn’t really about getting indexed – it’s about getting nearly instantaneous feedback. If Google has any major problems with crawling your site, you’ll know quickly, at least at the macro level.

(4) Submit New XML Sitemaps

Finally, submit a new set of XML sitemaps in Google Webmaster Tools, and preferably tiered sitemaps. While it’s a few years old now, Rob Ousbey has a great post on the subject of
XML sitemap structure. The basic idea is that, if you divide your sitemap into logical sections, it’s going to be much easier to diagnosis what kinds of pages Google is indexing and where you’re running into trouble.

A couple of pro tips on sitemaps – first, keep your old sitemaps active temporarily. This is counterintuitive to some people, but unless Google can crawl your old URLs, they won’t see and process the 301-redirects and other signals. Let the old accounts stay open for a couple of months, and don’t cut off access to the domains you’re moving.

Second (I learned this one the hard way), make sure that your Google Webmaster Tools site verification still works. If you use file uploads or meta tags and don’t move those files/tags to the new site, GWT verification will fail and you won’t have access to your old accounts. I’d recommend using a more domain-independent solution, like verifying with Google Analytics. If you lose verification, don’t panic – your data won’t be instantly lost.

Step 3 – The Waiting Game

Once you’ve made the switch, the waiting begins, and this is where many people start to panic. Even executed perfectly, it can take Google weeks or even months to process all of your 301-redirects and reevaluate a new domain’s capacity to rank. You have to expect short term fluctuations in ranking and traffic.

During this period, you’ll want to watch a few things – your traffic, your rankings, your indexed pages (via GWT and the site: operator), and your errors (such as unexpected 404s). Traffic will recover the fastest, since direct traffic is immediately carried through redirects, but ranking and indexation will lag, and errors may take time to appear.

(1) Monitor Traffic

I’m hoping you know how to check your traffic, but actually trying to determine what your new levels should be and comparing any two days can be easier said than done. If you launch on a Friday, and then Saturday your traffic goes down on the new site, that’s hardly cause for panic – your traffic probably
always goes down on Saturday.

In this case, I redirected the individual sites over about a week, but I’m going to focus on UserEffect.com, as that was the major traffic generator. That site was redirected, in full on January 21st, and the Google Analytics data for January for the old site looked like this:

So far, so good – traffic bottomed out almost immediately. Of course, losing traffic is easy – the real question is what’s going on with the new domain. Here’s the graph for January for DrPete.co:

This one’s a bit trickier – the first spike, on January 16th, is when I redirected the first domain. The second spike, on January 22nd, is when I redirected UserEffect.com. Both spikes are meaningless – I announced these re-launches on social media and got a short-term traffic burst. What we really want to know is where traffic is leveling out.

Of course, there isn’t a lot of history here, but a typical day for UserEffect.com in January was about 1,000 pageviews. The traffic to DrPete.co after it leveled out was about half that (500 pageviews). It’s not a complete crisis, but we’re definitely looking at a short-term loss.

Obviously, I’m simplifying the process here – for a large, ecommerce site you’d want to track a wide range of metrics, including conversion metrics. Hopefully, though, this illustrates the core approach. So, what am I missing out on? In this day of [not provided], tracking down a loss can be tricky. Let’s look for clues in our other three areas…

(2) Monitor Indexation

You can get a broad sense of your indexed pages from Google Webmaster Tools, but this data often lags real-time and isn’t very granular. Despite its shortcomings, I still prefer
the site: operator. Generally, I monitor a domain daily – any one measurement has a lot of noise, but what you’re looking for is the trend over time. Here’s the indexed page count for DrPete.co:

The first set of pages was indexed fairly quickly, and then the second set started being indexed soon after UserEffect.com was redirected. All in all, we’re seeing a fairly steady upward trend, and that’s what we’re hoping to see. The number is also in the ballpark of sanity (compared to the actual page count) and roughly matched GWT data once it started being reported.

So, what happened to UserEffect.com’s index after the switch?

The timeframe here is shorter, since UserEffect.com was redirected last, but we see a gradual decline in indexation, as expected. Note that the index size plateaus around 60 pages – about 1/4 of the original size. This isn’t abnormal – low-traffic and unlinked pages (or those with deep links) are going to take a while to clear out. This is a long-term process. Don’t panic over the absolute numbers – what you want here is a downward trend on the old domain accompanied by a roughly equal upward trend on the new domain.

The fact that UserEffect.com didn’t bottom out is definitely worth monitoring, but this timespan is too short for the plateau to be a major concern. The next step would be to dig into these specific pages and look for a pattern.

(3) Monitor Rankings

The old domain is dropping out of the index, and the new domain is taking its place, but we still don’t know why the new site is taking a traffic hit. It’s time to dig into our core keyword rankings.

Historically, UserEffect.com had ranked well for keywords related to “split test calculator” (near #1) and “usability checklist” (in the top 3). While [not provided] makes keyword-level traffic analysis tricky, we also know that the split-test calculator is one of the top trafficked pages on the site, so let’s dig into that one. Here’s the ranking data from Moz Analytics for “split test calculator”:

The new site took over the #1 position from the old site at first, but then quickly dropped down to the #3/#4 ranking. That may not sound like a lot, but given this general keyword category was one of the site’s top traffic drivers, the CTR drop from #1 to #3/#4 could definitely be causing problems.

When you have a specific keyword you can diagnose, it’s worth taking a look at the live SERP, just to get some context. The day after relaunch, I captured this result for “dr. pete”:

Here, the new domain is ranking, but it’s showing the old title tag. This may not be cause for alarm – weird things often happen in the very short term – but in this case we know that I accidentally set up a 302-redirect. There’s some reason to believe that Google didn’t pass full link equity during that period when 301s weren’t implemented.

Let’s look at a domain where the 301s behaved properly. Before the site was inactive, AreYouARealDoctor.com ranked #1 for “are you a real doctor”. Since there was an inactive period, and I dropped the exact-match domain, it wouldn’t be surprising to see a corresponding ranking drop.

In reality, the new site was ranking #1 for “are you a real doctor” within 2 weeks of 301-redirecting the old domain. The graph is just a horizontal line at #1, so I’m not going to bother you with it, but here’s a current screenshot (incognito):

Early on, I also spot-checked this result, and it wasn’t showing the strange title tag crossover that UserEffect.com pages exhibited. So, it’s very likely that the 302-redirects caused some problems.

Of course, these are just a couple of keywords, but I hope it provides a starting point for you to understand how to methodically approach this problem. There’s no use crying over spilled milk, and I’m not going to fire myself, so let’s move on to checking any other errors that I might have missed.

(4) Check Errors (404s, etc.)

A good first stop for unexpected errors is the “Crawl Errors” report in Google Webmaster Tools (Crawl > Crawl Errors). This is going to take some digging, especially if you’ve deliberately 404’ed some content. Over the couple of weeks after re-launch, I spotted the following problems:

The old site had a “/blog” directory, but the new site put the blog right on the home-page and had no corresponding directory. Doh. Hey, do as I say, not as I do, ok? Obviously, this was a big blunder, as the old blog home-page was well-trafficked.

The other two errors here are smaller but easy to correct. MinimalTalent.com had a “/free” directory that housed downloads (mostly PDFs). I missed it, since my other sites used a different format. Luckily, this was easy to remap.

The last error is a weird looking URL, and there are other similar URLs in the 404 list. This is where site knowledge is critical. I custom-designed a URL shortener for UserEffect.com and, in some cases, people linked to those URLs. Since those URLs didn’t exist in the site architecture, I missed them. This is where digging deep into historical traffic reports and your top-linked pages is critical. In this case, the fix isn’t easy, and I have to decide whether the loss is worth the time.

What About the New EMD?

My goal here wasn’t to rank better for “Dr. Pete,” and finally unseat Dr. Pete’s Marinades, Dr. Pete the Sodastream flavor (yes, it’s hilarious – you can stop sending me your grocery store photos), and 172 dentists. Ok, it mostly wasn’t my goal. Of course, you might be wondering how switching to an EMD worked out.

In the short term, I’m afraid the answer is “not very well.” I didn’t track ranking for “Dr. Pete” and related phrases very often before the switch, but it appears that ranking actually fell in the short-term. Current estimates have me sitting around page 4, even though my combined link profile suggests a much stronger position. Here’s a look at the ranking history for “dr pete” since relaunch (from Moz Analytics):

There was an initial drop, after which the site evened out a bit. This less-than-impressive plateau could be due to the bad 302s during transition. It could be Google evaluating a new EMD and multiple redirects to that EMD. It could be that the prevalence of natural anchor text with “Dr. Pete” pointing to my site suddenly looked unnatural when my domain name switched to DrPete.co. It could just be that this is going to take time to shake out.

If there’s a lesson here (and, admittedly, it’s too soon to tell), it’s that you shouldn’t rush to buy an EMD in 2015 in the wild hope of instantly ranking for that target phrase. There are so many factors involved in ranking for even a moderately competitive term, and your domain is just one small part of the mix.

So, What Did We Learn?

I hope you learned that I should’ve taken my own advice and planned a bit more carefully. I admit that this was a side project and it didn’t get the attention it deserved. The problem is that, even when real money is at stake, people rush these things and hope for the best. There’s a real cheerleading mentality when it comes to change – people want to take action and only see the upside.

Ultimately, in a corporate or agency environment, you can’t be the one sour note among the cheering. You’ll be ignored, and possibly even fired. That’s not fair, but it’s reality. What you need to do is make sure the work gets done right and people go into the process with eyes wide open. There’s no room for shortcuts when you’re moving to a new domain.

That said, a domain change isn’t a death sentence, either. Done right, and with sensible goals in mind – balancing not just SEO but broader marketing and business objectives – a domain migration can be successful, even across multiple sites.

To sum up: Plan, plan, plan, monitor, monitor, monitor, and try not to panic.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

The Most Important Link Penalty Removal Tool: Your Mindset

Posted by Eric Enge

Let’s face it. Getting slapped by a manual link penalty, or by the Penguin algorithm, really stinks. Once this has happened to you, your business is in a world of hurt. Worse still is the fact that you can’t get clear information from Google on which of your links are the bad ones. In today’s post, I am going to focus on the number one reason why people fail to get out from under these types of problems, and how to improve your chances of success.

The mindset

Success begins, continues, and ends with the right mindset. A large percentage of people I see who go through a link cleanup process are not aggressive enough about cleaning up their links. They worry about preserving some of that hard-won link juice they obtained over the years.

You have to start by understanding what a link cleanup process looks like, and just how long it can take. Some of the people I have spoken with have gone through a process like this one:

link removal timeline

In this fictitious timeline example, we see someone who spends four months working on trying to recover, and at the end of it all, they have not been successful.
A lot of time and money have been spent, and they have nothing to show for it. Then, the people at Google get frustrated and send them a message that basically tells them they are not getting it. At this point, they have no idea when they will be able to recover. The result is that the complete process might end up taking six months or more.

In contrast, imagine someone who is far more aggressive in removing and disavowing links. They are so aggressive that 20 percent of the links they cut out are actually ones that Google has not currently judged as being bad. They also start on March 9, and by April 30, the penalty has been lifted on their site.

Now they can begin rebuilding their business, five or months sooner than the person who does not take as aggressive an approach. Yes, they cut out some links that Google was not currently penalizing, but this is a small price to pay for getting your penalty cleared five months sooner. In addition, using our mindset-based approach, the 20 percent of links we cut out were probably not links that were helping much anyway, and that Google might also take action on them in the future.

Now that you understand the approach, it’s time to make the commitment. You have to make the decision that you are going to do whatever it takes to get this done, and that getting it done means cutting hard and deep, because that’s what will get you through it the fastest. Once you’ve got your head on straight about what it will take and have summoned the courage to go through with it, then and only then, you’re ready to do the work. Now let’s look at what that work entails.

Obtaining link data

We use four sources of data for links:

  1. Google Webmaster Tools
  2. Open Site Explorer
  3. Majestic SEO
  4. ahrefs

You will want to pull in data from all four of these sources, get them into one list, and then dedupe them to create a master list. Focus only on followed links as well, as nofollowed links are not an issue. The overall process is shown here:

pulling a link set

One other simplification is also possible at this stage. Once you have obtained a list of the followed links, there is another thing you can do to dramatically simplify your life.
You don’t need to look at every single link.

You do need to look at a small sampling of links from every domain that links to you. Chances are that this is a significantly smaller quantity of links to look at than all links. If a domain has 12 links to you, and you look at three of them, and any of those are bad, you will need to disavow the entire domain anyway.

I take the time to emphasize this because I’ve seen people with more than 1 million inbound links from 10,000 linking domains. Evaluating 1 million individual links could take a lifetime. Looking at 10,000 domains is not small, but it’s 100 times smaller than 1 million. But here is where the mindset comes in.
Do examine every domain.

This may be a grinding and brutal process, but there is no shortcut available here. What you don’t look at will hurt you. The sooner you start on the entire list, the sooner you will get the job done.

How to evaluate links

Now that you have a list, you can get to work. This is a key part where having the right mindset is critical. The first part of the process is really quite simple. You need to eliminate each and every one of these types of links:

  1. Article directory links
  2. Links in forum comments, or their related profiles
  3. Links in blog comments, or their related profiles
  4. Links from countries where you don’t operate/sell your products
  5. Links from link sharing schemes such as Link Wheels
  6. Any links you know were paid for

Here is an example of a foreign language link that looks somewhat out of place:

foreign language link

For the most part, you should also remove any links you have from web directories. Sure, if you have a link from DMOZ, Business.com, or BestofTheWeb.com, and the most important one or two directories dedicated to your market space, you can probably keep those.

For a decade I have offered people a rule for these types of directories, which is “no more than seven links from directories.” Even the good ones carry little to no value, and the bad ones can definitely hurt you. So there is absolutely no win to be had running around getting links from a bunch of directories, and there is no win in trying to keep them during a link cleanup process.

Note that I am NOT talking about local business directories such as Yelp, CityPages, YellowPages, SuperPages, etc. Those are a different class of directory that you don’t need to worry about. But general purpose web directories are, generally speaking, a poison.

Rich anchor text

Rich anchor text has been the downfall of many a publisher. Here is one of my favorite examples ever of rich anchor text:

The author wanted the link to say “buy cars,” but was too lazy to fit the two words into the same sentence! Of course, you may have many guest posts that you have written that are not nearly as obvious as this one. One great way to deal with that is to take your list of links that you built and sort them by URL and look at the overall mix of anchor text. You know it’s a problem if it looks anything like this:

overly optimized anchor text

The problem with the distribution in the above image is that the percentage of links that are non “rich” in nature is way too small. In the real world, most people don’t conveniently link to you using one of your key money phrases. Some do, but it’s normally a small percentage.

Other types of bad links

There is no way for me to cover every type of bad link in this post, but here are other types of links, or link scenarios, to be concerned about:

  1. If a large percentage of your links are coming from over on the right rail of sites, or in the footers of sites
  2. If there are sites that give you a site-wide link, or a very large number of links from one domain
  3. Links that come from sites whose IP address is identical in the A block, B block, and C block (read more about what these are here)
  4. Links from crappy sites

The definition of a crappy site may seem subjective, but if a site has not been updated in a while, or its information is of poor quality, or it just seems to have no one who cares about it, you can probably consider it a crappy site. Remember our discussion on mindset. Your objective is to be harsh in cleaning up your links.

In fact, the most important principle in evaluating links is this:
If you can argue that it’s a good link, it’s NOT. You don’t have to argue for good quality links. To put it another way, if they are not obviously good, then out they go!

Quick case study anecdote: I know of someone who really took a major knife to their backlinks. They removed and/or disavowed every link they had that was below a Moz Domain Authority of 70. They did not even try to justify or keep any links with lower DA than that. It worked like a champ. The penalty was lifted. If you are willing to try a hyper-aggressive approach like this one, you can avoid all the work evaluating links I just outlined above. Just get the Domain Authority data for all the links pointing to your site and bring out the hatchet.

No doubt that they ended up cutting out a large number of links that were perfectly fine, but their approach was way faster than doing the complete domain by domain analysis.

Requesting link removals

Why is it that we request link removals? Can’t we just build a
disavow file and submit that to Google? In my experience, for manual link penalties, the answer to this question is no, you can’t. (Note: if you have been hit by Penguin, and not a manual link penalty, you may not need to request link removals.)

Yes, disavowing a link is supposed to tell Google that you don’t want to receive any PageRank, or benefit, from it. However, there is a human element at play here.
Google likes to see that you put some effort into cleaning up the bad links that you have gotten that led to your penalty. The more bad links you have, the more important this becomes.

This does make the process a lot more expensive to get through, but if you approach this with the “whatever it takes” mindset, you dive into the requesting link removal process and go ahead and get it done.

I usually have people go through three rounds of requests asking people to remove links. This can be a very annoying process for those receiving your request, so you need to be aware of that. Don’t start your email with a line like “Your site is causing mine to be penalized …”, as that’s just plain offensive.

I’d be honest, and tell them “Hey, we’ve been hit by a penalty, and as part of our effort to recover we are trying to get many of the links we have gotten to our site removed. We don’t know which sites are causing the problem, but we’d appreciate your help …”

Note that some people will come back to you and ask for money to remove the link. Just ignore them, and put their domains in your disavow file.

Once you are done with the overall removal requests, and had whatever success you have had, take the rest of the domains and disavow them. There is a complete guide to
creating a disavow file here. The one incremental tip I would add is that you should nearly always disavow entire domains, not just the individual links you see.

This is important because even with the four tools we used to get information on as many links as we could, we still only have a subset of the total links. For example, the tools may have only seen one link from a domain, but in fact you have five. If you disavow only the one link, you still have four problem links, and that will torpedo your reconsideration request.

Disavowing the domain is a better-safe-than-sorry step you should take almost every time. As I illustrated at the beginning of this post, adding extra cleanup/reconsideration request loops is very expensive for your business.

The overall process

When all is said and done, the process looks something like this:

link removal process

If you run this process efficiently, and you don’t try to cut corners, you might be able to get out from your penalty in a single pass through the process. If so, congratulations!

What about tools?

There are some fairly well-known tools that are designed to help you with the link cleanup process. These include
Link Detox and Remove’em. In addition, at STC we have developed our own internal tool that we use with our clients.

These tools can be useful in flagging some of your links, but they are not comprehensive—they will help identify some really obvious offenders, but the great majority of links you need to deal with and remove/disavow are not identified. Plan on investing substantial manual time and effort to do the heavy lifting of a comprehensive review of all your links. Remember the “mindset.”

Summary

As I write this post, I have this sense of being heartless because I outline an approach that is often grueling to execute. But consider it tough love. Recovering from link penalties is indeed brutal.
In my experience, the winners are the ones who come with meat cleaver in hand, don’t try to cut corners, and take on the full task from the very start, no matter how extensive an effort it may be.

Does this type of process succeed? You bet. Here is an example of a traffic chart from a successful recovery:

manual penalty recovery graph

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

A Universal SEO Strategy Audit in 5 Steps – Whiteboard Friday

Posted by randfish

When it comes to building an SEO strategy, many marketers (especially those who don’t spend a significant amount of time with SEO) start off by asking a few key questions. That’s a good start, but only if you’re asking the right questions. In today’s Whiteboard Friday, Rand puts the usual suspects on the chopping block, showing us the five things we should really be looking into when formulating our SEO strategy.

For reference, here’s a still of this week’s whiteboard!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about building an SEO strategy and having a universal set of five questions that can get you there.

So number one: What keywords do you want to rank for?


Number two
: How do we get links?


Number three
: Site speed. Mobile? Doesn’t even seem like a question.


Number four
: What about Penguin and Panda?


Number five
: When do I get money?

This is bologna. That’s not a strategy. Some of those go to tactics you might invest in an SEO, but this is not an SEO strategy. Unfortunately, this is how a lot of conversations about SEO start at teams, with CMOs, with managers, with CEOs, with clients or potential clients, and it’s very frustrating because you can’t truly do a great job with SEO just in the tactical level. If you don’t start with a compelling strategy, doing all of these things is only going to produce a small amount of potential return compared to if you ask the right questions and you get your strategy set before you begin an SEO process and nailing your tactics.

So that’s what I want to go through. I spend a lot of time thinking through these things and analyzing a lot of posts that other people have put up and questions that folks have put in our Q&A system and others, on Quora and other places. I think actually every great SEO strategy that I have ever seen can be the distilled down to answers that come from these five questions.

So number one: What does our organization create that helps solve searchers’ questions or problems? That could be, “Or what will we create in the future?” It might be that you haven’t yet created the thing or things that’s going to help solve searchers’ questions or problems. But that thing that you make, that product or service or content that you are making, that expertise that you hold, something about your organization is creating value that if only searchers could access it, they would be immensely thankful.

It is possible, and I have seen plenty of examples of companies that are so new or so much on the cutting edge that they’re producing things that aren’t solving questions people are asking yet. The problem that you’re solving then is not a question. It’s not something that’s being searched for directly. It usually is very indirect. If you’re creating a machine that, let’s say, turns children’s laughter into energy, as they do in the film “Monsters, Inc.”, that is something very new. No one is searching for machine to turn kids laughing into energy. However, many people are searching for alternative energy. They’re searching for broader types of things and concepts. By the way, if you do invent that machine, I think you’ll probably nail a lot of that interest level stuff.

If you have a great answer to this, you can then move on to, “What is the unique value we provide that no one else does?” We talked about unique value previously on Whiteboard Friday. There’s a whole episode you can watch about that. Basically, if everyone else out there is producing X and X+1 and X+2, you’ve either got to be producing X times 10, or you’ve got to be producing Y, something that is highly unique or is unique because it is of such better quality, such greater quality. It does the job so much better than anything else out there. It’s not, “Hey, we’re better than the top ten search results.” It’s, “Why are you ten times better than anything on this list?”

The third question is, “Who’s going to help amplify our message, and why will they do it?” This is essential because SEO has turned from an exercise, where we essentially take content that already exists or create some content that will solve a searcher problem and then try and acquire links to it, or point links to it, or point ranking signals at it, and instead it’s ones where we have to go out and earn those ranking signals. Because we’ve shifted from link building or ranking signal building to ranking signal earning, we better have people who will help amplify our message, the content that we create, the value that we provide, the service or the product, the message about our brand.

If we don’t have those people who, for some reason, care enough about what we’re doing to help share it with others, we’re going to be shouting into a void. We’re going to get no return on the investment of broadcasting our message or reaching out one to one, or sharing on social media, or distributing. It’s not going to work. We need that amplification. There must be some of it, and because we need amplification in order to earn these ranking signals, we need an answer to who.

That who is going to depend highly on your target audience, your target customers, and who influences your target customers, which may be a very different group than other customers just like them. There are plenty of businesses in industries where your customers will be your worst amplifiers because they love you and they don’t want to share you with anyone else. They love whatever product or service you’re providing, and they want to keep you all to themselves. By the way, they’re not on social media, and they don’t do sharing. So you need another level above them. You need press or bloggers or social media sharers, somebody who influences your target audience.

Number four: What is our process for turning visitors from search into customers? If you have no answer to this, you can’t expect to earn search visits and have a positive return on your investment. You’ve got to be building out that funnel that says, “Aha, people have come to us through channel X, search, social media, e-mail, directly visited, referred from some other website, through business development, through conference or trade show, whatever it is. Then they come back to our website. Then they sign up for an e-mail. Then they make a conversion. How does that work? What does our web-marketing funnel look like? How do we take people that visited our site for the first time from search, from a problem or a question that they had that we answered, and now how do they become a customer?” If you don’t have that process yet, you must build it. That’s part of a great SEO strategy. Then optimization of this is often called conversion rate optimization.

The last question, number five: How do we expose what we do that provides value here in a way that engines can easily crawl, index, understand, and show off? This is getting to much more classic SEO stuff. For many companies they have something wonderful that they’ve built, but it’s just a mobile app or a web app that has no physical URL structure that anyone can crawl and be exposed to, or it’s a service based business.

Let’s say it’s legal services firm. How are we going to turn the expertise of our legal team into something that engines can perceive? Maybe we have the answers to these questions, but we need to find some way to show it off, and that’s where content creation comes into play. So we don’t just need content that is good quality content that can be crawled and indexed. It also must be understood, and this ties a little bit to things we’ve talked about in the past around Hummingbird, where it’s clear that the content is on the topic and that it really answers the searchers’ underlying question, not just uses the keywords the searcher is using. Although, using the keywords is still important from a classic SEO perspective.

Then show off that content is, “How do we do a great job of applying rich snippets, of applying schema, of having a very compelling title and description and URL, of getting that ranked highly, of learning what our competitors are doing that we can uniquely differentiate from them in the search results themselves so that we can improve our click-through rates,” all of those kinds of things.

If you answer these five questions, or if your customer, your client, your team, your boss already has great answers to these five questions, then you can start getting pretty tactical and be very successful. If you don’t have answers to these yet, go get them. Make them explicit, not just implicit. Don’t just assume you know what they are. Have them list them. Make sure everyone on the team, everyone in the SEO process has bought into, “Yes, these are the answers to those five questions that we have. Now, let’s go do our tactics.” I think you’ll find you’re far more successful with any type of SEO project or investment.

All right gang, thanks so much for joining us on Whiteboard Friday, and we’ll see you again next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

5 Years of SEO Changes and Better Goal-Setting – Videos from MozTalk 2

Posted by CharleneKate

Have you ever noticed how Rand is often speaking at conferences all around the world? Well, we realized that those of us here in Seattle rarely get to see them. So we started MozTalks, a free event here at the MozPlex.

It normally runs 2-3 hours with multiple speakers, one of whom is Rand. The event is hosted at the Moz HQ and offers time for mingling, appetizers, refreshments and of course, swag. The series is still evolving as we continue to test out new ideas (maybe taking the show on the road), so be on the lookout for any updates.

The world of marketing and SEO continues to change, but are you changing with it? Staying on the cutting edge should always be a priority, especially since early adoption has proven more beneficial than ever. Sticking with what works isn’t enough anymore, and marketing isn’t just about analyzing our successes and failures or understanding our returns and losses. It’s about what we do next.

In the presentations below, Rand and Dr. Pete will dive deep into where metrics serve us best, as well as what really works to drive traffic and what’s better left behind.

Rand: What Changed? A Brief Look at How SEO has Evolved over the Last 5 Years

2014-11-18 – MozTalk – Rand – What Changed?

Dr. Pete: From Lag to Lead: Actionable Analytics

2014-11-18 – MozTalk – Dr. Pete – From Lag to Lead

Top takeaways

We asked both presenters for a few of their top takeaways from their talks, and they’ve got some gems. Here’s what they had to say:

From Rand

  • Keyword matching has become intent matching, which doesn’t mean we should avoid using keywords, but it does mean we need to change the way we determine which pages to build, which to canonicalize, and how to structure our sites and content.
  • The job title “SEO” may be limiting the influence we have, and we may need broader authority to impact SEO in the modern era. The onus is on marketers to make teams, clients, and execs aware of these new requirements, so they understand what we need to do in order to grow search traffic.
  • Webspam has gone from Google’s problem to our problem. The onus is on marketers to stay wary and up-to-date with how Google is seeing their links and their site.

From Dr. Pete

  • As content marketers, we can’t afford to see only the forest or the trees. We have to understand a wide variety of metrics, and combine them in new and insightful ways.
  • We have to stop looking backward using lag goals like “Get 100,000 Likes in Q4.” They aren’t actionable, and succeed or fail, we have no way to repeat success. We have to focus on objectives that drive specific, measurable actions.

Missed the previous talk?

The first MozTalk featured Rand and his wife Geraldine, known in the blogosphere as The Everywhereist. Rand covered what bloggers need to know about SEO, and Geraldine talked about how to make your blog audience fall in love with you. Check them both out here:

Need-to-Know SEO and Making Your Blog Audience Fall in Love – Videos from MozTalk 1

Join us for the next one

Our next free MozTalk is set for 
Thursday, April 2nd, and we’re still finalizing plans. We’ll be sure to post the videos on this blog for those of you who can’t make it, but if you’re in town, keep your eyes open for more details. We hope to see you there!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from moz.com

12 Common Reasons Reconsideration Requests Fail

Posted by Modestos

There are several reasons a reconsideration request might fail. But some of the most common mistakes site owners and inexperienced SEOs make when trying to lift a link-related Google penalty are entirely avoidable. 

Here’s a list of the top 12 most common mistakes made when submitting reconsideration requests, and how you can prevent them.

1. Insufficient link data

This is one of the most common reasons why reconsideration requests fail. This mistake is readily evident each time a reconsideration request gets rejected and the example URLs provided by Google are unknown to the webmaster. Relying only on Webmaster Tools data isn’t enough, as Google has repeatedly said. You need to combine data from as many different sources as possible. 

A good starting point is to collate backlink data, at the very least:

  • Google Webmaster Tools (both latest and sample links)
  • Bing Webmaster Tools
  • Majestic SEO (Fresh Index)
  • Ahrefs
  • Open Site Explorer

If you use any toxic link-detection services (e.g., Linkrisk and Link Detox), then you need to take a few precautions to ensure the following:

  • They are 100% transparent about their backlink data sources
  • They have imported all backlink data
  • You can upload your own backlink data (e.g., Webmaster Tools) without any limitations

If you work on large websites that have tons of backlinks, most of these automated services are very likely used to process just a fraction of the links, unless you pay for one of their premium packages. If you have direct access to the above data sources, it’s worthwhile to download all backlink data, then manually upload it into your tool of choice for processing. This is the only way to have full visibility over the backlink data that has to be analyzed and reviewed later. Starting with an incomplete data set at this early (yet crucial) stage could seriously hinder the outcome of your reconsideration request.

2. Missing vital legacy information

The more you know about a site’s history and past activities, the better. You need to find out (a) which pages were targeted in the past as part of link building campaigns, (b) which keywords were the primary focus and (c) the link building tactics that were scaled (or abused) most frequently. Knowing enough about a site’s past activities, before it was penalized, can help you home in on the actual causes of the penalty. Also, collect as much information as possible from the site owners.

3. Misjudgement

Misreading your current situation can lead to wrong decisions. One common mistake is to treat the example URLs provided by Google as gospel and try to identify only links with the same patterns. Google provides a very small number of examples of unnatural links. Often, these examples are the most obvious and straightforward ones. However, you should look beyond these examples to fully address the issues and take the necessary actions against all types of unnatural links. 

Google is very clear on the matter: “Please correct or remove all inorganic links, not limited to the samples provided above.

Another common area of bad judgement is the inability to correctly identify unnatural links. This is a skill that requires years of experience in link auditing, as well as link building. Removing the wrong links won’t lift the penalty, and may also result in further ranking drops and loss of traffic. You must remove the right links.


4. Blind reliance on tools

There are numerous unnatural link-detection tools available on the market, and over the years I’ve had the chance to try out most (if not all) of them. Because (and without any exception) I’ve found them all very ineffective and inaccurate, I do not rely on any such tools for my day-to-day work. In some cases, a lot of the reported “high risk” links were 100% natural links, and in others, numerous toxic links were completely missed. If you have to manually review all the links to discover the unnatural ones, ensuring you don’t accidentally remove any natural ones, it makes no sense to pay for tools. 

If you solely rely on automated tools to identify the unnatural links, you will need a miracle for your reconsideration request to be successful. The only tool you really need is a powerful backlink crawler that can accurately report the current link status of each URL you have collected. You should then manually review all currently active links and decide which ones to remove. 

I could write an entire book on the numerous flaws and bugs I have come across each time I’ve tried some of the most popular link auditing tools. A lot of these issues can be detrimental to the outcome of the reconsideration request. I have seen many reconsiderations request fail because of this. If Google cannot algorithmically identify all unnatural links and must operate entire teams of humans to review the sites (and their links), you shouldn’t trust a $99/month service to identify the unnatural links.

If you have an in-depth understanding of Google’s link schemes, you can build your own process to prioritize which links are more likely to be unnatural, as I described in this post (see sections 7 & 8). In an ideal world, you should manually review every single link pointing to your site. Where this isn’t possible (e.g., when dealing with an enormous numbers of links or resources are unavailable), you should at least focus on the links that have the more “unnatural” signals and manually review them.

5. Not looking beyond direct links

When trying to lift a link-related penalty, you need to look into all the links that may be pointing to your site directly or indirectly. Such checks include reviewing all links pointing to other sites that have been redirected to your site, legacy URLs with external inbound links that have been internally redirected owned, and third-party sites that include cross-domain canonicals to your site. For sites that used to buy and redirect domains in order increase their rankings, the quickest solution is to get rid of the redirects. Both Majestic SEO and Ahrefs report redirects, but some manual digging usually reveals a lot more.

PQPkyj0.jpg

6. Not looking beyond the first link

All major link intelligence tools, including Majestic SEO, Ahrefs and Open Site Explorer, report only the first link pointing to a given site when crawling a page. This means that, if you overly rely on automated tools to identify links with commercial keywords, the vast majority of them will only take into consideration the first link they discover on a page. If a page on the web links just once to your site, this is not big deal. But if there are multiple links, the tools will miss all but the first one.

For example, if a page has five different links pointing to your site, and the first one includes a branded anchor text, these tools will just report the first link. Most of the link-auditing tools will in turn evaluate the link as “natural” and completely miss the other four links, some of which may contain manipulative anchor text. The more links that get missed this way the more likely your reconsideration request will fail.

7. Going too thin

Many SEOs and webmasters (still) feel uncomfortable with the idea of losing links. They cannot accept the idea of links that once helped their rankings are now being devalued, and must be removed. There is no point trying to save “authoritative”, unnatural links out of fear of losing rankings. If the main objective is to lift the penalty, then all unnatural links need to be removed.

Often, in the first reconsideration request, SEOs and site owners tend to go too thin, and in the subsequent attempts start cutting deeper. If you are already aware of the unnatural links pointing to your site, try to get rid of them from the very beginning. I have seen examples of unnatural links provided by Google on PR 9/DA 98 sites. Metrics do not matter when it comes to lifting a penalty. If a link is manipulative, it has to go.

In any case, Google’s decision won’t be based only on the number of links that have been removed. Most important in the search giant’s eyes are the quality of links still pointing to your site. If the remaining links are largely of low quality, the reconsideration request will almost certainly fail. 

8. Insufficient effort to remove links

Google wants to see a “good faith” effort to get as many links removed as possible. The higher the percentage of unnatural links removed, the better. Some agencies and SEO consultants tend to rely too much on the use of the disavow tool. However, this isn’t a panacea, and should be used as a last resort for removing those links that are impossible to remove—after exhausting all possibilities to physically remove them via the time-consuming (yet necessary) outreach route. 

Google is very clear on this:

m4M4n3g.jpg?1

Even if you’re unable to remove all of the links that need to be removed, you must be able to demonstrate that you’ve made several attempts to have them removed, which can have a favorable impact on the outcome of the reconsideration request. Yes, in some cases it might be possible to have a penalty lifted simply by disavowing instead of removing the links, but these cases are rare and this strategy may backfire in the future. When I reached out to ex-googler Fili Wiese’s for some advice on the value of removing the toxic links (instead of just disavowing them), his response was very straightforward:

V3TmCrj.jpg 

9. Ineffective outreach

Simply identifying the unnatural links won’t get the penalty lifted unless a decent percentage of the links have been successfully removed. The more communication channels you try, the more likely it is that you reach the webmaster and get the links removed. Sending the same email hundreds or thousands of times is highly unlikely to result in a decent response rate. Trying to remove a link from a directory is very different from trying to get rid of a link appearing in a press release, so you should take a more targeted approach with a well-crafted, personalized email. Link removal request emails must be honest and to the point, or else they’ll be ignored.

Tracking the emails will also help in figuring out which messages have been read, which webmasters might be worth contacting again, or alert you of the need to try an alternative means of contacting webmasters.

Creativity, too, can play a big part in the link removal process. For example, it might be necessary to use social media to reach the right contact. Again, don’t trust automated emails or contact form harvesters. In some cases, these applications will pull in any email address they find on the crawled page (without any guarantee of who the information belongs to). In others, they will completely miss masked email addresses or those appearing in images. If you really want to see that the links are removed, outreach should be carried out by experienced outreach specialists. Unfortunately, there aren’t any shortcuts to effective outreach.

10. Quality issues and human errors

All sorts of human errors can occur when filing a reconsideration request. The most common errors include submitting files that do not exist, files that do not open, files that contain incomplete data, and files that take too long to load. You need to triple-check that the files you are including in your reconsideration request are read-only, and that anyone with the URL can fully access them. 

Poor grammar and language is also bad practice, as it may be interpreted as “poor effort.” You should definitely get the reconsideration request proofread by a couple of people to be sure it is flawless. A poorly written reconsideration request can significantly hinder your overall efforts.

Quality issues can also occur with the disavow file submission. Disavowing at the URL level isn’t recommended because the link(s) you want to get rid of are often accessible to search engines via several URLs you may be unaware of. Therefore, it is strongly recommended that you disavow at the domain or sub-domain level.

11. Insufficient evidence

How does Google know you have done everything you claim in your reconsideration request? Because you have to prove each claim is valid, you need to document every single action you take, from sent emails and submitted forms, to social media nudges and phone calls. The more information you share with Google in your reconsideration request, the better. This is the exact wording from Google:

“ …we will also need to see good-faith efforts to remove a large portion of inorganic links from the web wherever possible.”

12. Bad communication

How you communicate your link cleanup efforts is as essential as the work you are expected to carry out. Not only do you need to explain the steps you’ve taken to address the issues, but you also need to share supportive information and detailed evidence. The reconsideration request is the only chance you have to communicate to Google which issues you have identified, and what you’ve done to address them. Being honest and transparent is vital for the success of the reconsideration request.

There is absolutely no point using the space in a reconsideration request to argue with Google. Some of the unnatural links examples they share may not always be useful (e.g., URLs that include nofollow links, removed links, or even no links at all). But taking the argumentative approach veritably guarantees your request will be denied.

54adb6e0227790.04405594.jpg
Cropped from photo by Keith Allison, licensed under Creative Commons.

Conclusion

Getting a Google penalty lifted requires a good understanding of why you have been penalized, a flawless process and a great deal of hands-on work. Performing link audits for the purpose of lifting a penalty can be very challenging, and should only be carried out by experienced consultants. If you are not 100% sure you can take all the required actions, seek out expert help rather than looking for inexpensive (and ineffective) automated solutions. Otherwise, you will almost certainly end up wasting weeks or months of your precious time, and in the end, see your request denied.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from moz.com

The Best of 2014: Top People and Posts from the Moz Blog

Posted by Trevor-Klein

At the end of every year, we compile a list of the very best posts and most popular and prolific people that have been published on the Moz Blog and YouMoz. It’s a really fun way to look back on what happened this year, and an insight-packed view of what really resonates with our readers.

Here’s what we’ve got in store:

  1. Top Moz Blog posts by 1Metric score
  2. Top Moz Blog posts by unique visits
  3. Top YouMoz Blog posts by unique visits
  4. Top Moz Blog posts by number of thumbs up
  5. Top Moz Blog posts by number of comments
  6. Top Moz Blog posts by number of linking root domains
  7. Top comments from our community by number of thumbs up
  8. Top commenters from our community by total number of thumbs up

A huge thanks goes to Dr. Pete Meyers and Cyrus Shepard; their help cut the amount of time creating this piece consumed in half.

We hope you enjoy the look back at the past year, and wish you a very happy start to 2015!

1. Top Moz Blog posts by 1Metric score

Earlier this year, we created a new metric to evaluate the success of our blog posts, calling it “the one metric” in a nod to The Lord of the Rings. We even
wrote about it on this blog. With the help and feedback of many folks in the community as well as some refinement of our own, we’ve now polished the metric, changed the spelling a bit, applied it retroactively to older posts, and are using it regularly in-house. The following posts are those with the highest scores, representing the 10 posts that saw the most overall success this year. In case there was any doubt, Cyrus really (really) knows what he’s doing.

1. More than Keywords: 7 Concepts of Advanced On-Page SEO
October 21 – Posted by Cyrus Shepard
As marketers, helping search engines understand what our content means is one of our most important tasks. Search engines can’t read pages like humans can, so we incorporate structure and clues as to what our content means. This post explores a series of on-page techniques that not only build upon one another, but can be combined in sophisticated ways.

Dr-Pete

2. New Title Tag Guidelines & Preview Tool
March 20 – Posted by Dr. Peter J. Meyers
Google’s 2014 redesign had a big impact on search result titles, cutting them off much sooner. This post includes a title preview tool and takes a data-driven approach to finding the new limit.

MarieHaynes

3. Your Google Algorithm Cheat Sheet: Panda, Penguin, and Hummingbird
June 11 – Posted by Marie Haynes
Do you have questions about the Panda algorithm, the Penguin algorithm, or Hummingbird? This guide explains in lay terms what each of these Google algorithm changes is about and how to improve your site so that it looks better in the eyes of the big G.

4. 12 Ways to Increase Traffic From Google Without Building Links
March 11 – Posted by Cyrus Shepard
The job of the Technical SEO becomes more complex each year, but we also have more opportunities now than ever. Here are 12 ways you can improve your rankings without relying on link building.

OliGardner

5. The Most Entertaining Guide to Landing Page Optimization You’ll Ever Read
May 20 – Posted by Oli Gardner
If you’ve ever been bored while reading a blog post, your life just got better. If you’ve ever wanted to learn about conversion rate optimization, and how to design high-converting landing pages, without falling asleep, you’re in the right place. Buckle up, and prepare to be entertained in your learning regions.

6. Illustrated Guide to Advanced On-Page Topic Targeting for SEO
November 17 – Posted by Cyrus Shepard
The concepts of advanced on-page SEO are dizzying: LDA, co-occurrence, and entity salience. The question is “How can I easily incorporate these techniques into my content for higher rankings?” The truth is, you can create optimized pages that rank well without understanding complex algorithms.

josh_bachynski

7. Panda 4.1 Google Leaked Dos and Don’ts – Whiteboard Friday
December 05 – Posted by Josh Bachynski
Panda is about so much more than good content. Let Josh Bachynski give you the inside information on the highlights of what you should (and should not) be doing.

8. 10 Smart Tips to Leverage Google+ for Increased Web Traffic
April 15 – Posted by Cyrus Shepard
While not everyone has an audience active on Google+, the number of people who interact socially with any Google products on a monthly basis now reportedly exceeds 500 million.

9. The Rules of Link Building – Whiteboard Friday
April 04 – Posted by Cyrus Shepard
Google is increasingly playing the referee in the marketing game, and many marketers are simply leaving instead of playing by the rules. In today’s Whiteboard Friday, Cyrus Shepard takes a time-out to explain a winning strategy.

gfiorelli1

10. The Myth of Google’s 200 Ranking Factors
September 30 – Posted by Gianluca Fiorelli
Nothing like the “The 200 Google Ranking Factors” actually exists. It is a myth, and those who claim to be able to offer a final list are its prophets. This post explains how the myth was born and the importance of knowing the stages of search engines’ working process.

2. Top Moz Blog posts by unique visits

The heaviest-weighted ingredient in the 1Metric is unique visits, as one of our primary goals for the Moz Blog is to drive traffic to the rest of the site. With that in mind, we thought it interesting to break things down to just this metric and show you just how different this list is from the last one. Of note: Dr. Pete’s post on Google’s new design for title tags is a nod to the power of evergreen content. That post is one that folks can return to over and over as they fiddle with their own title tags, and amassed more than
twice the traffic of the post in the #2 slot.

Dr-Pete

1. New Title Tag Guidelines & Preview Tool
March 20 – Posted by Dr. Peter J. Meyers
Google’s 2014 redesign had a big impact on search result titles, cutting them off much sooner. This post includes a title preview tool and takes a data-driven approach to finding the new limit.

OliGardner

2. The Most Entertaining Guide to Landing Page Optimization You’ll Ever Read
May 20 – Posted by Oli Gardner
If you’ve ever been bored while reading a blog post, your life just got better. If you’ve ever wanted to learn about conversion rate optimization, and how to design high-converting landing pages, without falling asleep, you’re in the right place. Buckle up, and prepare to be entertained in your learning regions.

3. 12 Ways to Increase Traffic From Google Without Building Links
March 11 – Posted by Cyrus Shepard
The job of the Technical SEO becomes more complex each year, but we also have more opportunities now than ever. Here are 12 ways you can improve your rankings without relying on link building.

briancarter

4. Why Every Business Should Spend at Least $1 per Day on Facebook Ads
February 19 – Posted by Brian Carter
For the last three years I’ve constantly recommended Facebook ads. I recommend them to both B2C and B2B businesses. I recommend them to local theaters and comedians here in Charleston, SC. I recommend them to everyone who wants to grow awareness about anything they’re doing. Here’s why.

5. More than Keywords: 7 Concepts of Advanced On-Page SEO
October 21 – Posted by Cyrus Shepard
As marketers, helping search engines understand what our content means is one of our most important tasks. Search engines can’t read pages like humans can, so we incorporate structure and clues as to what our content means. This post explores a series of on-page techniques that not only build upon one another, but can be combined in sophisticated ways.

MarieHaynes

6. Your Google Algorithm Cheat Sheet: Panda, Penguin, and Hummingbird
June 11 – Posted by Marie Haynes
Do you have questions about the Panda algorithm, the Penguin algorithm, or Hummingbird? This guide explains in lay terms what each of these Google algorithm changes is about and how to improve your site so that it looks better in the eyes of the big G.

Chad_Wittman

7. Make Facebook’s Algorithm Change Work For You, Not Against You
January 23 – Posted by Chad Wittman
Recently, many page admins have been experiencing a significant decrease in Total Reach—specifically, organic reach. For pages that want to keep their ad budget as low as possible, maximizing organic reach is vital. To best understand how to make a change like this work for you, and not against you, we need to examine what happened—and what you can do about it.

n8ngrimm

8. How to Rank Well in Amazon, the US’s Largest Product Search Engine
June 04 – Posted by Nathan Grimm
The eCommerce SEO community is ignoring a huge opportunity by focusing almost exclusively on Google. Amazon has roughly three times more search volume for products, and this post tells you all about how to rank.

iPullRank

9. Personas: The Art and Science of Understanding the Person Behind the Visit
January 29 – Posted by Michael King
With the erosion of keyword intelligence and the move to strings-not-things for the user, Google is pushing all marketers to focus more on their target audience. This post will teach you how to understand that audience, the future of Google, and how to build data-driven personas step by step.

Dr-Pete

10. Panda 4.0, Payday Loan 2.0 & eBay’s Very Bad Day
May 21 – Posted by Dr. Peter J. Meyers
Preliminary analysis of the Panda 4.0 and Payday Loan 2.0 updates, major algorithm flux on May 19th, and a big one-day rankings drop for eBay.

3. Top YouMoz Blog posts by unique visits

One of our favorite parts of the Moz community is the YouMoz Blog, where our community members can submit their own posts for potential publishing here on our site. We’re constantly impressed by what we’re sent. These 10 posts all received such high praise that they were promoted to the main Moz Blog, but they all started out as YouMoz posts. 

Chad_Wittman

1. Make Facebook’s Algorithm Change Work For You, Not Against You
January 23 – Posted by Chad Wittman
Recently, many page admins have been experiencing a significant decrease in Total Reach—specifically, organic reach. For pages that want to keep their ad budget as low as possible, maximizing organic reach is vital. To best understand how to make a change like this work for you, and not against you, we need to examine what happened—and what you can do about it.

Carla_Dawson

2. Parallax Scrolling Websites and SEO – A Collection of Solutions and Examples
April 01 – Posted by Carla Dawson
I have observed that there are many articles that say parallax scrolling is not ideal for search engines. Parallax Scrolling is a design technique and it is ideal for search engines if you know how to apply it. I have collected a list of great tutorials and real SEO-friendly parallax websites to help the community learn how to use both techniques together.

Jeffalytics

3. (Provided): 10 Ways to Prove SEO Value in Google Analytics
February 25 – Posted by Jeff Sauer
We and our clients have relied on keyword reports for so long that we’re now using (not provided) as a crutch. This post offers 10 ways you can use Google Analytics to prove your SEO value now that those keywords are gone.

danatanseo

4. How to Set Up and Use Twitter Lead Generation Cards in Your Tweets for Free!
May 07 – Posted by Dana Tan
Working as an in-house SEO strategist for a small business forces me to get “scrappy” every day with tools and techniques. I’m constantly on the lookout for an opportunity that can help my company market to broader audiences for less money. Here’s how to set up your Twitter Cards for free!

Amanda_Gallucci

5. 75 Content Starters for Any Industry
February 06 – Posted by Amanda Gallucci
Suffering from blank page anxiety? Before you go on the hunt for inspiration all over the Internet and elsewhere, turn to the resources around you. Realize that you can create exceptional content with what you already have at hand.

nicoleckohler

6. The Hidden Power of Nofollow Links
June 08 – Posted by Nicole Kohler
For those of us who are trying to earn links for our clients, receiving a nofollow link can feel like a slap in the face. But these links have hidden powers that make them just as important as followed ones. Here’s why nofollow links are more powerful than you might think.

YonDotan

7. A Startling Case Study of Manual Penalties and Negative SEO
March 17 – Posted by Yonatan Dotan
One day in my inbox I found the dreaded notice from Google that our client had a site-wide manual penalty for unnatural inbound links. We quickly set up a call and went through the tooth-rattling ordeal of explaining to our client that they weren’t even ranked for their brand name. Organic traffic dropped by a whopping 94% – and that for a website that gets 66% of its traffic from Google-based organic search.

malditojavi

8. How PornHub Is Bringing its A-Game (SFW)
July 23 – Posted by Javier Sanz
Despite dealing with a sensitive subject, PornHub is doing a great job marketing itself. This (safe-for-work) post takes a closer look at what they are doing.

ajfried

9. Storytelling Through Data: A New Inbound Marketing & SEO Report Structure
January 07 – Posted by Aaron Friedman
No matter what business you are in, it’s a pretty sure thing that someone is going to want to monitor how efficiently and productively you are working. Being able to show these results over time is crucial to maintaining the health of the long term relationship.

robinparallax

10. The Art of Thinking Sideways: Content Marketing for “Boring” Businesses
April 08 – Posted by Robin Swire
In this article, I’ll examine the art of thinking sideways for one of the slightly more tricky marketing clients I’ve worked with. I hope that this will provide an insight for fellow content marketers and SEOs in similar scenarios.

4. Top Moz Blog posts by number of thumbs up

These 10 posts were well enough received that liked that quite a few readers took the time to engage with them, logging in to give their stamp of approval. Whiteboard Fridays are always a hit, and two of them managed to make this list after having been live for less than a month.

1. More than Keywords: 7 Concepts of Advanced On-Page SEO
October 21 – Posted by Cyrus Shepard
As marketers, helping search engines understand what our content means is one of our most important tasks. Search engines can’t read pages like humans can, so we incorporate structure and clues as to what our content means. This post explores a series of on-page techniques that not only build upon one another, but can be combined in sophisticated ways.

Dr-Pete

2. New Title Tag Guidelines & Preview Tool
March 20 – Posted by Dr. Peter J. Meyers
Google’s 2014 redesign had a big impact on search result titles, cutting them off much sooner. This post includes a title preview tool and takes a data-driven approach to finding the new limit.

randfish

3. Dear Google, Links from YouMoz Don’t Violate Your Quality Guidelines
July 23 – Posted by Rand Fishkin
Recently, Moz contributor Scott Wyden, a photographer in New Jersey, received a warning in his Google Webmaster Tools about some links that violated Google’s Quality Guidelines. One example was from moz.com.

MarieHaynes

4. Your Google Algorithm Cheat Sheet: Panda, Penguin, and Hummingbird
June 11 – Posted by Marie Haynes
Do you have questions about the Panda algorithm, the Penguin algorithm, or Hummingbird? This guide explains in lay terms what each of these Google algorithm changes is about and how to improve your site so that it looks better in the eyes of the big G.

randfish

5. Thank You for 10 Incredible Years
October 06 – Posted by Rand Fishkin
It’s been 10 amazing years since Rand started the blog that would turn into SEOmoz and then Moz, and we never could have come this far without you all. You’ll find letters of appreciation from Rand and Sarah in this post (along with a super-cool video retrospective!), and from all of us at Moz, thank you!

6. Illustrated Guide to Advanced On-Page Topic Targeting for SEO
November 17 – Posted by Cyrus Shepard
The concepts of advanced on-page SEO are dizzying: LDA, co-occurrence, and entity salience. The question is “How can I easily incorporate these techniques into my content for higher rankings?” The truth is, you can create optimized pages that rank well without understanding complex algorithms.

josh_bachynski

7. Panda 4.1 Google Leaked Dos and Don’ts – Whiteboard Friday
December 05 – Posted by Josh Bachynski
Panda is about so much more than good content. Let Josh Bachynski give you the inside information on the highlights of what you should (and should not) be doing.

OliGardner

8. The Most Entertaining Guide to Landing Page Optimization You’ll Ever Read
May 20 – Posted by Oli Gardner
If you’ve ever been bored while reading a blog post, your life just got better. If you’ve ever wanted to learn about conversion rate optimization, and how to design high-converting landing pages, without falling asleep, you’re in the right place. Buckle up, and prepare to be entertained in your learning regions.

randfish

9. Does SEO Boil Down to Site Crawlability and Content Quality? – Whiteboard Friday
July 11 – Posted by Rand Fishkin
What does good SEO really mean these days? Rand takes us beyond crawlability and content quality for a peek inside the art and science of the practice.

randfish

10. How to Avoid the Unrealistic Expectations SEOs Often Create – Whiteboard Friday
December 12 – Posted by Rand Fishkin
Making promises about SEO results too often leads to broken dreams and shredded contracts. In today’s Whiteboard Friday, Rand shows us how to set expectations that lead to excitement but help prevent costly misunderstandings.

5. Top Moz Blog posts by number of comments

While the discussions can take a big chunk out of an already busy day, the conversations we get to have with our community members (and the conversations they have with each other) in the comments below our posts is absolutely one of our favorite parts of the blog. These 10 posts garnered quite a bit of discussion (some with a fair amount of controversy), and are fascinating to follow.

1. Take the SEO Expert Quiz and Rule the Internet
May 28 – Posted by Cyrus Shepard
You are master of the keyword. You create 1,000 links with a single tweet. Google engineers ask for your approval before updating their algorithm. You, my friend, are an SEO Expert. Prove it by taking our new SEO Expert Quiz.

2. The Rules of Link Building – Whiteboard Friday
April 04 – Posted by Cyrus Shepard
Google is increasingly playing the referee in the marketing game, and many marketers are simply leaving instead of playing by the rules. In today’s Whiteboard Friday, Cyrus Shepard takes a time-out to explain a winning strategy.

randfish

3. Dear Google, Links from YouMoz Don’t Violate Your Quality Guidelines
July 23 – Posted by Rand Fishkin
Recently, Moz contributor Scott Wyden, a photographer in New Jersey, received a warning in his Google Webmaster Tools about some links that violated Google’s Quality Guidelines. One example was from moz.com.

Dr-Pete

4. New Title Tag Guidelines & Preview Tool
March 20 – Posted by Dr. Peter J. Meyers
Google’s 2014 redesign had a big impact on search result titles, cutting them off much sooner. This post includes a title preview tool and takes a data-driven approach to finding the new limit.

Carla_Dawson

5. SEO Teaching: Should SEO Be Taught at Universities?
October 09 – Posted by Carla Dawson
Despite the popularity and importance of SEO, the field has yet to gain significant traction at the university level other than a few courses here and there offered as part of a broader digital marketing degree. The tide could be turning, however slowly.

6. 12 Ways to Increase Traffic From Google Without Building Links
March 11 – Posted by Cyrus Shepard
The job of the Technical SEO becomes more complex each year, but we also have more opportunities now than ever. Here are 12 ways you can improve your rankings without relying on link building.

evolvingSEO

7. The Broken Art of Company Blogging (and the Ignored Metric that Could Save Us All)
July 22 – Posted by Dan Shure
Company blogging is broken. We’re tricking ourselves into believing they’re successful while ignoring the one signal we have that tells us whether they’re actually working.

MichaelC

8. Real-World Panda Optimization – Whiteboard Friday
August 01 – Posted by Michael Cottam
From the originality of your content to top-heavy posts, there’s a lot that the Panda algorithm is looking for. In today’s Whiteboard Friday, Michael Cottam explains what these things are, and more importantly, what we can do to be sure we get the nod from this particular bear.

EricaMcGillivray

9. Ways to Proactively Welcome Women Into Online Marketing
September 17 – Posted by Erica McGillivray
SEO may be a male-dominated industry, but let’s step out of our biases and work hard to welcome women, and marketers of all stripes, into our community.

10. More than Keywords: 7 Concepts of Advanced On-Page SEO
October 21 – Posted by Cyrus Shepard
As marketers, helping search engines understand what our content means is one of our most important tasks. Search engines can’t read pages like humans can, so we incorporate structure and clues as to what our content means. This post explores a series of on-page techniques that not only build upon one another, but can be combined in sophisticated ways.

6. Top Moz Blog posts by number of linking root domains

What, you thought you’d get to the bottom of the post without seeing a traditional SEO metric? =)

Dr-Pete

1. New Title Tag Guidelines & Preview Tool
March 20 – Posted by Dr. Peter J. Meyers
Google’s 2014 redesign had a big impact on search result titles, cutting them off much sooner. This post includes a title preview tool and takes a data-driven approach to finding the new limit.

Dr-Pete

2. Panda 4.0, Payday Loan 2.0 & eBay’s Very Bad Day
May 21 – Posted by Dr. Peter J. Meyers
Preliminary analysis of the Panda 4.0 and Payday Loan 2.0 updates, major algorithm flux on May 19th, and a big one-day rankings drop for eBay.

iPullRank

3. Personas: The Art and Science of Understanding the Person Behind the Visit
January 29 – Posted by Michael King
With the erosion of keyword intelligence and the move to strings-not-things for the user, Google is pushing all marketers to focus more on their target audience. This post will teach you how to understand that audience, the future of Google, and how to build data-driven personas step by step.

briancarter

4. Why Every Business Should Spend at Least $1 per Day on Facebook Ads
February 19 – Posted by Brian Carter
For the last three years I’ve constantly recommended Facebook ads. I recommend them to both B2C and B2B businesses. I recommend them to local theaters and comedians here in Charleston, SC. I recommend them to everyone who wants to grow awareness about anything they’re doing. Here’s why.

JamesAgate

5. The New Link Building Survey 2014 – Results
July 16 – Posted by James Agate
How has the marketing industry changed its views of link building since last year? James Agate of Skyrocket SEO is back with the results of a brand new survey.

Dr-Pete

6. Google’s 2014 Redesign: Before and After
March 13 – Posted by Dr. Peter J. Meyers
Google’s SERP and ad format redesign may finally be rolling out, after months of testing. Before we lose the old version forever, here’s the before-and-after of every major vertical that’s changed.

7. Google Announces the End of Author Photos in Search: What You Should Know
June 26 – Posted by Cyrus Shepard
Many of us have been constantly advising webmasters to connect their content writers with Google authorship, and it came as a shock when John Mueller announced Google will soon drop authorship photos from regular search results. Let’s examine what this means.

randfish

8. The Greatest Misconception in Content Marketing – Whiteboard Friday
April 25 – Posted by Rand Fishkin
Great content certainly helps business, but it isn’t as simple as “publish, share, convert new customers.” In today’s Whiteboard Friday, Rand explains what’s really going on.

OliGardner

9. The Most Entertaining Guide to Landing Page Optimization You’ll Ever Read
May 20 – Posted by Oli Gardner
If you’ve ever been bored while reading a blog post, your life just got better. If you’ve ever wanted to learn about conversion rate optimization, and how to design high-converting landing pages, without falling asleep, you’re in the right place. Buckle up, and prepare to be entertained in your learning regions.

MarieHaynes

10. Your Google Algorithm Cheat Sheet: Panda, Penguin, and Hummingbird
June 11 – Posted by Marie Haynes
Do you have questions about the Panda algorithm, the Penguin algorithm, or Hummingbird? This guide explains in lay terms what each of these Google algorithm changes is about and how to improve your site so that it looks better in the eyes of the big G.

7. Top comments from our community by number of thumbs up

These 10 comments were the most thumbed-up of any on our blogs this year, offering voices of reason that stand out from the crowd. 

MarieHaynes

1. Marie Haynes | July 23
Commented on: 
Dear Google, Links from YouMoz Don’t Violate Your Quality Guidelines

Backlinko

2. Brian Dean | September 30
Commented on: 
The Myth of Google’s 200 Ranking Factors

mpezet

3. Martin Pezet | July 22
Commented on: 
The Broken Art of Company Blogging (and the Ignored Metric that Could Save Us All)

dannysullivan

4. Danny Sullivan | July 23
Commented on: 
Dear Google, Links from YouMoz Don’t Violate Your Quality Guidelines

5. Cyrus Shepard | October 21
Commented on: 
More than Keywords: 7 Concepts of Advanced On-Page SEO

SarahBird

6. Sarah Bird | September 17
Commented on: 
Ways to Proactively Welcome Women Into Online Marketing

randfish

7. Rand Fishkin | July 04
Commented on: 
5 Fashion Hacks for the Modern Male Marketer – Whiteboard Friday

mpezet

8. Martin Pezet | September 30
Commented on: 
The Myth of Google’s 200 Ranking Factors

FangDigitalMarketing

9. Jeff Ferguson | October 24
Commented on: 
Is It Possible to Have Good SEO Simply by Having Great Content – Whiteboard Friday

magicrob

10. Robert Duckers | March 20
Commented on: 
New Title Tag Guidelines & Preview Tool

8. Top commenters from our community by total thumbs up

We calculated this one a bit differently this year. In the past, we’ve shown the top community members by sheer number of comments. We don’t want, however, to imply that being prolific is necessarily good within itself. So, we added up all the thumbs-up that each comment on our blogs has received, and figured out which community members racked up the most thumbs over the course of the year. (We’ve intentionally omitted staff members and associates from this list, as they’d stack the deck pretty heavily!)

The graphics to the right of each community member show the number of comments they’ve left on blog posts in 2014 as well as the total number of thumbs up those comments have received.

This list is truly an illustration of how amazing the Moz community is. This site would hardly be anything without all of you, and we
so appreciate your involvement on such a regular basis!

SamuelScott

1. Samuel Scott (Moz username: SamuelScott)
MozPoints: 1557 | Rank: 54

paints-n-design

2. Andreas Becker (Moz username: paints-n-design)
MozPoints: 667 | Rank: 148

MarieHaynes

3. Marie Haynes (Moz username: MarieHaynes)
MozPoints: 4706 | Rank: 7

MarkTraphagen

4. Mark Traphagen (Moz username: MarkTraphagen)
MozPoints: 993 | Rank: 102

steviephil

5. Steve Morgan (Moz username: steviephil)
MozPoints: 1249 | Rank: 72

russangular

6. Russ Jones (Moz username: russangular)
MozPoints: 3282 | Rank: 16

mpezet

7. Martin Pezet (Moz username: mpezet)
MozPoints: 464 | Rank: 211

Pixelbypixel

8. Chris Painter (Moz username: Pixelbypixel)
MozPoints: 2707 | Rank: 25

billslawski

9. Bill Slawski (Moz username: billslawski)
MozPoints: 709 | Rank: 140

danatanseo

10. Dana Tan (Moz username: danatanseo)
MozPoints: 4071 | Rank: 11

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from moz.com

The #LocalUp Advanced 2015 Agenda Is Here

Posted by EricaMcGillivray

You may heard that in partnership with 
Local U, we’re putting on a local SEO conference called LocalUp Advanced on Saturday, February 7. We’re super-thrilled to be able to dive more into the local SEO space and bring you top speakers in the field for a one-day knowledge explosion. We’re expecting around 125-150 people at our Seattle headquarters, so this is your chance to really chat with speakers and attendees one-to-one with a huge return on investment.

Moz Pro or Local U Subscribers $699

General Admission $999


LocalUp Advanced 2015 Agenda


8:00-9:00am Breakfast
9:00-9:05am Welcome to LocalUp Advanced 2015! with David Mihm
9:05-9:30am

Pigeons, Packs, & Paid: Google Local 2015 with Dr. Pete Meyers
In the past year, Google shook the local SEO world with the Pigeon update, rolled out an entirely new local pack, and has aggressively dabbled in local advertising. Dr. Pete covers the year in review, how it’s impacted the local landscape, and what to expect in 2015.

Dr. Pete Meyers is the Marketing Scientist for Moz, where he works with the marketing and data science teams on product research and data-driven content. He’s spent the past two years building research tools to monitor Google, including the MozCast project, and he curates the Google Algorithm History.

Pete Meyers

9:30-9:55am

Local Battlegrounds – Tactics, Trenches, and Ghosts with Mike Blumenthal
Join Professor Maps and take a ride in the Way Back Whacky Machine to look at Google’s technologies, tactics, and play books used to create, shape, and dominate the local ecosystem in their image. Learn what’s relevant to marketing today and how these changes are shaping Google’s coming battles in the space.

If you’re in Local, then you know Mike Blumenthal, and here is your chance to learn from this pioneer in local SEO, whose years of industry research and documentation have earned him the fond and respectful nickname ‘Professor Maps.’ Mike’s blog has been the go-to spot for local SEOs since the early days of Google Maps. It’s safe to say that there are few people on the planet who know more about this area of marketing than Mike. He’s also the co-founder of GetFiveStars, an innovative review and testimonial software. Additionally, Mike loves biking, x-country skiing, and home cooking.

Mike Blumenthal

9:55-10:10am Q&A with Dr. Peter Meyers and Mike Blumenthal
10:10-10:45am

Going Local with Google with Jade Wang
Learn about local search with Google. We’ll chat about the potential of local search and discuss how business information gets on Google.

If you’ve gone to the Google and Your Business Forum for help (and, of course, you have!), then you know how quickly an answer from Google staffer Jade Wang can clear up even the toughest problems. She has been helping business owners get their information listed on Google since joining the team in 2012.

Jade Wang

10:45-11:05am AM Break
11:05-11:25am

Getting Local Keyword Research and On-page Optimization Right with Mary Bowling
Local keyword data is often difficult to find, analyze, and prioritize. Get tips, tools, and processes for zeroing in on the best terms to target when optimizing your website and directory listings, and learn how and why to structure your website around them.

Mary Bowling’s been specializing in SEO and local search since 2003. She works as a consultant at Optimized!, is a partner at a small agency called Ignitor Digital, is a partner in Local U, and is also a trainer and writer for Search Engine News. Mary spends her days interacting directly with local business owners and understands holistic local needs.

Mary Bowling

11:25-11:50am

Local Content + Scale + Creativity = Awesome with Mike Ramsey
If you are wondering who is crushing it with local content and how you can scale such efforts, then tune in as Mike Ramsey walks through ideas, examples, and lessons he has learned along the way.

Mike Ramsey is the president of Nifty Marketing with offices in Burley and Boise, Idaho. He is also a Partner at Local U and many other ventures. Mike has an awesome wife and three kids who put up with all his talk about search.

Mike Ramsey

11:50am-12:15pm

Review Acquisition Strategies That Work with Darren Shaw
Darren Shaw will walk you through multiple real-world examples of businesses that are killing it with review acquisition. He’ll detail exactly how they manage to get so many more reviews than their competitors and how you can use their methods to improve your own local search visibility.

Darren Shaw is the President and Founder of Whitespark, a company that builds software and provides services to help businesses with local search. He’s widely regarded in the local SEO community as an innovator, one whose years of experience working with massive local data sets have given him uncommon insights into the inner workings of the world of citation-building and local search marketing. Darren has been working on the web for over 16 years and loves everything about local SEO.

Mike Ramsey

12:15-12:30pm Q&A with Mary Bowling, Mike Ramsey, and Darren Shaw
12:30-1:30pm Lunch
1:30-1:55pm

The Down-Low on LoMo (Local Mobile) SEO with Cindy Krum
Half of all local searches happen on mobile, and that stat is just growing! Map search results are great, but your mobile site has to be great too. Cindy Krum will review the best practices for making your local site look perfect to mobile users and crawlers alike. No mobile site? No problem as you’ll also get tips for how to make the most of mobile searches without one.

Cindy Krum is the CEO and Founder of MobileMoxie, LLC, a mobile marketing consultancy and host of the most cutting-edge online mobile marketing toolset available today. Cindy is the author of Mobile Marketing: Finding Your Customers No Matter Where They Are, published by Que Publishing.

Cindy Krum

1:55-2:20pm

Thriving in the Mobile Ecosystem with Aaron Weiche
A look into the opportunity of creating and growing the mobile experience between your customers and your brand: one strong enough to delight fingers, change minds, and win hearts.

Aaron Weiche is a digital marketing geek focused on web design, mobile, and search marketing. Aaron is the COO of Spyder Trap in Minneapolis, Local U faculty member, founding board member of MnSearch, and a Local Search Ranking Factors Contributor since 2010.

Aaron Weiche

2:20-2:45pm

Content, Conversations, and Conversions with Will Scott
How local businesses, and the marketers who love them, can use social media to bring home the bacon.

Helping small businesses succeed online since 1994, Will Scott has led teams responsible for thousands of websites, hundreds of thousands of pages in online directories, and millions of visits from search. Today, Will leads nearly 100 professionals at Search Influence putting results first and helping customers successfully market online.

Will Scott

2:45-3:10pm

Segmentation Domination with Ed Reese
Learn how to gain powerful insight by creating creative custom segments in Google Analytics. This session shows several real-world examples in action and walks you through the brainstorming, implementation, and discovery process to utilize segmentation like never before.

Ed Reese leads a talented analytics and usability team at his firm Sixth Man Marketing, is a co-founder of Local U, and an adjunct professor of digital marketing at Gonzaga University. In his free time, he optimizes his foosball and disc golf technique and spends time with his wife and two boys.

Ed Reese

3:10-3:30pm PM Break
3:30-4:00pm

Playing to Your Local Strengths with David Mihm
Historically, local search has been one of the most level playing fields on the web with smaller, nimbler businesses having an advantage as larger enterprises struggled to adapt and keep up. Today, companies of both sizes can benefit from tactics that the other simply can’t leverage. David will share some of the most valuable tactics that scale—and don’t scale—in a presentation packed with actionable takeaways, no matter what size business you work with.

David Mihm is one of the world’s leading practitioners of local search engine marketing. He has created and promoted search-friendly websites for clients of all sizes since the early 2000s. David co-founded GetListed.org, which he sold to Moz in November 2012. Since then, he’s served as our Director of Local Search Marketing, imparting his wisdom everywhere!

David Mihm

4:00-4:25pm

Don’t Just Show Up, Stand Out with Dana DiTomaso
Learn how to destroy your competitors with bringing personality to your marketing. Confront the challenges of making HIPPOs comfortable with unique voice, keep brand standards while injecting some fun, and stay in the forefront of your audience’s mind.

Whether at a conference, on the radio, or in a meeting, Dana DiTomaso likes to impart wisdom to help you turn a lot of marketing BS into real strategies to grow your business. After 10+ years and with a focus on local SMBs, she’s seen (almost) everything. In her spare time, Dana drinks tea and yells at the Hamilton Tiger-Cats.

Dana DiTomaso

4:25-4:40pm Q&A with David Mihm and Dana DiTomaso
4:40-5:20pm

Exposing the Non-Obvious Elements of Local Businesses That Dominate on the Web with Rand Fishkin
In some categories and geographies, a local small business wholly dominates the rankings and visibility across channels. What are the secrets to this success, and how can small businesses with remarkable products/services showcase their traits best online? In this presentation, Rand will dig deep into examples and highlight the recurring elements that help the best of the best stand out.

Rand Fishkin is the founder of Moz. Traveler, blogger, social media addict, feminist, and husband.

Rand Fishkin

And if that doesn’t quite tickle your fancy… Workshops!

We’ll also be hosting workshops with our speakers, which are amazing opportunities for you to dig into your specific questions and issues. I know, sometimes I get a little shy to ask questions in front of a crowd or just want to socialize at the after party, so this a great opportunity to get direct feedback.

Time Workshop Option A Workshop Option B
1:30-1:55pm

Reporting Q&A with Ed Reese and Dana DiTomaso
Need help with your reporting? Ed and Dana will make sure you’re on the right track and tracking the right things.

Google My Business Q&A with Jade Wang
Google My Business can be confusing, but Jade Wang is here to lend a hand. She’ll look over your specific problems and help you troubleshoot.

1:55-2:20pm

How to Troubleshoot All Things Local with Mike Blumenthal and Mary Bowling
No Local SEO problem can get by the combined powers of Mike and Mary. This dynamic duo will assist you in diving into your specific questions, problems, and concerns.

Google My Business Q&A with Jade Wang
Google My Business can be confusing, but Jade Wang is here to lend a hand. She’ll look over your specific problems and help you troubleshoot.

2:20-2:45pm

Citation Q&A with David Mihm and Darren Shaw
Getting the right citations for your business can be a powerful boost. David and Darren will show you how to wield citations correctly and creatively for your business.

Google My Business Q&A with Jade Wang
Google My Business can be confusing, but Jade Wang is here to lend a hand. She’ll look over your specific problems and help you troubleshoot.

2:45-3:10pm

Mobile Q&A with Aaron Weiche and Cindy Krum
Local and mobile go hand-in-hand, but mobile implementation, optimization, and perfection can be tricky. Aaron and Cindy will help guide you and your business.

Google My Business Q&A with Jade Wang
Google My Business can be confusing, but Jade Wang is here to lend a hand. She’ll look over your specific problems and help you troubleshoot.


See you in February, friends. And please, don’t hesitate to reach out if you have any questions!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from moz.com

Cutting edge Video SEO Solutions That Are Guaranteed To Dominate YouTube And Google

http://www.vseosolutions3-0.com VSEO Solutions 3.0 is a video SEO optimization firm” – We offer our clients cutting edge video SEO solutions that set your vi…

Reblogged 5 years ago from www.youtube.com