#thedotties – announcing the shortlist

Our fabulous line-up of judges spent a grueling day analyzing hundreds of entries (more than 350, in fact) for the 11 customer categories making up the 2017 awards.

This year we invited a wide range of industry experts – including ecommerce pro Chloë Thomas; founder of Action Rocket, Elliot Ross; and Sarah Courbet, JET Software’s Head of Marketing – to deliberate on everything from subject lines to use of data.

Some of the judges hard at work

The overall verdict was that the quality of entries was extremely high, of course making it difficult to draw up the shortlist (they left for home eventually!)

So, here goes…

Best subject line

WaterAid UK

Hawes & Curtis

Charlotte Tilbury

Beer Hawk

Montreal Associates

Field & Flower

Best email creative

We decided to combine ‘Best use of copywriting’ with ‘Best email creative’ as the two complement each other. Here’s the shortlist!

Cats Protection

Neal’s Yard Remedies Ltd.

Chinti & Parker

Charlotte Tilbury – Day & Night

Barbour

British Heart Foundation

Hawes & Curtis

icelolly.com

Stroke Association

Charlotte Tilbury – Signature 10

Best use of data

Shortlist Media

All Star Business Solutions

English Heritage

JoJo Maman Bébé

England Hockey

Big Green Smile

Best multichannel campaign

Shortlist Media

The Dune Group

Forest Holidays

Greene King

Barbour

Habitat

Best use of dotmailer

City and Guilds

Macmillan Cancer Support

Twinmar Ltd.

Action for Children

Allstar Business Solutions

Shortlist Media

Best B2B campaign

BP

City Sprint

City and Guilds

Nationwide

Fleetcor

Montreal Associates

Best B2C campaign

The Dune Group

Southampton FC

Stroke Association

Greene King

Cats Protection

Ladbrokes Cheltenham

Best ecommerce campaign

Interdirect Ltd.

Neal’s Yard Remedies Ltd.

The Dune Group

Monin

Alexandra

Hawes & Curtis

Best charity campaign

Disaster Emergency Committee

Stroke Association

Macmillan Cancer Support

British Heart Foundation

Cats Protection

Email marketing team of the year

Shortlist Media

British Heart Foundation

The National Gallery

Forest Holidays

Smith & Nephew

Sika Group

Marie Curie

Email marketer of the year

We’ve decided to combine ‘Rising star – Marketer of the year’ with this category. Here’s the shortlist:

One4all Gift Cards

Public Desire

City and Guilds

Daisy London

Montreal Associates

Forest Holidays

Action for Children

Shortlist Media

 

If you made the shortlist, congratulations! Don’t forget to tell everyone by using #thedotties. We can’t wait to celebrate with you on the 13th July. The glittering ceremony is taking place at The Troxy in London, with comedian Russell Kane announcing the winners and The Festival Foxes taking the stage.

If you’ve not yet RSVP’d, don’t miss out – spaces are limited! You can confirm your attendance here.

Please note that ‘Best use of dotmailer’ for APAC and the Americas will be announced soon.

The post #thedotties – announcing the shortlist appeared first on The Email Marketing Blog.

Reblogged 3 weeks ago from blog.dotmailer.com

Announcing the 2017 Local Search Ranking Factors Survey Results

Posted by Whitespark

Since its inception in 2008, David Mihm has been running the Local Search Ranking Factors survey. It is the go-to resource for helping businesses and digital marketers understand what drives local search results and what they should focus on to increase their rankings. This year, David is focusing on his new company, Tidings, a genius service that automatically generates perfectly branded newsletters by pulling in the content from your Facebook page and leading content sources in your industry. While he will certainly still be connected to the local search industry, he’s spending less time on local search research, and has passed the reins to me to run the survey.

David is one of the smartest, nicest, most honest, and most generous people you will ever meet. In so many ways, he has helped direct and shape my career into what it is today. He has mentored me and promoted me by giving me my first speaking opportunities at Local U events, collaborated with me on research projects, and recommended me as a speaker at important industry conferences. And now, he has passed on one of the most important resources in our industry into my care. I am extremely grateful.

Thank you, David, for all that you have done for me personally, and for the local search industry. I am sure I speak for all who know you personally and those that know you through your work in this space; we wish you great success with your new venture!

I’m excited to dig into the results, so without further ado, read below for my observations, or:

Click here for the full results!

Shifting priorities

Here are the results of the thematic factors in 2017, compared to 2015:

Thematic Factors

2015

2017

Change

GMB Signals

21.63%

19.01%

-12.11%

Link Signals

14.83%

17.31%

+16.73%

On-Page Signals

14.23%

13.81%

-2.95%

Citation Signals

17.14%

13.31%

-22.36%

Review Signals

10.80%

13.13%

+21.53%

Behavioral Signals

8.60%

10.17%

+18.22%

Personalization

8.21%

9.76%

+18.81%

Social Signals

4.58%

3.53%

-22.89%

If you look at the Change column, you might get the impression that there were some major shifts in priorities this year, but the Change number doesn’t tell the whole story. Social factors may have seen the biggest drop with a -22.89% change, but a shift in emphasis on social factors from 4.58% to 3.53% isn’t particularly noteworthy.

The decreased emphasis on citations compared to the increased emphasis on link and review factors, is reflective of shifting focus, but as I’ll discuss below, citations are still crucial to laying down a proper foundation in local search. We’re just getting smarter about how far you need to go with them.

The importance of proximity

For the past two years, Physical Address in City of Search has been the #1 local pack/finder ranking factor. This makes sense. It’s tough to rank in the local pack of a city that you’re not physically located in.

Well, as of this year’s survey, the new #1 factor is… drumroll please…

Proximity of Address to the Point of Search

This factor has been climbing from position #8 in 2014, to position #4 in 2015, to claim the #1 spot in 2017. I’ve been seeing this factor’s increased importance for at least the past year, and clearly others have noticed as well. As I note in my recent post on proximity, this leads to poor results in most categories. I’m looking for the best lawyer in town, not the closest one. Hopefully we see the dial get turned down on this in the near future.

While Proximity of Address to the Point of Search is playing a stronger role than ever in the rankings, it’s certainly not the only factor impacting rankings. Businesses with higher relevancy and prominence will rank in a wider radius around their business and take a larger percentage of the local search pie. There’s still plenty to be gained from investing in local search strategies.

Here’s how the proximity factors changed from 2015 to 2017:

Proximity Factors

2015

2017

Change

Proximity of Address to the Point of Search

#4

#1

+3

Proximity of Address to Centroid of Other Businesses in Industry

#20

#30

-10

Proximity of Address to Centroid

#16

#50

-34

While we can see that Proximity to the Point of Search has seen a significant boost to become the new #1 factor, the other proximity factors which we once thought were extremely important have seen a major drop.

I’d caution people against ignoring Proximity of Address to Centroid, though. There is a situation where I think it still plays a role in local rankings. When you’re searching from outside of a city for a key phrase that contains the city name (Ex: Denver plumbers), then I believe Google geo-locates the search to the centroid and Proximity of Address to Centroid impacts rankings. This is important for business categories that are trying to attract searchers from outside of their city, such as attractions and hotels.

Local SEOs love links

Looking through the results and the comments, a clear theme emerges: Local SEOs are all about the links these days.

In this year’s survey results, we’re seeing significant increases for link-related factors across the board:

Local Pack/Finder Link Factors

2015

2017

Change

Quality/Authority of Inbound Links to Domain

#12

#4

+8

Domain Authority of Website

#6

#6

Diversity of Inbound Links to Domain

#27

#16

+11

Quality/Authority of Inbound Links to GMB Landing Page URL

#15

#11

+4

Quantity of Inbound Links to Domain

#34

#17

+17

Quantity of Inbound Links to Domain from Locally Relevant Domains

#31

#20

+11

Page Authority of GMB Landing Page URL

#24

#22

+2

Quantity of Inbound Links to Domain from Industry-Relevant Domains

#41

#28

+13

Product/Service Keywords in Anchor Text of Inbound Links to Domain

#33

+17

Location Keywords in Anchor Text of Inbound Links to Domain

#45

#38

+7

Diversity of Inbound Links to GMB Landing Page URL

#39

+11

Quantity of Inbound Links to GMB Landing Page URL from LocallyRelevant Domains

#48

+2

Google is still leaning heavily on links as a primary measure of a business’ authority and prominence, and the local search practitioners that invest time and resources to secure quality links for their clients are reaping the ranking rewards.

Fun fact: “links” appears 76 times in the commentary.

By comparison, “citations” were mentioned 32 times, and “reviews” were mentioned 45 times.

Shifting priorities with citations

At first glance at all the declining factors in the table below, you might think that yes, citations have declined in importance, but the situation is more nuanced than that.

Local Pack/Finder Citation Factors

2015

2017

Change

Consistency of Citations on The Primary Data Sources

n/a

#5

n/a

Quality/Authority of Structured Citations

#5

#8

-3

Consistency of Citations on Tier 1 Citation Sources

n/a

#9

n/a

Quality/Authority of Unstructured Citations (Newspaper Articles, Blog Posts, Gov Sites, Industry Associations)

#18

#21

-3

Quantity of Citations from Locally Relevant Domains

#21

#29

-8

Prominence on Key Industry-Relevant Domains

n/a

#37

n/a

Quantity of Citations from Industry-Relevant Domains

#19

#40

-21

Enhancement/Completeness of Citations

n/a

#44

n/a

Proper Category Associations on Aggregators and Tier 1 Citation Sources

n/a

#45

n/a

Quantity of Structured Citations (IYPs, Data Aggregators)

#14

#47

-33

Consistency of Structured Citations

#2

n/a

n/a

Quantity of Unstructured Citations (Newspaper Articles, Blog Posts)

#39

-11

You’ll notice that there are many “n/a” cells on this table. This is because I made some changes to the citation factors. I elaborate on this in the survey results, but for your quick reference here:

  1. To reflect the reality that you don’t need to clean up your citations on hundreds of sites, Consistency of Structured Citations has been broken down into 4 new factors:
    1. Consistency of Citations on The Primary Data Sources
    2. Consistency of Citations on Tier 1 Citation Sources
    3. Consistency of Citations on Tier 2 Citation Sources
    4. Consistency of Citations on Tier 3 Citation Sources
  2. I added these new citation factors:
    1. Enhancement/Completeness of Citations
    2. Presence of Business on Expert-Curated “Best of” and Similar Lists
    3. Prominence on Key Industry-Relevant Domains
    4. Proper Category Associations on Aggregators and Top Tier Citation Sources

Note that there are now more citation factors showing up, so some of the scores given to citation factors in 2015 are now being split across multiple factors in 2017:

  • In 2015, there were 7 citation factors in the top 50
  • In 2017, there are 10 citation factors in the top 50

That said, overall, I do think that the emphasis on citations has seen some decline (certainly in favor of links), and rightly so. In particular, there is an increasing focus on quality over quantity.

I was disappointed to see that Presence of Business on Expert-Curated “Best of” and Similar Lists didn’t make the top 50. I think this factor can provide a significant boost to a business’ local prominence and, in turn, their rankings. Granted, it’s a challenging factor to directly influence, but I would love to see an agency make a concerted effort to outreach to get their clients listed on these, measure the impact, and do a case study. Any takers?

GMB factors

There is no longer an editable description on your GMB listing, so any factors related to the GMB description field were removed from the survey. This is a good thing, since the field was typically poorly used, or abused, in the past. Google is on record saying that they didn’t use it for ranking, so stuffing it with keywords has always been more likely to get you penalized than to help you rank.

Here are the changes in GMB factors:

GMB Factors

2015

2017

Change

Proper GMB Category Associations

#3

#3

Product/Service Keyword in GMB Business Title

#7

#7

Location Keyword in GMB Business Title

#17

#12

+5

Verified GMB Listing

#13

#13

GMB Primary Category Matches a Broader Category of the Search Category (e.g. primary category=restaurant & search=pizza)

#22

#15

+7

Age of GMB Listing

#23

#25

-2

Local Area Code on GMB Listing

#33

#32

+1

Association of Photos with GMB Listing

#36

+14

Matching Google Account Domain to GMB Landing Page Domain

#36

-14

While we did see some upward movement in the Location Keyword in GMB Business Title factor, I’m shocked to see that Product/Service Keyword in GMB Business Title did not also go up this year. It is hands-down one of the strongest factors in local pack/finder rankings. Maybe THE strongest, after Proximity of Address to the Point of Search. It seems to me that everyone and their dog is complaining about how effective this is for spammers.

Be warned: if you decide to stuff your business title with keywords, international spam hunter Joy Hawkins will probably hunt your listing down and get you penalized. 🙂

Also, remember what happened back when everyone was spamming links with private blog networks, and then got slapped by the Penguin Update? Google has a complete history of changes to your GMB listing, and they could decide at any time to roll out an update that will retroactively penalize your listing. Is it really worth the risk?

Age of GMB Listing might have dropped two spots, but it was ranked extremely high by Joy Hawkins and Colan Neilsen. They’re both top contributors at the Google My Business forum, and I’m not saying they know something we don’t know, but uh, maybe they know something we don’t know.

Association of Photos with GMB Listing is a factor that I’ve heard some chatter about lately. It didn’t make the top 50 in 2015, but now it’s coming in at #36. Apparently, some Google support people have said it can help your rankings. I suppose it makes sense as a quality consideration. Listings with photos might indicate a more engaged business owner. I wonder if it matters whether the photos are uploaded by the business owner, or if it’s a steady stream of incoming photo uploads from the general public to the listing. I can imagine that a business that’s regularly getting photo uploads from users might be a signal of a popular and important business.

While this factor came in as somewhat benign in the Negative Factors section (#26), No Hours of Operation on GMB Listing might be something to pay attention to, as well. Nick Neels noted in the comments:

Our data showed listings that were incomplete and missing hours of operation were highly likely to be filtered out of the results and lose visibility. As a result, we worked with our clients to gather hours for any listings missing them. Once the hours of operation were uploaded, the listings no longer were filtered.

Behavioral factors

Here are the numbers:

GMB Factors

2015

2017

Change

Clicks to Call Business

#38

#35

+3

Driving Directions to Business Clicks

#29

#43

-14

Not very exciting, but these numbers do NOT reflect the serious impact that behavioral factors are having on local search rankings and the increased impact they will have in the future. In fact, we’re never going to get numbers that truly reflect the value of behavioral factors, because many of the factors that Google has access to are inaccessible and unmeasurable by SEOs. The best place to get a sense of the impact of these factors is in the comments. When asked about what he’s seeing driving rankings this year, Phil Rozek notes:

There seem to be more “black box” ranking scenarios, which to me suggests that behavioral factors have grown in importance. What terms do people type in before clicking on you? Where do those people search from? How many customers click on you rather than on the competitor one spot above you? If Google moves you up or down in the rankings, will many people still click? I think we’re somewhere past the beginning of the era of mushy ranking factors.

Mike Blumenthal also talks about behavioral factors in his comments:

Google is in a transition period from a web-based linking approach to a knowledge graph semantic approach. As we move towards a mobile-first index, the lack of linking as a common mobile practice, voice search, and single-response answers, Google needs to and has been developing ranking factors that are not link-dependent. Content, actual in-store visitations, on-page verifiable truth, third-party validation, and news-worthiness are all becoming increasingly important.

But Google never throws anything away. Citations and links as we have known them will continue to play a part in the ranking algo, but they will be less and less important as Google increases their understanding of entity prominence and the real world.

And David Mihm says:

It’s a very difficult concept to survey about, but the overriding ranking factor in local — across both pack and organic results — is entity authority. Ask yourself, “If I were Google, how would I define a local entity, and once I did, how would I rank it relative to others?” and you’ll have the underlying algorithmic logic for at least the next decade.

    • How widely known is the entity? Especially locally, but oh man, if it’s nationally known, searchers should REALLY know about it.
    • What are people saying about the entity? (It should probably rank for similar phrases)
    • What is the engagement with the entity? Do people recognize it when they see it in search results? How many Gmail users read its newsletter? How many call or visit it after seeing it in search results? How many visit its location?

David touches on this topic in the survey response above, and then goes full BEAST MODE on the future of local rankings in his must-read post on Tidings, The Difference-Making Local Ranking Factor of 2020. (David, thank you for letting me do the Local Search Ranking Factors, but please, don’t ever leave us.)

The thing is, Google has access to so much additional data now through Chrome, Android, Maps, Ads, and Search. They’d be crazy to not use this data to help them understand which businesses are favored by real, live humans, and then rank those businesses accordingly. You can’t game this stuff, folks. In the future, my ranking advice might just be: “Be an awesome business that people like and that people interact with.” Fortunately, David thinks we have until 2020 before this really sets in, so we have a few years left of keyword-stuffing business titles and building anchor text-optimized links. Phew.

To survey or to study? That is not the question

I’m a fan of Andrew Shotland’s and Dan Leibson’s Local SEO Ranking Factors Study. I think that the yearly Local Search Ranking Factors Survey and the yearly (hopefully) Local SEO Ranking Factors Study nicely complement each other. It’s great to see some hard data on what factors correlate with rankings. It confirms a lot of what the contributors to this survey are intuitively seeing impact rankings for their clients.

There are some factors that you just can’t get data for, though, and the number of these “black box” factors will continue to grow over the coming years. Factors such as:

  • Behavioral factors and entity authority, as described above. I don’t think Google is going to give SEOs this data anytime soon.
  • Relevancy. It’s tough to measure a general relevancy score for a business from all the different sources Google could be pulling this data from.
  • Even citation consistency is hard to measure. You can get a general sense of this from tools like Moz Local or Yext, but there is no single citation consistency metric you can use to score businesses by. The ecosystem is too large, too complicated, and too nuanced to get a value for consistency across all the location data that Google has access to.

The survey, on the other hand, aggregates opinions from the people that are practicing and studying local search day in and day out. They do work for clients, test things, and can see what had a positive impact on rankings and what didn’t. They can see that when they built out all of the service pages for a local home renovations company, their rankings across the board went up through increased relevancy for those terms. You can’t analyze these kinds of impacts with a quantitative study like the Local SEO Ranking Factors Study. It takes some amount of intuition and insight, and while the survey approach certainly has its flaws, it does a good job of surfacing those insights.

Going forward, I think there is great value in both the survey to get the general sense of what’s impacting rankings, and the study to back up any of our theories with data — or to potentially refute them, as they may have done with city names in webpage title tags. Andrew and Dan’s empirical study gives us more clues than we had before, so I’m looking forward to seeing what other data sources they can pull in for future editions.

Possum’s impact has been negligible

Other than Proper GMB Category Associations, which is definitely seeing a boost because of Possum, you can look at the results in this section more from the perspective of “this is what people are focusing on more IN GENERAL.” Possum hasn’t made much of an impact on what we do to rank businesses in local. It has simply added another point of failure in cases where a business gets filtered.

One question that’s still outstanding in my mind is: what do you do if you are filtered? Why is one business filtered and not the other? Can you do some work to make your business rank and demote the competitor to the filter? Is it more links? More relevancy? Hopefully someone puts out some case studies soon on how to defeat the dreaded Possum filter (paging Joy Hawkins).

Focusing on More Since Possum

#1

Proximity of Address to the Point of Search

#2

Proper GMB Category Associations

#3

Quality/Authority of Inbound Links to Domain

#4

Quantity of Inbound Links to Domain from Locally Relevant Domains

#5

Click-Through Rate from Search Results

Focusing on Less Since Possum

#1

Proximity of Address to Centroid

#2

Physical Address in City of Search

#3

Proximity of Address to Centroid of Other Businesses in Industry

#4

Quantity of Structured Citations (IYPs, Data Aggregators)

#5

Consistency of Citations on Tier 3 Citation Sources

Foundational factors vs. competitive difference-makers

There are many factors in this survey that I’d consider table stakes. To get a seat at the rankings table, you must at least have these factors in order. Then there are the factors which I’d consider competitive difference-makers. These are the factors that, once you have a seat at the table, will move your rankings beyond your competitors. It’s important to note that you need BOTH. You probably won’t rank with only the foundation unless you’re in an extremely low-competition market, and you definitely won’t rank if you’re missing that foundation, no matter how many links you have.

This year I added a section to try to get a sense of what the local search experts consider foundational factors and what they consider to be competitive difference-makers. Here are the top 5 in these two categories:

Foundational

Competitive Difference Makers

#1

Proper GMB Category Associations

Quality/Authority of Inbound Links to Domain

#2

Consistency of Citations on the Primary Data Sources

Quantity of Inbound Links to Domain from Industry-Relevant Domains

#3

Physical Address in City of Search

Quality/Authority of Inbound Links to GMB Landing Page URL

#4

Proximity of Address to the Point of Search (Searcher-Business Distance)

Quantity of Inbound Links to Domain from Locally Relevant Domains

#5

Consistency of Citations on Tier 1 Citation Sources

Quantity of Native Google Reviews (with text)

I love how you can look at just these 10 factors and pretty much extract the basics of how to rank in local:

“You need to have a physical location in the city you’re trying to rank in, and it’s helpful for it to be close to the searcher. Then, make sure to have the proper categories associated with your listing, and get your citations built out and consistent on the most important sites. Now, to really move the needle, focus on getting links and reviews.”

This is the much over-simplified version, of course, so I suggest you dive into the full survey results for all the juicy details. The amount of commentary from participants is double what it was in 2015, and it’s jam-packed with nuggets of wisdom. Well worth your time.

Got your coffee? Ready to dive in?

Take a look at the full results

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 months ago from tracking.feedpress.it

Announcing MozCon Local 2017!

Posted by George-Freitag

Now that location-based searches are growing about 50% faster than any other type of search on mobile, what are you going to do to make sure you’re working on the front lines of this new, local-focused world? Well, you can start by joining us in Seattle for MozCon Local 2017 on February 27–28 for a day full of in-depth workshops from LocalU followed by an all-day conference from the top local speakers and brands.

You’ll come away with another level of understanding related to local strategy, citations, reviews, SEO local link building, content creation, and more, along with some incredible, tactical advice to get you improving your local game the second you get home (or at least your first day back in the office). Plus, you’ll be able to interact directly with speakers both during Q&A sessions and around the conference, and spend time getting to know your fellow local marketers.

So whether you’re a marketer with a portfolio chock-full of local accounts or a brand with hundreds or thousands of locations, MozCon Local 2017 is where you need to be.

Buy your MozCon Local 2016 ticket!


Some of our great speakers (lots more coming!)

Darren Shaw

Whitespark

Darren Shaw is the president and founder of Whitespark, a company that builds software and provides services to help businesses with local search. He’s widely regarded in the local SEO community as an innovator, one whose years of experience working with massive local data sets have given him uncommon insights into the inner workings of the world of citation-building and local search marketing. Darren has been working on the web for over 16 years and loves everything about local SEO.

Mike Blumenthal

GetFiveStars

Mike grew up sweeping floors in his family retail business at age 7 and saw the challenges of local marketing up close from an early age. Before co-founding GetFiveStars.com and LocalU.org he had been doing what we now know as Local SEO since 2005 and writing at his blog Understanding Google Local since 2006. He loves researching and understanding the issues that confront bricks and mortar storefronts and helping owners, agencies, and franchises tackle the challenges of the ever changing local marketing world.

Heather Physioc

VML

Heather Physioc is Assoc. Director of Organic Search at global digital ad agency VML, performing search engine optimization services for multinational brands like Electrolux/Frigidaire, Colgate-Palmolive, Hill’s Pet Nutrition, Bridgestone, Wendy’s and Bayer Animal Health. She has worked in digital marketing for 10 years. Physioc earned her Bachelor’s of Journalism in Strategic Communication (Advertising) from the University of Missouri, and is currently pursuing an Executive Master’s of Business Administration from Rockhurst University. She has spoken at AACS, WordCamp, KCSEMA, SEMPO Cities, PRSA Mid-Missouri and Omaha, TEDxKCWomen and more.

Willys DeVoll

Google

Willys Devoll is a content strategist for Google My Business and a member of the AdWords Content Strategy and Development team. He has also worked as a technical writer and content developer on Google for Work. In the past, DeVoll worked for Major League Baseball Advanced Media in communications, and at the Center for Spatial and Textual Analysis, where he contributed to research in the Literary Lab.

Rand Fishkin

Moz

Rand Fishkin uses the ludicrous title, Wizard of Moz. He’s founder and former CEO of Moz, co-author of a pair of books on SEO, and co-founder of Inbound.org.

MozCon Local 2017 takes place at the Hyatt in downtown Seattle. In addition to coming home with a ton of knowledge, you’ll also be coming home with some great swag to show off! Monday’s workshops will have a snack break and networking time, and for Tuesday’s conference your ticket includes breakfast, lunch, and two snack breaks. FINALLY, on the last night we’ll have a networking party so you meet speakers, thought leaders, Mozzers, and other attendees. Networking without the ‘net!

We’re expecting around 200 people to join us, including speakers, Mozzers, and Local U staff. MozCon Local sold out last year, and we expect this year to sell out, as well, so you’ll want to buy your ticket now!

Purchase your ticket now!


Our best early-bird prices:

Local U Workshop + MozCon Local Conference – Monday & Tuesday, February 27–28, 2017

$1,048 $748 for Early Bird Moz Subscriber & Local U Forum Members

$1,498 $1,148 for Early Bird General Admission

MozCon Local Conference – Tuesday, February 28, 2017

$599 $399 for Early Bird Moz Subscribers & Local U Forum Members

$899 $699 for Early Bird General Admission

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 7 months ago from tracking.feedpress.it

Moving 5 Domains to 1: An SEO Case Study

Posted by Dr-Pete

People often ask me if they should change domain names, and I always shudder just a little. Changing domains is a huge, risky undertaking, and too many people rush into it seeing only the imaginary upside. The success of the change also depends wildly on the details, and it’s not the kind of question anyone should be asking casually on social media.

Recently, I decided that it was time to find a new permanent home for my personal and professional blogs, which had gradually spread out over 5 domains. I also felt my main domain was no longer relevant to my current situation, and it was time for a change. So, ultimately I ended up with a scenario that looked like this:

The top three sites were active, with UserEffect.com being my former consulting site and blog (and relatively well-trafficked). The bottom two sites were both inactive and were both essentially gag sites. My one-pager, AreYouARealDoctor.com, did previously rank well for “are you a real doctor”, so I wanted to try to recapture that.

I started migrating the 5 sites in mid-January, and I’ve been tracking the results. I thought it would be useful to see how this kind of change plays out, in all of the gory details. As it turns out, nothing is ever quite “textbook” when it comes to technical SEO.

Why Change Domains at All?

The rationale for picking a new domain could fill a month’s worth of posts, but I want to make one critical point – changing domains should be about your business goals first, and SEO second. I did not change domains to try to rank better for “Dr. Pete” – that’s a crap shoot at best. I changed domains because my old consulting brand (“User Effect”) no longer represented the kind of work I do and I’m much more known by my personal brand.

That business case was strong enough that I was willing to accept some losses. We went through a similar transition here
from SEOmoz.org to Moz.com. That was a difficult transition that cost us some SEO ground, especially short-term, but our core rationale was grounded in the business and where it’s headed. Don’t let an SEO pipe dream lead you into a risky decision.

Why did I pick a .co domain? I did it for the usual reason – the .com was taken. For a project of this type, where revenue wasn’t on the line, I didn’t have any particular concerns about .co. The evidence on how top-level domains (TLDs) impact ranking is tough to tease apart (so many other factors correlate with .com’s), and Google’s attitude tends to change over time, especially if new TLDs are abused. Anecdotally, though, I’ve seen plenty of .co’s rank, and I wasn’t concerned.

Step 1 – The Boring Stuff

It is absolutely shocking how many people build a new site, slap up some 301s, pull the switch, and hope for the best. It’s less shocking how many of those people end up in Q&A a week later, desperate and bleeding money.


Planning is hard work, and it’s boring – get over it.

You need to be intimately familiar with every page on your existing site(s), and, ideally, you should make a list. Not only do you have to plan for what will happen to each of these pages, but you’ll need that list to make sure everything works smoothly later.

In my case, I decided it might be time to do some housekeeping – the User Effect blog had hundreds of posts, many outdated and quite a few just not very good. So, I started with the easy data – recent traffic. I’m sure you’ve seen this Google Analytics report (Behavior > Site Content > All Pages):

Since I wanted to focus on recent activity, and none of the sites had much new content, I restricted myself to a 3-month window (Q4 of 2014). Of course, I looked much deeper than the top 10, but the principle was simple – I wanted to make sure the data matched my intuition and that I wasn’t cutting off anything important. This helped me prioritize the list.

Of course, from an SEO standpoint, I also didn’t want to lose content that had limited traffic but solid inbound links. So, I checked my “Top Pages” report in
Open Site Explorer:

Since the bulk of my main site was a blog, the top trafficked and top linked-to pages fortunately correlated pretty well. Again, this is only a way to prioritize. If you’re dealing with sites with thousands of pages, you need to work methodically through the site architecture.

I’m going to say something that makes some SEOs itchy – it’s ok not to move some pages to the new site. It’s even ok to let some pages 404. In Q4, UserEffect.com had traffic to 237 URLs. The top 10 pages accounted for 91.9% of that traffic. I strongly believe that moving domains is a good time to refocus a site and concentrate your visitors and link equity on your best content. More is not better in 2015.

Letting go of some pages also means that you’re not 301-redirecting a massive number of old URLs to a new home-page. This can look like a low-quality attempt to consolidate link-equity, and at large scale it can raise red flags with Google. Content worth keeping should exist on the new site, and your 301s should have well-matched targets.

In one case, I had a blog post that had a decent trickle of traffic due to ranking for “50,000 push-ups,” but the post itself was weak and the bounce rate was very high:

The post was basically just a placeholder announcing that I’d be attempting this challenge, but I never recapped anything after finishing it. So, in this case,
I rewrote the post.

Of course, this process was repeated across the 3 active sites. The 2 inactive sites only constituted a handful of total pages. In the case of AreYouARealDoctor.com, I decided to turn the previous one-pager
into a new page on the new site. That way, I had a very well-matched target for the 301-redirect, instead of simply mapping the old site to my new home-page.

I’m trying to prove a point – this is the amount of work I did for a handful of sites that were mostly inactive and producing no current business value. I don’t need consulting gigs and these sites produce no direct revenue, and yet I still considered this process worth the effort.

Step 2 – The Big Day

Eventually, you’re going to have to make the move, and in most cases, I prefer ripping off the bandage. Of course, doing something all at once doesn’t mean you shouldn’t be careful.

The biggest problem I see with domain switches (even if they’re 1-to-1) is that people rely on data that can take weeks to evaluate, like rankings and traffic, or directly checking Google’s index. By then, a lot of damage is already done. Here are some ways to find out quickly if you’ve got problems…

(1) Manually Check Pages

Remember that list you were supposed to make? It’s time to check it, or at least spot-check it. Someone needs to physically go to a browser and make sure that each major section of the site and each important individual page is resolving properly. It doesn’t matter how confident your IT department/guy/gal is – things go wrong.

(2) Manually Check Headers

Just because a page resolves, it doesn’t mean that your 301-redirects are working properly, or that you’re not firing some kind of 17-step redirect chain. Check your headers. There are tons of free tools, but lately I’m fond of
URI Valet. Guess what – I screwed up my primary 301-redirects. One of my registrar transfers wasn’t working, so I had to have a setting changed by customer service, and I inadvertently ended up with 302s (Pro tip: Don’t change registrars and domains in one step):

Don’t think that because you’re an “expert”, your plan is foolproof. Mistakes happen, and because I caught this one I was able to correct it fairly quickly.

(3) Submit Your New Site

You don’t need to submit your site to Google in 2015, but now that Google Webmaster Tools allows it, why not do it? The primary argument I hear is “well, it’s not necessary.” True, but direct submission has one advantage – it’s fast.

To be precise, Google Webmaster Tools separates the process into “Fetch” and “Submit to index” (you’ll find this under “Crawl” > “Fetch as Google”). Fetching will quickly tell you if Google can resolve a URL and retrieve the page contents, which alone is pretty useful. Once a page is fetched, you can submit it, and you should see something like this:

This isn’t really about getting indexed – it’s about getting nearly instantaneous feedback. If Google has any major problems with crawling your site, you’ll know quickly, at least at the macro level.

(4) Submit New XML Sitemaps

Finally, submit a new set of XML sitemaps in Google Webmaster Tools, and preferably tiered sitemaps. While it’s a few years old now, Rob Ousbey has a great post on the subject of
XML sitemap structure. The basic idea is that, if you divide your sitemap into logical sections, it’s going to be much easier to diagnosis what kinds of pages Google is indexing and where you’re running into trouble.

A couple of pro tips on sitemaps – first, keep your old sitemaps active temporarily. This is counterintuitive to some people, but unless Google can crawl your old URLs, they won’t see and process the 301-redirects and other signals. Let the old accounts stay open for a couple of months, and don’t cut off access to the domains you’re moving.

Second (I learned this one the hard way), make sure that your Google Webmaster Tools site verification still works. If you use file uploads or meta tags and don’t move those files/tags to the new site, GWT verification will fail and you won’t have access to your old accounts. I’d recommend using a more domain-independent solution, like verifying with Google Analytics. If you lose verification, don’t panic – your data won’t be instantly lost.

Step 3 – The Waiting Game

Once you’ve made the switch, the waiting begins, and this is where many people start to panic. Even executed perfectly, it can take Google weeks or even months to process all of your 301-redirects and reevaluate a new domain’s capacity to rank. You have to expect short term fluctuations in ranking and traffic.

During this period, you’ll want to watch a few things – your traffic, your rankings, your indexed pages (via GWT and the site: operator), and your errors (such as unexpected 404s). Traffic will recover the fastest, since direct traffic is immediately carried through redirects, but ranking and indexation will lag, and errors may take time to appear.

(1) Monitor Traffic

I’m hoping you know how to check your traffic, but actually trying to determine what your new levels should be and comparing any two days can be easier said than done. If you launch on a Friday, and then Saturday your traffic goes down on the new site, that’s hardly cause for panic – your traffic probably
always goes down on Saturday.

In this case, I redirected the individual sites over about a week, but I’m going to focus on UserEffect.com, as that was the major traffic generator. That site was redirected, in full on January 21st, and the Google Analytics data for January for the old site looked like this:

So far, so good – traffic bottomed out almost immediately. Of course, losing traffic is easy – the real question is what’s going on with the new domain. Here’s the graph for January for DrPete.co:

This one’s a bit trickier – the first spike, on January 16th, is when I redirected the first domain. The second spike, on January 22nd, is when I redirected UserEffect.com. Both spikes are meaningless – I announced these re-launches on social media and got a short-term traffic burst. What we really want to know is where traffic is leveling out.

Of course, there isn’t a lot of history here, but a typical day for UserEffect.com in January was about 1,000 pageviews. The traffic to DrPete.co after it leveled out was about half that (500 pageviews). It’s not a complete crisis, but we’re definitely looking at a short-term loss.

Obviously, I’m simplifying the process here – for a large, ecommerce site you’d want to track a wide range of metrics, including conversion metrics. Hopefully, though, this illustrates the core approach. So, what am I missing out on? In this day of [not provided], tracking down a loss can be tricky. Let’s look for clues in our other three areas…

(2) Monitor Indexation

You can get a broad sense of your indexed pages from Google Webmaster Tools, but this data often lags real-time and isn’t very granular. Despite its shortcomings, I still prefer
the site: operator. Generally, I monitor a domain daily – any one measurement has a lot of noise, but what you’re looking for is the trend over time. Here’s the indexed page count for DrPete.co:

The first set of pages was indexed fairly quickly, and then the second set started being indexed soon after UserEffect.com was redirected. All in all, we’re seeing a fairly steady upward trend, and that’s what we’re hoping to see. The number is also in the ballpark of sanity (compared to the actual page count) and roughly matched GWT data once it started being reported.

So, what happened to UserEffect.com’s index after the switch?

The timeframe here is shorter, since UserEffect.com was redirected last, but we see a gradual decline in indexation, as expected. Note that the index size plateaus around 60 pages – about 1/4 of the original size. This isn’t abnormal – low-traffic and unlinked pages (or those with deep links) are going to take a while to clear out. This is a long-term process. Don’t panic over the absolute numbers – what you want here is a downward trend on the old domain accompanied by a roughly equal upward trend on the new domain.

The fact that UserEffect.com didn’t bottom out is definitely worth monitoring, but this timespan is too short for the plateau to be a major concern. The next step would be to dig into these specific pages and look for a pattern.

(3) Monitor Rankings

The old domain is dropping out of the index, and the new domain is taking its place, but we still don’t know why the new site is taking a traffic hit. It’s time to dig into our core keyword rankings.

Historically, UserEffect.com had ranked well for keywords related to “split test calculator” (near #1) and “usability checklist” (in the top 3). While [not provided] makes keyword-level traffic analysis tricky, we also know that the split-test calculator is one of the top trafficked pages on the site, so let’s dig into that one. Here’s the ranking data from Moz Analytics for “split test calculator”:

The new site took over the #1 position from the old site at first, but then quickly dropped down to the #3/#4 ranking. That may not sound like a lot, but given this general keyword category was one of the site’s top traffic drivers, the CTR drop from #1 to #3/#4 could definitely be causing problems.

When you have a specific keyword you can diagnose, it’s worth taking a look at the live SERP, just to get some context. The day after relaunch, I captured this result for “dr. pete”:

Here, the new domain is ranking, but it’s showing the old title tag. This may not be cause for alarm – weird things often happen in the very short term – but in this case we know that I accidentally set up a 302-redirect. There’s some reason to believe that Google didn’t pass full link equity during that period when 301s weren’t implemented.

Let’s look at a domain where the 301s behaved properly. Before the site was inactive, AreYouARealDoctor.com ranked #1 for “are you a real doctor”. Since there was an inactive period, and I dropped the exact-match domain, it wouldn’t be surprising to see a corresponding ranking drop.

In reality, the new site was ranking #1 for “are you a real doctor” within 2 weeks of 301-redirecting the old domain. The graph is just a horizontal line at #1, so I’m not going to bother you with it, but here’s a current screenshot (incognito):

Early on, I also spot-checked this result, and it wasn’t showing the strange title tag crossover that UserEffect.com pages exhibited. So, it’s very likely that the 302-redirects caused some problems.

Of course, these are just a couple of keywords, but I hope it provides a starting point for you to understand how to methodically approach this problem. There’s no use crying over spilled milk, and I’m not going to fire myself, so let’s move on to checking any other errors that I might have missed.

(4) Check Errors (404s, etc.)

A good first stop for unexpected errors is the “Crawl Errors” report in Google Webmaster Tools (Crawl > Crawl Errors). This is going to take some digging, especially if you’ve deliberately 404’ed some content. Over the couple of weeks after re-launch, I spotted the following problems:

The old site had a “/blog” directory, but the new site put the blog right on the home-page and had no corresponding directory. Doh. Hey, do as I say, not as I do, ok? Obviously, this was a big blunder, as the old blog home-page was well-trafficked.

The other two errors here are smaller but easy to correct. MinimalTalent.com had a “/free” directory that housed downloads (mostly PDFs). I missed it, since my other sites used a different format. Luckily, this was easy to remap.

The last error is a weird looking URL, and there are other similar URLs in the 404 list. This is where site knowledge is critical. I custom-designed a URL shortener for UserEffect.com and, in some cases, people linked to those URLs. Since those URLs didn’t exist in the site architecture, I missed them. This is where digging deep into historical traffic reports and your top-linked pages is critical. In this case, the fix isn’t easy, and I have to decide whether the loss is worth the time.

What About the New EMD?

My goal here wasn’t to rank better for “Dr. Pete,” and finally unseat Dr. Pete’s Marinades, Dr. Pete the Sodastream flavor (yes, it’s hilarious – you can stop sending me your grocery store photos), and 172 dentists. Ok, it mostly wasn’t my goal. Of course, you might be wondering how switching to an EMD worked out.

In the short term, I’m afraid the answer is “not very well.” I didn’t track ranking for “Dr. Pete” and related phrases very often before the switch, but it appears that ranking actually fell in the short-term. Current estimates have me sitting around page 4, even though my combined link profile suggests a much stronger position. Here’s a look at the ranking history for “dr pete” since relaunch (from Moz Analytics):

There was an initial drop, after which the site evened out a bit. This less-than-impressive plateau could be due to the bad 302s during transition. It could be Google evaluating a new EMD and multiple redirects to that EMD. It could be that the prevalence of natural anchor text with “Dr. Pete” pointing to my site suddenly looked unnatural when my domain name switched to DrPete.co. It could just be that this is going to take time to shake out.

If there’s a lesson here (and, admittedly, it’s too soon to tell), it’s that you shouldn’t rush to buy an EMD in 2015 in the wild hope of instantly ranking for that target phrase. There are so many factors involved in ranking for even a moderately competitive term, and your domain is just one small part of the mix.

So, What Did We Learn?

I hope you learned that I should’ve taken my own advice and planned a bit more carefully. I admit that this was a side project and it didn’t get the attention it deserved. The problem is that, even when real money is at stake, people rush these things and hope for the best. There’s a real cheerleading mentality when it comes to change – people want to take action and only see the upside.

Ultimately, in a corporate or agency environment, you can’t be the one sour note among the cheering. You’ll be ignored, and possibly even fired. That’s not fair, but it’s reality. What you need to do is make sure the work gets done right and people go into the process with eyes wide open. There’s no room for shortcuts when you’re moving to a new domain.

That said, a domain change isn’t a death sentence, either. Done right, and with sensible goals in mind – balancing not just SEO but broader marketing and business objectives – a domain migration can be successful, even across multiple sites.

To sum up: Plan, plan, plan, monitor, monitor, monitor, and try not to panic.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 years ago from tracking.feedpress.it

Announcing the 2015 Online Marketing Industry Survey

Posted by Cyrus-Shepard

We’re very excited to announce the
2015 Online Marketing Industry Survey is ready. This is the fifth edition of the survey, which started in 2008 as the SEO Industry Survey, and has also been known as the Moz Industry Survey. Some of what we hope to learn and share:

  • Demographics: Who is practicing inbound marketing and SEO today? Where do we work and live?
  • Agencies vs. in-house vs. other: How are agencies growing? What’s the average size? Who is doing inbound marketing on their own?
  • Tactics and strategies: What’s working for people today? How have strategies and tactics evolved?
  • Tools and technology: What are marketers using to discover opportunities, promote themselves, and measure the results?
  • Budget and spending: What tools and platforms are marketers investing in?


This year’s survey was redesigned to be easier and only
take less than 10 minutes. When the results are in we’ll share the data freely with you and the rest of the world, along with the insights we’ve gleaned from it.

Survey importance

By comparing answers and predictions from one year to the next, we can spot trends and gain insight not easily reported through any other source. This is our best chance to understand exactly where the future of our industry is headed.

Every year the Industry Survey delivers new insights and surprises. For example, the chart below (from the 2014 survey) lists
average reported salary by role.

One of the data points we hope to discover is if these numbers go up or down for 2015.

Prizes. Oh, fabulous prizes.

It wouldn’t be the Industry Survey without a few excellent prizes thrown in as an added incentive.

This year we’ve upped the game with prizes we feel are both exciting and perfect for the busy inbound marketer. To see the full sweepstakes terms and rules,
go to our sweepstakes rules page. The winners will be announced by June 15th. Follow us on Twitter to stay up to date.

Grand Prize: Attend MozCon 2015 in Seattle

Once again, the Grand Prize includes one ticket to
MozCon 2015 plus airfare and accommodations. This is your chance to see greats like Wil Reynolds, Cindy Krum, Rand Fishkin and more over 3 days in Seattle. Plus experience lots of networking and social events. Moz is also covering the cost of the flight plus hotel room.

2 First Prizes: Apple Watch

Shhhhhh! Because we’re giving away two Apple Watches. These aren’t available to the general public yet, which make them mysteriously awesome.

10 Second Prizes: $50 Amazon.com gift cards

Yep, 10 lucky people will win $50 Amazon.com gift cards. Why not buy yourself a
nice book? Maybe 
this one?

Help with sharing!

The number of people who take the survey is very important!
The more people who take the survey, the better and more accurate the data will be, and the more insight we can share with the industry.

So please share with your co-workers. Share on social media. Share with your email lists. You can use the buttons below this post to get you started, but remember
to take the survey first!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 years ago from tracking.feedpress.it

Announcing the New & Improved Link Intersect Tool

Posted by randfish

Y’all remember how last October, we launched a new section in Open Site Explorer called “Link Opportunities?” While I was proud of that work, there was one section that really disappointed me at the time (and I said as much in my comments on the post).

Well, today, that disappointment is over, because we’re stepping up the Link Intersect tool inside OSE big time:

Literally thousands of sweet, sweet link opportunities are now yours at the click of a button

In the initial launch, Link Intersect used Freshscape (which powers Fresh Web Explorer). Freshscape is great for certain kinds of data – links and mentions that come from newly published pages that are in news sources, blogs, and feeds. But it’s not great for non-news/blogs/feed sources because it’s intentionally avoiding those!

For example, in the screenshot above, I wanted to see all the pages that link to SeriousEats.com and SplendidTable.org but don’t link to SmittenKitchen.com.

That’s 671 more, juicy link opportunities thanks to the hard work of the Moz Big Data and Research Tools teams.

How does the new Link Intersect work?

The tool looks at the top 250,000 links our index has pointing to each of the intersecting targets you enter, and the top 1 mllion links in our index pointing to the excluded URL.

Link Intersect then runs a differential comparison to determine which of the 250K links to each of the intersecting targets are from the same URL or root domain, and removes any of those links that point to the top million links to the excluded URL/root/sub domain.

This means it’s possible for sites and pages with massive quantities of links that we won’t show every intersecting link we know about, but since the sorting is in Page Authority order, you’ll get the highest quality/most important ones at the top.

You can use Link Intersect to see three unique views on the data:

  • Pages that link to subdomains (particularly useful if you’re interested in shared links to sites on hosted subdomains like blogspot, wordpress, etc or to a specific subdomain section of a competitor’s site)
  • Pages that link to root domains (my personal favorite, as I find the results the most comprehensive)
  • Root domains that link to the root domains (great if you’re trying to get a broad sense of domain-level outreach/marketing targets)

Note that it’s possible the root domains will actually expose more links that pages because the domain-level link graph is easier and faster to sort through, so the 250K limit is less of a barrier.

Like most of the reports in Open Site Explorer, Link Intersect comes with a handy CSV Export option:

When it finishes (my most recent one took just under 3 minutes to run and email me), you’ll get a nice email like this one:

Please ignore the grammatical errors. I’m sure our team will fix those up soon 🙂

Why are these such good link/outreach/marketing targets?

Generally speaking, this type of data is invaluable for link outreach because these sites and pages are ones that clearly care about the shared topics or content of the intersecting targets. If you enter two of your primary competitors, you’ll often get news media, blog posts, reference resources, events, trade publications, and more that produce content in your topical niche.

They’re also good targets because they actually link out! This means you can avoid sifting through sites whose policies or practices mean they’re unlikely to ever link to you – if they’ve linked to those other two chaps, why not you, too?!

Basically, you can check the trifecta of link opportunity goodness boxes (which I’ve helpfully illustrated above, because that’s just the kind of SEO dork I am).

Link Intersect is limited only by your own creativity – so long as you can keep finding sites and pages on the web whose links might also be a match for your own site, we can keep digging through trillions of links, finding the intersects, and giving them back to you.

3 examples of Link Intersect in action

Let’s look at some ways we might put this to use in the real world:

#1: I’m trying to figure out who links to my two big competitors in the world of book reviews

First off, remember that Link Intersect works on a root domain or subdomain level, so we wouldn’t want to use something like the NYTimes’ review of books, because we’d be finding all the intersections to NYTimes.com. Instead, we want to pick more topically-focused domains, like these two:

You’ll also note that I’ve used a fake website as my excluded URL – this is a great trick for when you’re simply interested in any sites/pages that link to two domains and don’t need to remove a particular target.

#2: I’ve got a locally-focused website doing plumbing and need a few link sources to help boost my potential to rank in local and organic SERPs

In this instance, I’ll certainly look at pages linking to combinations of the top ranking sites in the local results, e.g. the 15 results for this query:

This is a solid starting point, especially considering how few links local sites often need to perform well. But we can get creative by branching outside of plumbing and exploring related fields like construction:

Focusing on better-linked-to industries and websites will give more results, so we want to try to broaden rather than narrow our categories and look for the most-linked-to sites in given verticals for comparisons.

#3: I’m planning some new content around weather patterns for my air conditioning website and want to know what news and blog sites cover extreme weather content

First, I’m going to start by browsing some search results for content in this field that’s received some serious link activity. By turning on my Mozbar’s SERPs overlay, I can see the sites and pages that have generated loads of links:

Now I can run a few combinations of these through the Link Intersect Tool:

While those domain names make me fear for humanity’s intelligence and future survival, they also expose a great link opportunity tactic I hadn’t previously considered – climate science deniers and the more politically charged universe of climate science overall.


I hope you enjoy the new Link Intersect tool as much as I have been – I think it’s one of the best things we’ve put in Open Site Explorer in the last few months, though what we’re releasing in March might beat even that, so stay tuned!

And, as always, please do give us feedback and feel free to ask questions in the comments below or through the Moz Community Q+A.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 years ago from tracking.feedpress.it

Announcing our 2015 event schedule

Our 2015 events are now live and in our blog section; Majestic 2015 events We will be appearing worldwide throughout the year and hope to see some of you at the shows. A couple more dates may be added in the next few months but for now, these are our confirmed events. We hope this…

The post Announcing our 2015 event schedule appeared first on Majestic Blog.

Reblogged 2 years ago from blog.majestic.com

Try Your Hand at A/B Testing for a Chance to Win the Email Subject Line Contest

Posted by danielburstein

This blog post ends with an opportunity for you to win a stay at the ARIA in Vegas and a ticket to
Email Summit, but it begins with an essential question for marketers…

How can you improve already successful marketing, advertising, websites and copywriting?

Today’s Moz blog post is unique. Not only are we going to teach you how to address this challenge, we’re going to offer an example that you can dig into to help drive home the lesson.

Give the people what they want

Some copy and design is so bad, the fixes are obvious. Maybe you shouldn’t insult the customer in the headline. Maybe you should update the website that still uses a dot matrix font.

But when you’re already doing well, how can you continue to improve?

I don’t have the answer for you, but I’ll tell you who does – your customers.

There are many tricks, gimmicks and technology you can use in marketing, but when you strip away all the hype and rhetoric, successful marketing is pretty straightforward –
clearly communicate the value your offer provides to people who will pay you for that value.

Easier said than done, of course.

So how do you determine what customers want? And the best way to deliver it to them?

Well, there are many ways to learn from customers, such as focus groups, surveys and social listening. While there is value in asking people what they want, there is also a major challenge in it. “People’s ability to understand the factors that affect their behavior is surprisingly poor,” according to research from Dr. Noah J. Goldstein, Associate Professor of Management and Organizations, UCLA Anderson School of Management.

Or, as Malcolm Gladwell more glibly puts it when referring to coffee choices, “The mind knows not what the tongue wants.”

Not to say that opinion-based customer preference research is bad. It can be helpful. However, it should be the beginning and not the end of your quest.

…by seeing what they actually do

You can use what you learn from opinion-based research to create a hypothesis about what customers want, and then
run an experiment to see how they actually behave in real-world customer interactions with your product, marketing messages, and website.

The technique that powers this kind of research is often known as A/B testing, split testing, landing page optimization, and/or website optimization. If you are testing more than one thing at a time, it may also be referred to as multi-variate testing.

To offer a simple example, you might assume that customers buy your product because it tastes great. Or because it’s less filling. So you could create two landing pages – one with a headline that promotes that taste (treatment A) and another that mentions the low carbs (treatment B). You then send half the traffic that visits that URL to each version and see which performs better.

Here is a simple visual that Joey Taravella, Content Writer, MECLABS create to illustrate the concept…

That’s just one test. To really learn about your customers, you must continue the process and create a testing-optimization cycle in your organization – continue to run A/B tests, record the findings, learn from them, create more hypotheses, and test again based on these hypotheses.

This is true marketing experimentation, and helps you build your theory of the customer.

But you probably know all that already. So here’s your chance to practice while helping us shape an A/B test. You might even win a prize in the process.

The email subject line contest

The Moz Blog and MarketingExperiments Blog have joined forces to run a unique marketing experimentation contest. We’re presenting you with a real challenge from a real organization (VolunteerMatch) and
asking you to write a subject line to test (it’s simple, just leave your subject line as a comment in this blog post).

We’re going to pick three subject lines suggested by readers of The Moz Blog and three from the MarketingExperiments Blog and run a test with this organization’s customers. Whoever writes the best performing subject line will
win a stay at the ARIA Resort in Las Vegas as well as a two-day ticket to MarketingSherpa Email Summit 2015 to help them gain lessons to further improve their marketing.

Sound good? OK, let’s dive in and tell you more about your “client”…

Craft the best-performing subject line to win the prize

Every year at Email Summit, we run a live A/B test where the audience helps craft the experiment. We then run, validate, close the experiment, and share the results during Summit as a way to teach about marketing experimentation. We have typically run the experiment using MarketingSherpa as the “client” website to test (MarketingExperiments and MarketingSherpa are sister publications, both owned by MECLABS Institute).

However, this year we wanted to try something different and interviewed three national non-profits to find a new “client” for our tests.

We chose
VolunteerMatch – a nonprofit organization that uses the power of technology to make it easier for good people and good causes to connect. One of the key reasons we chose VolunteerMatch is because it is an already successful organization looking to further improve. (Here is a case study explaining one of its successful implementations – Lead Management: How a B2B SaaS nonprofit decreased its sales cycle 99%).

Another reason we chose VolunteerMatch for this opportunity is that it has three types of customers, so the lessons from the content we create can help marketers across a wide range of sales models. VolunteerMatch’s customers are:

  • People who want to volunteer (B2C)
  • Non-profit organizations looking for volunteers (non-profit)
  • Businesses looking for corporate volunteering solutions (B2B) to which it offers a Software-as-a-Service product through VolunteerMatch Solutions

Designing the experiment

After we took VolunteerMatch on as the Research Partner “client,” Jon Powell, Senior Executive Research and Development Manager, MECLABS, worked with Shari Tishman, Director of Engagement and Lauren Wagner, Senior Manager of Engagement, VolunteerMatch, to understand their challenges, take a look at their current assets and performance, and craft a design of experiments to determine what further knowledge about its customers would help VolunteerMatch improve performance.

That design of experiments includes a series of split tests – including the live test we’re going to run at Email Summit, as well as the one you have an opportunity to take part in by writing a subject line in the comments section of this blog post. Let’s take a look at that experiment…

The challenge

VolunteerMatch wants to increase the response rate of the corporate email list (B2B) by discovering the best possible messaging to use. In order to find out, MarketingExperiments wants to run an A/B split test to determine the
best messaging.

However the B2B list is relatively smaller than the volunteer/cause list (B2C) which makes it harder to test in (and gain
statistical significance) and determine which messaging is most effective.

So we’re going to run a messaging test to the B2C list. This isn’t without its challenges though, because most individuals on the B2C list are not likely to immediately connect with B2B corporate solutions messaging.

So the question is…

How do we create an email that is relevant (to the B2C list), which doesn’t ask too much, that simultaneously helps us discover the most relevant aspect of the solutions (B2B) product (if any)?

The approach – Here’s where you come in

This is where the Moz and MarketingExperiments community comes in to help.

We would like you to craft subject lines relevant to the B2C list, which highlight various benefits of the corporate solutions tool.

We have broken down the corporate solutions tool into three main categories of benefit for the SaaS product.
In the comments section below, include which category you are writing a subject line for along with what you think is an effective subject line.

The crew at Moz and MarketingExperiments will then choose the top subject line in each category to test. Below you will find the emails that will be sent as part of the test. They are identical, except for the subject lines (which you will write) and the bolded line in the third paragraph (that ties into that category of value).

Category #1: Proof, recognition, credibility


Category #2: Better, more opportunities to choose from


Category #3: Ease-of-use

About VolunteerMatch’s brand

Since we’re asking you to try your hand at crafting messaging for this example “client,” here is some more information about the brand to inform your messaging…


VolunteerMatch’s brand identity


VolunteerMatch’s core values

Ten things VolunteerMatch believes:

  1. People want to do good
  2. Every great cause should be able to find the help it needs
  3. People want to improve their lives and communities through volunteering
  4. You can’t make a difference without making a connection
  5. In putting the power of technology to good use
  6. Businesses are serious about making a difference
  7. In building relationships based on trust and excellent service
  8. In partnering with like-minded organizations to create systems that result in even greater impact
  9. The passion of our employees drives the success of our products, services and mission
  10. In being great at what we do

And now, we test…

To participate, you must leave your comment with your idea for a subject line before midnight on Tuesday, January 13, 2015. The contest is open to all residents of the 50 US states, the District of Columbia, and Canada (excluding Quebec), 18 or older. If you want more info, here are the
official rules.

When you enter your subject line in the comments section, also include which category you’re entering for (and if you have an idea outside these categories, let us know…we just might drop it in the test).

Next, the Moz marketing team will pick the subject lines they think will perform best in each category from all the comments on The Moz Blog, and the MarketingExperiments team will pick the subject lines we think will perform the best in each category from all the comments on the MarketingExperiments Blog.

We’ll give the VolunteerMatch team a chance to approve the subject lines based on their brand standards, then test all six to eight subject lines and report back to you through the Moz and MarketingExperiments blogs which subject lines won and why they won to help you improve your already successful marketing.

So, what have you got? Write your best subject lines in the comments section below. I look forward to seeing what you come up with.

Related resources

If you’re interested in learning more about marketing experimentation and A/B testing, you might find these links helpful…

And here’s a look at a previous subject line writing contest we’ve run to give you some ideas for your entry…


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 years ago from moz.com