How to Find Your True Local Competitors

Posted by MiriamEllis

Who are your clients’ true competitors?

It’s a question that’s become harder to answer. What felt like a fairly simple triangulation between Google, brand, and searcher in the early days of the local web has multiplied into a geodesic dome of localization, personalization, intent matching, and other facets.

This evolution from a simple shape to a more complex shape has the local SEO industry starting to understand the need to talk about trends and patterns vs. empirical rankings.

For instance, you might notice that you just can’t deliver client reports that say, “Congratulations, you’re #1” anymore. And that’s because the new reality is that there is no #1 for all searchers. A user on the north side of town may see a completely different local pack of results if they go south, or if they modify their search language. An SEO may get a whole different SERP if they search on one rank checking tool vs. another — or even on the same tool, just five minutes later.

Despite all this, you still need to analyze and report — it remains a core task to audit a client’s competitive landscape.

 Today, let’s talk about how we can distill this dynamic, complex environment down to the simplest shapes to understand who your client’s true competitors are. I’ll be sharing a spreadsheet to help you and your clients see the trends and patterns that can create the basis for competitive strategy.

Why are competitive audits necessary…and challenging?

Before we dive into a demo, let’s sync up on what the basic point is of auditing local competitors. Essentially, you’re seeking contrast — you stack up two brands side-by-side to discover the metrics that appear to be making one of them dominant in the local or localized organic SERPs.

From there, you can develop a strategy to emulate the successes of the current winner with the goal of meeting and then surpassing them with superior efforts.

But before you start comparing your brand A to their brand B, you’ve got to know who brand B actually is. What obstacles do you face?

1. SERPs are incredibly diversified

    A recent STAT whitepaper that looked at 1.2 million keywords says it all: every SERP is a local SERP. And since both local packs and organic results are both subject to the whims of geo-location and geo-modification, incorporating them into your tracking strategy is a must.

    To explain, imagine two searchers are sitting on the same couch. One searches for “Mexican restaurant” and the other searches for “Mexican restaurant near me”. Then, they divvy up searching “Mexican restaurant near me” vs. “Mexican restaurant in San Jose”. And, so on. What they see are local packs that are only about 80 percent similar based on Google recognizing different intents. That’s significant variability.

    The scenario gets even more interesting when one of the searchers gets up and travels across town to a different zip code. At that point, the two people making identical queries can see local packs that range from only about 26–65 percent similar. In other words, quite different.

    Now, let’s say your client wants to rank for seven key phrases — like “Mexican restaurant,” “Mexican restaurant near me,” “Mexican restaurant San Jose,” “best Mexican restaurant,” “cheap Mexican restaurant,” etc. Your client doesn’t have just three businesses to compete against in the local pack; they now have multiple multiples of three!

    2) Even good rank tracking tools can be inconsistent

    There are many useful local rank tracking tools out there, and one of the most popular comes to us from BrightLocal. I really like the super easy interface of this tool, but there is a consistency issue with this and other tools I’ve tried, which I’ve captured in a screenshot, below.

    Here I’m performing the same search at 5-minute intervals, showing how the reported localized organic ranking of a single business vary widely across time.

    The business above appears to move from position 5 to position 12. This illustrates the difficulty of answering the question of who is actually the top competitor when using a tool. My understanding is that this type of variability may result from the use of proxies. If you know of a local rank checker that doesn’t do this, please let our community know in the comments.

    In the meantime, what I’ve discovered in my own work is that it’s really hard to find a strong and consistent substitute for manually checking which competitors rank where, on the ground. So, let’s try something out together.

    The simplest solution for finding true competitors

    Your client owns a Mexican restaurant and has seven main keyword phrases they want to compete for. Follow these five easy steps:

    Step 1: Give the client a local pack crash course

    If the client doesn’t already know, teach them how to perform a search on Google and recognize what a local pack is. Show them how businesses in the pack rank 1, 2, and 3. If they have more questions about local packs, how they show up in results, and how Google ranks content, they can check out our updated Beginners Guide to SEO.

    Step 2: Give the client a spreadsheet and a tiny bit of homework

    Give the client a copy of this free spreadsheet, filled out with their most desired keyword phrases. Have them conduct seven searches from a computer located at their place of business* and then fill out the spreadsheet with the names of the three competitors they see for each of the seven phrases. Tell them not to pay attention to any of the other fields of the spreadsheet.

    *Be sure the client does this task from their business’ physical location as this is the best way to see what searchers in their area will see in the local results. Why are we doing this? Because Google weights proximity of the searcher-to-the-business so heavily, we have to pretend we’re a searcher at or near the business to emulate Google’s “thought process”.

    Step 3: Roll up your sleeves for your part of the work

    Now it’s your turn. Look up “directions Google” in Google.

    Enter your client’s business address and the address of their first competitor. Write down the distance in the spreadsheet. Repeat for every entry in each of the seven local packs. This will take you approximately 10–15 minutes to cover all 21 locations, so make sure you’re doing it on company time to ensure you’re on the clock.

    Step 4: Get measuring

    Now, in the 2nd column of the spreadsheet, note down the greatest distance Google appears to be going to fill out the results for each pack.

    Step 5: Identify competitors by strength

    Finally, rate the competitors by the number of times each one appears across all seven local packs. Your spreadsheet should now look something like this:

    Looking at the example sheet above, we’ve learned that:

    • Mi Casa and El Juan’s are the dominant competitors in your client’s market, ranking in 4/7 packs. Plaza Azul is also a strong competitor, with a place in 3/7 packs.
    • Don Pedro’s and Rubio’s are noteworthy with 2/7 pack appearances.
    • All the others make just one pack appearance, making them basic competitors.
    • The radius to which Google is willing to expand to find relevant businesses varies significantly, depending on the search term. While they’re having to go just a couple of miles to find competitors for “Mexican restaurant”, they’re forced to go more than 15 miles for a long tail term like “organic Mexican restaurant”.

    You now know who the client’s direct competitors are for their most desired searches, and how far Google is willing to go to make up a local pack for each term. You have discovered a pattern of most dominant competition across your client’s top phrases, signaling which players need to be audited to yield clues about which elements are making them so strong.

    The pros and cons of the simple search shape

    The old song says that it’s a gift to be simple, but there are some drawbacks to my methodology, namely:

    • You’ll have to depend on the client to help you out for a few minutes, and some clients are not very good at participation, so you’ll need to convince them of the value of their doing the initial searches for you.
    • Manual work is sometimes tedious.
    • Scaling this for a multi-location enterprise would be time-consuming.
    • Some of your clients are going to be located in large cities and will want to know what competitors are showing up for users across town and in different zip codes. Sometimes, it will be possible to compete with these differently-located competitors, but not always. At any rate, our approach doesn’t cover this scenario and you will be stuck with either using tools (with their known inconsistencies), or sending the client across town to search from that locale. This could quickly become a large chore.

    Negatives aside, the positives of this very basic exercise are:

    • Instead of tying yourself to the limited vision of a single local pack and a single set of competitors, you are seeing a trend, a pattern of dominant market-wide competitors.
    • You will have swiftly arrived at a base set of dominant, strong, and noteworthy competitors to audit, with the above-stated goal of figuring out what’s helping them to win so that you can create a client strategy for emulating and surpassing them.
    • Your agency will have created a useful view of your client’s market, understanding the difference between businesses that seem very embedded (like Mi Casa) across multiple packs, vs. those (like Taco Bell) that are only one-offs and could possibly be easier to outpace.
    • You may discover some extremely valuable competitive intel for your client. For example, if Google is having to cast a 15-mile net to find an organic Mexican restaurant, what if your client started offering more organic items on their menu, writing more about this and getting more reviews that mention it? This will give Google a new option, right in town, to consider for local pack inclusion.
    • It’s really quite fast to do for a single-location business.
    • Client buy-in should be a snap for any research they’ve personally helped on, and the spreadsheet should be something they can intuitively and immediately understand.

    My questions for you

    I’d like to close by asking you some questions about your work doing competitive audits for local businesses. I’d be truly interested in your replies as we all work together to navigate the complex shape of Google’s SERPs:

    1. What percentage of your clients “get” that Google’s results have become so dynamic, with different competitors being shown for different queries and different packs being based on searcher location? What percentage of your clients are “there yet” with this concept vs. the old idea of just being #1, period?
    2. I’ve offered you a manual process for getting at trustworthy data on competitors, but as I’ve said, it does take some work. If something could automate this process for you, especially for multi-location clients, would you be interested in hearing more about it?
    3. How often do you do competitive audits for clients? Monthly? Every six months? Annually?

    Thanks for responding, and allow me to wish you and your clients a happy and empowering audit!

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Reblogged 7 months ago from tracking.feedpress.it

    How to find your brand voice

     Hey bae! Ever received an email that sounded totally ‘sus’ from a company you thought was ‘fam?’

    Seriously. We’ve all seen them. Those emails that suddenly change their tone overnight to try and capitalise on a trend – the luxury brand using emoticons in its subject lines or the corporate brand jumping on the slang bandwagon.

    How did that make you feel? Did it come across as disingenuous? Did it make you think less of the company sending the email?

    When your business communicates with its customers, you need to use language that not only conveys your brand’s personality, but that also connects with those customers and makes them feel that as a brand, you share values and understand their needs. The words you use need to back up the story you are selling about your brand and the tone of those words has to appeal to your target audience.

    For example, if you are a recruitment business, should you really use text speak to address job candidates in your email campaigns? If you sell fast fashion, you certainly wouldn’t want your email campaigns to come across as stiff and corporate. Your customers simply wouldn’t believe that the communication in either of these instances was genuine.

    When brands fail to communicate with customers in an authentic way, they run the risk of alienating their target market. In a recent survey we conducted, 64% of respondents said that they either dislike or hate brands using slang terms like ‘bae’, ‘babes’ and ‘totes’ mainly because they believe that the brands using them are trying too hard to sound like their customers while actually coming across as though these communications are ‘written by a 37-year-old white man’.

    Just because your customers speak in a certain way, it does not mean your brand voice should imitate them; instead you need to determine what tone of voice you should use in order to get them to trust you and then use this voice consistently across all communications channels.

    How do you create a brand voice?

    Your brand voice should be informed both by who you are, and what message you want to convey, and by your audience and what they need from you. Start by answering four simple questions:

    • who are your current audience?
    • Who are your desired audience?
    • what do they need you to be in order to trust you?
    • what do they need you to be in order to notice you in a crowded marketplace?

    Once you understand who you’re trying to reach and what they expect, you need to think about who you are as a business and what personality and message you want to convey through your tone and choice of words.

    Write down the personality traits you want to convey. For example, are you friendly and professional? Upbeat, youthful and laid-back? Authoritative and straight-talking but approachable?

    You should also write down the traits you do not want to convey. So you might be friendly and professional, but never corporate or prone to jargon. Or, you could be authoritative, straight-talking and approachable, but never flippant or comical.

    Once you’ve decided on your style, you need to give examples of how the brand voice should sound – as well as some examples of what not to do.

    Finally, you need to explain your reasoning. Tell your team exactly how the brand voice you want them to use relates to your overall brand, along with what you’re trying to achieve and why you think it will connect with your audience. This way everybody who is expected to communicate using your brand voice understands exactly how they are meant to speak.

    To help you with this process, we’ve created a handy template for you to fill in.

    Getting it right for email

    Having a strong brand voice is particularly important in your email campaigns because if you can gain the trust of your existing customers you can turn them into advocates. In order to do this, however, they need to develop a personal connection to your brand, which they won’t do if they don’t trust the communication they receive.

    Your email communications can establish a long-term relationship and make people feel like they’re having a two-way conversation, especially when you personalise that communication using automation. But if you get the brand voice wrong, they’ll disengage entirely and you will lose them as customers.

    Defining your brand voice may sound like a daunting task, but if you understand your brand story and you know your customers then you should be able to develop a meaningful way of communicating with them.

     

    The post How to find your brand voice appeared first on The Email Marketing Blog.

    Reblogged 2 years ago from blog.dotmailer.com

    Darryl, the man behind dotmailer’s Custom Technical Solutions team

    Why did you decide to come to dotmailer?

    I first got to know dotmailer when the company was just a bunch of young enthusiastic web developers called Ellipsis Media back in 1999. I was introduced by one of my suppliers and we decided to bring them on board to build a recruitment website for one of our clients. That client was Amnesty International and the job role was Secretary General. Not bad for a Croydon company whose biggest client before that was Scobles the plumber’s merchants. So, I was probably dotmailer’s first ever corporate client! After that, I used dotmailer at each company I worked for and then one day they approached a colleague and me and asked us if we wanted to work for them. That was 2013.  We grabbed the opportunity with both hands and haven’t looked back since.

    Tell us a bit about your role

    I’m the Global Head of Technical Solutions which actually gives me responsibility for 2 teams. First, Custom Technical Solutions (CTS), who build bespoke applications and tools for customers that allow them to integrate more closely with dotmailer and make life easier. Second, Technical Pre-sales, which spans our 3 territories (EMEA, US and APAC) and works with prospective and existing clients to figure out the best solution and fit within dotmailer.

    What accomplishments are you most proud of from your dotmailer time so far?

    I would say so far it has to be helping to turn the CTS team from just 2 people into a group of 7 highly skilled and dedicated men and women who have become an intrinsic and valued part of the dotmailer organization. Also I really enjoy being part of the Senior Technical Management team. Here we have the ability to influence the direction and structure of the platform on a daily basis.

    Meet Darryl Clark – the cheese and peanut butter sandwich lover

    Can you speak a bit about your background and that of your team? What experience and expertise is required to join this team?

    My background is quite diverse from a stint in the Army, through design college, web development, business analysis to heading up my current teams. I would say the most valuable skill that I have is being highly analytical. I love nothing more than listening to a client’s requirements and digging deep to work out how we can answer these if not exceed them.

    As a team, we love nothing more than brainstorming our ideas. Every member has a valid input and we listen. Everyone has the opportunity to influence what we do and our motto is “there is no such thing as a stupid question.”

    To work in my teams you have to be analytical but open minded to the fact that other people may have a better answer than you. Embrace other people’s input and use it to give our clients the best possible solution. We are hugely detail conscious, but have to be acutely aware that we need to tailor what we say to our audience so being able to talk to anyone at any level is hugely valuable.

    How much of the dotmailer platform is easily customizable and when does it cross over into something that requires your team’s expertise? How much time is spent on these custom solutions one-time or ongoing?

    I’ll let you in on a little secret here. We don’t actually do anything that our customers can’t do with dotmailer given the right knowledge and resources. This is because we build all of our solutions using the dotmailer public API. The API has hundreds of methods in both SOAP and REST versions, which allows you to do a huge amount with the dotmailer platform. We do have a vast amount of experience and knowledge in the team so we may well be able to build a solution quicker than our customers. We are more than happy to help them and their development teams build a solution using us on a consultancy basis to lessen the steepness of the learning curve.

    Our aim when building a solution for a customer is that it runs silently in the background and does what it should without any fuss.

    What are your plans for the Custom Tech Solutions team going forward?

    The great thing about Custom Technical Solutions is you never know what is around the corner as our customers have very diverse needs. What we are concentrating on at the moment is refining our processes to ensure that they are as streamlined as possible and allow us to give as much information to the customer as we can. We are also always looking at the technology and coding approaches that we use to make sure that we build the most innovative and robust solutions.

    We are also looking at our external marketing and sharing our knowledge through blogs so keep an eye on the website for our insights.

    What are the most common questions that you get when speaking to a prospective customer?

    Most questions seem to revolve around reassurance such as “Have you done this before?”, “How safe is my data?”, “What about security?”, “Can you talk to my developers?”, “Do I need to do anything?”.  In most instances, we are the ones asking the questions as we need to find out information as soon as possible so that we can analyse it to ensure that we have the right detail to provide the right solution.

    Can you tell us about the dotmailer differentiators you highlight when speaking to prospective customers that seem to really resonate?

    We talk a lot about working with best of breed so for example a customer can use our Channel Extensions in automation programs to fire out an SMS to a contact using their existing provider. We don’t force customers down one route, we like to let them decide for themselves.

    Also, I really like to emphasize the fact that there is always more than one way to do something within the dotmailer platform. This means we can usually find a way to do something that works for a client within the platform. If not, then we call in CTS to work out if there is a way that we can build something that will — whether this is automating uploads for a small client or mass sending from thousands of child accounts for an enterprise level one.

    What do you see as the future of marketing automation technology?  Will one size ever fit all? Or more customization going forward?

    The 64 million dollar question. One size will never fit all. Companies and their systems are too organic for that. There isn’t one car that suits every driver or one racquet that suits every sport. Working with a top drawer partner network and building our system to be as open as possible from an integration perspective means that our customers can make dotmailer mold to their business and not the other way round…and adding to that the fact that we are building lots of features in the platform that will blow your socks off.

    Tell us a bit about yourself – favorite sports team, favorite food, guilty pleasure, favorite band, favorite vacation spot?

    I’m a dyed in the wool Gooner (aka Arsenal Football Club fan) thanks to my Grandfather leading me down the right path as a child. If you are still reading this after that bombshell, then food-wise I pretty much like everything apart from coriander which as far as I’m concerned is the Devils own spawn. I don’t really have a favorite band, but am partial to a bit of Level 42 and Kings of Leon and you will also find me listening to 90s drum and bass and proper old school hip hop. My favorite holiday destination is any decent villa that I can relax in and spend time with my family and I went to Paris recently and loved that. Guilty pleasure – well that probably has to be confessing to liking Coldplay or the fact that my favorite sandwich is peanut butter, cheese and salad cream. Go on try it, you’ll love it.

    Want to meet more of the dotmailer team? Say hi to Darren Hockley, Global Head of Support, and Dan Morris, EVP for North America.

    Reblogged 3 years ago from blog.dotmailer.com

    Meet Dan Morris, Executive Vice President, North America

    1. Why did you decide to come to dotmailer?

    The top three reasons were People, Product and Opportunity. I met the people who make up our business and heard their stories from the past 18 years, learned about the platform and market leading status they had built in the UK, and saw that I could add value with my U.S. high growth business experience. I’ve been working with marketers, entrepreneurs and business owners for years across a series of different roles, and saw that I could apply what I’d learned from that and the start-up space to dotmailer’s U.S. operation. dotmailer has had clients in the U.S. for 12 years and we’re positioned to grow the user base of our powerful and easy-to-use platform significantly. I knew I could make a difference here, and what closed the deal for me was the people.  Every single person I’ve met is deeply committed to the business, to the success of our customers and to making our solution simple and efficient.  We’re a great group of passionate people and I’m proud to have joined the dotfamily.

    Dan Morris, dotmailer’s EVP for North America in the new NYC office

        1. Tell us a bit about your new role

    dotmailer has been in business and in this space for more than 18 years. We were a web agency, then a Systems Integrator, and we got into the email business that way, ultimately building the dotmailer platform thousands of people use daily. This means we know this space better than anyone and we have the perfect solutions to align closely with our customers and the solutions flexible enough to grow with them.  My role is to take all that experience and the platform and grow our U.S. presence. My early focus has been on identifying the right team to execute our growth plans. We want to be the market leader in the U.S. in the next three years – just like we’ve done in the UK –  so getting the right people in the right spots was critical.  We quickly assessed the skills of the U.S. team and made changes that were necessary in order to provide the right focus on customer success. Next, we set out to completely rebuild dotmailer’s commercial approach in the U.S.  We simplified our offers to three bundles, so that pricing and what’s included in those bundles is transparent to our customers.  We’ve heard great things about this already from clients and partners. We’re also increasing our resources on customer success and support.  We’re intensely focused on ease of on-boarding, ease of use and speed of use.  We consistently hear how easy and smooth a process it is to use dotmailer’s tools.  That’s key for us – when you buy a dotmailer solution, we want to onboard you quickly and make sure you have all of your questions answered right away so that you can move right into using it.  Customers are raving about this, so we know it’s working well.

    1. What early accomplishments are you most proud of from your dotmailer time so far?

    I’ve been at dotmailer for eight months now and I’m really proud of all we’ve accomplished together.  We spent a lot of time assessing where we needed to restructure and where we needed to invest.  We made the changes we needed, invested in our partner program, localized tech support, customer on-boarding and added customer success team members.  We have the right people in the right roles and it’s making a difference.  We have a commercial approach that is clear with the complete transparency that we wanted to provide our customers.  We’ve got a more customer-focused approach and we’re on-boarding customers quickly so they’re up and running faster.  We have happier customers than ever before and that’s the key to everything we do.

    1. You’ve moved the U.S. team to a new office. Can you tell us why and a bit about the new space?

    I thought it was very important to create a NY office space that was tied to branding and other offices around the world, and also had its own NY energy and culture for our team here – to foster collaboration and to have some fun.  It was also important for us that we had a flexible space where we could welcome customers, partners and resellers, and also hold classes and dotUniversity training sessions. I’m really grateful to the team who worked on the space because it really reflects our team and what we care about.   At any given time, you’ll see a training session happening, the team collaborating, a customer dropping in to ask a few questions or a partner dropping in to work from here.  We love our new, NYC space.

    We had a spectacular reception this week to celebrate the opening of this office with customers, partners and the dotmailer leadership team in attendance. Please take a look at the photos from our event on Facebook.

    Guests and the team at dotmailer's new NYC office warming party

    Guests and the team at dotmailer’s new NYC office warming party

    1. What did you learn from your days in the start-up space that you’re applying at dotmailer?

    The start-up space is a great place to learn. You have to know where every dollar is going and coming from, so every choice you make needs to be backed up with a business case for that investment.  You try lots of different things to see if they’ll work and you’re ready to turn those tactics up or down quickly based on an assessment of the results. You also learn things don’t have to stay the way they are, and can change if you make them change. You always listen and learn – to customers, partners, industry veterans, advisors, etc. to better understand what’s working and not working.  dotmailer has been in business for 18 years now, and so there are so many great contributors across the business who know how things have worked and yet are always keen to keep improving.  I am constantly in listening and learning mode so that I can understand all of the unique perspectives our team brings and what we need to act on.

    1. What are your plans for the U.S. and the sales function there?

    On our path to being the market leader in the U.S., I’m focused on three things going forward: 1 – I want our customers to be truly happy.  It’s already a big focus in the dotmailer organization – and we’re working hard to understand their challenges and goals so we can take product and service to the next level. 2 – Creating an even more robust program around partners, resellers and further building out our channel partners to continuously improve sales and customer service programs. We recently launched a certification program to ensure partners have all the training and resources they need to support our mutual customers.  3 – We have an aggressive growth plan for the U.S. and I’m very focused on making sure our team is well trained, and that we remain thoughtful and measured as we take the steps to grow.  We want to always keep an eye on what we’re known for – tools that are powerful and simple to use – and make sure everything else we offer remains accessible and valuable as we execute our growth plans.

    1. What are the most common questions that you get when speaking to a prospective customer?

    The questions we usually get are around price, service level and flexibility.  How much does dotmailer cost?  How well are you going to look after my business?  How will you integrate into my existing stack and then my plans for future growth? We now have three transparent bundle options with specifics around what’s included published right on our website.  We have introduced a customer success team that’s focused only on taking great care of our customers and we’re hearing stories every day that tells me this is working.  And we have all of the tools to support our customers as they grow and to also integrate into their existing stacks – often integrating so well that you can use dotmailer from within Magento, Salesforce or Dynamics, for example.

    1. Can you tell us about the dotmailer differentiators you highlight when speaking to prospective customers that seem to really resonate?

    In addition to the ones above – ease of use, speed of use and the ability to scale with you. With dotmailer’s tiered program, you can start with a lighter level of functionality and grow into more advanced functionality as you need it. The platform itself is so easy to use that most marketers are able to build campaigns in minutes that would have taken hours on other platforms. Our customer success team is also with you all the way if ever you want or need help.  We’ve built a very powerful platform and we have a fantastic team to help you with personalized service as an extended part of your team and we’re ready to grow with you.

    1. How much time is your team on the road vs. in the office? Any road warrior tips to share?

    I’ve spent a lot of time on the road, one year I attended 22 tradeshows! Top tip when flying is to be willing to give up your seat for families or groups once you’re at the airport gate, as you’ll often be rewarded with a better seat for helping the airline make the family or group happy. Win win! Since joining dotmailer, I’m focused on being in office and present for the team and customers as much as possible. I can usually be found in our new, NYC office where I spend a lot of time with our team, in customer meetings, in trainings and other hosted events, sales conversations or marketing meetings. I’m here to help the team, clients and partners to succeed, and will always do my best to say yes! Once our prospective customers see how quickly and efficiently they can execute tasks with dotmailer solutions vs. their existing solutions, it’s a no-brainer for them.  I love seeing and hearing their reactions.

    1. Tell us a bit about yourself – favorite sports team, favorite food, guilty pleasure, favorite band, favorite vacation spot?

    I’m originally from Yorkshire in England, and grew up just outside York. I moved to the U.S. about seven years ago to join a very fast growing startup, we took it from 5 to well over 300 people which was a fantastic experience. I moved to NYC almost two years ago, and I love exploring this great city.  There’s so much to see and do.  Outside of dotmailer, my passion is cars, and I also enjoy skeet shooting, almost all types of music, and I love to travel – my goal is to get to India, Thailand, Australia and Japan in the near future.

    Want to find out more about the dotfamily? Check out our recent post about Darren Hockley, Global Head of Support.

    Reblogged 3 years ago from blog.dotmailer.com

    Stop Ghost Spam in Google Analytics with One Filter

    Posted by CarloSeo

    The spam in Google Analytics (GA) is becoming a serious issue. Due to a deluge of referral spam from social buttons, adult sites, and many, many other sources, people are starting to become overwhelmed by all the filters they are setting up to manage the useless data they are receiving.

    The good news is, there is no need to panic. In this post, I’m going to focus on the most common mistakes people make when fighting spam in GA, and explain an efficient way to prevent it.

    But first, let’s make sure we understand how spam works. A couple of months ago, Jared Gardner wrote an excellent article explaining what referral spam is, including its intended purpose. He also pointed out some great examples of referral spam.

    Types of spam

    The spam in Google Analytics can be categorized by two types: ghosts and crawlers.

    Ghosts

    The vast majority of spam is this type. They are called ghosts because they never access your site. It is important to keep this in mind, as it’s key to creating a more efficient solution for managing spam.

    As unusual as it sounds, this type of spam doesn’t have any interaction with your site at all. You may wonder how that is possible since one of the main purposes of GA is to track visits to our sites.

    They do it by using the Measurement Protocol, which allows people to send data directly to Google Analytics’ servers. Using this method, and probably randomly generated tracking codes (UA-XXXXX-1) as well, the spammers leave a “visit” with fake data, without even knowing who they are hitting.

    Crawlers

    This type of spam, the opposite to ghost spam, does access your site. As the name implies, these spam bots crawl your pages, ignoring rules like those found in robots.txt that are supposed to stop them from reading your site. When they exit your site, they leave a record on your reports that appears similar to a legitimate visit.

    Crawlers are harder to identify because they know their targets and use real data. But it is also true that new ones seldom appear. So if you detect a referral in your analytics that looks suspicious, researching it on Google or checking it against this list might help you answer the question of whether or not it is spammy.

    Most common mistakes made when dealing with spam in GA

    I’ve been following this issue closely for the last few months. According to the comments people have made on my articles and conversations I’ve found in discussion forums, there are primarily three mistakes people make when dealing with spam in Google Analytics.

    Mistake #1. Blocking ghost spam from the .htaccess file

    One of the biggest mistakes people make is trying to block Ghost Spam from the .htaccess file.

    For those who are not familiar with this file, one of its main functions is to allow/block access to your site. Now we know that ghosts never reach your site, so adding them here won’t have any effect and will only add useless lines to your .htaccess file.

    Ghost spam usually shows up for a few days and then disappears. As a result, sometimes people think that they successfully blocked it from here when really it’s just a coincidence of timing.

    Then when the spammers later return, they get worried because the solution is not working anymore, and they think the spammer somehow bypassed the barriers they set up.

    The truth is, the .htaccess file can only effectively block crawlers such as buttons-for-website.com and a few others since these access your site. Most of the spam can’t be blocked using this method, so there is no other option than using filters to exclude them.

    Mistake #2. Using the referral exclusion list to stop spam

    Another error is trying to use the referral exclusion list to stop the spam. The name may confuse you, but this list is not intended to exclude referrals in the way we want to for the spam. It has other purposes.

    For example, when a customer buys something, sometimes they get redirected to a third-party page for payment. After making a payment, they’re redirected back to you website, and GA records that as a new referral. It is appropriate to use referral exclusion list to prevent this from happening.

    If you try to use the referral exclusion list to manage spam, however, the referral part will be stripped since there is no preexisting record. As a result, a direct visit will be recorded, and you will have a bigger problem than the one you started with since. You will still have spam, and direct visits are harder to track.

    Mistake #3. Worrying that bounce rate changes will affect rankings

    When people see that the bounce rate changes drastically because of the spam, they start worrying about the impact that it will have on their rankings in the SERPs.

    bounce.png

    This is another mistake commonly made. With or without spam, Google doesn’t take into consideration Google Analytics metrics as a ranking factor. Here is an explanation about this from Matt Cutts, the former head of Google’s web spam team.

    And if you think about it, Cutts’ explanation makes sense; because although many people have GA, not everyone uses it.

    Assuming your site has been hacked

    Another common concern when people see strange landing pages coming from spam on their reports is that they have been hacked.

    landing page

    The page that the spam shows on the reports doesn’t exist, and if you try to open it, you will get a 404 page. Your site hasn’t been compromised.

    But you have to make sure the page doesn’t exist. Because there are cases (not spam) where some sites have a security breach and get injected with pages full of bad keywords to defame the website.

    What should you worry about?

    Now that we’ve discarded security issues and their effects on rankings, the only thing left to worry about is your data. The fake trail that the spam leaves behind pollutes your reports.

    It might have greater or lesser impact depending on your site traffic, but everyone is susceptible to the spam.

    Small and midsize sites are the most easily impacted – not only because a big part of their traffic can be spam, but also because usually these sites are self-managed and sometimes don’t have the support of an analyst or a webmaster.

    Big sites with a lot of traffic can also be impacted by spam, and although the impact can be insignificant, invalid traffic means inaccurate reports no matter the size of the website. As an analyst, you should be able to explain what’s going on in even in the most granular reports.

    You only need one filter to deal with ghost spam

    Usually it is recommended to add the referral to an exclusion filter after it is spotted. Although this is useful for a quick action against the spam, it has three big disadvantages.

    • Making filters every week for every new spam detected is tedious and time-consuming, especially if you manage many sites. Plus, by the time you apply the filter, and it starts working, you already have some affected data.
    • Some of the spammers use direct visits along with the referrals.
    • These direct hits won’t be stopped by the filter so even if you are excluding the referral you will sill be receiving invalid traffic, which explains why some people have seen an unusual spike in direct traffic.

    Luckily, there is a good way to prevent all these problems. Most of the spam (ghost) works by hitting GA’s random tracking-IDs, meaning the offender doesn’t really know who is the target, and for that reason either the hostname is not set or it uses a fake one. (See report below)

    Ghost-Spam.png

    You can see that they use some weird names or don’t even bother to set one. Although there are some known names in the list, these can be easily added by the spammer.

    On the other hand, valid traffic will always use a real hostname. In most of the cases, this will be the domain. But it also can also result from paid services, translation services, or any other place where you’ve inserted GA tracking code.

    Valid-Referral.png

    Based on this, we can make a filter that will include only hits that use real hostnames. This will automatically exclude all hits from ghost spam, whether it shows up as a referral, keyword, or pageview; or even as a direct visit.

    To create this filter, you will need to find the report of hostnames. Here’s how:

    1. Go to the Reporting tab in GA
    2. Click on Audience in the lefthand panel
    3. Expand Technology and select Network
    4. At the top of the report, click on Hostname

    Valid-list

    You will see a list of all hostnames, including the ones that the spam uses. Make a list of all the valid hostnames you find, as follows:

    • yourmaindomain.com
    • blog.yourmaindomain.com
    • es.yourmaindomain.com
    • payingservice.com
    • translatetool.com
    • anotheruseddomain.com

    For small to medium sites, this list of hostnames will likely consist of the main domain and a couple of subdomains. After you are sure you got all of them, create a regular expression similar to this one:

    yourmaindomain\.com|anotheruseddomain\.com|payingservice\.com|translatetool\.com

    You don’t need to put all of your subdomains in the regular expression. The main domain will match all of them. If you don’t have a view set up without filters, create one now.

    Then create a Custom Filter.

    Make sure you select INCLUDE, then select “Hostname” on the filter field, and copy your expression into the Filter Pattern box.

    filter

    You might want to verify the filter before saving to check that everything is okay. Once you’re ready, set it to save, and apply the filter to all the views you want (except the view without filters).

    This single filter will get rid of future occurrences of ghost spam that use invalid hostnames, and it doesn’t require much maintenance. But it’s important that every time you add your tracking code to any service, you add it to the end of the filter.

    Now you should only need to take care of the crawler spam. Since crawlers access your site, you can block them by adding these lines to the .htaccess file:

    ## STOP REFERRER SPAM 
    RewriteCond %{HTTP_REFERER} semalt\.com [NC,OR] 
    RewriteCond %{HTTP_REFERER} buttons-for-website\.com [NC] 
    RewriteRule .* - [F]
    

    It is important to note that this file is very sensitive, and misplacing a single character it it can bring down your entire site. Therefore, make sure you create a backup copy of your .htaccess file prior to editing it.

    If you don’t feel comfortable messing around with your .htaccess file, you can alternatively make an expression with all the crawlers, then and add it to an exclude filter by Campaign Source.

    Implement these combined solutions, and you will worry much less about spam contaminating your analytics data. This will have the added benefit of freeing up more time for you to spend actually analyze your valid data.

    After stopping spam, you can also get clean reports from the historical data by using the same expressions in an Advance Segment to exclude all the spam.

    Bonus resources to help you manage spam

    If you still need more information to help you understand and deal with the spam on your GA reports, you can read my main article on the subject here: http://www.ohow.co/what-is-referrer-spam-how-stop-it-guide/.

    Additional information on how to stop spam can be found at these URLs:

    In closing, I am eager to hear your ideas on this serious issue. Please share them in the comments below.

    (Editor’s Note: All images featured in this post were created by the author.)

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Reblogged 4 years ago from tracking.feedpress.it

    Why Effective, Modern SEO Requires Technical, Creative, and Strategic Thinking – Whiteboard Friday

    Posted by randfish

    There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.

    For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

    Video transcription

    Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.

    There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.

    Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”

    Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.

    First off, some of the good points he did bring up.

    One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”

    Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.

    And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.

    You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.

    But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.

    First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.

    So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.

    Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.

    Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.

    So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.

    And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.

    Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.

    The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”

    Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”

    I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.

    There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.

    Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;

    • Good on-site experience
    • Writing good content
    • Getting others to acknowledge you as an authority
    • Rising in social popularity
    • Earning local relevance
    • Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)

    The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.

    I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.

    But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.

    I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.

    We have to be able to understand things like;

    • Content rendering and indexability
    • Crawl structure, internal links, JavaScript, Ajax. If something’s post-loading after the page and Google’s not able to index it, or there are links that are accessible via JavaScript or Ajax, maybe Google can’t necessarily see those or isn’t crawling them as effectively, or is crawling them, but isn’t assigning them as much link weight as they might be assigning other stuff, and you’ve made it tough to link to them externally, and so they can’t crawl it.
    • Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?
    • Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.
    • Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.

    Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”

    Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.

    • International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?
    • Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?
    • Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.

    Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.

    I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.

    • Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.
    • Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.
    • Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.
    • Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.
    • Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.
    • Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.
    • Diagnosing mobile friendliness issues
    • Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.

    Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.

    So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.

    All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.

    Video transcription by Speechpad.com

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Reblogged 4 years ago from tracking.feedpress.it