dotDigital Group plc named in ‘1000 Companies to Inspire Britain 2018’ report by the LSE Group

We’re delighted to be recognized by the London Stock Exchange for the second year running as one of the fastest-growing and most dynamic SMEs in the UK. Over the course of the last year we’ve pushed ourselves to make sure we continue to give our customers with the tools they need to be the best marketers they can. The result was our acquisition of Comapi last November, and the launch of new omnichannel features to enhance our platform.

In the report SMEs are said to “have the potential to power our economy into the future” and that’s why we’re incredibly proud of the services we provide small and medium sized businesses like us around the world. From SMS and product recommendations to automated re-targeting through Google AdWords and Facebook Audience nodes, we’re enabling brands to engage more effectively with their audiences across multiple channels and all from one place.

To find out more about our omnichannel features talk to your account manager or request a demo today.

The full report can be found on the LSE Group website where you can download your own copy and find a searchable database of all the companies listed in the publication.

The post dotDigital Group plc named in ‘1000 Companies to Inspire Britain 2018’ report by the LSE Group appeared first on The Marketing Automation Blog.

Reblogged 4 months ago from blog.dotmailer.com

Inspiring the serious marketer in you: Hitting the Mark email benchmark report 2018 is here

We’ve expanded our remit. Our sample now includes a mix of big and small companies, across three continents with the inclusion of Asia-Pacific (APAC), as well as incorporating brands from the B2B sector.  It’s our biggest, beefiest benchmark report – and now it’s truly relevant on a global scale.

Even more insights to dive into

Some of our findings echo last year’s report. There are still several brands out there failing to adopt simple automation programs, most notably a welcome program. Similarly, 56 of the 100 brands still aren’t utilizing cart recovery emails – crazy when you think about the massive opportunity for ROI presented by triggered campaigns. These are quick and easy wins that many companies continue to miss.

However, our wider scope offers marketers some new insights too. We’ve found that B2C businesses are outperforming B2B thanks to their wider adoption of basic automation, and they offer a better post-purchase experience. In the APAC region, brands aren’t making the most of data-driven tactics causing them to lag behind their US and UK rivals when it comes to personalizing content and making it relevant to their customers.

In our 2018 benchmark report, we’ll show you how and why some retailers are winning big and reveal the faux pas that can make a massive difference to your profits.

Real results for winning practices

The overall winner, hitting the mark across all our criteria, was a young, UK brand that’s rapidly expanding across Europe. This is in no small part thanks to its hyper-targeted email marketing strategy which proved the perfect technique to win, serve and retain its customers.

This brand never missed an opportunity to send abandoned cart prompts, personalized subject lines and tailored content based on past activity and preferences. The company has made significant and commendable improvements for 2018; especially as it scored 0 for abandoned cart emails and segmentation in last year’s report, ranking in the mid-30s overall. What an achievement! Customers were made to feel valued and given a reason to keep coming back and remain loyal to the brand.

The brand has clearly implemented the winning practices outlined in Hitting the Mark 2017, allowing it to forge a powerful and compelling email marketing strategy. We’ve taken an in-depth look at the tactics that have inspired this epic turnaround, so you can get there too.

What do you need to do to top next year’s Hitting the Mark?

Read Hitting the Mark in full today to get the low-down on all our dos and don’ts that make up a fantastic email marketing campaign.

If you’re a dotmailer client, don’t forget to talk to your account manager for advice and tips on how to put these into action. Interested in how dotmailer can help your business hit the mark? Take a free tour of our platform at a time that suits you.

The post Inspiring the serious marketer in you: Hitting the Mark email benchmark report 2018 is here appeared first on The Marketing Automation Blog.

Reblogged 4 months ago from blog.dotmailer.com

The Moz 2016 Annual Report

Posted by SarahBird

I have a longstanding tradition of boring Moz readers with our exhaustive annual reports (2012, 2013, 2014, 2015).

If you’re avoiding sorting the recycling, going to the gym, or cleaning out your closet, I have got a *really* interesting post that needs your attention *right now*.

(Yeah. I know it’s March. But check this out, I had pneumonia in Jan/Feb so my life slid sideways for a while.)

Skip to your favorite parts:

Part 1: TL;DR

Part 2: Achievements unlocked

Part 3: Oh hai, elephant. Oh hai, room.

Part 4: More wood, fewer arrows

Part 5: Performance (metrics vomit)

Part 6: Inside Moz HQ

Part 7: Looking ahead


Part 1: TL;DR

We closed out 2016 with more customers and revenue than 2015. Our core SEO products are on a roll with frequent, impactful launches.

The year was not all butterflies and sunshine, though. Some of our initiatives failed to produce the results we needed. We made some tough calls (sunsetting some products and initiatives) and big changes (laying off a bunch of folks and reallocating resources). On a personal level, it was the most emotionally fraught time in my career.

Thank the gods, our hard work is paying off. Moz ended the year cashflow, EBITDA, and net income profitable (on a monthly basis), and with more can-do spirit than in years past. In fact, in the month of December we added a million dollars cash to the business.

We’re completely focused on our mission to simplify SEO for everyone through software, education, and community.


Part 2: Achievements unlocked

It blows my mind that we ended the year with over 36,000 customers from all over the world. We’ve got brands and agencies. We’ve got solopreneurs and Fortune 500s. We’ve got hundreds of thousands of people using the MozBar. A bunch of software companies integrate with our API. It’s humbling and awesome. We endeavor to be worthy of you!

Customers and Community.png

We were very busy last year. The pace and quality of development has never been better. The achievements captured below don’t come even close to listing everything. How many of these initiatives did you know about?


Part 3: Oh hai, elephant. Oh hai, room.

When a few really awful things happen, it can overshadow the great stuff you experience. That makes this a particularly hard annual report to write. 2016 was undoubtedly the most emotionally challenging year I’ve experienced at Moz.

It became clear that some of our strategic hypotheses were wrong. Pulling the plug on those projects and asking people I care deeply about to leave the company was heartbreaking. That’s what happened in August 2016.

Tolstoy Happy products and unhappy products.jpg

As Tolstoy wrote, “Happy products are all alike; every unhappy product is unhappy in its own way.” The hard stuff happened. Rehashing what went wrong deserves a couple chapters in a book, not a couple lines in a blog post. It shook us up hard.

And *yet*, I am determined not to let the hard stuff take away from the amazing, wonderful things we accomplished and experienced in 2016. There was a lot of good there, too.

Smarter people than me have said that progress doesn’t happen in a straight line; it zigs and zags. I’m proud of Mozzers; they rise to challenges. They lean into change and find the opportunity in it. They turn their compassion and determination up to 11. When the going gets tough, the tough get going.

beast mode q4-finish-strong.jpg

I’ve learned a lot about Moz and myself over the last year. I’m taking all those learnings with me into the next phase of Moz’s growth. Onwards.


Part 4: More wood, fewer arrows

At the start of 2016, our hypothesis was that our customers and community would purchase several inbound marketing tools from Moz, including SEO, local SEO, social analytics, and content marketing. The upside was market expansion. The downside was fewer resources to go around, and a much more complex brand and acquisition funnel.

By trimming our product lines, we could reallocate resources to initiatives showing more growth potential. We also simplified our mission, brand, and acquisition funnel.

It feels really good to be focusing on what we love: search. We want to be the best place to learn and do SEO.

Whenever someone wonders how to get found in search, we want them to go to Moz first. We aspire to be the best in the world at the core pillars of SEO: rankings, keywords, site audit and optimization, links, location data management.

SEO is dynamic and complex. By reducing our surface area, we can better achieve our goal of being the best. We’re putting more wood behind fewer arrows.

more wood fewer arrows.png


Part 5: Performance (metrics vomit)

Check out the infographic view of our data barf.

We ended the year at ~$42.6 million in gross revenue, amounting to ~12% annual growth. We had hoped for better at the start of the year. Moz Pro is still our economic engine, and Local drives new revenue and cashflow.

revenue for annual report 2016.png

Gross profit margin increased a hair to 74%, despite Moz Local being a larger share of our overall business. Product-only gross profit margin is a smidge higher at 76%. Partner relationships generally drag the profit margin on that product line.

Our Cost of Revenue (COR) went up in raw numbers from the previous year, but it didn’t increase as much as revenue.COR 2016.png

COR Pie Annual Report 2016.png

Total Operating Expenses came to about ~$41 million. Excluding the cost of the restructure we initiated in August, the shape and scale of our major expenses has remained remarkably stable.

2016 year in review major expenses.png

We landed at -$5.5 million in EBITDA, which was disappointingly below our plan. We were on target for our budgeted expenses. As we fell behind our revenue goals, it became clear we’d need to right-size our expenses to match the revenue reality. Hence, we made painful cuts.

EBITDA Annual Report 2016.png

Cash Burn Annual Report 2016.png

I’m happy/relieved/overjoyed to report that we were EBITDA positive by September, cashflow positive by October, and net income positive by November. Words can’t express how completely terrible it would have been to go through what we all went through, and *not* have achieved our business goals.

My mind was blown when we actually added a million in cash in December. I couldn’t have dared to dream that… Ha ha! They won’t all be like that! It was the confluence of a bunch of stuff, but man, it felt good.

one million dollars dr evil.jpg


Part 6: Inside MozHQ

Thanks to you, dear reader, we have a thriving and opinionated community of marketers. It’s a great privilege to host so many great exchanges of ideas. Education and community are integral to our mission. After all, we were a blog before we were a tech company. Traffic continues to climb and social keeps us busy. We love to hear from you!

organic traffic 2016 annual report.png

social channels for annual report 2016.png

We added a bunch of folks to the Moz Local, Moz.com, and Customer Success teams in the last half of the year. But our headcount is still lower than last year because we asked a lot of talented people to leave when we sunsetted a bunch of projects last August. We’re leaner, and gaining momentum.

End of year headcount bar charg 2016 annual report.png

Moz is deeply committed to making tech a more inclusive industry. My vision is for Moz to be a place where people are constantly learning and doing their best work. We took a slight step back on our gender diversity gains in 2016. Ugh. We’re not doing much hiring in 2017, so it’s going to be challenging to make substantial progress. We made a slight improvement in the ratio of underrepresented minorities working at Moz, which is a positive boost.

Gender ratios annual report 2016.png

The tech industry has earned its reputation of being unwelcoming and myopic.

Mozzers work hard to make Moz a place where anyone could thrive. Moz isn’t perfect; we’re human and we screw up sometimes. But we pick ourselves up, dust off, and try again. We continue our partnership with Ada Academy, and we’ve deepened our relationship with Year Up. One of my particular passions is partnering with programs that expose girls and young women to STEM careers, such as Ignite Worldwide, Techbridge, and BigSisters.

I’m so proud of our charitable match program. We match Mozzer donations 150% up to $3k. Over the years, we’ve given over half a million dollars to charity. In 2016, we gave over $111,028 to charities. The ‘G’ in TAGFEE stands for ‘generous,’ and this is one of the ways we show it.

charitable donation match annual report 2016.png

One of our most beloved employee benefits is paid, PAID vacation. We give every employee up to $3,000 to spend on his or her vacation. This year, we spent over half a million dollars exploring the world and sucking the marrow out of life.

paid paid vacation annual report 2016.png


Part 7: Looking ahead

Dear reader, I don’t have to tell you that search has been critical for a long time.

This juggernaut of a channel is becoming *even more* important with the proliferation of search interfaces and devices. Mobile liberated search from the desktop by bringing it into the physical world. Now, watches, home devices, and automobiles are making search ubiquitous. In a world of ambient search, SEO becomes even more important.

SEO is more complicated and dynamic than years past because the number of human interfaces, response types, and ranking signals are increasing. We here at Moz are wild about the complexity. We sink our teeth into it. It drives our mission: Simplify SEO for everyone through software, education, and community.

We’re very excited about the feature and experience improvements coming ahead. Thank you, dear reader, for sharing your feedback, inspiring us, and cheering us on. We look forward to exploring the future of search together.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 1 year ago from tracking.feedpress.it

A first look @ the UK Digital Marketing Association (DMA) Email Tracking Report 2017

Sometimes a bit of that déjà vu feeling can be a giggle, but when repetition is costing you time and money, it’s a little less amusing.

We’ve just been given first access to The UK Digital Marketing Association (DMA) Email Tracking Report 2017, sponsored by dotmailer. Awesome statistics, actionable insights, and a little bit like something we read this time last year.

The good news is that email is still firmly seated atop the throne as consumers’ preferred marketing channel. In fact, according to the report, email use is still on the rise, with the DMA likening the act of checking your inbox to a routine as subconscious as brushing your teeth in the morning. That’s the kind of healthy recurrence we like to hear about.

And the bad news?

Email marketing is in danger of losing its crown. And the culprit? It’s not new laws, or poor technology, or pesky Millennials whining and moaning and taking everything for granted…

It’s relevancy.

Back in yesteryear, our Client Services Director Skip Fidura wrote a blog post to accompany the recently published DMA Email Tracking Report 2015, detailing a call to action for the prioritization of relevant content in email marketing campaigns. 63% of the 2016 versions of our consumers had said: “Most of the marketing emails I receive include NO content or offers that are of interest to me.” Subject lines were generic, offers were universal, and content was characterless. We all took note; relevancy needed to be improved if email marketing was going to continue to top the charts.

So why are we now looking at a 5% rise in consumers failing to identify the relevancy of our campaigns? 68% now agree with the above statement, and 84% now find less than half of their emails ‘interesting or relevant’. And that’s if they even get to the campaign itself; the DMA Email Tracker Report 2017 reveals that only 6% of consumers opened and read all of their emails – 67% read fewer than half. This is hardly a surprising statistic, when you consider that most feel there’s nothing of worth to them inside.

Email marketers are losing their customers’ trust because they’re not able to prove that they know how to engage with them. When you don’t feel like somebody knows you, you’re not going to open up, invite them in for a cup of tea, give them the nice biscuits with chocolate on. And if you feel like someone else is making a better attempt to get to know you, you’re going to turn your attention to them. It’s that simple.

Is it that we just don’t like “simple”?

Why is it that we haven’t yet cracked relevancy in email marketing campaigns, when we’ve got the tools at our fingertips? We can track consumer behavior; we can segment our contact lists by gender, location, and shoe size (if appropriate) we can test subject lines, copy, images and layout at the click of a button, and all of this data can be fed into a campaign that reaches Mrs. Smith when she gets home at 7pm on a Monday and starts surfing for size six shoes.

We need to start effectively using the tools that are available to us.

What can you do before the year is out?

Ask if you are relevant – When we released last year’s results, one of our clients took some of the key metrics, built them into a survey and asked her recipients the same questions. It would be inappropriate to share what she found but it was interesting to compare the responses from her recipients with those of the average consumer.

Pay your data some attention – If you have gaps in your database, create a campaign via email or on your website that seeks to gain a better picture of both your prospects and your customers. Alternatively, you can connect your email platform to your CRM or e-commerce software to get access to live customer data

Strategize your segmentation – Dividing by two and hoping for the best hasn’t worked since school, so start thinking about what your brand can offer to different consumers, and create intelligent segments based on your results. Quizzes, competitions, and preference surveys are a great way to collect additional explicit data, on top of implicit data such as order history and behavioral data.

Think harder about context – You need to keep up to date with what’s happening with your different audiences. Got an internationally active brand? Make sure your content is going to be relevant to everyone – and don’t forget about delivering your emails at a time that the recipient is likely to be looking at email or at least be awake. dotmailer’s Send Time Optimization feature gets your message to your recipient’s inbox at the time that is best for them.

Split-test ‘til the cows come home – Performing a split-test is a brilliant way to find out what works for your brand in a time-effective manner. You can afford to be creative when your ideas are backed up by intelligent reporting.

What can you put in place for 2017?

Get better insight – dotmailer’s WebInsight tool lets you track prospects’ and customers’ online behaviors. You can then use the data you receive to send relevant, automated campaigns that “react” to your recipients most recent actions on your site.

Nurture your valued customers – With a tool like dotmailer’s OrderInsight, you can quickly and easily data-mine your customers to identify those highest scoring by frequency and value of purchase, as well as product category. Then design a targeted appreciation campaign for your most valued segments in minutes using our drag-and-drop segmentation tool.

Get to know our friends – We’ve got the top pick of partners to boost campaign relevancy. Phrasee gives you the insight on what subject lines will perform best for you, Sweet Tooth is the number one platform for points based loyalty programs, and Moveable Ink eats real-time relevancy for breakfast.

Want to get more of the most up-to-date data on the habits of email consumers? Book now for the 2017 DMA Email Tracking Report launch!

Reblogged 1 year ago from blog.dotmailer.com

Is Australia the land of opportunity for your retail brand?

Australia has a resident population of more than 24 million and, according to eMarketer, the country’s ecommerce sales are predicted to reach A$32.56 billion by 2017. The country’s remote location in the APAC region means that unlike European countries or the USA, traditionally there have been a lack of global brands sold locally.

Of course, we also know that many expatriates, particularly from inside the Commonwealth, have made Australia their home and are keen to buy products they know and love from their country of origin.

All of these factors present a huge and potentially lucrative opportunity for non-Australian brands wanting to open up their new and innovative products to a fresh market, or compete for market share.

But it’s not just non-Australian retailers who are at an advantage here: Australia was late to the ecommerce party because native, established brands were trading well without it. Subsequently, Australian retailers’ ecommerce technology stacks are much more recent and not burdened by legacy systems. This makes it much easier to extend, or get started with, best-of-breed technologies and cash in on a market that’s booming. To put some of this into perspective, Magento’s innovative ecommerce platform currently takes 42% of Australia’s market share and the world’s first adopter of Magento 2.0 was an Australian brand.

The GST loophole

At the moment, local retailers are campaigning against a rule that exempts foreign websites from being charged a 10% general sales tax (GST) on purchases under A$1,000. And in 2013, Australian consumers made $3.11 billion worth of purchases under A$1,000.[1]

While the current GST break appears to put non-Australian retailers at an advantage, Australian-based brands such as Harvey Norman are using it to their advantage by setting up ecommerce operations in Asia to enjoy the GST benefit.

Australian consumers have also countered the argument by saying that price isn’t always the motivator when it comes to making purchasing decisions.

It’s not a place where no man has gone before

Often, concerns around meeting local compliance and lack of overseas business knowledge prevent outsiders from taking the leap into cross-border trade. However, this ecommerce passport, created by Ecommerce Worldwide and NORA, is designed to support those considering selling in Australia. The guide provides a comprehensive look into everything from the country’s economy and trade status, to logistics and dealing with international payments.

Global expansion success stories are also invaluable sources of information. For instance, it’s not just lower-end retailers that are fitting the bill, with brands like online luxury fashion retailer Net-a-Porter naming Australia as one of its biggest markets.

How tech-savvy are the Aussies?

One of the concerns you might have as a new entrant into the market is how you’ll reach and sell to your new audience, particularly without having a physical presence. The good news is that more than 80% of the country is digitally enabled and 60% of mobile phone users own a smartphone – so online is deeply rooted into the majority of Australians’ lives. [2]

Marketing your brand

Heard the saying “Fire bullets then fire cannonballs”? In any case, you’ll want to test the waters and gauge people’s reactions to your product or service.

It all starts with the website because, without it, you’re not discoverable or searchable, and you’ve nowhere to drive people to when running campaigns. SEO and SEM should definitely be a priority, and an online store that can handle multiple regions and storefronts, like Magento, will make your life easier. A mobile-first mentality and well thought-out UX will also place you in a good position.

Once your new web store is set up, you should be making every effort to collect visitors’ email addresses, perhaps via a popover. Why? Firstly, email is one of the top three priority areas for Australian retailers, because it’s a cost-effective, scalable marketing channel that enables true personalization.

Secondly, email marketing automation empowers you to deliver the customer experience today’s consumer expects, as well as enabling you to communicate with them throughout the lifecycle. Check out our ‘Do customer experience masters really exist?’ whitepaper for some real-life success stories.

Like the Magento platform, dotmailer is set up to handle multiple languages, regions and accounts, and is designed to grow with you.

In summary, there’s great scope for ecommerce success in Australia, whether you’re a native bricks-and-mortar retailer, a start-up or a non-Australian merchant. The barriers to cross-border trade are falling and Australia is one of APAC’s most developed regions in terms of purchasing power and tech savviness.

We recently worked with ecommerce expert Chloe Thomas to produce a whitepaper on cross-border trade, which goes into much more detail on how to market and sell successfully in new territories. You can download a free copy here.

[1] Australian Passport 2015: Cross-Border Trading Report

[2] Australian Passport 2015: Cross-Border Trading Report

Reblogged 2 years ago from blog.dotmailer.com

Google updates the AMP report in the Google Search Console

Five months after launching the AMP error report in the Google Search Console, Google has updated the report to make it easier to spot errors.

The post Google updates the AMP report in the Google Search Console appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Reblogged 2 years ago from feeds.searchengineland.com

Local Search Ranking Factors 2015

The 2015 Local Search Ranking Factors report is out, and it’s a must-read for anyone in the local SEO arena. As you may know, the survey polls roughly 40 leading local SEO practitioners on what they believe to be the variables most responsible for driving rankings in Google local search…

Please visit Search Engine Land for the full article.

Reblogged 3 years ago from feeds.searchengineland.com

Stop Ghost Spam in Google Analytics with One Filter

Posted by CarloSeo

The spam in Google Analytics (GA) is becoming a serious issue. Due to a deluge of referral spam from social buttons, adult sites, and many, many other sources, people are starting to become overwhelmed by all the filters they are setting up to manage the useless data they are receiving.

The good news is, there is no need to panic. In this post, I’m going to focus on the most common mistakes people make when fighting spam in GA, and explain an efficient way to prevent it.

But first, let’s make sure we understand how spam works. A couple of months ago, Jared Gardner wrote an excellent article explaining what referral spam is, including its intended purpose. He also pointed out some great examples of referral spam.

Types of spam

The spam in Google Analytics can be categorized by two types: ghosts and crawlers.

Ghosts

The vast majority of spam is this type. They are called ghosts because they never access your site. It is important to keep this in mind, as it’s key to creating a more efficient solution for managing spam.

As unusual as it sounds, this type of spam doesn’t have any interaction with your site at all. You may wonder how that is possible since one of the main purposes of GA is to track visits to our sites.

They do it by using the Measurement Protocol, which allows people to send data directly to Google Analytics’ servers. Using this method, and probably randomly generated tracking codes (UA-XXXXX-1) as well, the spammers leave a “visit” with fake data, without even knowing who they are hitting.

Crawlers

This type of spam, the opposite to ghost spam, does access your site. As the name implies, these spam bots crawl your pages, ignoring rules like those found in robots.txt that are supposed to stop them from reading your site. When they exit your site, they leave a record on your reports that appears similar to a legitimate visit.

Crawlers are harder to identify because they know their targets and use real data. But it is also true that new ones seldom appear. So if you detect a referral in your analytics that looks suspicious, researching it on Google or checking it against this list might help you answer the question of whether or not it is spammy.

Most common mistakes made when dealing with spam in GA

I’ve been following this issue closely for the last few months. According to the comments people have made on my articles and conversations I’ve found in discussion forums, there are primarily three mistakes people make when dealing with spam in Google Analytics.

Mistake #1. Blocking ghost spam from the .htaccess file

One of the biggest mistakes people make is trying to block Ghost Spam from the .htaccess file.

For those who are not familiar with this file, one of its main functions is to allow/block access to your site. Now we know that ghosts never reach your site, so adding them here won’t have any effect and will only add useless lines to your .htaccess file.

Ghost spam usually shows up for a few days and then disappears. As a result, sometimes people think that they successfully blocked it from here when really it’s just a coincidence of timing.

Then when the spammers later return, they get worried because the solution is not working anymore, and they think the spammer somehow bypassed the barriers they set up.

The truth is, the .htaccess file can only effectively block crawlers such as buttons-for-website.com and a few others since these access your site. Most of the spam can’t be blocked using this method, so there is no other option than using filters to exclude them.

Mistake #2. Using the referral exclusion list to stop spam

Another error is trying to use the referral exclusion list to stop the spam. The name may confuse you, but this list is not intended to exclude referrals in the way we want to for the spam. It has other purposes.

For example, when a customer buys something, sometimes they get redirected to a third-party page for payment. After making a payment, they’re redirected back to you website, and GA records that as a new referral. It is appropriate to use referral exclusion list to prevent this from happening.

If you try to use the referral exclusion list to manage spam, however, the referral part will be stripped since there is no preexisting record. As a result, a direct visit will be recorded, and you will have a bigger problem than the one you started with since. You will still have spam, and direct visits are harder to track.

Mistake #3. Worrying that bounce rate changes will affect rankings

When people see that the bounce rate changes drastically because of the spam, they start worrying about the impact that it will have on their rankings in the SERPs.

bounce.png

This is another mistake commonly made. With or without spam, Google doesn’t take into consideration Google Analytics metrics as a ranking factor. Here is an explanation about this from Matt Cutts, the former head of Google’s web spam team.

And if you think about it, Cutts’ explanation makes sense; because although many people have GA, not everyone uses it.

Assuming your site has been hacked

Another common concern when people see strange landing pages coming from spam on their reports is that they have been hacked.

landing page

The page that the spam shows on the reports doesn’t exist, and if you try to open it, you will get a 404 page. Your site hasn’t been compromised.

But you have to make sure the page doesn’t exist. Because there are cases (not spam) where some sites have a security breach and get injected with pages full of bad keywords to defame the website.

What should you worry about?

Now that we’ve discarded security issues and their effects on rankings, the only thing left to worry about is your data. The fake trail that the spam leaves behind pollutes your reports.

It might have greater or lesser impact depending on your site traffic, but everyone is susceptible to the spam.

Small and midsize sites are the most easily impacted – not only because a big part of their traffic can be spam, but also because usually these sites are self-managed and sometimes don’t have the support of an analyst or a webmaster.

Big sites with a lot of traffic can also be impacted by spam, and although the impact can be insignificant, invalid traffic means inaccurate reports no matter the size of the website. As an analyst, you should be able to explain what’s going on in even in the most granular reports.

You only need one filter to deal with ghost spam

Usually it is recommended to add the referral to an exclusion filter after it is spotted. Although this is useful for a quick action against the spam, it has three big disadvantages.

  • Making filters every week for every new spam detected is tedious and time-consuming, especially if you manage many sites. Plus, by the time you apply the filter, and it starts working, you already have some affected data.
  • Some of the spammers use direct visits along with the referrals.
  • These direct hits won’t be stopped by the filter so even if you are excluding the referral you will sill be receiving invalid traffic, which explains why some people have seen an unusual spike in direct traffic.

Luckily, there is a good way to prevent all these problems. Most of the spam (ghost) works by hitting GA’s random tracking-IDs, meaning the offender doesn’t really know who is the target, and for that reason either the hostname is not set or it uses a fake one. (See report below)

Ghost-Spam.png

You can see that they use some weird names or don’t even bother to set one. Although there are some known names in the list, these can be easily added by the spammer.

On the other hand, valid traffic will always use a real hostname. In most of the cases, this will be the domain. But it also can also result from paid services, translation services, or any other place where you’ve inserted GA tracking code.

Valid-Referral.png

Based on this, we can make a filter that will include only hits that use real hostnames. This will automatically exclude all hits from ghost spam, whether it shows up as a referral, keyword, or pageview; or even as a direct visit.

To create this filter, you will need to find the report of hostnames. Here’s how:

  1. Go to the Reporting tab in GA
  2. Click on Audience in the lefthand panel
  3. Expand Technology and select Network
  4. At the top of the report, click on Hostname

Valid-list

You will see a list of all hostnames, including the ones that the spam uses. Make a list of all the valid hostnames you find, as follows:

  • yourmaindomain.com
  • blog.yourmaindomain.com
  • es.yourmaindomain.com
  • payingservice.com
  • translatetool.com
  • anotheruseddomain.com

For small to medium sites, this list of hostnames will likely consist of the main domain and a couple of subdomains. After you are sure you got all of them, create a regular expression similar to this one:

yourmaindomain\.com|anotheruseddomain\.com|payingservice\.com|translatetool\.com

You don’t need to put all of your subdomains in the regular expression. The main domain will match all of them. If you don’t have a view set up without filters, create one now.

Then create a Custom Filter.

Make sure you select INCLUDE, then select “Hostname” on the filter field, and copy your expression into the Filter Pattern box.

filter

You might want to verify the filter before saving to check that everything is okay. Once you’re ready, set it to save, and apply the filter to all the views you want (except the view without filters).

This single filter will get rid of future occurrences of ghost spam that use invalid hostnames, and it doesn’t require much maintenance. But it’s important that every time you add your tracking code to any service, you add it to the end of the filter.

Now you should only need to take care of the crawler spam. Since crawlers access your site, you can block them by adding these lines to the .htaccess file:

## STOP REFERRER SPAM 
RewriteCond %{HTTP_REFERER} semalt\.com [NC,OR] 
RewriteCond %{HTTP_REFERER} buttons-for-website\.com [NC] 
RewriteRule .* - [F]

It is important to note that this file is very sensitive, and misplacing a single character it it can bring down your entire site. Therefore, make sure you create a backup copy of your .htaccess file prior to editing it.

If you don’t feel comfortable messing around with your .htaccess file, you can alternatively make an expression with all the crawlers, then and add it to an exclude filter by Campaign Source.

Implement these combined solutions, and you will worry much less about spam contaminating your analytics data. This will have the added benefit of freeing up more time for you to spend actually analyze your valid data.

After stopping spam, you can also get clean reports from the historical data by using the same expressions in an Advance Segment to exclude all the spam.

Bonus resources to help you manage spam

If you still need more information to help you understand and deal with the spam on your GA reports, you can read my main article on the subject here: http://www.ohow.co/what-is-referrer-spam-how-stop-it-guide/.

Additional information on how to stop spam can be found at these URLs:

In closing, I am eager to hear your ideas on this serious issue. Please share them in the comments below.

(Editor’s Note: All images featured in this post were created by the author.)

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Why Effective, Modern SEO Requires Technical, Creative, and Strategic Thinking – Whiteboard Friday

Posted by randfish

There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.

There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.

Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”

Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.

First off, some of the good points he did bring up.

One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”

Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.

And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.

You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.

But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.

First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.

So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.

Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.

Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.

So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.

And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.

Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.

The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”

Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”

I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.

There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.

Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;

  • Good on-site experience
  • Writing good content
  • Getting others to acknowledge you as an authority
  • Rising in social popularity
  • Earning local relevance
  • Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)

The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.

I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.

But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.

I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.

We have to be able to understand things like;

  • Content rendering and indexability
  • Crawl structure, internal links, JavaScript, Ajax. If something’s post-loading after the page and Google’s not able to index it, or there are links that are accessible via JavaScript or Ajax, maybe Google can’t necessarily see those or isn’t crawling them as effectively, or is crawling them, but isn’t assigning them as much link weight as they might be assigning other stuff, and you’ve made it tough to link to them externally, and so they can’t crawl it.
  • Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?
  • Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.
  • Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.

Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”

Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.

  • International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?
  • Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?
  • Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.

Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.

I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.

  • Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.
  • Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.
  • Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.
  • Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.
  • Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.
  • Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.
  • Diagnosing mobile friendliness issues
  • Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.

Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.

So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.

All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it