Meet Dan Morris, Executive Vice President, North America

  1. Why did you decide to come to dotmailer?

The top three reasons were People, Product and Opportunity. I met the people who make up our business and heard their stories from the past 18 years, learned about the platform and market leading status they had built in the UK, and saw that I could add value with my U.S. high growth business experience. I’ve been working with marketers, entrepreneurs and business owners for years across a series of different roles, and saw that I could apply what I’d learned from that and the start-up space to dotmailer’s U.S. operation. dotmailer has had clients in the U.S. for 12 years and we’re positioned to grow the user base of our powerful and easy-to-use platform significantly. I knew I could make a difference here, and what closed the deal for me was the people.  Every single person I’ve met is deeply committed to the business, to the success of our customers and to making our solution simple and efficient.  We’re a great group of passionate people and I’m proud to have joined the dotfamily.

Dan Morris, dotmailer’s EVP for North America in the new NYC office

      1. Tell us a bit about your new role

dotmailer has been in business and in this space for more than 18 years. We were a web agency, then a Systems Integrator, and we got into the email business that way, ultimately building the dotmailer platform thousands of people use daily. This means we know this space better than anyone and we have the perfect solutions to align closely with our customers and the solutions flexible enough to grow with them.  My role is to take all that experience and the platform and grow our U.S. presence. My early focus has been on identifying the right team to execute our growth plans. We want to be the market leader in the U.S. in the next three years – just like we’ve done in the UK –  so getting the right people in the right spots was critical.  We quickly assessed the skills of the U.S. team and made changes that were necessary in order to provide the right focus on customer success. Next, we set out to completely rebuild dotmailer’s commercial approach in the U.S.  We simplified our offers to three bundles, so that pricing and what’s included in those bundles is transparent to our customers.  We’ve heard great things about this already from clients and partners. We’re also increasing our resources on customer success and support.  We’re intensely focused on ease of on-boarding, ease of use and speed of use.  We consistently hear how easy and smooth a process it is to use dotmailer’s tools.  That’s key for us – when you buy a dotmailer solution, we want to onboard you quickly and make sure you have all of your questions answered right away so that you can move right into using it.  Customers are raving about this, so we know it’s working well.

  1. What early accomplishments are you most proud of from your dotmailer time so far?

I’ve been at dotmailer for eight months now and I’m really proud of all we’ve accomplished together.  We spent a lot of time assessing where we needed to restructure and where we needed to invest.  We made the changes we needed, invested in our partner program, localized tech support, customer on-boarding and added customer success team members.  We have the right people in the right roles and it’s making a difference.  We have a commercial approach that is clear with the complete transparency that we wanted to provide our customers.  We’ve got a more customer-focused approach and we’re on-boarding customers quickly so they’re up and running faster.  We have happier customers than ever before and that’s the key to everything we do.

  1. You’ve moved the U.S. team to a new office. Can you tell us why and a bit about the new space?

I thought it was very important to create a NY office space that was tied to branding and other offices around the world, and also had its own NY energy and culture for our team here – to foster collaboration and to have some fun.  It was also important for us that we had a flexible space where we could welcome customers, partners and resellers, and also hold classes and dotUniversity training sessions. I’m really grateful to the team who worked on the space because it really reflects our team and what we care about.   At any given time, you’ll see a training session happening, the team collaborating, a customer dropping in to ask a few questions or a partner dropping in to work from here.  We love our new, NYC space.

We had a spectacular reception this week to celebrate the opening of this office with customers, partners and the dotmailer leadership team in attendance. Please take a look at the photos from our event on Facebook.

Guests and the team at dotmailer's new NYC office warming party

Guests and the team at dotmailer’s new NYC office warming party

  1. What did you learn from your days in the start-up space that you’re applying at dotmailer?

The start-up space is a great place to learn. You have to know where every dollar is going and coming from, so every choice you make needs to be backed up with a business case for that investment.  You try lots of different things to see if they’ll work and you’re ready to turn those tactics up or down quickly based on an assessment of the results. You also learn things don’t have to stay the way they are, and can change if you make them change. You always listen and learn – to customers, partners, industry veterans, advisors, etc. to better understand what’s working and not working.  dotmailer has been in business for 18 years now, and so there are so many great contributors across the business who know how things have worked and yet are always keen to keep improving.  I am constantly in listening and learning mode so that I can understand all of the unique perspectives our team brings and what we need to act on.

  1. What are your plans for the U.S. and the sales function there?

On our path to being the market leader in the U.S., I’m focused on three things going forward: 1 – I want our customers to be truly happy.  It’s already a big focus in the dotmailer organization – and we’re working hard to understand their challenges and goals so we can take product and service to the next level. 2 – Creating an even more robust program around partners, resellers and further building out our channel partners to continuously improve sales and customer service programs. We recently launched a certification program to ensure partners have all the training and resources they need to support our mutual customers.  3 – We have an aggressive growth plan for the U.S. and I’m very focused on making sure our team is well trained, and that we remain thoughtful and measured as we take the steps to grow.  We want to always keep an eye on what we’re known for – tools that are powerful and simple to use – and make sure everything else we offer remains accessible and valuable as we execute our growth plans.

  1. What are the most common questions that you get when speaking to a prospective customer?

The questions we usually get are around price, service level and flexibility.  How much does dotmailer cost?  How well are you going to look after my business?  How will you integrate into my existing stack and then my plans for future growth? We now have three transparent bundle options with specifics around what’s included published right on our website.  We have introduced a customer success team that’s focused only on taking great care of our customers and we’re hearing stories every day that tells me this is working.  And we have all of the tools to support our customers as they grow and to also integrate into their existing stacks – often integrating so well that you can use dotmailer from within Magento, Salesforce or Dynamics, for example.

  1. Can you tell us about the dotmailer differentiators you highlight when speaking to prospective customers that seem to really resonate?

In addition to the ones above – ease of use, speed of use and the ability to scale with you. With dotmailer’s tiered program, you can start with a lighter level of functionality and grow into more advanced functionality as you need it. The platform itself is so easy to use that most marketers are able to build campaigns in minutes that would have taken hours on other platforms. Our customer success team is also with you all the way if ever you want or need help.  We’ve built a very powerful platform and we have a fantastic team to help you with personalized service as an extended part of your team and we’re ready to grow with you.

  1. How much time is your team on the road vs. in the office? Any road warrior tips to share?

I’ve spent a lot of time on the road, one year I attended 22 tradeshows! Top tip when flying is to be willing to give up your seat for families or groups once you’re at the airport gate, as you’ll often be rewarded with a better seat for helping the airline make the family or group happy. Win win! Since joining dotmailer, I’m focused on being in office and present for the team and customers as much as possible. I can usually be found in our new, NYC office where I spend a lot of time with our team, in customer meetings, in trainings and other hosted events, sales conversations or marketing meetings. I’m here to help the team, clients and partners to succeed, and will always do my best to say yes! Once our prospective customers see how quickly and efficiently they can execute tasks with dotmailer solutions vs. their existing solutions, it’s a no-brainer for them.  I love seeing and hearing their reactions.

  1. Tell us a bit about yourself – favorite sports team, favorite food, guilty pleasure, favorite band, favorite vacation spot?

I’m originally from Yorkshire in England, and grew up just outside York. I moved to the U.S. about seven years ago to join a very fast growing startup, we took it from 5 to well over 300 people which was a fantastic experience. I moved to NYC almost two years ago, and I love exploring this great city.  There’s so much to see and do.  Outside of dotmailer, my passion is cars, and I also enjoy skeet shooting, almost all types of music, and I love to travel – my goal is to get to India, Thailand, Australia and Japan in the near future.

Want to find out more about the dotfamily? Check out our recent post about Darren Hockley, Global Head of Support.

Reblogged 3 years ago from blog.dotmailer.com

How Much Has Link Building Changed in Recent Years?

Posted by Paddy_Moogan

I get asked this question a lot. It’s mainly asked by people who are considering buying my link building book and want to know whether it’s still up to date. This is understandable given that the first edition was published in February 2013 and our industry has a deserved reputation for always changing.

I find myself giving the same answer, even though I’ve been asked it probably dozens of times in the last two years—”not that much”. I don’t think this is solely due to the book itself standing the test of time, although I’ll happily take a bit of credit for that 🙂 I think it’s more a sign of our industry as a whole not changing as much as we’d like to think.

I started to question myself and if I was right and honestly, it’s one of the reasons it has taken me over two years to release the second edition of the book.

So I posed this question to a group of friends not so long ago, some via email and some via a Facebook group. I was expecting to be called out by many of them because my position was that in reality, it hasn’t actually changed that much. The thing is, many of them agreed and the conversations ended with a pretty long thread with lots of insights. In this post, I’d like to share some of them, share what my position is and talk about what actually has changed.

My personal view

Link building hasn’t changed as much we think it has.

The core principles of link building haven’t changed. The signals around link building have changed, but mainly around new machine learning developments that have indirectly affected what we do. One thing that has definitely changed is the mindset of SEOs (and now clients) towards link building.

I think the last big change to link building came in April 2012 when Penguin rolled out. This genuinely did change our industry and put to bed a few techniques that should never have worked so well in the first place.

Since then, we’ve seen some things change, but the core principles haven’t changed if you want to build a business that will be around for years to come and not run the risk of being hit by a link related Google update. For me, these principles are quite simple:

  • You need to deserve links – either an asset you create or your product
  • You need to put this asset in front of a relevant audience who have the ability to share it
  • You need consistency – one new asset every year is unlikely to cut it
  • Anything that scales is at risk

For me, the move towards user data driving search results + machine learning has been the biggest change we’ve seen in recent years and it’s still going.

Let’s dive a bit deeper into all of this and I’ll talk about how this relates to link building.

The typical mindset for building links has changed

I think that most SEOs are coming round to the idea that you can’t get away with building low quality links any more, not if you want to build a sustainable, long-term business. Spammy link building still works in the short-term and I think it always will, but it’s much harder than it used to be to sustain websites that are built on spam. The approach is more “churn and burn” and spammers are happy to churn through lots of domains and just make a small profit on each one before moving onto another.

For everyone else, it’s all about the long-term and not putting client websites at risk.

This has led to many SEOs embracing different forms of link building and generally starting to use content as an asset when it comes to attracting links. A big part of me feels that it was actually Penguin in 2012 that drove the rise of content marketing amongst SEOs, but that’s a post for another day…! For today though, this goes some way towards explain the trend we see below.

Slowly but surely, I’m seeing clients come to my company already knowing that low quality link building isn’t what they want. It’s taken a few years after Penguin for it to filter down to client / business owner level, but it’s definitely happening. This is a good thing but unfortunately, the main reason for this is that most of them have been burnt in the past by SEO companies who have built low quality links without giving thought to building good quality ones too.

I have no doubt that it’s this change in mindset which has led to trends like this:

The thing is, I don’t think this was by choice.

Let’s be honest. A lot of us used the kind of link building tactics that Google no longer like because they worked. I don’t think many SEOs were under the illusion that it was genuinely high quality stuff, but it worked and it was far less risky to do than it is today. Unless you were super-spammy, the low-quality links just worked.

Fast forward to a post-Penguin world, things are far more risky. For me, it’s because of this that we see the trends like the above. As an industry, we had the easiest link building methods taken away from us and we’re left with fewer options. One of the main options is content marketing which, if you do it right, can lead to good quality links and importantly, the types of links you won’t be removing in the future. Get it wrong and you’ll lose budget and lose the trust if your boss or client in the power of content when it comes to link building.

There are still plenty of other methods to build links and sometimes we can forget this. Just look at this epic list from Jon Cooper. Even with this many tactics still available to us, it’s hard work. Way harder than it used to be.

My summary here is that as an industry, our mindset has shifted but it certainly wasn’t a voluntary shift. If the tactics that Penguin targeted still worked today, we’d still be using them.

A few other opinions…

I definitely think too many people want the next easy win. As someone surfing the edge of what Google is bringing our way, here’s my general take—SEO, in broad strokes, is changing a lot, *but* any given change is more and more niche and impacts fewer people. What we’re seeing isn’t radical, sweeping changes that impact everyone, but a sort of modularization of SEO, where we each have to be aware of what impacts our given industries, verticals, etc.”

Dr. Pete

 

I don’t feel that techniques for acquiring links have changed that much. You can either earn them through content and outreach or you can just buy them. What has changed is the awareness of “link building” outside of the SEO community. This makes link building / content marketing much harder when pitching to journalists and even more difficult when pitching to bloggers.

“Link building has to be more integrated with other channels and struggles to work in its own environment unless supported by brand, PR and social. Having other channels supporting your link development efforts also creates greater search signals and more opportunity to reach a bigger audience which will drive a greater ROI.

Carl Hendy

 

SEO has grown up in terms of more mature staff and SEOs becoming more ingrained into businesses so there is a smarter (less pressure) approach. At the same time, SEO has become more integrated into marketing and has made marketing teams and decision makers more intelligent in strategies and not pushing for the quick win. I’m also seeing that companies who used to rely on SEO and building links have gone through IPOs and the need to build 1000s of links per quarter has rightly reduced.

Danny Denhard

Signals that surround link building have changed

There is no question about this one in my mind. I actually wrote about this last year in my previous blog post where I talked about signals such as anchor text and deep links changing over time.

Many of the people I asked felt the same, here are some quotes from them, split out by the types of signal.

Domain level link metrics

I think domain level links have become increasingly important compared with page level factors, i.e. you can get a whole site ranking well off the back of one insanely strong page, even with sub-optimal PageRank flow from that page to the rest of the site.

Phil Nottingham

I’d agree with Phil here and this is what I was getting at in my previous post on how I feel “deep links” will matter less over time. It’s not just about domain level links here, it’s just as much about the additional signals available for Google to use (more on that later).

Anchor text

I’ve never liked anchor text as a link signal. I mean, who actually uses exact match commercial keywords as anchor text on the web?

SEOs. 🙂

Sure there will be natural links like this, but honestly, I struggle with the idea that it took Google so long to start turning down the dial on commercial anchor text as a ranking signal. They are starting to turn it down though, slowly but surely. Don’t get me wrong, it still matters and it still works. But like pure link spam, the barrier is a lot more lower now in terms what of constitutes too much.

Rand feels that they matter more than we’d expect and I’d mostly agree with this statement:

Exact match anchor text links still have more power than you’d expect—I think Google still hasn’t perfectly sorted what is “brand” or “branded query” from generics (i.e. they want to start ranking a new startup like meldhome.com for “Meld” if the site/brand gets popular, but they can’t quite tell the difference between that and https://moz.com/learn/seo/redirection getting a few manipulative links that say “redirect”)

Rand Fishkin

What I do struggle with though, is that Google still haven’t figured this out and that short-term, commercial anchor text spam is still so effective. Even for a short burst of time.

I don’t think link building as a concept has changed loads—but I think links as a signal have, mainly because of filters and penalties but I don’t see anywhere near the same level of impact from coverage anymore, even against 18 months ago.

Paul Rogers

New signals have been introduced

It isn’t just about established signals changing though, there are new signals too and I personally feel that this is where we’ve seen the most change in Google algorithms in recent years—going all the way back to Panda in 2011.

With Panda, we saw a new level of machine learning where it almost felt like Google had found a way of incorporating human reaction / feelings into their algorithms. They could then run this against a website and answer questions like the ones included in this post. Things such as:

  • “Would you be comfortable giving your credit card information to this site?”
  • “Does this article contain insightful analysis or interesting information that is beyond obvious?”
  • “Are the pages produced with great care and attention to detail vs. less attention to detail?”

It is a touch scary that Google was able to run machine learning against answers to questions like this and write an algorithm to predict the answers for any given page on the web. They have though and this was four years ago now.

Since then, they’ve made various moves to utilize machine learning and AI to build out new products and improve their search results. For me, this was one of the biggest and went pretty unnoticed by our industry. Well, until Hummingbird came along I feel pretty sure that we have Ray Kurzweil to thank for at least some of that.

There seems to be more weight on theme/topic related to sites, though it’s hard to tell if this is mostly link based or more user/usage data based. Google is doing a good job of ranking sites and pages that don’t earn the most links but do provide the most relevant/best answer. I have a feeling they use some combination of signals to say “people who perform searches like this seem to eventually wind up on this website—let’s rank it.” One of my favorite examples is the Audubon Society ranking for all sorts of birding-related searches with very poor keyword targeting, not great links, etc. I think user behavior patterns are stronger in the algo than they’ve ever been.

– Rand Fishkin

Leading on from what Rand has said, it’s becoming more and more common to see search results that just don’t make sense if you look at the link metrics—but are a good result.

For me, the move towards user data driving search results + machine learning advanced has been the biggest change we’ve seen in recent years and it’s still going.

Edit: since drafting this post, Tom Anthony released this excellent blog post on his views on the future of search and the shift to data-driven results. I’d recommend reading that as it approaches this whole area from a different perspective and I feel that an off-shoot of what Tom is talking about is the impact on link building.

You may be asking at this point, what does machine learning have to do with link building?

Everything. Because as strong as links are as a ranking signal, Google want more signals and user signals are far, far harder to manipulate than established link signals. Yes it can be done—I’ve seen it happen. There have even been a few public tests done. But it’s very hard to scale and I’d venture a guess that only the top 1% of spammers are capable of doing it, let alone maintaining it for a long period of time. When I think about the process for manipulation here, I actually think we go a step beyond spammers towards hackers and more cut and dry illegal activity.

For link building, this means that traditional methods of manipulating signals are going to become less and less effective as these user signals become stronger. For us as link builders, it means we can’t keep searching for that silver bullet or the next method of scaling link building just for an easy win. The fact is that scalable link building is always going to be at risk from penalization from Google—I don’t really want to live a life where I’m always worried about my clients being hit by the next update. Even if Google doesn’t catch up with a certain method, machine learning and user data mean that these methods may naturally become less effective and cost efficient over time.

There are of course other things such as social signals that have come into play. I certainly don’t feel like these are a strong ranking factor yet, but with deals like this one between Google and Twitter being signed, I wouldn’t be surprised if that ever-growing dataset is used at some point in organic results. The one advantage that Twitter has over Google is it’s breaking news freshness. Twitter is still way quicker at breaking news than Google is—140 characters in a tweet is far quicker than Google News! Google know this which is why I feel they’ve pulled this partnership back into existence after a couple of years apart.

There is another important point to remember here and it’s nicely summarised by Dr. Pete:

At the same time, as new signals are introduced, these are layers not replacements. People hear social signals or user signals or authorship and want it to be the link-killer, because they already fucked up link-building, but these are just layers on top of on-page and links and all of the other layers. As each layer is added, it can verify the layers that came before it and what you need isn’t the magic signal but a combination of signals that generally matches what Google expects to see from real, strong entities. So, links still matter, but they matter in concert with other things, which basically means it’s getting more complicated and, frankly, a bit harder. Of course, on one wants to hear that.”

– Dr. Pete

The core principles have not changed

This is the crux of everything for me. With all the changes listed above, the key is that the core principles around link building haven’t changed. I could even argue that Penguin didn’t change the core principles because the techniques that Penguin targeted should never have worked in the first place. I won’t argue this too much though because even Google advised website owners to build directory links at one time.

You need an asset

You need to give someone a reason to link to you. Many won’t do it out of the goodness of their heart! One of the most effective ways to do this is to develop a content asset and use this as your reason to make people care. Once you’ve made someone care, they’re more likely to share the content or link to it from somewhere.

You need to promote that asset to the right audience

I really dislike the stance that some marketers take when it comes to content promotion—build great content and links will come.

No. Sorry but for the vast majority of us, that’s simply not true. The exceptions are people that sky dive from space or have huge existing audiences to leverage.

You simply have to spend time promoting your content or your asset for it to get shares and links. It is hard work and sometimes you can spend a long time on it and get little return, but it’s important to keep working at until you’re at a point where you have two things:

  • A big enough audience where you can almost guarantee at least some traffic to your new content along with some shares
  • Enough strong relationships with relevant websites who you can speak to when new content is published and stand a good chance of them linking to it

Getting to this point is hard—but that’s kind of the point. There are various hacks you can use along the way but it will take time to get right.

You need consistency

Leading on from the previous point. It takes time and hard work to get links to your content—the types of links that stand the test of time and you’re not going to be removing in 12 months time anyway! This means that you need to keep pushing content out and getting better each and every time. This isn’t to say you should just churn content out for the sake of it, far from it. I am saying that with each piece of content you create, you will learn to do at least one thing better the next time. Try to give yourself the leverage to do this.

Anything scalable is at risk

Scalable link building is exactly what Google has been trying to crack down on for the last few years. Penguin was the biggest move and hit some of the most scalable tactics we had at our disposal. When you scale something, you often lose some level of quality, which is exactly what Google doesn’t want when it comes to links. If you’re still relying on tactics that could fall into the scalable category, I think you need to be very careful and just look at the trend in the types of links Google has been penalizing to understand why.

The part Google plays in this

To finish up, I want to briefly talk about the part that Google plays in all of this and shaping the future they want for the web.

I’ve always tried to steer clear of arguments involving the idea that Google is actively pushing FUD into the community. I’ve preferred to concentrate more on things I can actually influence and change with my clients rather than what Google is telling us all to do.

However, for the purposes of this post, I want to talk about it.

General paranoia has increased. My bet is there are some companies out there carrying out zero specific linkbuilding activity through worry.

Dan Barker

Dan’s point is a very fair one and just a day or two after reading this in an email, I came across a page related to a client’s target audience that said:

“We are not publishing guest posts on SITE NAME any more. All previous guest posts are now deleted. For more information, see www.mattcutts.com/blog/guest-blogging/“.

I’ve reworded this as to not reveal the name of the site, but you get the point.

This is silly. Honestly, so silly. They are a good site, publish good content, and had good editorial standards. Yet they have ignored all of their own policies, hard work, and objectives to follow a blog post from Matt. I’m 100% confident that it wasn’t sites like this one that Matt was talking about in this blog post.

This is, of course, from the publishers’ angle rather than the link builders’ angle, but it does go to show the effect that statements from Google can have. Google know this so it does make sense for them to push out messages that make their jobs easier and suit their own objectives—why wouldn’t they? In a similar way, what did they do when they were struggling to classify at scale which links are bad vs. good and they didn’t have a big enough web spam team? They got us to do it for them 🙂

I’m mostly joking here, but you see the point.

The most recent infamous mobilegeddon update, discussed here by Dr. Pete is another example of Google pushing out messages that ultimately scared a lot of people into action. Although to be fair, I think that despite the apparent small impact so far, the broad message from Google is a very serious one.

Because of this, I think we need to remember that Google does have their own agenda and many shareholders to keep happy. I’m not in the camp of believing everything that Google puts out is FUD, but I’m much more sensitive and questioning of the messages now than I’ve ever been.

What do you think? I’d love to hear your feedback and thoughts in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Your Google Algorithm Cheat Sheet: Panda, Penguin, and Hummingbird

Posted by MarieHaynes

If you’re reading the Moz blog, then you probably have a decent understanding of Google and its algorithm changes. However, there is probably a good percentage of the Moz audience that is still confused about the effects that Panda, Penguin, and Hummingbird can have on your site. I did write a post last year about the main 
differences between Penguin and a Manual Unnautral Links Penalty, and if you haven’t read that, it’ll give you a good primer.

The point of this article is to explain very simply what each of these algorithms are meant to do. It is hopefully a good reference that you can point your clients to if you want to explain an algorithm change and not overwhelm them with technical details about 301s, canonicals, crawl errors, and other confusing SEO terminologies.

What is an algorithm change?

First of all, let’s start by discussing the Google algorithm. It’s immensely complicated and continues to get more complicated as Google tries its best to provide searchers with the information that they need. When search engines were first created, early search marketers were able to easily find ways to make the search engine think that their client’s site was the one that should rank well. In some cases it was as simple as putting in some code on the website called a meta keywords tag. The meta keywords tag would tell search engines what the page was about.

As Google evolved, its engineers, who were primarily focused on making the search engine results as relevant to users as possible, continued to work on ways to stop people from cheating, and looked at other ways to show the most relevant pages at the top of their searches. The algorithm now looks at hundreds of different factors. There are some that we know are significant such as having a good descriptive title (between the <title></title> tags in the code.) And there are many that are the subject of speculation such as 
whether or not Google +1’s contribute to a site’s rankings.

In the past, the Google algorithm would change very infrequently. If your site was sitting at #1 for a certain keyword, it was guaranteed to stay there until the next update which might not happen for weeks or months. Then, they would push out another update and things would change. They would stay that way until the next update happened. If you’re interested in reading about how Google used to push updates out of its index, you may find this 
Webmaster World forum thread from 2002 interesting. (Many thanks to Paul Macnamara  for explaining to me how algo changes used to work on Google in the past and pointing me to the Webmaster World thread.)

This all changed with launch of “Caffeine” in 2010. Since Caffeine launched, the search engine results have been changing several times a day rather than every few weeks. Google makes over 600 changes to its algorithm in a year, and the vast majority of these are not announced. But, when Google makes a really big change, they give it a name, usually make an announcement, and everyone in the SEO world goes crazy trying to figure out how to understand the changes and use them to their advantage.

Three of the biggest changes that have happened in the last few years are the Panda algorithm, the Penguin algorithm and Hummingbird.

What is the Panda algorithm?

Panda first launched on February 23, 2011. It was a big deal. The purpose of Panda was to try to show high-quality sites higher in search results and demote sites that may be of lower quality. This algorithm change was unnamed when it first came out, and many of us called it the “Farmer” update as it seemed to affect content farms. (Content farms are sites that aggregate information from many sources, often stealing that information from other sites, in order to create large numbers of pages with the sole purpose of ranking well in Google for many different keywords.) However, it affected a very large number of sites. The algorithm change was eventually officially named after one of its creators, Navneet Panda.

When Panda first happened, a lot of SEOs in forums thought that this algorithm was targeting sites with unnatural backlink patterns. However, it turns out that links are most likely
not a part of the Panda algorithm. It is all about on-site quality.

In most cases, sites that were affected by Panda were hit quite hard. But, I have also seen sites that have taken a slight loss on the date of a Panda update. Panda tends to be a site-wide issue which means that it doesn’t just demote certain pages of your site in the search engine results, but instead, Google considers the entire site to be of lower quality. In some cases though Panda can affect just a section of a site such as a news blog or one particular subdomain.

Whenever a Google employee is asked about what needs to be done to recover from Panda, they refer to a 
blog post by Google Employee Amit Singhal that gives a checklist that you can use on your site to determine if your site really is high quality or not. Here is the list:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • Does this article have spelling, stylistic, or factual errors?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?
  • Does the article describe both sides of a story?
  • Is the site a recognized authority on its topic?
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
  • Was the article edited well, or does it appear sloppy or hastily produced?
  • For a health related query, would you trust information from this site?
  • Would you recognize this site as an authoritative source when mentioned by name?
  • Does this article provide a complete or comprehensive description of the topic?
  • Does this article contain insightful analysis or interesting information that is beyond obvious?
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  • Does this article have an excessive amount of ads that distract from or interfere with the main content?
  • Would you expect to see this article in a printed magazine, encyclopedia or book?
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
  • Are the pages produced with great care and attention to detail vs. less attention to detail?
  • Would users complain when they see pages from this site?

Phew! That list is pretty overwhelming! These questions do not necessarily mean that Google tries to algorithmically figure out whether your articles are interesting or whether you have told both sides of a story. Rather, the questions are there because all of these factors can contribute to how real-life users would rate the quality of your site. No one really knows all of the factors that Google uses in determining the quality of your site through the eyes of Panda. Ultimately though, the focus is on creating the best site possible for your users.  It is also important that only your best stuff is given to Google to have in its index. There are a few factors that are widely accepted as important things to look at in regards to Panda:

Thin content

A “thin” page is a page that adds little or no value to someone who is reading it. It doesn’t necessarily mean that a page has to be a certain number of words, but quite often, pages with very few words are not super-helpful. If you have a large number of pages on your site that contain just one or two sentences and those pages are all included in the Google index, then the Panda algorithm may determine that the majority of your indexed pages are of low quality.

Having the odd thin page is not going to cause you to run in to Panda problems. But, if a big enough portion of your site contains pages that are not helpful to users, then that is not good.

Duplicate content

There are several ways that duplicate content can cause your site to be viewed as a low-quality site by the Panda algorithm. The first is when a site has a large amount of content that is copied from other sources on the web. Let’s say that you have a blog on your site and you populate that blog with articles that are taken from other sources. Google is pretty good at figuring out that you are not the creator of this content. If the algorithm can see that a large portion of your site is made up of content that exists on other sites then this can cause Panda to look at you unfavorably.

You can also run into problems with duplicated content on your own site. One example would be for a site that has a large number of products for sale. Perhaps each product has a separate page for each color variation and size. But, all of these pages are essentially the same. If one product comes in 20 different colors and each of those come in 6 different sizes, then that means that you have 120 pages for the same product, all of which are almost identical. Now, imagine that you sell 4,000 products. This means that you’ve got almost half a million pages in the Google index when really 4,000 pages would suffice. In this type of situation, the fix for this problem is to use something called a canonical tag. Moz has got a really good guide on using canonical tags 
here, and Dr. Pete has also written this great article on canonical tag use

Low-quality content

When I write an article and publish it on one of my websites, the only type of information that I want to present to Google is information that is the absolute best of its kind. In the past, many SEOs have given advice to site owners saying that it was important to blog every day and make sure that you are always adding content for Google to index. But, if what you are producing is not high quality content, then you could be doing more harm than good. A lot of Amit Singhal’s questions listed above are asking whether the content on your site is valuable to readers. Let’s say that I have an SEO blog and every day I take a short blurb from each of the interesting SEO articles that I have read online and publish it as a blog post on my site. Is Google going to want to show searchers my summary of these articles, or would they rather show them the actual articles? Of course my summary is not going to be as valuable as the real thing! Now, let’s say that I have done this every day for 4 years. Now my site has over 4,000 pages that contain information that is not unique and not as valuable as other sites on the same topics.

Here is another example. Let’s say that I am a plumber. I’ve been told that I should blog regularly, so several times a week I write a 2-3 paragraph article on things like, “How to fix a leaky faucet” or “How to unclog a toilet.” But, I’m busy and don’t have much time to put into my website so each article I’ve written contains keywords in the title and a few times in the content, but the content is not in depth and is not that helpful to readers. If the majority of the pages on my site contain information that no one is engaging with, then this can be a sign of low quality in the eyes of the Panda algorithm.

There are other factors that probably play a roll in the Panda algorithm.  Glenn Gabe recently wrote an 
excellent article on his evaluation of sites affected by the most recent Panda update.  His bullet point list of things to improve upon when affected by Panda is extremely thorough.

How to recover from a Panda hit

Google refreshes the Panda algorithm approximately monthly. They used to announce whenever they were refreshing the algorithm, but now they only do this if there is a really big change to the Panda algorithm. What happens when the Panda algorithm refreshes is that Google takes a new look at each site on the web and determines whether or not it looks like a quality site in regards to the criteria that the Panda algorithm looks at. If your site was adversely affected by Panda and you have made changes such as removing thin and duplicate content then, when Panda refreshes, you should see that things improve. However, for some sites it can take a couple of Panda refreshes to see the full extent of the improvements. This is because it can sometimes take several months for Google to revisit all of your pages and recognize the changes that you have made.

Every now and then, instead of just
refreshing the algorithm, Google does what they call an update. When an update happens, this means that Google has changed the criteria that they use to determine what is and isn’t considered high quality. On May 20, 2014, Google did a major update which they called Panda 4.0. This caused a lot of sites to see significant changes in regards to Panda:

Not all Panda recoveries are as dramatic as this one. But, if you have been affected by Panda and you work hard to make changes to your site, you really should see some improvement.

What is the Penguin algorithm?

Penguin

The Penguin algorithm initially rolled out on April 24, 2012. The goal of Penguin is to reduce the trust that Google has in sites that have cheated by creating unnatural backlinks in order to gain an advantage in the Google results. While the primary focus of Penguin is on unnatural links, there can be other 
factors that can affect a site in the eyes of Penguin as well. Links, though, are known to be by far the most important thing to look at.

Why are links important?

A link is like a vote for your site. If a well respected site links to your site, then this is a recommendation for your site. If a small, unknown site links to you then this vote is not going to count for as much as a vote from an authoritative site. Still, if you can get a large number of these small votes, they really can make a difference. This is why, in the past, SEOs would try to get as many links as they could from any possible source.

Another thing that is important in the Google algorithms is anchor text. Anchor text is the text that is underlined in a link. So, in this link to a great 
SEO blog, the anchor text would be “SEO blog.” If Moz.com gets a number of sites linking to them using the anchor text “SEO blog,” that is a hint to Google that people searching for “SEO blog” probably want to see sites like Moz in their search results.

It’s not hard to see how people could manipulate this part of the algorithm. Let’s say that I am doing SEO for a landscaping company in Orlando. In the past, one of the ways that I could cheat the algorithm into thinking that my company should be ranked highly would be to create a bunch of self made links and use anchor text in these links that contain phrases like
Orlando Landscaping Company, Landscapers in Orlando and Orlando Landscaping. While an authoritative link from a well respected site is good, what people discovered is that creating a large number of links from low quality sites was quite effective. As such, what SEOs would do is create links from easy to get places like directory listings, self made articles, and links in comments and forum posts.

While we don’t know exactly what factors the Penguin algorithm looks at, what we do know is that this type of low quality, self made link is what the algorithm is trying to detect. In my mind, the Penguin algorithm is sort of like Google putting a “trust factor” on your links. I used to tell people that Penguin could affect a site on a page or even a keyword level, but Google employee John Mueller has said several times now that Penguin is a sitewide algorithm. This means that if the Penguin algorithm determines that a large number of the links to your site are untrustworthy, then this reduces Google’s trust in your entire site. As such, the whole site will see a reduction in rankings.  

While Penguin affected a lot of sites drastically, I have seen many sites that saw a small reduction in rankings.  The difference, of course, depends on the amount of link manipulation that has been done.

How to recover from a Penguin hit?

Penguin is a filter just like Panda. What that means, is that the algorithm is re-run periodically and sites are re-evaluated with each re-run. At this point it is not run very often at all. The last update was October 4, 2013 which means that we have currently been waiting eight months for a new Penguin update. In order to recover from Penguin, you need to identify the unnatural links pointing to your site and either remove them, or if you can’t remove them you can ask Google to no longer count them by using the 
disavow tool. Then, the next time that Penguin refreshes or updates, if you have done a good enough job at cleaning up your unnatural links, you will once again regain trust in Google’s eyes.  In some cases, it can take a couple of refreshes in order for a site to completely escape Penguin because it can take up to 6 months for all of a site’s disavow file to be completely processed.

If you are not certain how to identify which links to your site are unnatural, here are some good resources for you:

The disavow tool is something that you probably should only be using if you really understand how it works. It is potentially possible for you to do more harm than good to your site if you disavow the wrong links. Here is some information on using the disavow tool:

It’s important to note that when sites “recover” from Penguin, they often don’t skyrocket up to top rankings once again as those previously high rankings were probably based on the power of links that are now considered unnatural. Here is some information on 
what to expect when you have recovered from a link based penalty or algorithmic issue.

Also, the Penguin algorithm is not the same thing as a manual unnatural links penalty. You do not need to file a reconsideration request to recover from Penguin. You also do not need to document the work that you have done in order to get links removed as no Google employee will be manually reviewing your work. As mentioned previously, here is more information on the 
difference between the Penguin algorithm and a manual unnatural links penalty.

What is Hummingbird?

Hummingbird is a completely different animal than Penguin or Panda. (Yeah, I know…that was a bad pun.) I will commonly get people emailing me telling me that Hummingbird destroyed their rankings. I would say that in almost every case that I have evalutated, this was not true. Google made their announcement about Hummingbird on September 26, 2013. However, at that time, they announced that Hummingbird had already been live for about a month. If the Hummingbird algorithm was truly responsible for catastrophic ranking fluctuations then we really should have seen an outcry from the SEO world of something drastic happening in August of 2013, and this did not happen. There did seem to be some type of fluctuation that happened around August 21 as reported here on Search Engine Round Table, but there were not many sites that reported huge ranking changes on that day.

If you think that Hummingbird affected you, it’s not a bad idea to look at your traffic to see if you noticed a drop on October 4, 2013 which was actually a refresh of the Penguin algorithm. I believe that a lot of people who thought that they were affected by Hummingbird were actually affected by Penguin which happened just a week after Google made their announcement about Hummingbird.

There are some excellent articles on Hummingbird here and here. Hummingbird was a complete overhaul of the entire Google algorithm. As Danny Sullivan put it, if you consider the Google algorithm as an engine, Panda and Penguin are algorithm changes that were like putting a new part in the engine such as a filter or a fuel pump. But, Hummingbird wasn’t just a new part; it was a completely new engine. That new engine still makes use of many of the old parts (such as Panda and Penguin) but a good amount of the engine is completely original.

The goal of the Hummingbird algorithm is for Google to better understand a user’s query. Bill Slawski who writes about Google patents has a great example of this in his post here. He explains that when someone searches for “What is the best place to find and eat Chicago deep dish style pizza?”, Hummingbird is able to discern that by “place” the user likely would be interested in results that show “restaurants”. There is speculation that these changes were necessary in order for Google’s voice search to be more effective. When we’re typing a search query, we might type, “best Seattle SEO company” but when we’re speaking a query (i.e. via Google Glass or via Google Now) we’re more likely to say something like, “Which firm in Seattle offers the best SEO services?” The point of Hummingbird is to better understand what users mean when they have queries like this.

So how do I recover or improve in the eyes of Hummingbird?

If you read the posts referenced above, the answer to this question is essentially to create content that answers users queries rather than just trying to rank for a particular keyword. But really, this is what you should already be doing!

It appears that Google’s goal with all of these algorithm changes (Panda, Penguin and Hummingbird) is to encourage webmasters to publish content that is the best of its kind. Google’s goal is to deliver answers to people who are searching. If you can produce content that answers people’s questions, then you’re on the right track.

I know that that is a really vague answer when it comes to “recovering” from Hummingbird. Hummingbird really is different than Panda and Penguin. When a site has been demoted by the Panda or Penguin algorithm, it’s because Google has lost some trust in the site’s quality, whether it is on-site quality or the legitimacy of its backlinks. If you fix those quality issues you can regain the algorithm’s trust and subsequently see improvements. But, if your site seems to be doing poorly since the launch of Hummingbird, then there really isn’t a way to recover those keyword rankings that you once held. You can, however, get new traffic by finding ways to be more thorough and complete in what your website offers.

Do you have more questions?

My goal in writing this article was to have a resource to point people to when they had basic questions about Panda, Penguin and Hummingbird. Recently, when I published my penalty newsletter, I had a small business owner comment that it was very interesting but that most of it went over their head. I realized that many people outside of the SEO world are greatly affected by these algorithm changes, but don’t have much information on why they have affected their website.

Do you have more questions about Panda, Penguin or Hummingbird? If so, I’d be happy to address them in the comments. I also would love for those of you who are experienced with dealing with websites affected by these issues to comment as well.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 5 years ago from feedproxy.google.com