Stop Ghost Spam in Google Analytics with One Filter

Posted by CarloSeo

The spam in Google Analytics (GA) is becoming a serious issue. Due to a deluge of referral spam from social buttons, adult sites, and many, many other sources, people are starting to become overwhelmed by all the filters they are setting up to manage the useless data they are receiving.

The good news is, there is no need to panic. In this post, I’m going to focus on the most common mistakes people make when fighting spam in GA, and explain an efficient way to prevent it.

But first, let’s make sure we understand how spam works. A couple of months ago, Jared Gardner wrote an excellent article explaining what referral spam is, including its intended purpose. He also pointed out some great examples of referral spam.

Types of spam

The spam in Google Analytics can be categorized by two types: ghosts and crawlers.

Ghosts

The vast majority of spam is this type. They are called ghosts because they never access your site. It is important to keep this in mind, as it’s key to creating a more efficient solution for managing spam.

As unusual as it sounds, this type of spam doesn’t have any interaction with your site at all. You may wonder how that is possible since one of the main purposes of GA is to track visits to our sites.

They do it by using the Measurement Protocol, which allows people to send data directly to Google Analytics’ servers. Using this method, and probably randomly generated tracking codes (UA-XXXXX-1) as well, the spammers leave a “visit” with fake data, without even knowing who they are hitting.

Crawlers

This type of spam, the opposite to ghost spam, does access your site. As the name implies, these spam bots crawl your pages, ignoring rules like those found in robots.txt that are supposed to stop them from reading your site. When they exit your site, they leave a record on your reports that appears similar to a legitimate visit.

Crawlers are harder to identify because they know their targets and use real data. But it is also true that new ones seldom appear. So if you detect a referral in your analytics that looks suspicious, researching it on Google or checking it against this list might help you answer the question of whether or not it is spammy.

Most common mistakes made when dealing with spam in GA

I’ve been following this issue closely for the last few months. According to the comments people have made on my articles and conversations I’ve found in discussion forums, there are primarily three mistakes people make when dealing with spam in Google Analytics.

Mistake #1. Blocking ghost spam from the .htaccess file

One of the biggest mistakes people make is trying to block Ghost Spam from the .htaccess file.

For those who are not familiar with this file, one of its main functions is to allow/block access to your site. Now we know that ghosts never reach your site, so adding them here won’t have any effect and will only add useless lines to your .htaccess file.

Ghost spam usually shows up for a few days and then disappears. As a result, sometimes people think that they successfully blocked it from here when really it’s just a coincidence of timing.

Then when the spammers later return, they get worried because the solution is not working anymore, and they think the spammer somehow bypassed the barriers they set up.

The truth is, the .htaccess file can only effectively block crawlers such as buttons-for-website.com and a few others since these access your site. Most of the spam can’t be blocked using this method, so there is no other option than using filters to exclude them.

Mistake #2. Using the referral exclusion list to stop spam

Another error is trying to use the referral exclusion list to stop the spam. The name may confuse you, but this list is not intended to exclude referrals in the way we want to for the spam. It has other purposes.

For example, when a customer buys something, sometimes they get redirected to a third-party page for payment. After making a payment, they’re redirected back to you website, and GA records that as a new referral. It is appropriate to use referral exclusion list to prevent this from happening.

If you try to use the referral exclusion list to manage spam, however, the referral part will be stripped since there is no preexisting record. As a result, a direct visit will be recorded, and you will have a bigger problem than the one you started with since. You will still have spam, and direct visits are harder to track.

Mistake #3. Worrying that bounce rate changes will affect rankings

When people see that the bounce rate changes drastically because of the spam, they start worrying about the impact that it will have on their rankings in the SERPs.

bounce.png

This is another mistake commonly made. With or without spam, Google doesn’t take into consideration Google Analytics metrics as a ranking factor. Here is an explanation about this from Matt Cutts, the former head of Google’s web spam team.

And if you think about it, Cutts’ explanation makes sense; because although many people have GA, not everyone uses it.

Assuming your site has been hacked

Another common concern when people see strange landing pages coming from spam on their reports is that they have been hacked.

landing page

The page that the spam shows on the reports doesn’t exist, and if you try to open it, you will get a 404 page. Your site hasn’t been compromised.

But you have to make sure the page doesn’t exist. Because there are cases (not spam) where some sites have a security breach and get injected with pages full of bad keywords to defame the website.

What should you worry about?

Now that we’ve discarded security issues and their effects on rankings, the only thing left to worry about is your data. The fake trail that the spam leaves behind pollutes your reports.

It might have greater or lesser impact depending on your site traffic, but everyone is susceptible to the spam.

Small and midsize sites are the most easily impacted – not only because a big part of their traffic can be spam, but also because usually these sites are self-managed and sometimes don’t have the support of an analyst or a webmaster.

Big sites with a lot of traffic can also be impacted by spam, and although the impact can be insignificant, invalid traffic means inaccurate reports no matter the size of the website. As an analyst, you should be able to explain what’s going on in even in the most granular reports.

You only need one filter to deal with ghost spam

Usually it is recommended to add the referral to an exclusion filter after it is spotted. Although this is useful for a quick action against the spam, it has three big disadvantages.

  • Making filters every week for every new spam detected is tedious and time-consuming, especially if you manage many sites. Plus, by the time you apply the filter, and it starts working, you already have some affected data.
  • Some of the spammers use direct visits along with the referrals.
  • These direct hits won’t be stopped by the filter so even if you are excluding the referral you will sill be receiving invalid traffic, which explains why some people have seen an unusual spike in direct traffic.

Luckily, there is a good way to prevent all these problems. Most of the spam (ghost) works by hitting GA’s random tracking-IDs, meaning the offender doesn’t really know who is the target, and for that reason either the hostname is not set or it uses a fake one. (See report below)

Ghost-Spam.png

You can see that they use some weird names or don’t even bother to set one. Although there are some known names in the list, these can be easily added by the spammer.

On the other hand, valid traffic will always use a real hostname. In most of the cases, this will be the domain. But it also can also result from paid services, translation services, or any other place where you’ve inserted GA tracking code.

Valid-Referral.png

Based on this, we can make a filter that will include only hits that use real hostnames. This will automatically exclude all hits from ghost spam, whether it shows up as a referral, keyword, or pageview; or even as a direct visit.

To create this filter, you will need to find the report of hostnames. Here’s how:

  1. Go to the Reporting tab in GA
  2. Click on Audience in the lefthand panel
  3. Expand Technology and select Network
  4. At the top of the report, click on Hostname

Valid-list

You will see a list of all hostnames, including the ones that the spam uses. Make a list of all the valid hostnames you find, as follows:

  • yourmaindomain.com
  • blog.yourmaindomain.com
  • es.yourmaindomain.com
  • payingservice.com
  • translatetool.com
  • anotheruseddomain.com

For small to medium sites, this list of hostnames will likely consist of the main domain and a couple of subdomains. After you are sure you got all of them, create a regular expression similar to this one:

yourmaindomain\.com|anotheruseddomain\.com|payingservice\.com|translatetool\.com

You don’t need to put all of your subdomains in the regular expression. The main domain will match all of them. If you don’t have a view set up without filters, create one now.

Then create a Custom Filter.

Make sure you select INCLUDE, then select “Hostname” on the filter field, and copy your expression into the Filter Pattern box.

filter

You might want to verify the filter before saving to check that everything is okay. Once you’re ready, set it to save, and apply the filter to all the views you want (except the view without filters).

This single filter will get rid of future occurrences of ghost spam that use invalid hostnames, and it doesn’t require much maintenance. But it’s important that every time you add your tracking code to any service, you add it to the end of the filter.

Now you should only need to take care of the crawler spam. Since crawlers access your site, you can block them by adding these lines to the .htaccess file:

## STOP REFERRER SPAM 
RewriteCond %{HTTP_REFERER} semalt\.com [NC,OR] 
RewriteCond %{HTTP_REFERER} buttons-for-website\.com [NC] 
RewriteRule .* - [F]

It is important to note that this file is very sensitive, and misplacing a single character it it can bring down your entire site. Therefore, make sure you create a backup copy of your .htaccess file prior to editing it.

If you don’t feel comfortable messing around with your .htaccess file, you can alternatively make an expression with all the crawlers, then and add it to an exclude filter by Campaign Source.

Implement these combined solutions, and you will worry much less about spam contaminating your analytics data. This will have the added benefit of freeing up more time for you to spend actually analyze your valid data.

After stopping spam, you can also get clean reports from the historical data by using the same expressions in an Advance Segment to exclude all the spam.

Bonus resources to help you manage spam

If you still need more information to help you understand and deal with the spam on your GA reports, you can read my main article on the subject here: http://www.ohow.co/what-is-referrer-spam-how-stop-it-guide/.

Additional information on how to stop spam can be found at these URLs:

In closing, I am eager to hear your ideas on this serious issue. Please share them in the comments below.

(Editor’s Note: All images featured in this post were created by the author.)

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

The Linkbait Bump: How Viral Content Creates Long-Term Lift in Organic Traffic – Whiteboard Friday

Posted by randfish

A single fantastic (or “10x”) piece of content can lift a site’s traffic curves long beyond the popularity of that one piece. In today’s Whiteboard Friday, Rand talks about why those curves settle into a “new normal,” and how you can go about creating the content that drives that change.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about the linkbait bump, classic phrase in the SEO world and almost a little dated. I think today we’re talking a little bit more about viral content and how high-quality content, content that really is the cornerstone of a brand or a website’s content can be an incredible and powerful driver of traffic, not just when it initially launches but over time.

So let’s take a look.

This is a classic linkbait bump, viral content bump analytics chart. I’m seeing over here my traffic and over here the different months of the year. You know, January, February, March, like I’m under a thousand. Maybe I’m at 500 visits or something, and then I have this big piece of viral content. It performs outstandingly well from a relative standpoint for my site. It gets 10,000 or more visits, drives a ton more people to my site, and then what happens is that that traffic falls back down. But the new normal down here, new normal is higher than the old normal was. So the new normal might be at 1,000, 1,500 or 2,000 visits whereas before I was at 500.

Why does this happen?

A lot of folks see an analytics chart like this, see examples of content that’s done this for websites, and they want to know: Why does this happen and how can I replicate that effect? The reasons why are it sort of feeds back into that viral loop or the flywheel, which we’ve talked about in previous Whiteboard Fridays, where essentially you start with a piece of content. That content does well, and then you have things like more social followers on your brand’s accounts. So now next time you go to amplify content or share content socially, you’re reaching more potential people. You have a bigger audience. You have more people who share your content because they’ve seen that that content performs well for them in social. So they want to find other content from you that might help their social accounts perform well.

You see more RSS and email subscribers because people see your interesting content and go, “Hey, I want to see when these guys produce something else.” You see more branded search traffic because people are looking specifically for content from you, not necessarily just around this viral piece, although that’s often a big part of it, but around other pieces as well, especially if you do a good job of exposing them to that additional content. You get more bookmark and type in traffic, more searchers biased by personalization because they’ve already visited your site. So now when they search and they’re logged into their accounts, they’re going to see your site ranking higher than they normally would otherwise, and you get an organic SEO lift from all the links and shares and engagement.

So there’s a ton of different factors that feed into this, and you kind of want to hit all of these things. If you have a piece of content that gets a lot of shares, a lot of links, but then doesn’t promote engagement, doesn’t get more people signing up, doesn’t get more people searching for your brand or searching for that content specifically, then it’s not going to have the same impact. Your traffic might fall further and more quickly.

How do you achieve this?

How do we get content that’s going to do this? Well, we’re going to talk through a number of things that we’ve talked about previously on Whiteboard Friday. But there are some additional ones as well. This isn’t just creating good content or creating high quality content, it’s creating a particular kind of content. So for this what you want is a deep understanding, not necessarily of what your standard users or standard customers are interested in, but a deep understanding of what influencers in your niche will share and promote and why they do that.

This often means that you follow a lot of sharers and influencers in your field, and you understand, hey, they’re all sharing X piece of content. Why? Oh, because it does this, because it makes them look good, because it helps their authority in the field, because it provides a lot of value to their followers, because they know it’s going to get a lot of retweets and shares and traffic. Whatever that because is, you have to have a deep understanding of it in order to have success with viral kinds of content.

Next, you want to have empathy for users and what will give them the best possible experience. So if you know, for example, that a lot of people are coming on mobile and are going to be sharing on mobile, which is true of almost all viral content today, FYI, you need to be providing a great mobile and desktop experience. Oftentimes that mobile experience has to be different, not just responsive design, but actually a different format, a different way of being able to scroll through or watch or see or experience that content.

There are some good examples out there of content that does that. It makes a very different user experience based on the browser or the device you’re using.

You also need to be aware of what will turn them off. So promotional messages, pop-ups, trying to sell to them, oftentimes that diminishes user experience. It means that content that could have been more viral, that could have gotten more shares won’t.

Unique value and attributes that separate your content from everything else in the field. So if there’s like ABCD and whoa, what’s that? That’s very unique. That stands out from the crowd. That provides a different form of value in a different way than what everyone else is doing. That uniqueness is often a big reason why content spreads virally, why it gets more shared than just the normal stuff.

I’ve talk about this a number of times, but content that’s 10X better than what the competition provides. So unique value from the competition, but also quality that is not just a step up, but 10X better, massively, massively better than what else you can get out there. That makes it unique enough. That makes it stand out from the crowd, and that’s a very hard thing to do, but that’s why this is so rare and so valuable.

This is a critical one, and I think one that, I’ll just say, many organizations fail at. That is the freedom and support to fail many times, to try to create these types of effects, to have this impact many times before you hit on a success. A lot of managers and clients and teams and execs just don’t give marketing teams and content teams the freedom to say, “Yeah, you know what? You spent a month and developer resources and designer resources and spent some money to go do some research and contracted with this third party, and it wasn’t a hit. It didn’t work. We didn’t get the viral content bump. It just kind of did okay. You know what? We believe in you. You’ve got a lot of chances. You should try this another 9 or 10 times before we throw it out. We really want to have a success here.”

That is something that very few teams invest in. The powerful thing is because so few people are willing to invest that way, the ones that do, the ones that believe in this, the ones that invest long term, the ones that are willing to take those failures are going to have a much better shot at success, and they can stand out from the crowd. They can get these bumps. It’s powerful.

Not a requirement, but it really, really helps to have a strong engaged community, either on your site and around your brand, or at least in your niche and your topic area that will help, that wants to see you, your brand, your content succeed. If you’re in a space that has no community, I would work on building one, even if it’s very small. We’re not talking about building a community of thousands or tens of thousands. A community of 100 people, a community of 50 people even can be powerful enough to help content get that catalyst, that first bump that’ll boost it into viral potential.

Then finally, for this type of content, you need to have a logical and not overly promotional match between your brand and the content itself. You can see many sites in what I call sketchy niches. So like a criminal law site or a casino site or a pharmaceutical site that’s offering like an interactive musical experience widget, and you’re like, “Why in the world is this brand promoting this content? Why did they even make it? How does that match up with what they do? Oh, it’s clearly just intentionally promotional.”

Look, many of these brands go out there and they say, “Hey, the average web user doesn’t know and doesn’t care.” I agree. But the average web user is not an influencer. Influencers know. Well, they’re very, very suspicious of why content is being produced and promoted, and they’re very skeptical of promoting content that they don’t think is altruistic. So this kills a lot of content for brands that try and invest in it when there’s no match. So I think you really need that.

Now, when you do these linkbait bump kinds of things, I would strongly recommend that you follow up, that you consider the quality of the content that you’re producing. Thereafter, that you invest in reproducing these resources, keeping those resources updated, and that you don’t simply give up on content production after this. However, if you’re a small business site, a small or medium business, you might think about only doing one or two of these a year. If you are a heavy content player, you’re doing a lot of content marketing, content marketing is how you’re investing in web traffic, I’d probably be considering these weekly or monthly at the least.

All right, everyone. Look forward to your experiences with the linkbait bump, and I will see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

The Inbound Marketing Economy

Posted by KelseyLibert

When it comes to job availability and security, the future looks bright for inbound marketers.

The Bureau of Labor Statistics (BLS) projects that employment for marketing managers will grow by 13% between 2012 and 2022. Job security for marketing managers also looks positive according to the BLS, which cites that marketing employees are less likely to be laid off since marketing drives revenue for most businesses.

While the BLS provides growth estimates for managerial-level marketing roles, these projections don’t give much insight into the growth of digital marketing, specifically the disciplines within digital marketing. As we know, “marketing” can refer to a variety of different specializations and methodologies. Since digital marketing is still relatively new compared to other fields, there is not much comprehensive research on job growth and trends in our industry.

To gain a better understanding of the current state of digital marketing careers, Fractl teamed up with Moz to identify which skills and roles are the most in demand and which states have the greatest concentration of jobs.

Methodology

We analyzed 75,315 job listings posted on Indeed.com during June 2015 based on data gathered from job ads containing the following terms:

  • “content marketing” or “content strategy”
  • “SEO” or “search engine marketing”
  • “social media marketing” or “social media management”
  • “inbound marketing” or “digital marketing”
  • “PPC” (pay-per-click)
  • “Google Analytics”

We chose the above keywords based on their likelihood to return results that were marketing-focused roles (for example, just searching for “social media” may return a lot of jobs that are not primarily marketing focused, such as customer service). The occurrence of each of these terms in job listings was quantified and segmented by state. We then combined the job listing data with U.S. Census Bureau population estimates to calculate the jobs per capita for each keyword, giving us the states with the greatest concentration of jobs for a given search query.

Using the same data, we identified which job titles appeared most frequently. We used existing data from Indeed to determine job trends and average salaries. LinkedIn search results were also used to identify keyword growth in user profiles.

Marketing skills are in high demand, but talent is hard to find

As the marketing industry continues to evolve due to emerging technology and marketing platforms, marketers are expected to pick up new skills and broaden their knowledge more quickly than ever before. Many believe this rapid rate of change has caused a marketing skills gap, making it difficult to find candidates with the technical, creative, and business proficiencies needed to succeed in digital marketing.

The ability to combine analytical thinking with creative execution is highly desirable and necessary in today’s marketing landscape. According to an article in The Guardian, “Companies will increasingly look for rounded individuals who can combine analytical rigor with the ability to apply this knowledge in a practical and creative context.” Being both detail-oriented and a big picture thinker is also a sought-after combination of attributes. A report by The Economist and Marketo found that “CMOs want people with the ability to grasp and manage the details (in data, technology, and marketing operations) combined with a view of the strategic big picture.”

But well-rounded marketers are hard to come by. In a study conducted by Bullhorn, 64% of recruiters reported a shortage of skilled candidates for available marketing roles. Wanted Analytics recently found that one of the biggest national talent shortages is for marketing manager roles, with only two available candidates per job opening.

Increase in marketers listing skills in content marketing, inbound marketing, and social media on LinkedIn profiles

While recruiter frustrations may indicate a shallow talent pool, LinkedIn tells a different story—the number of U.S.-based marketers who identify themselves as having digital marketing skills is on the rise. Using data tracked by Rand and LinkedIn, we found the following increases of marketing keywords within user profiles.

growth of marketing keywords in linkedin profiles

The number of profiles containing “content marketing” has seen the largest growth, with a 168% increase since 2013. “Social media” has also seen significant growth with a 137% increase. “Social media” appears on a significantly higher volume of profiles than the other keywords, with more than 2.2 million profiles containing some mention of social media. Although “SEO” has not seen as much growth as the other keywords, it still has the second-highest volume with it appearing in 630,717 profiles.

Why is there a growing number of people self-identifying as having the marketing skills recruiters want, yet recruiters think there is a lack of talent?

While there may be a lot of specialists out there, perhaps recruiters are struggling to fill marketing roles due to a lack of generalists or even a lack of specialists with surface-level knowledge of other areas of digital marketing (also known as a T-shaped marketer).

Popular job listings show a need for marketers to diversify their skill set

The data we gathered from LinkedIn confirm this, as the 20 most common digital marketing-related job titles being advertised call for a broad mix of skills.

20 most common marketing job titles

It’s no wonder that marketing manager roles are hard to fill, considering the job ads are looking for proficiency in a wide range of marketing disciplines including social media marketing, SEO, PPC, content marketing, Google Analytics, and digital marketing. Even job descriptions for specialist roles tend to call for skills in other disciplines. A particular role such as SEO Specialist may call for several skills other than SEO, such as PPC, content marketing, and Google Analytics.

Taking a more granular look at job titles, the chart below shows the five most common titles for each search query. One might expect mostly specialist roles to appear here, but there is a high occurrence of generalist positions, such as Digital Marketing Manager and Marketing Manager.

5 most common job titles by search query

Only one job title containing “SEO” cracked the top five. This indicates that SEO knowledge is a desirable skill within other roles, such as general digital marketing and development.

Recruiter was the third most common job title among job listings containing social media keywords, which suggests a need for social media skills in non-marketing roles.

Similar to what we saw with SEO job titles, only one job title specific to PPC (Paid Search Specialist) made it into the top job titles. PPC skills are becoming necessary for more general marketing roles, such as Marketing Manager and Digital Marketing Specialist.

Across all search queries, the most common jobs advertised call for a broad mix of skills. This tells us hiring managers are on the hunt for well-rounded candidates with a diverse range of marketing skills, as opposed to candidates with expertise in one area.

Marketers who cultivate diverse skill sets are better poised to gain an advantage over other job seekers, excel in their job role, and accelerate career growth. Jason Miller says it best in his piece about the new breed hybrid marketer:

future of marketing quote linkedin

Inbound job demand and growth: Most-wanted skills and fastest-growing jobs

Using data from Indeed, we identified which inbound skills have the highest demand and which jobs are seeing the most growth. Social media keywords claim the largest volume of results out of the terms we searched for during June 2015.

number of marketing job listings by keyword

“Social media marketing” or “social media management” appeared the most frequently in the job postings we analyzed, with 46.7% containing these keywords. “PPC” returned the smallest number of results, with only 3.8% of listings containing this term.

Perhaps this is due to social media becoming a more necessary skill across many industries and not only a necessity for marketers (for example, social media’s role in customer service and recruitment). On the other hand, job roles calling for PPC or SEO skills are most likely marketing-focused. The prevalence of social media jobs also may indicate that social media has gained wide acceptance as a necessary part of a marketing strategy. Additionally, social media skills are less valuable compared to other marketing skills, making it cheaper to hire for these positions (we will explore this further in the average salaries section below).

Our search results also included a high volume of jobs containing “digital marketing” and “SEO” keywords, which made up 19.5% and 15.5% respectively. At 5.8%, “content marketing” had the lowest search volume after “PPC.”

Digital marketing, social media, and content marketing experienced the most job growth

While the number of job listings tells us which skills are most in demand today, looking at which jobs are seeing the most growth can give insight into shifting demands.

digital marketing growth on  indeed.com

Digital marketing job listings have seen substantial growth since 2009, when it accounted for less than 0.1% of Indeed.com search results. In January 2015, this number had climbed to nearly 0.3%.

social media job growth on indeed.com

While social media marketing jobs have seen some uneven growth, as of January 2015 more than 0.1% of all job listings on Indeed.com contained the term “social media marketing” or “social media management.” This shows a significant upward trend considering this number was around 0.05% for most of 2014. It’s also worth noting that “social media” is currently ranked No. 10 on Indeed’s list of top job trends.

content marketing job growth on indeed.com

Despite its growth from 0.02% to nearly 0.09% of search volume in the last four years, “content marketing” does not make up a large volume of job postings compared to “digital marketing” or “social media.” In fact, “SEO” has seen a decrease in growth but still constitutes a higher percentage of job listings than content marketing.

SEO, PPC, and Google Analytics job growth has slowed down

On the other hand, search volume on Indeed has either decreased or plateaued for “SEO,” “PPC,” and “Google Analytics.”

seo job growth on indeed.com

As we see in the graph, the volume of “SEO job” listings peaked between 2011 and 2012. This is also around the time content marketing began gaining popularity, thanks to the Panda and Penguin updates. The decrease may be explained by companies moving their marketing budgets away from SEO and toward content or social media positions. However, “SEO” still has a significant amount of job listings, with it appearing in more than 0.2% of job listings on Indeed as of 2015.

ppc job growth on indeed.com

“PPC” has seen the most staggered growth among all the search terms we analyzed, with its peak of nearly 0.1% happening between 2012 and 2013. As of January of this year, search volume was below 0.05% for “PPC.”

google analytics job growth on indeed.com

Despite a lack of growth, the need for this skill remains steady. Between 2008 and 2009, “Google Analytics” job ads saw a huge spike on Indeed. Since then, the search volume has tapered off and plateaued through January 2015.

Most valuable skills are SEO, digital marketing, and Google Analytics

So we know the number of social media, digital marketing, and content marketing jobs are on the rise. But which skills are worth the most? We looked at the average salaries based on keywords and estimates from Indeed and salaries listed in job ads.

national average marketing salaries

Job titles containing “SEO” had an average salary of $102,000. Meanwhile, job titles containing “social media marketing” had an average salary of $51,000. Considering such a large percentage of the job listings we analyzed contained “social media” keywords, there is a much larger pool of jobs; therefore, a lot of entry level social media jobs or internships are probably bringing down the average salary.

Job titles containing “Google Analytics” had the second-highest average salary at $82,000, but this should be taken with a grain of salt considering “Google Analytics” will rarely appear as part of a job title. The chart below, which shows average salaries for jobs containing keywords anywhere in the listing as opposed to only in the title, gives a more accurate idea of how much “Google Analytics” job roles earn on average.national salary averages marketing keywords

Looking at the average salaries based on keywords that appeared anywhere within the job listing (job title, job description, etc.) shows a slightly different picture. Based on this, jobs containing “digital marketing” or “inbound marketing” had the highest average salary of $84,000. “SEO” and “Google Analytics” are tied for second with $76,000 as the average salary.

“Social media marketing” takes the bottom spot with an average salary of $57,000. However, notice that there is a higher average salary for jobs that contain “social media” within the job listing as opposed to jobs that contain “social media” within the title. This suggests that social media skills may be more valuable when combined with other responsibilities and skills, whereas a strictly social media job, such as Social Media Manager or Social Media Specialist, does not earn as much.

Massachusetts, New York, and California have the most career opportunities for inbound marketers

Looking for a new job? Maybe it’s time to pack your bags for Boston.

Massachusetts led the U.S. with the most jobs per capita for digital marketing, content marketing, SEO, and Google Analytics. New York took the top spot for social media jobs per capita, while Utah had the highest concentration of PPC jobs. California ranked in the top three for digital marketing, content marketing, social media, and Google Analytics. Illinois appeared in the top 10 for every term and usually ranked within the top five. Most of the states with the highest job concentrations are in the Northeast, West, and East Coast, with a few exceptions such as Illinois and Minnesota.

But you don’t necessarily have to move to a new state to increase the odds of landing an inbound marketing job. Some unexpected states also made the cut, with Connecticut and Vermont ranking within the top 10 for several keywords.

concentration of digital marketing jobs

marketing jobs per capita

Job listings containing “digital marketing” or “inbound marketing” were most prevalent in Massachusetts, New York, Illinois, and California, which is most likely due to these states being home to major cities where marketing agencies and large brands are headquartered or have a presence. You will notice these four states make an appearance in the top 10 for every other search query and usually rank close to the top of the list.

More surprising to find in the top 10 were smaller states such as Connecticut and Vermont. Many major organizations are headquartered in Connecticut, which may be driving the state’s need for digital marketing talent. Vermont’s high-tech industry growth may explain its high concentration of digital marketing jobs.

content marketing job concentration

per capita content marketing jobs

Although content marketing jobs are growing, there are still a low volume overall of available jobs, as shown by the low jobs per capita compared to most of the other search queries. With more than three jobs per capita, Massachusetts and New York topped the list for the highest concentration of job listings containing “content marketing” or “content strategy.” California and Illinois rank in third and fourth with 2.8 and 2.1 jobs per capita respectively.

seo job concentration

seo jobs per capita

Again, Massachusetts and New York took the top spots, each with more than eight SEO jobs per capita. Utah took third place for the highest concentration of SEO jobs. Surprised to see Utah rank in the top 10? Its inclusion on this list and others may be due to its booming tech startup scene, which has earned the metropolitan areas of Salt Lake City, Provo, and Park City the nickname Silicon Slopes.

social media job concentration

social media jobs per capita

Compared to the other keywords, “social media” sees a much higher concentration of jobs. New York dominates the rankings with nearly 24 social media jobs per capita. The other top contenders of California, Massachusetts, and Illinois all have more than 15 social media jobs per capita.

The numbers at the bottom of this list can give you an idea of how prevalent social media jobs were compared to any other keyword we analyzed. Minnesota’s 12.1 jobs per capita, the lowest ranking state in the top 10 for social media, trumps even the highest ranking state for any other keyword (11.5 digital marketing jobs per capita in Massachusetts).

ppc job concentration

ppc jobs per capita

Due to its low overall number of available jobs, “PPC” sees the lowest jobs per capita out of all the search queries. Utah has the highest concentration of jobs with just two PPC jobs per 100,000 residents. It is also the only state in the top 10 to crack two jobs per capita.

google analytics job concentration

google analytics jobs per capita

Regionally, the Northeast and West dominate the rankings, with the exception of Illinois. Massachusetts and New York are tied for the most Google Analytics job postings, each with nearly five jobs per capita. At more than three jobs per 100,000 residents, California, Illinois, and Colorado round out the top five.

Overall, our findings indicate that none of the marketing disciplines we analyzed are dying career choices, but there is a need to become more than a one-trick pony—or else you’ll risk getting passed up for job opportunities. As the marketing industry evolves, there is a greater need for marketers who “wear many hats” and have competencies across different marketing disciplines. Marketers who develop diverse skill sets can gain a competitive advantage in the job market and achieve greater career growth.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Even more How-To videos!

Since January we have been updating our How-To videos section to help our users get a much better understanding of the tools offered by Majestic. Today, we would like to introduce two new tutorials: “How to use the Neighbour Checker Tool” and “How to use the Link Profile Fight Tool“. The Neighbour Checker video –…

The post Even more How-To videos! appeared first on Majestic Blog.

Reblogged 4 years ago from blog.majestic.com

Moving 5 Domains to 1: An SEO Case Study

Posted by Dr-Pete

People often ask me if they should change domain names, and I always shudder just a little. Changing domains is a huge, risky undertaking, and too many people rush into it seeing only the imaginary upside. The success of the change also depends wildly on the details, and it’s not the kind of question anyone should be asking casually on social media.

Recently, I decided that it was time to find a new permanent home for my personal and professional blogs, which had gradually spread out over 5 domains. I also felt my main domain was no longer relevant to my current situation, and it was time for a change. So, ultimately I ended up with a scenario that looked like this:

The top three sites were active, with UserEffect.com being my former consulting site and blog (and relatively well-trafficked). The bottom two sites were both inactive and were both essentially gag sites. My one-pager, AreYouARealDoctor.com, did previously rank well for “are you a real doctor”, so I wanted to try to recapture that.

I started migrating the 5 sites in mid-January, and I’ve been tracking the results. I thought it would be useful to see how this kind of change plays out, in all of the gory details. As it turns out, nothing is ever quite “textbook” when it comes to technical SEO.

Why Change Domains at All?

The rationale for picking a new domain could fill a month’s worth of posts, but I want to make one critical point – changing domains should be about your business goals first, and SEO second. I did not change domains to try to rank better for “Dr. Pete” – that’s a crap shoot at best. I changed domains because my old consulting brand (“User Effect”) no longer represented the kind of work I do and I’m much more known by my personal brand.

That business case was strong enough that I was willing to accept some losses. We went through a similar transition here
from SEOmoz.org to Moz.com. That was a difficult transition that cost us some SEO ground, especially short-term, but our core rationale was grounded in the business and where it’s headed. Don’t let an SEO pipe dream lead you into a risky decision.

Why did I pick a .co domain? I did it for the usual reason – the .com was taken. For a project of this type, where revenue wasn’t on the line, I didn’t have any particular concerns about .co. The evidence on how top-level domains (TLDs) impact ranking is tough to tease apart (so many other factors correlate with .com’s), and Google’s attitude tends to change over time, especially if new TLDs are abused. Anecdotally, though, I’ve seen plenty of .co’s rank, and I wasn’t concerned.

Step 1 – The Boring Stuff

It is absolutely shocking how many people build a new site, slap up some 301s, pull the switch, and hope for the best. It’s less shocking how many of those people end up in Q&A a week later, desperate and bleeding money.


Planning is hard work, and it’s boring – get over it.

You need to be intimately familiar with every page on your existing site(s), and, ideally, you should make a list. Not only do you have to plan for what will happen to each of these pages, but you’ll need that list to make sure everything works smoothly later.

In my case, I decided it might be time to do some housekeeping – the User Effect blog had hundreds of posts, many outdated and quite a few just not very good. So, I started with the easy data – recent traffic. I’m sure you’ve seen this Google Analytics report (Behavior > Site Content > All Pages):

Since I wanted to focus on recent activity, and none of the sites had much new content, I restricted myself to a 3-month window (Q4 of 2014). Of course, I looked much deeper than the top 10, but the principle was simple – I wanted to make sure the data matched my intuition and that I wasn’t cutting off anything important. This helped me prioritize the list.

Of course, from an SEO standpoint, I also didn’t want to lose content that had limited traffic but solid inbound links. So, I checked my “Top Pages” report in
Open Site Explorer:

Since the bulk of my main site was a blog, the top trafficked and top linked-to pages fortunately correlated pretty well. Again, this is only a way to prioritize. If you’re dealing with sites with thousands of pages, you need to work methodically through the site architecture.

I’m going to say something that makes some SEOs itchy – it’s ok not to move some pages to the new site. It’s even ok to let some pages 404. In Q4, UserEffect.com had traffic to 237 URLs. The top 10 pages accounted for 91.9% of that traffic. I strongly believe that moving domains is a good time to refocus a site and concentrate your visitors and link equity on your best content. More is not better in 2015.

Letting go of some pages also means that you’re not 301-redirecting a massive number of old URLs to a new home-page. This can look like a low-quality attempt to consolidate link-equity, and at large scale it can raise red flags with Google. Content worth keeping should exist on the new site, and your 301s should have well-matched targets.

In one case, I had a blog post that had a decent trickle of traffic due to ranking for “50,000 push-ups,” but the post itself was weak and the bounce rate was very high:

The post was basically just a placeholder announcing that I’d be attempting this challenge, but I never recapped anything after finishing it. So, in this case,
I rewrote the post.

Of course, this process was repeated across the 3 active sites. The 2 inactive sites only constituted a handful of total pages. In the case of AreYouARealDoctor.com, I decided to turn the previous one-pager
into a new page on the new site. That way, I had a very well-matched target for the 301-redirect, instead of simply mapping the old site to my new home-page.

I’m trying to prove a point – this is the amount of work I did for a handful of sites that were mostly inactive and producing no current business value. I don’t need consulting gigs and these sites produce no direct revenue, and yet I still considered this process worth the effort.

Step 2 – The Big Day

Eventually, you’re going to have to make the move, and in most cases, I prefer ripping off the bandage. Of course, doing something all at once doesn’t mean you shouldn’t be careful.

The biggest problem I see with domain switches (even if they’re 1-to-1) is that people rely on data that can take weeks to evaluate, like rankings and traffic, or directly checking Google’s index. By then, a lot of damage is already done. Here are some ways to find out quickly if you’ve got problems…

(1) Manually Check Pages

Remember that list you were supposed to make? It’s time to check it, or at least spot-check it. Someone needs to physically go to a browser and make sure that each major section of the site and each important individual page is resolving properly. It doesn’t matter how confident your IT department/guy/gal is – things go wrong.

(2) Manually Check Headers

Just because a page resolves, it doesn’t mean that your 301-redirects are working properly, or that you’re not firing some kind of 17-step redirect chain. Check your headers. There are tons of free tools, but lately I’m fond of
URI Valet. Guess what – I screwed up my primary 301-redirects. One of my registrar transfers wasn’t working, so I had to have a setting changed by customer service, and I inadvertently ended up with 302s (Pro tip: Don’t change registrars and domains in one step):

Don’t think that because you’re an “expert”, your plan is foolproof. Mistakes happen, and because I caught this one I was able to correct it fairly quickly.

(3) Submit Your New Site

You don’t need to submit your site to Google in 2015, but now that Google Webmaster Tools allows it, why not do it? The primary argument I hear is “well, it’s not necessary.” True, but direct submission has one advantage – it’s fast.

To be precise, Google Webmaster Tools separates the process into “Fetch” and “Submit to index” (you’ll find this under “Crawl” > “Fetch as Google”). Fetching will quickly tell you if Google can resolve a URL and retrieve the page contents, which alone is pretty useful. Once a page is fetched, you can submit it, and you should see something like this:

This isn’t really about getting indexed – it’s about getting nearly instantaneous feedback. If Google has any major problems with crawling your site, you’ll know quickly, at least at the macro level.

(4) Submit New XML Sitemaps

Finally, submit a new set of XML sitemaps in Google Webmaster Tools, and preferably tiered sitemaps. While it’s a few years old now, Rob Ousbey has a great post on the subject of
XML sitemap structure. The basic idea is that, if you divide your sitemap into logical sections, it’s going to be much easier to diagnosis what kinds of pages Google is indexing and where you’re running into trouble.

A couple of pro tips on sitemaps – first, keep your old sitemaps active temporarily. This is counterintuitive to some people, but unless Google can crawl your old URLs, they won’t see and process the 301-redirects and other signals. Let the old accounts stay open for a couple of months, and don’t cut off access to the domains you’re moving.

Second (I learned this one the hard way), make sure that your Google Webmaster Tools site verification still works. If you use file uploads or meta tags and don’t move those files/tags to the new site, GWT verification will fail and you won’t have access to your old accounts. I’d recommend using a more domain-independent solution, like verifying with Google Analytics. If you lose verification, don’t panic – your data won’t be instantly lost.

Step 3 – The Waiting Game

Once you’ve made the switch, the waiting begins, and this is where many people start to panic. Even executed perfectly, it can take Google weeks or even months to process all of your 301-redirects and reevaluate a new domain’s capacity to rank. You have to expect short term fluctuations in ranking and traffic.

During this period, you’ll want to watch a few things – your traffic, your rankings, your indexed pages (via GWT and the site: operator), and your errors (such as unexpected 404s). Traffic will recover the fastest, since direct traffic is immediately carried through redirects, but ranking and indexation will lag, and errors may take time to appear.

(1) Monitor Traffic

I’m hoping you know how to check your traffic, but actually trying to determine what your new levels should be and comparing any two days can be easier said than done. If you launch on a Friday, and then Saturday your traffic goes down on the new site, that’s hardly cause for panic – your traffic probably
always goes down on Saturday.

In this case, I redirected the individual sites over about a week, but I’m going to focus on UserEffect.com, as that was the major traffic generator. That site was redirected, in full on January 21st, and the Google Analytics data for January for the old site looked like this:

So far, so good – traffic bottomed out almost immediately. Of course, losing traffic is easy – the real question is what’s going on with the new domain. Here’s the graph for January for DrPete.co:

This one’s a bit trickier – the first spike, on January 16th, is when I redirected the first domain. The second spike, on January 22nd, is when I redirected UserEffect.com. Both spikes are meaningless – I announced these re-launches on social media and got a short-term traffic burst. What we really want to know is where traffic is leveling out.

Of course, there isn’t a lot of history here, but a typical day for UserEffect.com in January was about 1,000 pageviews. The traffic to DrPete.co after it leveled out was about half that (500 pageviews). It’s not a complete crisis, but we’re definitely looking at a short-term loss.

Obviously, I’m simplifying the process here – for a large, ecommerce site you’d want to track a wide range of metrics, including conversion metrics. Hopefully, though, this illustrates the core approach. So, what am I missing out on? In this day of [not provided], tracking down a loss can be tricky. Let’s look for clues in our other three areas…

(2) Monitor Indexation

You can get a broad sense of your indexed pages from Google Webmaster Tools, but this data often lags real-time and isn’t very granular. Despite its shortcomings, I still prefer
the site: operator. Generally, I monitor a domain daily – any one measurement has a lot of noise, but what you’re looking for is the trend over time. Here’s the indexed page count for DrPete.co:

The first set of pages was indexed fairly quickly, and then the second set started being indexed soon after UserEffect.com was redirected. All in all, we’re seeing a fairly steady upward trend, and that’s what we’re hoping to see. The number is also in the ballpark of sanity (compared to the actual page count) and roughly matched GWT data once it started being reported.

So, what happened to UserEffect.com’s index after the switch?

The timeframe here is shorter, since UserEffect.com was redirected last, but we see a gradual decline in indexation, as expected. Note that the index size plateaus around 60 pages – about 1/4 of the original size. This isn’t abnormal – low-traffic and unlinked pages (or those with deep links) are going to take a while to clear out. This is a long-term process. Don’t panic over the absolute numbers – what you want here is a downward trend on the old domain accompanied by a roughly equal upward trend on the new domain.

The fact that UserEffect.com didn’t bottom out is definitely worth monitoring, but this timespan is too short for the plateau to be a major concern. The next step would be to dig into these specific pages and look for a pattern.

(3) Monitor Rankings

The old domain is dropping out of the index, and the new domain is taking its place, but we still don’t know why the new site is taking a traffic hit. It’s time to dig into our core keyword rankings.

Historically, UserEffect.com had ranked well for keywords related to “split test calculator” (near #1) and “usability checklist” (in the top 3). While [not provided] makes keyword-level traffic analysis tricky, we also know that the split-test calculator is one of the top trafficked pages on the site, so let’s dig into that one. Here’s the ranking data from Moz Analytics for “split test calculator”:

The new site took over the #1 position from the old site at first, but then quickly dropped down to the #3/#4 ranking. That may not sound like a lot, but given this general keyword category was one of the site’s top traffic drivers, the CTR drop from #1 to #3/#4 could definitely be causing problems.

When you have a specific keyword you can diagnose, it’s worth taking a look at the live SERP, just to get some context. The day after relaunch, I captured this result for “dr. pete”:

Here, the new domain is ranking, but it’s showing the old title tag. This may not be cause for alarm – weird things often happen in the very short term – but in this case we know that I accidentally set up a 302-redirect. There’s some reason to believe that Google didn’t pass full link equity during that period when 301s weren’t implemented.

Let’s look at a domain where the 301s behaved properly. Before the site was inactive, AreYouARealDoctor.com ranked #1 for “are you a real doctor”. Since there was an inactive period, and I dropped the exact-match domain, it wouldn’t be surprising to see a corresponding ranking drop.

In reality, the new site was ranking #1 for “are you a real doctor” within 2 weeks of 301-redirecting the old domain. The graph is just a horizontal line at #1, so I’m not going to bother you with it, but here’s a current screenshot (incognito):

Early on, I also spot-checked this result, and it wasn’t showing the strange title tag crossover that UserEffect.com pages exhibited. So, it’s very likely that the 302-redirects caused some problems.

Of course, these are just a couple of keywords, but I hope it provides a starting point for you to understand how to methodically approach this problem. There’s no use crying over spilled milk, and I’m not going to fire myself, so let’s move on to checking any other errors that I might have missed.

(4) Check Errors (404s, etc.)

A good first stop for unexpected errors is the “Crawl Errors” report in Google Webmaster Tools (Crawl > Crawl Errors). This is going to take some digging, especially if you’ve deliberately 404’ed some content. Over the couple of weeks after re-launch, I spotted the following problems:

The old site had a “/blog” directory, but the new site put the blog right on the home-page and had no corresponding directory. Doh. Hey, do as I say, not as I do, ok? Obviously, this was a big blunder, as the old blog home-page was well-trafficked.

The other two errors here are smaller but easy to correct. MinimalTalent.com had a “/free” directory that housed downloads (mostly PDFs). I missed it, since my other sites used a different format. Luckily, this was easy to remap.

The last error is a weird looking URL, and there are other similar URLs in the 404 list. This is where site knowledge is critical. I custom-designed a URL shortener for UserEffect.com and, in some cases, people linked to those URLs. Since those URLs didn’t exist in the site architecture, I missed them. This is where digging deep into historical traffic reports and your top-linked pages is critical. In this case, the fix isn’t easy, and I have to decide whether the loss is worth the time.

What About the New EMD?

My goal here wasn’t to rank better for “Dr. Pete,” and finally unseat Dr. Pete’s Marinades, Dr. Pete the Sodastream flavor (yes, it’s hilarious – you can stop sending me your grocery store photos), and 172 dentists. Ok, it mostly wasn’t my goal. Of course, you might be wondering how switching to an EMD worked out.

In the short term, I’m afraid the answer is “not very well.” I didn’t track ranking for “Dr. Pete” and related phrases very often before the switch, but it appears that ranking actually fell in the short-term. Current estimates have me sitting around page 4, even though my combined link profile suggests a much stronger position. Here’s a look at the ranking history for “dr pete” since relaunch (from Moz Analytics):

There was an initial drop, after which the site evened out a bit. This less-than-impressive plateau could be due to the bad 302s during transition. It could be Google evaluating a new EMD and multiple redirects to that EMD. It could be that the prevalence of natural anchor text with “Dr. Pete” pointing to my site suddenly looked unnatural when my domain name switched to DrPete.co. It could just be that this is going to take time to shake out.

If there’s a lesson here (and, admittedly, it’s too soon to tell), it’s that you shouldn’t rush to buy an EMD in 2015 in the wild hope of instantly ranking for that target phrase. There are so many factors involved in ranking for even a moderately competitive term, and your domain is just one small part of the mix.

So, What Did We Learn?

I hope you learned that I should’ve taken my own advice and planned a bit more carefully. I admit that this was a side project and it didn’t get the attention it deserved. The problem is that, even when real money is at stake, people rush these things and hope for the best. There’s a real cheerleading mentality when it comes to change – people want to take action and only see the upside.

Ultimately, in a corporate or agency environment, you can’t be the one sour note among the cheering. You’ll be ignored, and possibly even fired. That’s not fair, but it’s reality. What you need to do is make sure the work gets done right and people go into the process with eyes wide open. There’s no room for shortcuts when you’re moving to a new domain.

That said, a domain change isn’t a death sentence, either. Done right, and with sensible goals in mind – balancing not just SEO but broader marketing and business objectives – a domain migration can be successful, even across multiple sites.

To sum up: Plan, plan, plan, monitor, monitor, monitor, and try not to panic.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Videos: New How-To videos to guide you through Majestic

The videos just keep on coming! In January, we announced the release of some new “how-to training videos” to guide you through Majestic. Each “how-to” video explains a particular feature in the Majestic range – whether it be a tool, a terminology, or an overview of a Majestic development. We have now published another three…

The post Videos: New How-To videos to guide you through Majestic appeared first on Majestic Blog.

Reblogged 4 years ago from blog.majestic.com

What Deep Learning and Machine Learning Mean For the Future of SEO – Whiteboard Friday

Posted by randfish

Imagine a world where even the high-up Google engineers don’t know what’s in the ranking algorithm. We may be moving in that direction. In today’s Whiteboard Friday, Rand explores and explains the concepts of deep learning and machine learning, drawing us a picture of how they could impact our work as SEOs.

For reference, here’s a still of this week’s whiteboard!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we are going to take a peek into Google’s future and look at what it could mean as Google advances their machine learning and deep learning capabilities. I know these sound like big, fancy, important words. They’re not actually that tough of topics to understand. In fact, they’re simplistic enough that even a lot of technology firms like Moz do some level of machine learning. We don’t do anything with deep learning and a lot of neural networks. We might be going that direction.

But I found an article that was published in January, absolutely fascinating and I think really worth reading, and I wanted to extract some of the contents here for Whiteboard Friday because I do think this is tactically and strategically important to understand for SEOs and really important for us to understand so that we can explain to our bosses, our teams, our clients how SEO works and will work in the future.

The article is called “Google Search Will Be Your Next Brain.” It’s by Steve Levy. It’s over on Medium. I do encourage you to read it. It’s a relatively lengthy read, but just a fascinating one if you’re interested in search. It starts with a profile of Geoff Hinton, who was a professor in Canada and worked on neural networks for a long time and then came over to Google and is now a distinguished engineer there. As the article says, a quote from the article: “He is versed in the black art of organizing several layers of artificial neurons so that the entire system, the system of neurons, could be trained or even train itself to divine coherence from random inputs.”

This sounds complex, but basically what we’re saying is we’re trying to get machines to come up with outcomes on their own rather than us having to tell them all the inputs to consider and how to process those incomes and the outcome to spit out. So this is essentially machine learning. Google has used this, for example, to figure out when you give it a bunch of photos and it can say, “Oh, this is a landscape photo. Oh, this is an outdoor photo. Oh, this is a photo of a person.” Have you ever had that creepy experience where you upload a photo to Facebook or to Google+ and they say, “Is this your friend so and so?” And you’re like, “God, that’s a terrible shot of my friend. You can barely see most of his face, and he’s wearing glasses which he usually never wears. How in the world could Google+ or Facebook figure out that this is this person?”

That’s what they use, these neural networks, these deep machine learning processes for. So I’ll give you a simple example. Here at MOZ, we do machine learning very simplistically for page authority and domain authority. We take all the inputs — numbers of links, number of linking root domains, every single metric that you could get from MOZ on the page level, on the sub-domain level, on the root-domain level, all these metrics — and then we combine them together and we say, “Hey machine, we want you to build us the algorithm that best correlates with how Google ranks pages, and here’s a bunch of pages that Google has ranked.” I think we use a base set of 10,000, and we do it about quarterly or every 6 months, feed that back into the system and the system pumps out the little algorithm that says, “Here you go. This will give you the best correlating metric with how Google ranks pages.” That’s how you get page authority domain authority.

Cool, really useful, helpful for us to say like, “Okay, this page is probably considered a little more important than this page by Google, and this one a lot more important.” Very cool. But it’s not a particularly advanced system. The more advanced system is to have these kinds of neural nets in layers. So you have a set of networks, and these neural networks, by the way, they’re designed to replicate nodes in the human brain, which is in my opinion a little creepy, but don’t worry. The article does talk about how there’s a board of scientists who make sure Terminator 2 doesn’t happen, or Terminator 1 for that matter. Apparently, no one’s stopping Terminator 4 from happening? That’s the new one that’s coming out.

So one layer of the neural net will identify features. Another layer of the neural net might classify the types of features that are coming in. Imagine this for search results. Search results are coming in, and Google’s looking at the features of all the websites and web pages, your websites and pages, to try and consider like, “What are the elements I could pull out from there?”

Well, there’s the link data about it, and there are things that happen on the page. There are user interactions and all sorts of stuff. Then we’re going to classify types of pages, types of searches, and then we’re going to extract the features or metrics that predict the desired result, that a user gets a search result they really like. We have an algorithm that can consistently produce those, and then neural networks are hopefully designed — that’s what Geoff Hinton has been working on — to train themselves to get better. So it’s not like with PA and DA, our data scientist Matt Peters and his team looking at it and going, “I bet we could make this better by doing this.”

This is standing back and the guys at Google just going, “All right machine, you learn.” They figure it out. It’s kind of creepy, right?

In the original system, you needed those people, these individuals here to feed the inputs, to say like, “This is what you can consider, system, and the features that we want you to extract from it.”

Then unsupervised learning, which is kind of this next step, the system figures it out. So this takes us to some interesting places. Imagine the Google algorithm, circa 2005. You had basically a bunch of things in here. Maybe you’d have anchor text, PageRank and you’d have some measure of authority on a domain level. Maybe there are people who are tossing new stuff in there like, “Hey algorithm, let’s consider the location of the searcher. Hey algorithm, let’s consider some user and usage data.” They’re tossing new things into the bucket that the algorithm might consider, and then they’re measuring it, seeing if it improves.

But you get to the algorithm today, and gosh there are going to be a lot of things in there that are driven by machine learning, if not deep learning yet. So there are derivatives of all of these metrics. There are conglomerations of them. There are extracted pieces like, “Hey, we only ant to look and measure anchor text on these types of results when we also see that the anchor text matches up to the search queries that have previously been performed by people who also search for this.” What does that even mean? But that’s what the algorithm is designed to do. The machine learning system figures out things that humans would never extract, metrics that we would never even create from the inputs that they can see.

Then, over time, the idea is that in the future even the inputs aren’t given by human beings. The machine is getting to figure this stuff out itself. That’s weird. That means that if you were to ask a Google engineer in a world where deep learning controls the ranking algorithm, if you were to ask the people who designed the ranking system, “Hey, does it matter if I get more links,” they might be like, “Well, maybe.” But they don’t know, because they don’t know what’s in this algorithm. Only the machine knows, and the machine can’t even really explain it. You could go take a snapshot and look at it, but (a) it’s constantly evolving, and (b) a lot of these metrics are going to be weird conglomerations and derivatives of a bunch of metrics mashed together and torn apart and considered only when certain criteria are fulfilled. Yikes.

So what does that mean for SEOs. Like what do we have to care about from all of these systems and this evolution and this move towards deep learning, which by the way that’s what Jeff Dean, who is, I think, a senior fellow over at Google, he’s the dude that everyone mocks for being the world’s smartest computer scientist over there, and Jeff Dean has basically said, “Hey, we want to put this into search. It’s not there yet, but we want to take these models, these things that Hinton has built, and we want to put them into search.” That for SEOs in the future is going to mean much less distinct universal ranking inputs, ranking factors. We won’t really have ranking factors in the way that we know them today. It won’t be like, “Well, they have more anchor text and so they rank higher.” That might be something we’d still look at and we’d say, “Hey, they have this anchor text. Maybe that’s correlated with what the machine is finding, the system is finding to be useful, and that’s still something I want to care about to a certain extent.”

But we’re going to have to consider those things a lot more seriously. We’re going to have to take another look at them and decide and determine whether the things that we thought were ranking factors still are when the neural network system takes over. It also is going to mean something that I think many, many SEOs have been predicting for a long time and have been working towards, which is more success for websites that satisfy searchers. If the output is successful searches, and that’ s what the system is looking for, and that’s what it’s trying to correlate all its metrics to, if you produce something that means more successful searches for Google searchers when they get to your site, and you ranking in the top means Google searchers are happier, well you know what? The algorithm will catch up to you. That’s kind of a nice thing. It does mean a lot less info from Google about how they rank results.

So today you might hear from someone at Google, “Well, page speed is a very small ranking factor.” In the future they might be, “Well, page speed is like all ranking factors, totally unknown to us.” Because the machine might say, “Well yeah, page speed as a distinct metric, one that a Google engineer could actually look at, looks very small.” But derivatives of things that are connected to page speed may be huge inputs. Maybe page speed is something, that across all of these, is very well connected with happier searchers and successful search results. Weird things that we never thought of before might be connected with them as the machine learning system tries to build all those correlations, and that means potentially many more inputs into the ranking algorithm, things that we would never consider today, things we might consider wholly illogical, like, “What servers do you run on?” Well, that seems ridiculous. Why would Google ever grade you on that?

If human beings are putting factors into the algorithm, they never would. But the neural network doesn’t care. It doesn’t care. It’s a honey badger. It doesn’t care what inputs it collects. It only cares about successful searches, and so if it turns out that Ubuntu is poorly correlated with successful search results, too bad.

This world is not here yet today, but certainly there are elements of it. Google has talked about how Panda and Penguin are based off of machine learning systems like this. I think, given what Geoff Hinton and Jeff Dean are working on at Google, it sounds like this will be making its way more seriously into search and therefore it’s something that we’re really going to have to consider as search marketers.

All right everyone, I hope you’ll join me again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it