dotmailer receives ‘Great User Experience’ title for email marketing software – from reputable business software directory

Leading business software directory FinancesOnline believes businesses and organizations can highly benefit from an email automation marketing platform that is both feature-rich and easy to use. FinancesOnline’s experts found this in dotmailer, thus they gave us a positive 8.8 score and bestowed to us their prestigious Great User Experience and Rising Star awards.

 

The Great User Experience and Rising Star recognition for online email marketing software is given to systems that have satisfied clients with well-designed functionalities alongside a user-friendly and intuitive interface. This can be attributed to dotmailer’s unique drag-and-drop template builder that allow users to effortlessly create impressive email templates within a few minutes. It was also one of the reasons why our solution was recommended in the platform’s ‘what is email marketing software’ guide.

 

FinancesOnline believes dotmailer’s throng of functionalities enables users to remain “on top of every single phase of their email marketing campaigns and other related activities.” Aside from easily creating emails, FinancesOnline said our software can help users “fully optimize their email marketing strategies and get the best results” through various services including, but not limited to, campaign management, creative studio and strategic services. With these, users can significantly boost click-through rates and grow their business.

 

Businesses are also safeguarded with dotmailer’s scalability and custom-built integrations. “As your business needs develop and become more demanding and diverse, dotmailer is more than capable of growing with your enterprise,” wrote FinancesOnline’s experts.

The post dotmailer receives ‘Great User Experience’ title for email marketing software – from reputable business software directory appeared first on The Marketing Automation Blog.

Reblogged 1 month ago from blog.dotmailer.com

Controlling Search Engine Crawlers for Better Indexation and Rankings – Whiteboard Friday

Posted by randfish

When should you disallow search engines in your robots.txt file, and when should you use meta robots tags in a page header? What about nofollowing links? In today’s Whiteboard Friday, Rand covers these tools and their appropriate use in four situations that SEOs commonly find themselves facing.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to talk about controlling search engine crawlers, blocking bots, sending bots where we want, restricting them from where we don’t want them to go. We’re going to talk a little bit about crawl budget and what you should and shouldn’t have indexed.

As a start, what I want to do is discuss the ways in which we can control robots. Those include the three primary ones: robots.txt, meta robots, and—well, the nofollow tag is a little bit less about controlling bots.

There are a few others that we’re going to discuss as well, including Webmaster Tools (Search Console) and URL status codes. But let’s dive into those first few first.

Robots.txt lives at yoursite.com/robots.txt, it tells crawlers what they should and shouldn’t access, it doesn’t always get respected by Google and Bing. So a lot of folks when you say, “hey, disallow this,” and then you suddenly see those URLs popping up and you’re wondering what’s going on, look—Google and Bing oftentimes think that they just know better. They think that maybe you’ve made a mistake, they think “hey, there’s a lot of links pointing to this content, there’s a lot of people who are visiting and caring about this content, maybe you didn’t intend for us to block it.” The more specific you get about an individual URL, the better they usually are about respecting it. The less specific, meaning the more you use wildcards or say “everything behind this entire big directory,” the worse they are about necessarily believing you.

Meta robots—a little different—that lives in the headers of individual pages, so you can only control a single page with a meta robots tag. That tells the engines whether or not they should keep a page in the index, and whether they should follow the links on that page, and it’s usually a lot more respected, because it’s at an individual-page level; Google and Bing tend to believe you about the meta robots tag.

And then the nofollow tag, that lives on an individual link on a page. It doesn’t tell engines where to crawl or not to crawl. All it’s saying is whether you editorially vouch for a page that is being linked to, and whether you want to pass the PageRank and link equity metrics to that page.

Interesting point about meta robots and robots.txt working together (or not working together so well)—many, many folks in the SEO world do this and then get frustrated.

What if, for example, we take a page like “blogtest.html” on our domain and we say “all user agents, you are not allowed to crawl blogtest.html. Okay—that’s a good way to keep that page away from being crawled, but just because something is not crawled doesn’t necessarily mean it won’t be in the search results.

So then we have our SEO folks go, “you know what, let’s make doubly sure that doesn’t show up in search results; we’ll put in the meta robots tag:”

<meta name="robots" content="noindex, follow">

So, “noindex, follow” tells the search engine crawler they can follow the links on the page, but they shouldn’t index this particular one.

Then, you go and run a search for “blog test” in this case, and everybody on the team’s like “What the heck!? WTF? Why am I seeing this page show up in search results?”

The answer is, you told the engines that they couldn’t crawl the page, so they didn’t. But they are still putting it in the results. They’re actually probably not going to include a meta description; they might have something like “we can’t include a meta description because of this site’s robots.txt file.” The reason it’s showing up is because they can’t see the noindex; all they see is the disallow.

So, if you want something truly removed, unable to be seen in search results, you can’t just disallow a crawler. You have to say meta “noindex” and you have to let them crawl it.

So this creates some complications. Robots.txt can be great if we’re trying to save crawl bandwidth, but it isn’t necessarily ideal for preventing a page from being shown in the search results. I would not recommend, by the way, that you do what we think Twitter recently tried to do, where they tried to canonicalize www and non-www by saying “Google, don’t crawl the www version of twitter.com.” What you should be doing is rel canonical-ing or using a 301.

Meta robots—that can allow crawling and link-following while disallowing indexation, which is great, but it requires crawl budget and you can still conserve indexing.

The nofollow tag, generally speaking, is not particularly useful for controlling bots or conserving indexation.

Webmaster Tools (now Google Search Console) has some special things that allow you to restrict access or remove a result from the search results. For example, if you have 404’d something or if you’ve told them not to crawl something but it’s still showing up in there, you can manually say “don’t do that.” There are a few other crawl protocol things that you can do.

And then URL status codes—these are a valid way to do things, but they’re going to obviously change what’s going on on your pages, too.

If you’re not having a lot of luck using a 404 to remove something, you can use a 410 to permanently remove something from the index. Just be aware that once you use a 410, it can take a long time if you want to get that page re-crawled or re-indexed, and you want to tell the search engines “it’s back!” 410 is permanent removal.

301—permanent redirect, we’ve talked about those here—and 302, temporary redirect.

Now let’s jump into a few specific use cases of “what kinds of content should and shouldn’t I allow engines to crawl and index” in this next version…

[Rand moves at superhuman speed to erase the board and draw part two of this Whiteboard Friday. Seriously, we showed Roger how fast it was, and even he was impressed.]

Four crawling/indexing problems to solve

So we’ve got these four big problems that I want to talk about as they relate to crawling and indexing.

1. Content that isn’t ready yet

The first one here is around, “If I have content of quality I’m still trying to improve—it’s not yet ready for primetime, it’s not ready for Google, maybe I have a bunch of products and I only have the descriptions from the manufacturer and I need people to be able to access them, so I’m rewriting the content and creating unique value on those pages… they’re just not ready yet—what should I do with those?”

My options around crawling and indexing? If I have a large quantity of those—maybe thousands, tens of thousands, hundreds of thousands—I would probably go the robots.txt route. I’d disallow those pages from being crawled, and then eventually as I get (folder by folder) those sets of URLs ready, I can then allow crawling and maybe even submit them to Google via an XML sitemap.

If I’m talking about a small quantity—a few dozen, a few hundred pages—well, I’d probably just use the meta robots noindex, and then I’d pull that noindex off of those pages as they are made ready for Google’s consumption. And then again, I would probably use the XML sitemap and start submitting those once they’re ready.

2. Dealing with duplicate or thin content

What about, “Should I noindex, nofollow, or potentially disallow crawling on largely duplicate URLs or thin content?” I’ve got an example. Let’s say I’m an ecommerce shop, I’m selling this nice Star Wars t-shirt which I think is kind of hilarious, so I’ve got starwarsshirt.html, and it links out to a larger version of an image, and that’s an individual HTML page. It links out to different colors, which change the URL of the page, so I have a gray, blue, and black version. Well, these four pages are really all part of this same one, so I wouldn’t recommend disallowing crawling on these, and I wouldn’t recommend noindexing them. What I would do there is a rel canonical.

Remember, rel canonical is one of those things that can be precluded by disallowing. So, if I were to disallow these from being crawled, Google couldn’t see the rel canonical back, so if someone linked to the blue version instead of the default version, now I potentially don’t get link credit for that. So what I really want to do is use the rel canonical, allow the indexing, and allow it to be crawled. If you really feel like it, you could also put a meta “noindex, follow” on these pages, but I don’t really think that’s necessary, and again that might interfere with the rel canonical.

3. Passing link equity without appearing in search results

Number three: “If I want to pass link equity (or at least crawling) through a set of pages without those pages actually appearing in search results—so maybe I have navigational stuff, ways that humans are going to navigate through my pages, but I don’t need those appearing in search results—what should I use then?”

What I would say here is, you can use the meta robots to say “don’t index the page, but do follow the links that are on that page.” That’s a pretty nice, handy use case for that.

Do NOT, however, disallow those in robots.txt—many, many folks make this mistake. What happens if you disallow crawling on those, Google can’t see the noindex. They don’t know that they can follow it. Granted, as we talked about before, sometimes Google doesn’t obey the robots.txt, but you can’t rely on that behavior. Trust that the disallow in robots.txt will prevent them from crawling. So I would say, the meta robots “noindex, follow” is the way to do this.

4. Search results-type pages

Finally, fourth, “What should I do with search results-type pages?” Google has said many times that they don’t like your search results from your own internal engine appearing in their search results, and so this can be a tricky use case.

Sometimes a search result page—a page that lists many types of results that might come from a database of types of content that you’ve got on your site—could actually be a very good result for a searcher who is looking for a wide variety of content, or who wants to see what you have on offer. Yelp does this: When you say, “I’m looking for restaurants in Seattle, WA,” they’ll give you what is essentially a list of search results, and Google does want those to appear because that page provides a great result. But you should be doing what Yelp does there, and make the most common or popular individual sets of those search results into category-style pages. A page that provides real, unique value, that’s not just a list of search results, that is more of a landing page than a search results page.

However, that being said, if you’ve got a long tail of these, or if you’d say “hey, our internal search engine, that’s really for internal visitors only—it’s not useful to have those pages show up in search results, and we don’t think we need to make the effort to make those into category landing pages.” Then you can use the disallow in robots.txt to prevent those.

Just be cautious here, because I have sometimes seen an over-swinging of the pendulum toward blocking all types of search results, and sometimes that can actually hurt your SEO and your traffic. Sometimes those pages can be really useful to people. So check your analytics, and make sure those aren’t valuable pages that should be served up and turned into landing pages. If you’re sure, then go ahead and disallow all your search results-style pages. You’ll see a lot of sites doing this in their robots.txt file.

That being said, I hope you have some great questions about crawling and indexing, controlling robots, blocking robots, allowing robots, and I’ll try and tackle those in the comments below.

We’ll look forward to seeing you again next week for another edition of Whiteboard Friday. Take care!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 years ago from tracking.feedpress.it

Your Daily SEO Fix: Week 5

Posted by Trevor-Klein

We’ve arrived, folks! This is the last installment of our short (< 2-minute) video tutorials that help you all get the most out of Moz’s tools. If you haven’t been following along, these are each designed to solve a use case that we regularly hear about from Moz community members.

Here’s a quick recap of the previous round-ups in case you missed them:

  • Week 1: Reclaim links using Open Site Explorer, build links using Fresh Web Explorer, and find the best time to tweet using Followerwonk.
  • Week 2: Analyze SERPs using new MozBar features, boost your rankings through on-page optimization, check your anchor text using Open Site Explorer, do keyword research with OSE and the keyword difficulty tool, and discover keyword opportunities in Moz Analytics.
  • Week 3: Compare link metrics in Open Site Explorer, find tweet topics with Followerwonk, create custom reports in Moz Analytics, use Spam Score to identify high-risk links, and get link building opportunities delivered to your inbox.
  • Week 4: Use Fresh Web Explorer to build links, analyze rank progress for a given keyword, use the MozBar to analyze your competitors’ site markup, use the Top Pages report to find content ideas, and find on-site errors with Crawl Test.

We’ve got five new fixes for you in this edition:

  • How to Use the Full SERP Report
  • How to Find Fresh Links and Manage Your Brand Online Using Open Site Explorer
  • How to Build Your Link Profile with Link Intersect
  • How to Find Local Citations Using the MozBar
  • Bloopers: How to Screw Up While Filming a Daily SEO Fix

Hope you enjoy them!


Fix 1: How to Use the Full SERP Report

Moz’s Full SERP Report is a detailed report that shows the top ten ranking URLs for a specific keyword and presents the potential ranking signals in an easy-to-view format. In this Daily SEO Fix, Meredith breaks down the report so you can see all the sections and how each are used.

.video-container {
position: relative;
padding-bottom: 56.25%;
padding-top: 30px; height: 0; overflow: hidden;
}
.video-container iframe,
.video-container object,
.video-container embed {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}


Fix 2: How to Find Fresh Links and Manage Your Brand Online Using Open Site Explorer

The Just-Discovered Links report in Open Site Explorer helps you discover recently created links within an hour of them being published. In this fix, Nick shows you how to use the report to view who is linking to you, how they’re doing it, and what they are saying, so you can capitalize on link opportunities while they’re still fresh and join the conversation about your brand.


Fix 3: How to Build Your Link Profile with Link Intersect

The quantity and (more importantly) quality of backlinks to your website make up your link profile, one of the most important elements in SEO and an incredibly important factor in search engine rankings. In this Daily SEO Fix, Tori shows you how to use Moz’s Link Intersect tool to analyze the competitions’ backlinks. Plus, learn how to find opportunities to build links and strengthen your own link profile.


Fix 4: How to Find Local Citations Using the MozBar

Citations are mentions of your business and address on webpages other than your own such as an online yellow pages directory or a local business association page. They are a key component in search engine ranking algorithms so building consistent and accurate citations for your local business(s) is a key Local SEO tactic. In today’s Daily SEO Fix, Tori shows you how to use MozBar to find local citations around the web


Bloopers: How to Screw Up While Filming a Daily SEO Fix

We had a lot of fun filming this series, and there were plenty of laughs along the way. Like these ones. =)


Looking for more?

We’ve got more videos in the previous four weeks’ round-ups!

Your Daily SEO Fix: Week 1

Your Daily SEO Fix: Week 2

Your Daily SEO Fix: Week 3

Your Daily SEO Fix: Week 4


Don’t have a Pro subscription? No problem. Everything we cover in these Daily SEO Fix videos is available with a free 30-day trial.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 years ago from tracking.feedpress.it

5 Spreadsheet Tips for Manual Link Audits

Posted by MarieHaynes

Link auditing is the part of my job that I love the most. I have audited a LOT of links over the last few years. While there are some programs out there that can be quite helpful to the avid link auditor, I still prefer to create a spreadsheet of my links in Excel and then to audit those links one-by-one from within Google Spreadsheets. Over the years I have learned a few tricks and formulas that have helped me in this process. In this article, I will share several of these with you.

Please know that while I am quite comfortable being labelled a link auditing expert, I am not an Excel wizard. I am betting that some of the things that I am doing could be improved upon if you’re an advanced user. As such, if you have any suggestions or tips of your own I’d love to hear them in the comments section!

1. Extract the domain or subdomain from a URL

OK. You’ve downloaded links from as many sources as possible and now you want to manually visit and evaluate one link from every domain. But, holy moly, some of these domains can have THOUSANDS of links pointing to the site. So, let’s break these down so that you are just seeing one link from each domain. The first step is to extract the domain or subdomain from each url.

I am going to show you examples from a Google spreadsheet as I find that these display nicer for demonstration purposes. However, if you’ve got a fairly large site, you’ll find that the spreadsheets are easier to create in Excel. If you’re confused about any of these steps, check out the animated gif at the end of each step to see the process in action.

Here is how you extract a domain or subdomain from a url:

  • Create a new column to the left of your url column.
  • Use this formula:

    =LEFT(B1,FIND(“/”,B1,9)-1)

    What this will do is remove everything after the trailing slash following the domain name. http://www.example.com/article.html will now become http://www.example.com and http://www.subdomain.example.com/article.html will now become http://www.subdomain.example.com.

  • Copy our new column A and paste it right back where it was using the “paste as values” function. If you don’t do this, you won’t be able to use the Find and Replace feature.
  • Use Find and Replace to replace each of the following with a blank (i.e. nothing):
    http://
    https://
    www.

And BOOM! We are left with a column that contains just domain names and subdomain names. This animated gif shows each of the steps we just outlined:

2. Just show one link from each domain

The next step is to filter this list so that we are just seeing one link from each domain. If you are manually reviewing links, there’s usually no point in reviewing every single link from every domain. I will throw in a word of caution here though. Sometimes a domain can have both a good link and a bad link pointing to you. Or in some cases, you may find that links from one page are followed and from another page on the same site they are nofollowed. You can miss some of these by just looking at one link from each domain. Personally, I have some checks built in to my process where I use Scrapebox and some internal tools that I have created to make sure that I’m not missing the odd link by just looking at one link from each domain. For most link audits, however, you are not going to miss very much by assessing one link from each domain.

Here’s how we do it:

  • Highlight our domains column and sort the column in alphabetical order.
  • Create a column to the left of our domains, so that the domains are in column B.
  • Use this formula:

    =IF(B1=B2,”duplicate”,”unique”)

  • Copy that formula down the column.
  • Use the filter function so that you are just seeing the duplicates.
  • Delete those rows. Note: If you have tens of thousands of rows to delete, the spreadsheet may crash. A workaround here is to use “Clear Rows” instead of “Delete Rows” and then sort your domains column from A-Z once you are finished.

We’ve now got a list of one link from every domain linking to us.

Here’s the gif that shows each of these steps:

You may wonder why I didn’t use Excel’s dedupe function to simply deduplicate these entries. I have found that it doesn’t take much deduplication to crash Excel, which is why I do this step manually.

3. Finding patterns FTW!

Sometimes when you are auditing links, you’ll find that unnatural links have patterns. I LOVE when I see these, because sometimes I can quickly go through hundreds of links without having to check each one manually. Here is an example. Let’s say that your website has a bunch of spammy directory links. As you’re auditing you notice patterns such as one of these:

  • All of these directory links come from a url that contains …/computers/internet/item40682/
  • A whole bunch of spammy links that all come from a particular free subdomain like blogspot, wordpress, weebly, etc.
  • A lot of links that all contain a particular keyword for anchor text (this is assuming you’ve included anchor text in your spreadsheet when making it.)

You can quickly find all of these links and mark them as “disavow” or “keep” by doing the following:

  • Create a new column. In my example, I am going to create a new column in Column C and look for patterns in urls that are in Column B.
  • Use this formula:

    =FIND(“/item40682”,B1)
    (You would replace “item40682” with the phrase that you are looking for.)

  • Copy this formula down the column.
  • Filter your new column so that you are seeing any rows that have a number in this column. If the phrase doesn’t exist in that url, you’ll see “N/A”, and we can ignore those.
  • Now you can mark these all as disavow

4. Check your disavow file

This next tip is one that you can use to check your disavow file across your list of domains that you want to audit. The goal here is to see which links you have disavowed so that you don’t waste time reassessing them. This particular tip only works for checking links that you have disavowed on the domain level.

The first thing you’ll want to do is download your current disavow file from Google. For some strange reason, Google gives you the disavow file in CSV format. I have never understood this because they want you to upload the file in .txt. Still, I guess this is what works best for Google. All of your entries will be in column A of the CSV:

What we are going to do now is add these to a new sheet on our current spreadsheet and use a VLOOKUP function to mark which of our domains we have disavowed.

Here are the steps:

  • Create a new sheet on your current spreadsheet workbook.
  • Copy and paste column A from your disavow spreadsheet onto this new sheet. Or, alternatively, use the import function to import the entire CSV onto this sheet.
  • In B1, write “previously disavowed” and copy this down the entire column.
  • Remove the “domain:” from each of the entries by doing a Find and Replace to replace domain: with a blank.
  • Now go back to your link audit spreadsheet. If your domains are in column A and if you had, say, 1500 domains in your disavow file, your formula would look like this:

    =VLOOKUP(A1,Sheet2!$A$1:$B$1500,2,FALSE)

When you copy this formula down the spreadsheet, it will check each of your domains, and if it finds the domain in Sheet 2, it will write “previously disavowed” on our link audit spreadsheet.

Here is a gif that shows the process:

5. Make monthly or quarterly disavow work easier

That same formula described above is a great one to use if you are doing regular repeated link audits. In this case, your second sheet on your spreadsheet would contain domains that you have previously audited, and column B of this spreadsheet would say, “previously audited” rather than “previously disavowed“.

Your tips?

These are just a few of the formulas that you can use to help make link auditing work easier. But there are lots of other things you can do with Excel or Google Sheets to help speed up the process as well. If you have some tips to add, leave a comment below. Also, if you need clarification on any of these tips, I’m happy to answer questions in the comments section.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 years ago from tracking.feedpress.it

How to Use Server Log Analysis for Technical SEO

Posted by SamuelScott

It’s ten o’clock. Do you know where your logs are?

I’m introducing this guide with a pun on a common public-service announcement that has run on late-night TV news broadcasts in the United States because log analysis is something that is extremely newsworthy and important.

If your technical and on-page SEO is poor, then nothing else that you do will matter. Technical SEO is the key to helping search engines to crawl, parse, and index websites, and thereby rank them appropriately long before any marketing work begins.

The important thing to remember: Your log files contain the only data that is 100% accurate in terms of how search engines are crawling your website. By helping Google to do its job, you will set the stage for your future SEO work and make your job easier. Log analysis is one facet of technical SEO, and correcting the problems found in your logs will help to lead to higher rankings, more traffic, and more conversions and sales.

Here are just a few reasons why:

  • Too many response code errors may cause Google to reduce its crawling of your website and perhaps even your rankings.
  • You want to make sure that search engines are crawling everything, new and old, that you want to appear and rank in the SERPs (and nothing else).
  • It’s crucial to ensure that all URL redirections will pass along any incoming “link juice.”

However, log analysis is something that is unfortunately discussed all too rarely in SEO circles. So, here, I wanted to give the Moz community an introductory guide to log analytics that I hope will help. If you have any questions, feel free to ask in the comments!

What is a log file?

Computer servers, operating systems, network devices, and computer applications automatically generate something called a log entry whenever they perform an action. In a SEO and digital marketing context, one type of action is whenever a page is requested by a visiting bot or human.

Server log entries are specifically programmed to be output in the Common Log Format of the W3C consortium. Here is one example from Wikipedia with my accompanying explanations:

127.0.0.1 user-identifier frank [10/Oct/2000:13:55:36 -0700] "GET /apache_pb.gif HTTP/1.0" 200 2326
  • 127.0.0.1 — The remote hostname. An IP address is shown, like in this example, whenever the DNS hostname is not available or DNSLookup is turned off.
  • user-identifier — The remote logname / RFC 1413 identity of the user. (It’s not that important.)
  • frank — The user ID of the person requesting the page. Based on what I see in my Moz profile, Moz’s log entries would probably show either “SamuelScott” or “392388” whenever I visit a page after having logged in.
  • [10/Oct/2000:13:55:36 -0700] — The date, time, and timezone of the action in question in strftime format.
  • GET /apache_pb.gif HTTP/1.0 — “GET” is one of the two commands (the other is “POST”) that can be performed. “GET” fetches a URL while “POST” is submitting something (such as a forum comment). The second part is the URL that is being accessed, and the last part is the version of HTTP that is being accessed.
  • 200 — The status code of the document that was returned.
  • 2326 — The size, in bytes, of the document that was returned.

Note: A hyphen is shown in a field when that information is unavailable.

Every single time that you — or the Googlebot — visit a page on a website, a line with this information is output, recorded, and stored by the server.

Log entries are generated continuously and anywhere from several to thousands can be created every second — depending on the level of a given server, network, or application’s activity. A collection of log entries is called a log file (or often in slang, “the log” or “the logs”), and it is displayed with the most-recent log entry at the bottom. Individual log files often contain a calendar day’s worth of log entries.

Accessing your log files

Different types of servers store and manage their log files differently. Here are the general guides to finding and managing log data on three of the most-popular types of servers:

What is log analysis?

Log analysis (or log analytics) is the process of going through log files to learn something from the data. Some common reasons include:

  • Development and quality assurance (QA) — Creating a program or application and checking for problematic bugs to make sure that it functions properly
  • Network troubleshooting — Responding to and fixing system errors in a network
  • Customer service — Determining what happened when a customer had a problem with a technical product
  • Security issues — Investigating incidents of hacking and other intrusions
  • Compliance matters — Gathering information in response to corporate or government policies
  • Technical SEO — This is my favorite! More on that in a bit.

Log analysis is rarely performed regularly. Usually, people go into log files only in response to something — a bug, a hack, a subpoena, an error, or a malfunction. It’s not something that anyone wants to do on an ongoing basis.

Why? This is a screenshot of ours of just a very small part of an original (unstructured) log file:

Ouch. If a website gets 10,000 visitors who each go to ten pages per day, then the server will create a log file every day that will consist of 100,000 log entries. No one has the time to go through all of that manually.

How to do log analysis

There are three general ways to make log analysis easier in SEO or any other context:

  • Do-it-yourself in Excel
  • Proprietary software such as Splunk or Sumo-logic
  • The ELK Stack open-source software

Tim Resnik’s Moz essay from a few years ago walks you through the process of exporting a batch of log files into Excel. This is a (relatively) quick and easy way to do simple log analysis, but the downside is that one will see only a snapshot in time and not any overall trends. To obtain the best data, it’s crucial to use either proprietary tools or the ELK Stack.

Splunk and Sumo-Logic are proprietary log analysis tools that are primarily used by enterprise companies. The ELK Stack is a free and open-source batch of three platforms (Elasticsearch, Logstash, and Kibana) that is owned by Elastic and used more often by smaller businesses. (Disclosure: We at Logz.io use the ELK Stack to monitor our own internal systems as well as for the basis of our own log management software.)

For those who are interested in using this process to do technical SEO analysis, monitor system or application performance, or for any other reason, our CEO, Tomer Levy, has written a guide to deploying the ELK Stack.

Technical SEO insights in log data

However you choose to access and understand your log data, there are many important technical SEO issues to address as needed. I’ve included screenshots of our technical SEO dashboard with our own website’s data to demonstrate what to examine in your logs.

Bot crawl volume

It’s important to know the number of requests made by Baidu, BingBot, GoogleBot, Yahoo, Yandex, and others over a given period time. If, for example, you want to get found in search in Russia but Yandex is not crawling your website, that is a problem. (You’d want to consult Yandex Webmaster and see this article on Search Engine Land.)

Response code errors

Moz has a great primer on the meanings of the different status codes. I have an alert system setup that tells me about 4XX and 5XX errors immediately because those are very significant.

Temporary redirects

Temporary 302 redirects do not pass along the “link juice” of external links from the old URL to the new one. Almost all of the time, they should be changed to permanent 301 redirects.

Crawl budget waste

Google assigns a crawl budget to each website based on numerous factors. If your crawl budget is, say, 100 pages per day (or the equivalent amount of data), then you want to be sure that all 100 are things that you want to appear in the SERPs. No matter what you write in your robots.txt file and meta-robots tags, you might still be wasting your crawl budget on advertising landing pages, internal scripts, and more. The logs will tell you — I’ve outlined two script-based examples in red above.

If you hit your crawl limit but still have new content that should be indexed to appear in search results, Google may abandon your site before finding it.

Duplicate URL crawling

The addition of URL parameters — typically used in tracking for marketing purposes — often results in search engines wasting crawl budgets by crawling different URLs with the same content. To learn how to address this issue, I recommend reading the resources on Google and Search Engine Land here, here, here, and here.

Crawl priority

Google might be ignoring (and not crawling or indexing) a crucial page or section of your website. The logs will reveal what URLs and/or directories are getting the most and least attention. If, for example, you have published an e-book that attempts to rank for targeted search queries but it sits in a directory that Google only visits once every six months, then you won’t get any organic search traffic from the e-book for up to six months.

If a part of your website is not being crawled very often — and it is updated often enough that it should be — then you might need to check your internal-linking structure and the crawl-priority settings in your XML sitemap.

Last crawl date

Have you uploaded something that you hope will be indexed quickly? The log files will tell you when Google has crawled it.

Crawl budget

One thing I personally like to check and see is Googlebot’s real-time activity on our site because the crawl budget that the search engine assigns to a website is a rough indicator — a very rough one — of how much it “likes” your site. Google ideally does not want to waste valuable crawling time on a bad website. Here, I had seen that Googlebot had made 154 requests of our new startup’s website over the prior twenty-four hours. Hopefully, that number will go up!

As I hope you can see, log analysis is critically important in technical SEO. It’s eleven o’clock — do you know where your logs are now?

Additional resources

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 years ago from tracking.feedpress.it

How Much Has Link Building Changed in Recent Years?

Posted by Paddy_Moogan

I get asked this question a lot. It’s mainly asked by people who are considering buying my link building book and want to know whether it’s still up to date. This is understandable given that the first edition was published in February 2013 and our industry has a deserved reputation for always changing.

I find myself giving the same answer, even though I’ve been asked it probably dozens of times in the last two years—”not that much”. I don’t think this is solely due to the book itself standing the test of time, although I’ll happily take a bit of credit for that 🙂 I think it’s more a sign of our industry as a whole not changing as much as we’d like to think.

I started to question myself and if I was right and honestly, it’s one of the reasons it has taken me over two years to release the second edition of the book.

So I posed this question to a group of friends not so long ago, some via email and some via a Facebook group. I was expecting to be called out by many of them because my position was that in reality, it hasn’t actually changed that much. The thing is, many of them agreed and the conversations ended with a pretty long thread with lots of insights. In this post, I’d like to share some of them, share what my position is and talk about what actually has changed.

My personal view

Link building hasn’t changed as much we think it has.

The core principles of link building haven’t changed. The signals around link building have changed, but mainly around new machine learning developments that have indirectly affected what we do. One thing that has definitely changed is the mindset of SEOs (and now clients) towards link building.

I think the last big change to link building came in April 2012 when Penguin rolled out. This genuinely did change our industry and put to bed a few techniques that should never have worked so well in the first place.

Since then, we’ve seen some things change, but the core principles haven’t changed if you want to build a business that will be around for years to come and not run the risk of being hit by a link related Google update. For me, these principles are quite simple:

  • You need to deserve links – either an asset you create or your product
  • You need to put this asset in front of a relevant audience who have the ability to share it
  • You need consistency – one new asset every year is unlikely to cut it
  • Anything that scales is at risk

For me, the move towards user data driving search results + machine learning has been the biggest change we’ve seen in recent years and it’s still going.

Let’s dive a bit deeper into all of this and I’ll talk about how this relates to link building.

The typical mindset for building links has changed

I think that most SEOs are coming round to the idea that you can’t get away with building low quality links any more, not if you want to build a sustainable, long-term business. Spammy link building still works in the short-term and I think it always will, but it’s much harder than it used to be to sustain websites that are built on spam. The approach is more “churn and burn” and spammers are happy to churn through lots of domains and just make a small profit on each one before moving onto another.

For everyone else, it’s all about the long-term and not putting client websites at risk.

This has led to many SEOs embracing different forms of link building and generally starting to use content as an asset when it comes to attracting links. A big part of me feels that it was actually Penguin in 2012 that drove the rise of content marketing amongst SEOs, but that’s a post for another day…! For today though, this goes some way towards explain the trend we see below.

Slowly but surely, I’m seeing clients come to my company already knowing that low quality link building isn’t what they want. It’s taken a few years after Penguin for it to filter down to client / business owner level, but it’s definitely happening. This is a good thing but unfortunately, the main reason for this is that most of them have been burnt in the past by SEO companies who have built low quality links without giving thought to building good quality ones too.

I have no doubt that it’s this change in mindset which has led to trends like this:

The thing is, I don’t think this was by choice.

Let’s be honest. A lot of us used the kind of link building tactics that Google no longer like because they worked. I don’t think many SEOs were under the illusion that it was genuinely high quality stuff, but it worked and it was far less risky to do than it is today. Unless you were super-spammy, the low-quality links just worked.

Fast forward to a post-Penguin world, things are far more risky. For me, it’s because of this that we see the trends like the above. As an industry, we had the easiest link building methods taken away from us and we’re left with fewer options. One of the main options is content marketing which, if you do it right, can lead to good quality links and importantly, the types of links you won’t be removing in the future. Get it wrong and you’ll lose budget and lose the trust if your boss or client in the power of content when it comes to link building.

There are still plenty of other methods to build links and sometimes we can forget this. Just look at this epic list from Jon Cooper. Even with this many tactics still available to us, it’s hard work. Way harder than it used to be.

My summary here is that as an industry, our mindset has shifted but it certainly wasn’t a voluntary shift. If the tactics that Penguin targeted still worked today, we’d still be using them.

A few other opinions…

I definitely think too many people want the next easy win. As someone surfing the edge of what Google is bringing our way, here’s my general take—SEO, in broad strokes, is changing a lot, *but* any given change is more and more niche and impacts fewer people. What we’re seeing isn’t radical, sweeping changes that impact everyone, but a sort of modularization of SEO, where we each have to be aware of what impacts our given industries, verticals, etc.”

Dr. Pete

 

I don’t feel that techniques for acquiring links have changed that much. You can either earn them through content and outreach or you can just buy them. What has changed is the awareness of “link building” outside of the SEO community. This makes link building / content marketing much harder when pitching to journalists and even more difficult when pitching to bloggers.

“Link building has to be more integrated with other channels and struggles to work in its own environment unless supported by brand, PR and social. Having other channels supporting your link development efforts also creates greater search signals and more opportunity to reach a bigger audience which will drive a greater ROI.

Carl Hendy

 

SEO has grown up in terms of more mature staff and SEOs becoming more ingrained into businesses so there is a smarter (less pressure) approach. At the same time, SEO has become more integrated into marketing and has made marketing teams and decision makers more intelligent in strategies and not pushing for the quick win. I’m also seeing that companies who used to rely on SEO and building links have gone through IPOs and the need to build 1000s of links per quarter has rightly reduced.

Danny Denhard

Signals that surround link building have changed

There is no question about this one in my mind. I actually wrote about this last year in my previous blog post where I talked about signals such as anchor text and deep links changing over time.

Many of the people I asked felt the same, here are some quotes from them, split out by the types of signal.

Domain level link metrics

I think domain level links have become increasingly important compared with page level factors, i.e. you can get a whole site ranking well off the back of one insanely strong page, even with sub-optimal PageRank flow from that page to the rest of the site.

Phil Nottingham

I’d agree with Phil here and this is what I was getting at in my previous post on how I feel “deep links” will matter less over time. It’s not just about domain level links here, it’s just as much about the additional signals available for Google to use (more on that later).

Anchor text

I’ve never liked anchor text as a link signal. I mean, who actually uses exact match commercial keywords as anchor text on the web?

SEOs. 🙂

Sure there will be natural links like this, but honestly, I struggle with the idea that it took Google so long to start turning down the dial on commercial anchor text as a ranking signal. They are starting to turn it down though, slowly but surely. Don’t get me wrong, it still matters and it still works. But like pure link spam, the barrier is a lot more lower now in terms what of constitutes too much.

Rand feels that they matter more than we’d expect and I’d mostly agree with this statement:

Exact match anchor text links still have more power than you’d expect—I think Google still hasn’t perfectly sorted what is “brand” or “branded query” from generics (i.e. they want to start ranking a new startup like meldhome.com for “Meld” if the site/brand gets popular, but they can’t quite tell the difference between that and https://moz.com/learn/seo/redirection getting a few manipulative links that say “redirect”)

Rand Fishkin

What I do struggle with though, is that Google still haven’t figured this out and that short-term, commercial anchor text spam is still so effective. Even for a short burst of time.

I don’t think link building as a concept has changed loads—but I think links as a signal have, mainly because of filters and penalties but I don’t see anywhere near the same level of impact from coverage anymore, even against 18 months ago.

Paul Rogers

New signals have been introduced

It isn’t just about established signals changing though, there are new signals too and I personally feel that this is where we’ve seen the most change in Google algorithms in recent years—going all the way back to Panda in 2011.

With Panda, we saw a new level of machine learning where it almost felt like Google had found a way of incorporating human reaction / feelings into their algorithms. They could then run this against a website and answer questions like the ones included in this post. Things such as:

  • “Would you be comfortable giving your credit card information to this site?”
  • “Does this article contain insightful analysis or interesting information that is beyond obvious?”
  • “Are the pages produced with great care and attention to detail vs. less attention to detail?”

It is a touch scary that Google was able to run machine learning against answers to questions like this and write an algorithm to predict the answers for any given page on the web. They have though and this was four years ago now.

Since then, they’ve made various moves to utilize machine learning and AI to build out new products and improve their search results. For me, this was one of the biggest and went pretty unnoticed by our industry. Well, until Hummingbird came along I feel pretty sure that we have Ray Kurzweil to thank for at least some of that.

There seems to be more weight on theme/topic related to sites, though it’s hard to tell if this is mostly link based or more user/usage data based. Google is doing a good job of ranking sites and pages that don’t earn the most links but do provide the most relevant/best answer. I have a feeling they use some combination of signals to say “people who perform searches like this seem to eventually wind up on this website—let’s rank it.” One of my favorite examples is the Audubon Society ranking for all sorts of birding-related searches with very poor keyword targeting, not great links, etc. I think user behavior patterns are stronger in the algo than they’ve ever been.

– Rand Fishkin

Leading on from what Rand has said, it’s becoming more and more common to see search results that just don’t make sense if you look at the link metrics—but are a good result.

For me, the move towards user data driving search results + machine learning advanced has been the biggest change we’ve seen in recent years and it’s still going.

Edit: since drafting this post, Tom Anthony released this excellent blog post on his views on the future of search and the shift to data-driven results. I’d recommend reading that as it approaches this whole area from a different perspective and I feel that an off-shoot of what Tom is talking about is the impact on link building.

You may be asking at this point, what does machine learning have to do with link building?

Everything. Because as strong as links are as a ranking signal, Google want more signals and user signals are far, far harder to manipulate than established link signals. Yes it can be done—I’ve seen it happen. There have even been a few public tests done. But it’s very hard to scale and I’d venture a guess that only the top 1% of spammers are capable of doing it, let alone maintaining it for a long period of time. When I think about the process for manipulation here, I actually think we go a step beyond spammers towards hackers and more cut and dry illegal activity.

For link building, this means that traditional methods of manipulating signals are going to become less and less effective as these user signals become stronger. For us as link builders, it means we can’t keep searching for that silver bullet or the next method of scaling link building just for an easy win. The fact is that scalable link building is always going to be at risk from penalization from Google—I don’t really want to live a life where I’m always worried about my clients being hit by the next update. Even if Google doesn’t catch up with a certain method, machine learning and user data mean that these methods may naturally become less effective and cost efficient over time.

There are of course other things such as social signals that have come into play. I certainly don’t feel like these are a strong ranking factor yet, but with deals like this one between Google and Twitter being signed, I wouldn’t be surprised if that ever-growing dataset is used at some point in organic results. The one advantage that Twitter has over Google is it’s breaking news freshness. Twitter is still way quicker at breaking news than Google is—140 characters in a tweet is far quicker than Google News! Google know this which is why I feel they’ve pulled this partnership back into existence after a couple of years apart.

There is another important point to remember here and it’s nicely summarised by Dr. Pete:

At the same time, as new signals are introduced, these are layers not replacements. People hear social signals or user signals or authorship and want it to be the link-killer, because they already fucked up link-building, but these are just layers on top of on-page and links and all of the other layers. As each layer is added, it can verify the layers that came before it and what you need isn’t the magic signal but a combination of signals that generally matches what Google expects to see from real, strong entities. So, links still matter, but they matter in concert with other things, which basically means it’s getting more complicated and, frankly, a bit harder. Of course, on one wants to hear that.”

– Dr. Pete

The core principles have not changed

This is the crux of everything for me. With all the changes listed above, the key is that the core principles around link building haven’t changed. I could even argue that Penguin didn’t change the core principles because the techniques that Penguin targeted should never have worked in the first place. I won’t argue this too much though because even Google advised website owners to build directory links at one time.

You need an asset

You need to give someone a reason to link to you. Many won’t do it out of the goodness of their heart! One of the most effective ways to do this is to develop a content asset and use this as your reason to make people care. Once you’ve made someone care, they’re more likely to share the content or link to it from somewhere.

You need to promote that asset to the right audience

I really dislike the stance that some marketers take when it comes to content promotion—build great content and links will come.

No. Sorry but for the vast majority of us, that’s simply not true. The exceptions are people that sky dive from space or have huge existing audiences to leverage.

You simply have to spend time promoting your content or your asset for it to get shares and links. It is hard work and sometimes you can spend a long time on it and get little return, but it’s important to keep working at until you’re at a point where you have two things:

  • A big enough audience where you can almost guarantee at least some traffic to your new content along with some shares
  • Enough strong relationships with relevant websites who you can speak to when new content is published and stand a good chance of them linking to it

Getting to this point is hard—but that’s kind of the point. There are various hacks you can use along the way but it will take time to get right.

You need consistency

Leading on from the previous point. It takes time and hard work to get links to your content—the types of links that stand the test of time and you’re not going to be removing in 12 months time anyway! This means that you need to keep pushing content out and getting better each and every time. This isn’t to say you should just churn content out for the sake of it, far from it. I am saying that with each piece of content you create, you will learn to do at least one thing better the next time. Try to give yourself the leverage to do this.

Anything scalable is at risk

Scalable link building is exactly what Google has been trying to crack down on for the last few years. Penguin was the biggest move and hit some of the most scalable tactics we had at our disposal. When you scale something, you often lose some level of quality, which is exactly what Google doesn’t want when it comes to links. If you’re still relying on tactics that could fall into the scalable category, I think you need to be very careful and just look at the trend in the types of links Google has been penalizing to understand why.

The part Google plays in this

To finish up, I want to briefly talk about the part that Google plays in all of this and shaping the future they want for the web.

I’ve always tried to steer clear of arguments involving the idea that Google is actively pushing FUD into the community. I’ve preferred to concentrate more on things I can actually influence and change with my clients rather than what Google is telling us all to do.

However, for the purposes of this post, I want to talk about it.

General paranoia has increased. My bet is there are some companies out there carrying out zero specific linkbuilding activity through worry.

Dan Barker

Dan’s point is a very fair one and just a day or two after reading this in an email, I came across a page related to a client’s target audience that said:

“We are not publishing guest posts on SITE NAME any more. All previous guest posts are now deleted. For more information, see www.mattcutts.com/blog/guest-blogging/“.

I’ve reworded this as to not reveal the name of the site, but you get the point.

This is silly. Honestly, so silly. They are a good site, publish good content, and had good editorial standards. Yet they have ignored all of their own policies, hard work, and objectives to follow a blog post from Matt. I’m 100% confident that it wasn’t sites like this one that Matt was talking about in this blog post.

This is, of course, from the publishers’ angle rather than the link builders’ angle, but it does go to show the effect that statements from Google can have. Google know this so it does make sense for them to push out messages that make their jobs easier and suit their own objectives—why wouldn’t they? In a similar way, what did they do when they were struggling to classify at scale which links are bad vs. good and they didn’t have a big enough web spam team? They got us to do it for them 🙂

I’m mostly joking here, but you see the point.

The most recent infamous mobilegeddon update, discussed here by Dr. Pete is another example of Google pushing out messages that ultimately scared a lot of people into action. Although to be fair, I think that despite the apparent small impact so far, the broad message from Google is a very serious one.

Because of this, I think we need to remember that Google does have their own agenda and many shareholders to keep happy. I’m not in the camp of believing everything that Google puts out is FUD, but I’m much more sensitive and questioning of the messages now than I’ve ever been.

What do you think? I’d love to hear your feedback and thoughts in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

The Nifty Guide to Local Content Strategy and Marketing

Posted by NiftyMarketing

This is my Grandma.

She helped raised me and I love her dearly. That chunky baby with the Gerber cheeks is
me. The scarlet letter “A” means nothing… I hope.

This is a rolled up newspaper. 

rolled up newspaper

When I was growing up, I was the king of mischief and had a hard time following parental guidelines. To ensure the lessons she wanted me to learn “sunk in” my grandma would give me a soft whack with a rolled up newspaper and would say,

“Mike, you like to learn the hard way.”

She was right. I have
spent my life and career learning things the hard way.

Local content has been no different. I started out my career creating duplicate local doorway pages using “find and replace” with city names. After getting whacked by the figurative newspaper a few times, I decided there had to be a better way. To save others from the struggles I experienced, I hope that the hard lessons I have learned about local content strategy and marketing help to save you fearing a rolled newspaper the same way I do.

Lesson one: Local content doesn’t just mean the written word

local content ecosystem

Content is everything around you. It all tells a story. If you don’t have a plan for how that story is being told, then you might not like how it turns out. In the local world, even your brick and mortar building is a piece of content. It speaks about your brand, your values, your appreciation of customers and employees, and can be used to attract organic visitors if it is positioned well and provides a good user experience. If you just try to make the front of a building look good, but don’t back up the inside inch by inch with the same quality, people will literally say, “Hey man, this place sucks… let’s bounce.”

I had this experience proved to me recently while conducting an interview at
Nifty for our law division. Our office is a beautifully designed brick, mustache, animal on the wall, leg lamp in the center of the room, piece of work you would expect for a creative company.

nifty offices idaho

Anywho, for our little town of Burley, Idaho it is a unique space, and helps to set apart our business in our community. But, the conference room has a fluorescent ballast light system that can buzz so loudly that you literally can’t carry on a proper conversation at times, and in the recent interviews I literally had to conduct them in the dark because it was so bad.

I’m cheap and slow to spend money, so I haven’t got it fixed yet. The problem is I have two more interviews this week and I am so embarrassed by the experience in that room, I am thinking of holding them offsite to ensure that we don’t product a bad content experience. What I need to do is just fix the light but I will end up spending weeks going back and forth with the landlord on whose responsibility it is.

Meanwhile, the content experience suffers. Like I said, I like to learn the hard way.

Start thinking about everything in the frame of content and you will find that you make better decisions and less costly mistakes.

Lesson two: Scalable does not mean fast and easy growth

In every sales conversation I have had about local content, the question of scalability comes up. Usually, people want two things:

  1. Extremely Fast Production 
  2. Extremely Low Cost

While these two things would be great for every project, I have come to find that there are rare cases where quality can be achieved if you are optimizing for fast production and low cost. A better way to look at scale is as follows:

The rate of growth in revenue/traffic is greater than the cost of continued content creation.

A good local content strategy at scale will create a model that looks like this:

scaling content graph

Lesson three: You need a continuous local content strategy

This is where the difference between local content marketing and content strategy kicks in. Creating a single piece of content that does well is fairly easy to achieve. Building a true scalable machine that continually puts out great local content and consistently tells your story is not. This is a graph I created outlining the process behind creating and maintaining a local content strategy:

local content strategy

This process is not a one-time thing. It is not a box to be checked off. It is a structure that should become the foundation of your marketing program and will need to be revisited, re-tweaked, and replicated over and over again.

1. Identify your local audience

Most of you reading this will already have a service or product and hopefully local customers. Do you have personas developed for attracting and retaining more of them? Here are some helpful tools available to give you an idea of how many people fit your personas in any given market.

Facebook Insights

Pretend for a minute that you live in the unique market of Utah and have a custom wedding dress line. You focus on selling modest wedding dresses. It is a definite niche product, but one that shows the idea of personas very well.

You have interviewed your customer base and found a few interests that your customer base share. Taking that information and putting it into Facebook insights will give you a plethora of data to help you build out your understanding of a local persona.

facebook insights data

We are able to see from the interests of our customers there are roughly 6k-7k current engaged woman in Utah who have similar interests to our customer base.

The location tab gives us a break down of the specific cities and, understandably, Salt Lake City has the highest percentage with Provo (home of BYU) in second place. You can also see pages this group would like, activity levels on Facebook, and household income with spending habits. If you wanted to find more potential locations for future growth you can open up the search to a region or country.

localized facebook insights data

From this data it’s apparent that Arizona would be a great expansion opportunity after Utah.

Neilson Prizm

Neilson offers a free and extremely useful tool for local persona research called Zip Code Lookup that allows you to identify pre-determined personas in a given market.

Here is a look at my hometown and the personas they have developed are dead on.

Neilson Prizm data

Each persona can be expanded to learn more about the traits, income level, and areas across the country with other high concentrations of the same persona group.

You can also use the segment explorer to get a better idea of pre-determined persona lists and can work backwards to determine the locations with the highest density of a given persona.

Google Keyword Planner Tool

The keyword tool is fantastic for local research. Using our same Facebook Insight data above we can match keyword search volume against the audience size to determine how active our persona is in product research and purchasing. In the case of engaged woman looking for dresses, it is a very active group with a potential of 20-30% actively searching online for a dress.

google keyword planner tool

2. Create goals and rules

I think the most important idea for creating the goals and rules around your local content is the following from the must read book Content Strategy for the Web.

You also need to ensure that everyone who will be working on things even remotely related to content has access to style and brand guides and, ultimately, understands the core purpose for what, why, and how everything is happening.

3. Audit and analyze your current local content

The point of this step is to determine how the current content you have stacks up against the goals and rules you established, and determine the value of current pages on your site. With tools like Siteliner (for finding duplicate content) and ScreamingFrog (identifying page titles, word count, error codes and many other things) you can grab a lot of information very fast. Beyond that, there are a few tools that deserve a more in-depth look.

BuzzSumo

With BuzzSumo you can see social data and incoming links behind important pages on your site. This can you a good idea which locations or areas are getting more promotion than others and identify what some of the causes could be.

Buzzsumo also can give you access to competitors’ information where you might find some new ideas. In the following example you can see that one of Airbnb.com’s most shared pages was a motiongraphic of its impact on Berlin.

Buzzsumo

urlProfiler

This is another great tool for scraping urls for large sites that can return about every type of measurement you could want. For sites with 1000s of pages, this tool could save hours of data gathering and can spit out a lovely formatted CSV document that will allow you to sort by things like word count, page authority, link numbers, social shares, or about anything else you could imagine.

url profiler

4. Develop local content marketing tactics

This is how most of you look when marketing tactics are brought up.

monkey

Let me remind you of something with a picture. 

rolled up newspaper

Do not start with tactics. Do the other things first. It will ensure your marketing tactics fall in line with a much bigger organizational movement and process. With the warning out of the way, here are a few tactics that could work for you.

Local landing page content

Our initial concept of local landing pages has stood the test of time. If you are scared to even think about local pages with the upcoming doorway page update then please read this analysis and don’t be too afraid. Here are local landing pages that are done right.

Marriott local content

Marriot’s Burley local page is great. They didn’t think about just ensuring they had 500 unique words. They have custom local imagery of the exterior/interior, detailed information about the area’s activities, and even their own review platform that showcases both positive and negative reviews with responses from local management.

If you can’t build your own platform handling reviews like that, might I recommend looking at Get Five Stars as a platform that could help you integrate reviews as part of your continuous content strategy.

Airbnb Neighborhood Guides

I not so secretly have a big crush on Airbnb’s approach to local. These neighborhood guides started it. They only have roughly 21 guides thus far and handle one at a time with Seoul being the most recent addition. The idea is simple, they looked at extremely hot markets for them and built out guides not just for the city, but down to a specific neighborhood.

air bnb neighborhood guides

Here is a look at Hell’s Kitchen in New York by imagery. They hire a local photographer to shoot the area, then they take some of their current popular listing data and reviews and integrate them into the page. This idea would have never flown if they only cared about creating content that could be fast and easy for every market they serve.

Reverse infographicing

Every decently sized city has had a plethora of infographics made about them. People spent the time curating information and coming up with the concept, but a majority just made the image and didn’t think about the crawlability or page title from an SEO standpoint.

Here is an example of an image search for Portland infographics.

image search results portland infographics

Take an infographic and repurpose it into crawlable content with a new twist or timely additions. Usually infographics share their data sources in the footer so you can easily find similar, new, or more information and create some seriously compelling data based content. You can even link to or share the infographic as part of it if you would like.

Become an Upworthy of local content

No one I know does this better than Movoto. Read the link for their own spin on how they did it and then look at these examples and share numbers from their local content.

60k shares in Boise by appealing to that hometown knowledge.

movoto boise content

65k shares in Salt Lake following the same formula.

movoto salt lake city content

It seems to work with video as well.

movoto video results

Think like a local directory

Directories understand where content should be housed. Not every local piece should be on the blog. Look at where Trip Advisor’s famous “Things to Do” page is listed. Right on the main city page.

trip advisor things to do in salt lake city

Or look at how many timely, fresh, quality pieces of content Yelp is showcasing from their main city page.

yelp main city page

The key point to understand is that local content isn’t just about being unique on a landing page. It is about BEING local and useful.

Ideas of things that are local:

  • Sports teams
  • Local celebrities or heroes 
  • Groups and events
  • Local pride points
  • Local pain points

Ideas of things that are useful:

  • Directions
  • Favorite local sports
  • Granular details only “locals” know

The other point to realize is that in looking at our definition of scale you don’t need to take shortcuts that un-localize the experience for users. Figure and test a location at a time until you have a winning formula and then move forward at a speed that ensures a quality local experience.

5. Create a content calendar

I am not going to get into telling you exactly how or what your content calendar needs to include. That will largely be based on the size and organization of your team and every situation might call for a unique approach. What I will do is explain how we do things at Nifty.

  1. We follow the steps above.
  2. We schedule the big projects and timelines first. These could be months out or weeks out. 
  3. We determine the weekly deliverables, checkpoints, and publish times.
  4. We put all of the information as tasks assigned to individuals or teams in Asana.

asana content calendar

The information then can be viewed by individual, team, groups of team, due dates, or any other way you would wish to sort. Repeatable tasks can be scheduled and we can run our entire operation visible to as many people as need access to the information through desktop or mobile devices. That is what works for us.

6. Launch and promote content

My personal favorite way to promote local content (other than the obvious ideas of sharing with your current followers or outreaching to local influencers) is to use Facebook ads to target the specific local personas you are trying to reach. Here is an example:

I just wrapped up playing Harold Hill in our communities production of The Music Man. When you live in a small town like Burley, Idaho you get the opportunity to play a lead role without having too much talent or a glee-based upbringing. You also get the opportunity to do all of the advertising, set design, and costuming yourself and sometime even get to pay for it.

For my advertising responsibilities, I decided to write a few blog posts and drive traffic to them. As any good Harold Hill would do, I used fear tactics.

music man blog post

I then created Facebook ads that had the following stats: Costs of $.06 per click, 12.7% click through rate, and naturally organic sharing that led to thousands of visits in a small Idaho farming community where people still think a phone book is the only way to find local businesses.

facebook ads setup

Then we did it again.

There was a protestor in Burley for over a year that parked a red pickup with signs saying things like, “I wud not trust Da Mayor” or “Don’t Bank wid Zions”. Basically, you weren’t working hard enough if you name didn’t get on the truck during the year.

Everyone knew that ol’ red pickup as it was parked on the corner of Main and Overland, which is one of the few stoplights in town. Then one day it was gone. We came up with the idea to bring the red truck back, put signs on it that said, “I wud Not Trust Pool Tables” and “Resist Sins n’ Corruption” and other things that were part of The Music Man and wrote another blog complete with pictures.

facebook ads red truck

Then I created another Facebook Ad.

facebook ads set up

A little under $200 in ad spend resulted in thousands more visits to the site which promoted the play and sold tickets to a generation that might not have been very familiar with the show otherwise.

All of it was local targeting and there was no other way would could have driven that much traffic in a community like Burley without paying Facebook and trying to create click bait ads in hope the promotion led to an organic sharing.

7. Measure and report

This is another very personal step where everyone will have different needs. At Nifty we put together very custom weekly or monthly reports that cover all of the plan, execution, and relevant stats such as traffic to specific content or location, share data, revenue or lead data if available, analysis of what worked and what didn’t, and the plan for the following period.

There is no exact data that needs to be shared. Everyone will want something slightly different, which is why we moved away from automated reporting years ago (when we moved away from auto link building… hehe) and built our report around our clients even if it took added time.

I always said that the product of a SEO or content shop is the report. That is what people buy because it is likely that is all they will see or understand.

8. In conclusion, you must refine and repeat the process

local content strategy - refine and repeat

From my point of view, this is by far the most important step and sums everything up nicely. This process model isn’t perfect. There will be things that are missed, things that need tweaked, and ways that you will be able to improve on your local content strategy and marketing all the time. The idea of the cycle is that it is never done. It never sleeps. It never quits. It never surrenders. You just keep perfecting the process until you reach the point that few locally-focused companies ever achieve… where your local content reaches and grows your target audience every time you click the publish button.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

​Inbound Lead Generation: eCommerce Marketing’s Missing Link

Posted by Everett

If eCommerce businesses hope to remain competitive with Amazon, eBay, big box brands, and other online retail juggernauts, they’ll need to learn how to conduct content marketing, lead generation, and contact nurturing as part of a comprehensive inbound marketing strategy.

First, I will discuss some of the ways most online retailers are approaching email from the bottom of the funnel upward, and why this needs to be turned around. Then we can explore how to go about doing this within the framework of “Inbound Marketing” for eCommerce businesses. Lastly, popular marketing automation and email marketing solutions are discussed in the context of inbound marketing for eCommerce.

Key differences between eCommerce and lead generation approaches to email

Different list growth strategies

Email acquisition sources differ greatly between lead gen. sites and online stores. The biggest driver of email acquisition for most eCommerce businesses are their shoppers, especially when the business doesn’t collect an email address for their contact database until the shopper provides it during the check-out process—possibly, not until the very end.

With most B2B/B2C lead gen. websites, the entire purpose of every landing page is to get visitors to submit a contact form or pick up the phone. Often, the price tag for their products or services is much higher than those of an eCommerce site or involves recurring payments. In other words, what they’re selling is more difficult to sell. People take longer to make those purchasing decisions. For this reason, leads—in the form of contact names and email addresses—are typically acquired and nurtured without having first become a customer.

Contacts vs. leads

Whether it is a B2B or B2C website, lead gen. contacts (called leads) are thought of as potential customers (clients, subscribers, patients) who need to be nurtured to the point of becoming “sales qualified,” meaning they’ll eventually get a sales call or email that attempts to convert them into a customer.

On the other hand, eCommerce contacts are often thought of primarily as existing customers to whom the marketing team can blast coupons and other offers by email.

Retail sites typically don’t capture leads at the top or middle of the funnel. Only once a shopper has checked out do they get added to the list. Historically, the buying cycle has been short enough that eCommerce sites could move many first-time visitors directly to customers in a single visit.
But this has changed.

Unless your brand is very strong—possibly a luxury brand or one with an offline retail presence—it is probably getting more difficult (i.e. expensive) to acquire new customers. At the same time, attrition rates are rising. Conversion optimization helps by converting more bottom of the funnel visitors. SEO helps drive more traffic into the site, but mostly for middle-of-funnel (category page) and bottom-of-funnel (product page) visitors who may not also be price/feature comparison shopping, or are unable to convert right away because of device or time limitations.

Even savvy retailers publishing content for shoppers higher up in the funnel, such as buyer guides and reviews, aren’t getting an email address and are missing a lot of opportunities because of it.

attract-convert-grow-funnel-inflow-2.jpg

Here’s a thought. If your eCommerce site has a 10 percent conversion rate, you’re doing pretty good by most standards. But what happened to the other 90 percent of those visitors? Will you have the opportunity to connect with them again? Even if you bump that up a few percentage points with retargeting, a lot of potential revenue has seeped out of your funnel without a trace.

I don’t mean to bash the eCommerce marketing community with generalizations. Most lead gen. sites aren’t doing anything spectacular either, and a lot of opportunity is missed all around.

There are many eCommerce brands doing great things marketing-wise. I’m a big fan of
Crutchfield for their educational resources targeting early-funnel traffic, and Neman Tools, Saddleback Leather and Feltraiger for the stories they tell. Amazon is hard to beat when it comes to scalability, product suggestions and user-generated reviews.

Sadly, most eCommerce sites (including many of the major household brands) still approach marketing in this way…

The ol’ bait n’ switch: promising value and delivering spam

Established eCommerce brands have gigantic mailing lists (compared with lead gen. counterparts), to whom they typically send out at least one email each week with “offers” like free shipping, $ off, buy-one-get-one, or % off their next purchase. The lists are minimally segmented, if at all. For example, there might be lists for repeat customers, best customers, unresponsive contacts, recent purchasers, shoppers with abandoned carts, purchases by category, etc.

The missing points of segmentation include which campaign resulted in the initial contact (sometimes referred to as a cohort) and—most importantly—the persona and buying cycle stage that best applies to each contact.

Online retailers often send frequent “blasts” to their entire list or to a few of the large segments mentioned above. Lack of segmentation means contacts aren’t receiving emails based on their interests, problems, or buying cycle stage, but instead, are receiving what they perceive as “generic” emails.

The result of these missing segments and the lack of overarching strategy looks something like this:

My, What a Big LIST You Have!

iStock_000017047747Medium.jpg

TIME reported in 2012 on stats from Responsys that the average online retailer sent out between five and six emails the week after Thanksgiving. Around the same time, the Wall Street Journal reported that the top 100 online retailers sent an average of 177 emails apiece to each of their contacts in 2011. Averaged out, that’s somewhere between three and four emails each week that the contact is receiving from these retailers.

The better to SPAM you with!

iStock_000016088853Medium.jpg

A 2014 whitepaper from SimpleRelevance titled
Email Fail: An In-Depth Evaluation of Top 20 Internet Retailer’s Email Personalization Capabilities (
PDF) found that, while 70 percent of marketing executives believed personalization was of “utmost importance” to their business…

“Only 17 percent of marketing leaders are going beyond basic transactional data to deliver personalized messages to consumers.”

Speaking of email overload, the same report found that some major online retailers sent ten or more emails per week!

simplerelevance-email-report-frequency.png

The result?

All too often, the eCommerce business will carry around big, dead lists of contacts who don’t even bother reading their emails anymore. They end up scrambling toward other channels to “drive more demand,” but because the real problems were never addressed, this ends up increasing new customer acquisition costs.

The cycle looks something like this:

  1. Spend a fortune driving in unqualified traffic from top-of-the-funnel channels
  2. Ignore the majority of those visitors who aren’t ready to purchase
  3. Capture email addresses only for the few visitors who made a purchase
  4. Spam the hell out of those people until they unsubscribe
  5. Spend a bunch more money trying to fill the top of the funnel with even more traffic

It’s like trying to fill your funnel with a bucket full of holes, some of them patched with band-aids.

The real problems

  1. Lack of a cohesive strategy across marketing channels
  2. Lack of a cohesive content strategy throughout all stages of the buying cycle
  3. Lack of persona, buying cycle stage, and cohort-based list segmentation to nurture contacts
  4. Lack of tracking across customer touchpoints and devices
  5. Lack of gated content that provides enough value to early-funnel visitors to get them to provide their email address

So, what’s the answer?

Inbound marketing allows online retailers to stop competing with Amazon and other “price focused” competitors with leaky funnels, and to instead focus on:

  1. Persona-based content marketing campaigns designed to acquire email addresses from high-quality leads (potential customers) by offering them the right content for each stage in their buyer’s journey
  2. A robust marketing automation system that makes true personalization scalable
  3. Automated contact nurturing emails triggered by certain events, such as viewing specific content, abandoning their shopping cart, adding items to their wish list or performing micro-conversions like downloading a look book
  4. Intelligent SMM campaigns that match visitors and customers with social accounts by email addresses, interests and demographics—as well as social monitoring
  5. Hyper-segmented email contact lists to support the marketing automation described above, as well as to provide highly-customized email and shopping experiences
  6. Cross-channel, closed loop reporting to provide a complete “omnichannel” view of online marketing efforts and how they assist offline conversions, if applicable

Each of these areas will be covered in more detail below. First, let’s take a quick step back and define what it is we’re talking about here.

Inbound marketing: a primer

A lot of people think “inbound marketing” is just a way some SEO agencies are re-cloaking themselves to avoid negative associations with search engine optimization. Others think it’s synonymous with “internet marketing.” I think it goes more like this:

Inbound marketing is to Internet marketing as SEO is to inbound marketing: One piece of a larger whole.

There are many ways to define inbound marketing. A cursory review of definitions from several trusted sources reveals some fundamental similarities :

Rand Fishkin

randfishkin.jpeg

“Inbound Marketing is the practice of earning traffic and attention for your business on the web rather than buying it or interrupting people to get it. Inbound channels include organic search, social media, community-building content, opt-in email, word of mouth, and many others. Inbound marketing is particularly powerful because it appeals to what people are looking for and what they want, rather than trying to get between them and what they’re trying to do with advertising. Inbound’s also powerful due to the flywheel-effect it creates. The more you invest in Inbound and the more success you have, the less effort required to earn additional benefit.”


Mike King

mikeking.jpeg

“Inbound Marketing is a collection of marketing activities that leverage remarkable content to penetrate earned media channels such as Organic Search, Social Media, Email, News and the Blogosphere with the goal of engaging prospects when they are specifically interested in what the brand has to offer.”

This quote is from 2012, and is still just as accurate today. It’s from an
Inbound.org comment thread where you can also see many other takes on it from the likes of Ian Lurie, Jonathon Colman, and Larry Kim.


Inflow

inflow-logo.jpeg

“Inbound Marketing is a multi-channel, buyer-centric approach to online marketing that involves attracting, engaging, nurturing and converting potential customers from wherever they are in the buying cycle.”

From Inflow’s
Inbound Services page.


Wikipedia

wikipedia.jpeg

“Inbound marketing refers to marketing activities that bring visitors in, rather than marketers having to go out to get prospects’ attention. Inbound marketing earns the attention of customers, makes the company easy to be found, and draws customers to the website by producing interesting content.”

From
Inbound Marketing – Wikipedia.


Larry-Kim.jpeg

Larry Kim

“Inbound marketing” refers to marketing activities that bring leads and customers in when they’re ready, rather than you having to go out and wave your arms to try to get people’s attention.”

Via
Marketing Land in 2013. You can also read more of Larry Kim’s interpretation, along with many others, on Inbound.org.


Hubspot

“Instead of the old outbound marketing methods of buying ads, buying email lists, and praying for leads, inbound marketing focuses on creating quality content that pulls people toward your company and product, where they naturally want to be.”

Via
Hubspot, a marketing automation platform for inbound marketing.

When everyone has their own definition of something, it helps to think about what they have in common, as opposed to how they differ. In the case of inbound, this includes concepts such as:

  • Pull (inbound) vs. push (interruption) marketing
  • “Earning” media coverage, search engine rankings, visitors and customers with outstanding content
  • Marketing across channels
  • Meeting potential customers where they are in their buyer’s journey

Running your first eCommerce inbound marketing campaign

Audience personas—priority no. 1

The magic happens when retailers begin to hyper-segment their list based on buyer personas and other relevant information (i.e. what they’ve downloaded, what they’ve purchased, if they abandoned their cart…). This all starts with audience research to develop personas. If you need more information on persona development, try these resources:

Once personas are developed, retailers should choose one on which to focus. A complete campaign strategy should be developed around this persona, with the aim of providing the “right value” to them at the “right time” in their buyer’s journey.

Ready to get started?

We’ve developed a quick-start guide in the form of a checklist for eCommerce marketers who want to get started with inbound marketing, which you can access below.

inbound ecommerce checklist

Hands-on experience running one campaign will teach you more about inbound marketing than a dozen articles. My advice: Just do one. You will make mistakes. Learn from them and get better each time.

Example inbound marketing campaign

Below is an example of how a hypothetical inbound marketing campaign might play out, assuming you have completed all of the steps in the checklist above. Imagine you handle marketing for an online retailer of high-end sporting goods.

AT Hiker Tommy campaign: From awareness to purchase

When segmenting visitors and customers for a “high-end sporting goods / camping retailer” based on the East Coast, you identified a segment of “Trail Hikers.” These are people with disposable income who care about high-quality gear, and will pay top dollar if they know it is tested and reliable. The top trail on their list of destinations is the
Appalachian Trail (AT).

Top of the Funnel: SEO & Strategic Content Marketing

at-tommy.jpg

Tommy’s first action is to do “top of the funnel” research from search engines (one reason why SEO is still so important to a complete inbound marketing strategy).

A search for “Hiking the Appalachian Trail” turns up your article titled “What NOT to Pack When Hiking the Appalachian Trail,” which lists common items that are bulky/heavy, and highlights slimmer, lighter alternatives from your online catalog.

It also highlights the difference between cheap gear and the kind that won’t let you down on your 2,181 mile journey through the wilderness of Appalachia, something you learned was important to Tommy when developing his persona. This allows you to get the company’s value proposition of “tested, high-end, quality gear only” in front of readers very early in their buyer’s journey—important if you want to differentiate your site from all of the retailers racing Amazon to the bottom of their profit margins.

So far you have yet to make “contact” with AT Hiker Tommy. The key to “acquiring” a contact before the potential customer is ready to make a purchase is to provide something of value to that specific type of person (i.e. their persona) at that specific point in time (i.e. their buying cycle stage).

In this case, we need to provide value to AT Hiker Tommy while he is getting started on his research about hiking the Appalachian Trail. He has an idea of what gear not to bring, as well as some lighter, higher-end options sold on your site. At this point, however, he is not ready to buy anything without researching the trail more. This is where retailers lose most of their potential customers. But not you. Not this time…

Middle of the funnel: Content offers, personalization, social & email nurturing

at-hiker-ebook.png

On the “What NOT to Pack When Hiking the Appalachian Trail” article (and probably several others), you have placed a call-to-action (CTA) in the form of a button that offers something like:

Download our Free 122-page Guide to Hiking the Appalachian Trail

This takes Tommy to a landing page showcasing some of the quotes from the book, and highlighting things like:

“We interviewed over 50 ‘thru-hikers’ who completed the AT and have curated and organized the best first-hand tips, along with our own significant research to develop a free eBook that should answer most of your questions about the trail.”

By entering their email address potential customers agree to allow you to send them the free PDF downloadable guide to hiking the AT, and other relevant information about hiking.

An automated email is sent with a link to the downloadable PDF guide, and several other useful content links, such as “The AT Hiker’s Guide to Gear for the Appalachian Trail”—content designed to move Tommy further toward the purchase of hiking gear.

If Tommy still has not made a purchase within the next two weeks, another automated email is sent asking for feedback about the PDF guide (providing the link again), and to again provide the link to the “AT Hiker’s Guide to Gear…” along with a compelling offer just for him, perhaps “Get 20% off your first hiking gear purchase, and a free wall map of the AT!”

Having Tommy’s email address also allows you to hyper-target him on social channels, while also leveraging his initial visit to initiate retargeting efforts.

Bottom of the funnel: Email nurturing & strategic, segmented offers

Eventually Tommy makes a purchase, and he may or may not receive further emails related to this campaign, such as post-purchase emails for reviews, up-sells and cross-sells.

Upon checkout, Tommy checked the box to opt-in to weekly promotional emails. He is now on multiple lists. Your marketing automation system will automatically update Tommy’s status from “Contact” or lead, to “Customer” and potentially remove or deactivate him from the marketing automation system database. This is accomplished either by default integration features, or with the help of integration tools like
Zapier and IFTTT.

You have now nurtured Tommy from his initial research on Google all the way to his first purchase without ever having sent a spammy newsletter email full of irrelevant coupons and other offers. However, now that he is a loyal customer, Tommy finds value in these bottom-of-funnel email offers.

And this is just the start

Every inbound marketing campaign will have its own mix of appropriate channels. This post has focused mostly on email because acquiring the initial permission to contact the person is what fuels most of the other features offered by marketing automation systems, including:

  • Personalization of offers and other content on the site.
  • Knowing exactly which visitors are interacting on social media
  • Knowing where visitors and social followers are in the buying cycle and which persona best represents them, among other things.
  • Smart forms that don’t require visitors to put in the same information twice and allow you to build out more detailed profiles of them over time.
  • Blogging platforms that tie into email and marketing automation systems
  • Analytics data that isn’t blocked by Google and is tied directly to real people.
  • Closed-loop reporting that integrates with call-tracking and Google’s Data Import tool
  • Up-sell, cross-sell, and abandoned cart reclamation features
Three more things…
  1. If you can figure out a way to get Tommy to “log in” when he comes to your site, the personalization possibilities are nearly limitless.
  2. The persona above is based on a real customer segment. I named it after my friend Tommy Bailey, who actually did write the eBook
    Guide to Hiking the Appalachian Trail, featured in the image above.
  3. This Moz post is part of an inbound marketing campaign targeting eCommerce marketers, a segment Inflow identified while building out our own personas. Our hope, and the whole point of inbound marketing, is that it provides value to you.

Current state of the inbound marketing industry

Inbound has, for the the most part, been applied to businesses in which the website objective is to generate leads for a sales team to follow-up with and close the deal. An examination of various marketing automation platforms—a key component of scalable inbound marketing programs—highlights this issue.

Popular marketing automation systems

Most of the major marketing automation systems can be be used very effectively as the backbone of an inbound marketing program for eCommerce businesses. However, only one of them (Silverpop) has made significant efforts to court the eCommerce market with content and out-of-box features. The next closest thing is Hubspot, so let’s start with those two:

Silverpop – an IBMⓇ Company

silver-pop.jpeg

Unlike the other platforms below, right out of the box Silverpop allows marketers to tap into very specific behaviors, including the items purchased or left in the cart.

You can easily segment based on metrics like the Recency, Frequency and Monetary Value (RFM) of purchases:

silverpop triggered campaigns

You can automate personalized shopping cart abandonment recovery emails:

silverpop cart abandonment recovery

You can integrate with many leading brands offering complementary services, including: couponing, CRM, analytics, email deliverability enhancement, social and most major eCommerce platforms.

What you can’t do with Silverpop is blog, find pricing info on their website, get a free trial on their website or have a modern-looking user experience. Sounds like an IBMⓇ company, doesn’t it?

HubSpot

Out of all the marketing automation platforms on this list, HubSpot is the most capable of handling “inbound marketing” campaigns from start to finish. This should come as no surprise, given the phrase is credited to
Brian Halligan, HubSpot’s co-founder and CEO.

While they don’t specifically cater to eCommerce marketing needs with the same gusto they give to lead gen. marketing, HubSpot does have
an eCommerce landing page and a demo landing page for eCommerce leads, which suggests that their own personas include eCommerce marketers. Additionally, there is some good content on their blog written specifically for eCommerce.

HubSpot has allowed some key partners to develop plug-ins that integrate with leading eCommerce platforms. This approach works well with curation, and is not dissimilar to how Google handles Android or Apple handles their approved apps.

magento and hubspot

The
Magento Connector for HubSpot, which costs $80 per month, was developed by EYEMAGiNE, a creative design firm for eCommerce websites. A similar HubSpot-approved third-party integration is on the way for Bigcommerce.

Another eCommerce integration for Hubspot is a Shopify plug-in called
HubShoply, which was developed by Groove Commerce and costs $100 per month.

You can also use HubSpot’s native integration capabilities with
Zapier to sync data between HubSpot and most major eCommerce SaaS vendors, including the ones above, as well as WooCommerce, Shopify, PayPal, Infusionsoft and more. However, the same could be said of some of the other marketing automation platforms, and using these third-party solutions can sometimes feel like fitting a square peg into a round hole.

HubSpot can and does handle inbound marketing for eCommerce websites. All of the features are there, or easy enough to integrate. But let’s put some pressure on them to up their eCommerce game even more. The least they can do is put an eCommerce link in the footer:

hubspot menus

Despite the lack of clear navigation to their eCommerce content, HubSpot seems to be paying more attention to the needs of eCommerce businesses than the rest of the platforms below.

Marketo

Nothing about Marketo’s in-house marketing strategy suggests “Ecommerce Director Bob” might be one of their personas. The description for each of
their marketing automation packages (from Spark to Enterprise) mentions that it is “for B2B” websites.

marketo screenshot

Driving Sales could apply to a retail business so I clicked on the link. Nope. Clearly, this is for lead generation.

marketo marketing automation

Passing “purchase-ready leads” over to your “sales reps” is a good example of the type of language used throughout the site.

Make no mistake, Marketo is a top-notch marketing automation platform. Powerful and clean, it’s a shame they don’t launch a full-scale eCommerce version of their core product. In the meantime, there’s the
Magento Integration for Marketo Plug-in developed by an agency out of Australia called Hoosh Marketing.

magento marketo integration

I’ve never used this integration, but it’s part of Marketo’s
LaunchPoint directory, which I imagine is vetted, and Hoosh seems like a reputable agency.

Their
pricing page is blurred and gated, which is annoying, but perhaps they’ll come on here and tell everyone how much they charge.

marketo pricing page

As with all others except Silverpop, the Marketo navigation provides no easy paths to landing pages that would appeal to “Ecommerce Director Bob.”

Pardot

This option is a
SalesForce product, so—though I’ve never had the opportunity to use it—I can imagine Pardot is heavy on B2B/Sales and very light on B2C marketing for retail sites.

The hero image on their homepage says as much.

pardot tagline

pardot marketing automationAgain, no mention of eCommerce or retail, but clear navigation to lead gen and sales.

Eloqua / OMC

eloqua-logo.jpeg

Eloqua, now part of the Oracle Marketing Cloud (OMC), has a landing page
for the retail industry, on which they proclaim:

“Retail marketers know that the path to lifelong loyalty and increased revenue goes through building and growing deep client relationships.”

Since when did retail marketers start calling customers clients?

eloqua integration

The Integration tab on OMC’s “…Retail.html” page helpfully informs eCommerce marketers that their sales teams can continue using CRM systems like SalesForce and Microsoft Dynamics but doesn’t mention anything about eCommerce platforms and other SaaS solutions for eCommerce businesses.

Others

There are many other players in this arena. Though I haven’t used them yet, three I would love to try out are
SharpSpring, Hatchbuck and Act-On. But none of them appear to be any better suited to handle the concerns of eCommerce websites.

Where there’s a gap, there’s opportunity

The purpose of the section above wasn’t to highlight deficiencies in the tools themselves, but to illustrate a gap in who they are being marketed to and developed for.

So far, most of your eCommerce competitors probably aren’t using tools like these because they are not marketed to by the platforms, and don’t know how to apply the technology to online retail in a way that would justify the expense.

The thing is, a tool is just a tool

The
key concepts behind inbound marketing apply just as much to online retail as they do to lead generation.

In order to “do inbound marketing,” a marketing automation system isn’t even strictly necessary (in theory). They just help make the activities scalable for most businesses.

They also bring a lot of different marketing activities under one roof, which saves time and allows data to be moved and utilized between channels and systems. For example, what a customer is doing on social could influence the emails they receive, or content they see on your site. Here are some potential uses for most of the platforms above:

Automated marketing uses

  • Personalized abandoned cart emails
  • Post-purchase nurturing/reorder marketing
  • Welcome campaigns for the newsletter (other free offer) signups
  • Winback campaigns
  • Lead-nurturing email campaigns for cohorts and persona-based segments

Content marketing uses

  • Optimized, strategic blogging platforms, and frameworks
  • Landing pages for pre-transactional/educational offers or contests
  • Social media reporting, monitoring, and publishing
  • Personalization of content and user experience

Reporting uses

  • Revenue reporting (by segment or marketing action)
  • Attribution reporting (by campaign or content)

Assuming you don’t have the budget for a marketing automation system, but already have a good email marketing platform, you can still get started with inbound marketing. Eventually, however, you may want to graduate to a dedicated marketing automation solution to reap the full benefits.

Email marketing platforms

Most of the marketing automation systems claim to replace your email marketing platform, while many email marketing platforms claim to be marketing automation systems. Neither statement is completely accurate.

Marketing automation systems, especially those created specifically for the type of “inbound” campaigns described above, provide a powerful suite of tools all in one place. On the other hand, dedicated email platforms tend to offer “email marketing” features that are better, and more robust, than those offered by marketing automation systems. Some of them are also considerably cheaper—such as
MailChimp—but those are often light on even the email-specific features for eCommerce.

A different type of campaign

Email “blasts” in the form of B.O.G.O., $10 off or free shipping offers can still be very successful in generating incremental revenue boosts — especially for existing customers and seasonal campaigns.

The conversion rate on a 20% off coupon sent to existing customers, for instance, would likely pulverize the conversion rate of an email going out to middle-of-funnel contacts with a link to content (at least with how CR is currently being calculated by email platforms).

Inbound marketing campaigns can also offer quick wins, but they tend to focus mostly on non-customers after the first segmentation campaign (a campaign for the purpose of segmenting your list, such as an incentivised survey). This means lower initial conversion rates, but long-term success with the growth of new customers.

Here’s a good bet if works with your budget: Rely on a marketing automation system for inbound marketing to drive new customer acquisition from initial visit to first purchase, while using a good email marketing platform to run your “promotional email” campaigns to existing customers.

If you have to choose one or the other, I’d go with a robust marketing automation system.

Some of the most popular email platforms used by eCommerce businesses, with a focus on how they handle various Inbound Marketing activities, include:

Bronto

bronto.jpeg

This platform builds in features like abandoned cart recovery, advanced email list segmentation and automated email workflows that nurture contacts over time.

They also offer a host of eCommerce-related
features that you just don’t get with marketing automation systems like Hubspot and Marketo. This includes easy integration with a variety of eCommerce platforms like ATG, Demandware, Magento, Miva Merchant, Mozu and MarketLive, not to mention apps for coupons, product recommendations, social shopping and more. Integration with enterprise eCommerce platforms is one reason why Bronto is seen over and over again when browsing the Internet Retailer Top 500 reports.

On the other hand, Bronto—like the rest of these email platforms—doesn’t have many of the features that assist with content marketing outside of emails. As an “inbound” marketing automation system, it is incomplete because it focuses almost solely on one channel: email.

Vertical Response

verticalresponse.jpeg

Another juggernaut in eCommerce email marketing platforms, Vertical Response, has even fewer inbound-related features than Bronto, though it is a good email platform with a free version that includes up to 1,000 contacts and 4,000 emails per month (i.e. 4 emails to a full list of 1,000).

Oracle Marketing Cloud (OMC)

Responsys (the email platform), like Eloqua (the marketing automation system) was gobbled up by Oracle and is now part of their “Marketing Cloud.”

It has been my experience that when a big technology firm like IBM or Oracle buys a great product, it isn’t “great” for the users. Time will tell.

Listrak

listrak.jpeg

Out of the established email platforms for eCommerce, Listrak may do the best job at positioning themselves as a full inbound marketing platform.

Listrak’s value proposition is that they’re an “Omnichannel” solution. Everything is all in one “Single, Integrated Digital Marketing Platform for Retailers.” The homepage image promises solutions for Email, Mobile, Social, Web and In-Store channels.

I haven’t had the opportunity to work with Listrak yet, but would love to hear feedback in the comments on whether they could handle the kind of persona-based content marketing and automated email nurturing campaigns described in the example campaign above.

Key takeaways

Congratulations for making this far! Here are a few things I hope you’ll take away from this post:

  • There is a lot of opportunity right now for eCommerce sites to take advantage of marketing automation systems and robust email marketing platforms as the infrastructure to run comprehensive inbound marketing campaigns.
  • There is a lot of opportunity right now for marketing automation systems to develop content and build in eCommerce-specific features to lure eCommerce marketers.
  • Inbound marketing isn’t email marketing, although email is an important piece to inbound because it allows you to begin forming lasting relationships with potential customers much earlier in the buying cycle.
  • To see the full benefits of inbound marketing, you should focus on getting the right content to the right person at the right time in their shopping journey. This necessarily involves several different channels, including search, social and email. One of the many benefits of marketing automation systems is their ability to track your efforts here across marketing channels, devices and touch-points.

Tools, resources, and further reading

There is a lot of great content on the topic of Inbound marketing, some of which has greatly informed my own understanding and approach. Here are a few resources you may find useful as well.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it

Get Unbeatable Insights into Local SEO: Buy the LocalUp Advanced Video Bundle

Posted by EricaMcGillivray

Missed LocalUp Advanced 2015? Forgot to take notes, or just want to relive the action? You can now 
purchase the bundle of 13 videos in order to get all the knowledge our speakers shared about local SEO in. Dive deep and learn how to wrangle content, get reviews, overcome technical mobile challenges, and so much more from top industry leaders.

Moz and
Local U are offering a super-special $99 deal for everyone—an unbeatable value for anyone looking to level-up their local SEO skills.

Buy the LocalUp Advanced Video Bundle


Get a preview of what you’ll learn and hear what attendees had to say about how much they enjoyed it:

LocalUp Advanced 2015 Video Sales Promo


In addition to the videos, you also get the slide decks. Follow along and go back as you start implementing these tips into your strategy and work. You can watch these videos and download them to any device you use: desktop, laptop, tablet, and mobile.


Watch the following great talks and more:

Getting Local Keyword Research and On-page Optimization Right with Mary Bowling
Local keyword data is often difficult to find, analyze, and prioritize. Get tips, tools, and processes for zeroing in on the best terms to target when optimizing your website and directory listings, and learn how and why to structure your website around them.

Exposing the Non-Obvious Elements of Local Businesses That Dominate on the Web with Rand Fishkin
In some categories and geographies, a local small business wholly dominates the rankings and visibility across channels. What are the secrets to this success, and how can small businesses with remarkable products/services showcase their traits best online? In this presentation, Rand digs deep into examples and highlight the recurring elements that help the best of the best stand out.

Rand Fishkin


Local Content + Scale + Creativity = Awesome with Mike Ramsey
If you are wondering who is crushing it with local content and how you can scale such efforts, then tune in as Mike Ramsey walks through ideas, examples, and lessons he has learned along the way.

Mike Ramsey


Don’t Just Show Up, Stand Out with Dana DiTomaso
Learn how to destroy your competitors by bringing personality to your marketing. Confront the challenges of making HIPPOs comfortable with unique voice, keep brand standards while injecting some fun, and stay in the forefront of your audience’s mind.

Dana DiTomaso


Playing to Your Local Strengths with David Mihm
Historically, local search has been one of the most level playing fields on the web with smaller, nimbler businesses having an advantage as larger enterprises struggled to adapt and keep up. Today, companies of both sizes can benefit from tactics that the other simply can’t leverage. David shares some of the most valuable tactics that scale—and don’t scale—in a presentation packed with actionable takeaways, no matter what size business you work with.

David Mihm

Wondering if it’s truly “advanced?”

79 percent of attendees found the information perfectly advanced

Seventy-nine percent of attendees found the LocalUp Advanced presentations to be at just the right level.

Buy the LocalUp Advanced Video Bundle

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 years ago from tracking.feedpress.it