What’s an API? The marketer’s definitive guide is here!

I have a problem. At weddings, parties, or family gatherings, someone will generally ask what I do. “I’m a technical writer”, I reply. They look confused and then ask what that means. I find myself saying something about ‘documenting APIs’, at which point any audience I may have gathered has sidled away, keenly looking for their next canape.

 

Why is that? It’s probably because most people – certainly people who don’t work in any kind of IT-based business – don’t know what an API is. Moreover, most people don’t know what ‘API’ stands for.

API is an acronym and it stands for ‘application programming interface’. Explaining that doesn’t really help matters though! Let’s try with a very simple definition: an API is a set of functions and procedures that allow different computer systems to communicate with each other. That’s a bit better, but it still doesn’t really help visualize how one works. In which case, let’s go back to that canape – because the idea of food and restaurants serves well as an analogy for an API.

The API – an analogy

It’s best to think of an API as a menu that you’re given in a restaurant. A menu lists all the dishes and drinks on offer, and you request something off it if you want it returned to you from the kitchen or bar. If you order something that’s not on the menu, then the kitchen won’t be able to make it and can’t return it.

Now, think of two separate computer systems. How is data exchanged between them?

Answer: via an API. An API lists operations that can be used by one system to request data from the other system’s database. As with the menu though, if you request something that the API doesn’t list then the other system won’t be able to respond with it.

However, unlike a restaurant menu, you can do more with APIs than just ‘return ordered food’ (data) from ‘the kitchen’ (the other system’s database). An API can also let you send the other system new data, update existing data, and delete data (with a restaurant menu, you’ll be extremely hard-pressed to send any food that you’ve brought along yourself into the kitchen, or to force them to throw their food away!)

Chefs discussing menu on clipboard in commercial kitchen

 

So, there you have it. APIs work on a request/response cycle and they’re essentially the engine running under the internet’s hood, galvanizing all the online data connectivity that we’re constantly making use of.

For instance, APIs have enabled you to do all sorts of things, from ordering a pizza from your mobile phone using a food delivery app, to checking for the insurance deals on a price comparison site, to receiving a calendar notification that you’re due to check-in for a flight. Two or more systems are making this possible through the exchange of data via APIs.

The difference between private and open APIs

APIs can be used in different ways to facilitate different things. Some APIs are private ones, used solely within a company by that company’s software engineers to communicate between many different services and systems that make up the company’s overall infrastructure.

An open API (often referred to as a public API) is one that has been made available by a company for external users to consume. A company with a public API will have purposely designed their API to expose only a certain amount of services that their product offers (and not all, otherwise this would prove detrimental and a security risk), which will be documented online for software developers to make use of.

The better an API is designed and documented, the quicker a visiting developer can get up and running and start communicating with another system to build effective integrations.

APIs expand businesses

Public APIs are mutually beneficial. External developers get to extend their system or product by consuming the services of another company’s API, whilst the company offering the API benefit from lots of developers writing code and integrations that can be made public, shared – and in turn expands their product and business.

Shot of a man working in an office

 

Businesses are fast harnessing the money-making potential of APIs to expose those services and make data available to external audiences. This enables integration and creation of new revenue streams. For some companies, the API is the product, such as omnichannel communications platform service Comapi (a dotdigital company).

Why the dotmailer API benefits you – the marketer

dotmailer offers a powerful, flexible open API. As such, internal and external users consume our API for various reasons. It powers the premium eCommerce and CRM integrations that we offer, like Magento, MS Dynamics and Shopify Plus. It also allows partners and customers to develop and build their own custom integrations and technical solutions for the platform.

How does this benefit you? It means you can point your developers to our API documentation so they can start making some of your keenest marketing automation wishes come true! It enables them to quickly get to grips with our API and create code that not only gets data out of dotmailer, but gets your data in too – as well as automate various actions crucial to smarter marketing.

You’re no longer bound by the user interface of the app itself.

Find out more about how to use the dotmailer API by visiting our dedicated API support page.

Social media and global network concept.

What can I do to benefit from the API?

Let me provide you with a few common scenarios in which the API helps with custom marketing automation:

  • Import new signups to your site in real time: our API has several calls which means new signups are not only added to your CRM, they’re also added to dotmailer as contacts, so they can then be sent an automated welcome.
  • Import order data from your store so it can be used to send better targeted and personalized content: our API has a number of transactional data calls that enables you to not only import historical order data but to also schedule keeping this data up to date, as new purchases are made, and orders updated. Once this data is in, you can go on to create contact segments and, if you have it enabled on your account, use advanced personalization in content.
  • Export contacts’ email engagement data: our API features numerous calls that can export contacts’ key engagement data with your campaigns into your CRM, allowing you to create marketing lists and other actions to improve relevant targeting.

Get more automation tips and tricks from our free resources.

 

Hopefully the concept of an API is a lot clearer now, and you understand the benefits. In which case, back to that analogy before I leave you…

Feeling hungry for integration? Take a seat at the restaurant, bring along your developers, hand them that menu and put in your request.

In the meantime, I’ll wish you all ‘Bon API!’

See the dotmailer API in action: watch a super-quick demo.

The post What’s an API? The marketer’s definitive guide is here! appeared first on The Marketing Automation Blog.

Reblogged 1 month ago from blog.dotmailer.com

Follow the Local SEO Leaders: A Guide to Our Industry’s Best Publications

Posted by MiriamEllis

Change is the only constant in local SEO. As your local brand or local search marketing agency grows, you’ll be onboarding new hires. Whether they’re novices or adepts, they’ll need to keep up with continuous industry developments in order to make agile contributions to team strategy. Particularly if local SEO is new to someone, it saves training time if you can fast-track them on who to follow for the best news and analysis. This guide serves as a blueprint for that very purpose.

And even if you’re an old hand in the local SEM industry, you may find some sources here you’ve been overlooking that could add richness and depth to your ongoing education.

Two quick notes on what and how I’ve chosen:

  1. As the author of both of Moz’s newsletters (the Moz Top 10 and the Moz Local Top 7), I read an inordinate amount of SEO and local SEO content, but I could have missed your work. The list that follows represents my own, personal slate of the resources that have taught me the most. If you publish great local SEO information but you’re not on this list, my apologies, and if you write something truly awesome in future, you’re welcome to tweet at me. I’m always on the lookout for fresh and enlightening voices. My personal criteria for the publications I trust is that they are typically groundbreaking, thoughtful, investigative, and respectful of readers and subjects.
  2. Following the leaders is a useful practice, but not a stopping point. Even experts aren’t infallible. Rather than take industry advice at face value, do your own testing. Some of the most interesting local SEO discussions I’ve ever participated in have stemmed from people questioning standard best practices. So, while it’s smart to absorb the wisdom of experts, it’s even smarter to do your own experiments.

The best of local SEO news

Who reports fastest on Google updates, Knowledge Panel tweaks, and industry business?

Sterling Sky’s Timeline of Local SEO Changes is the industry’s premiere log of developments that impact local businesses and is continuously updated by Joy Hawkins + team.

Search Engine Roundtable has a proven track record of being among the first to report news that affects both local and digital businesses, thanks to the ongoing dedication of Barry Schwartz.

Street Fight is the best place on the web to read about mergers, acquisitions, the release of new technology, and other major happenings on the business side of local. I’m categorizing Street Fight under news, but they also offer good commentary, particularly the joint contributions of David Mihm and Mike Blumenthal.

LocalU’s Last Week in Local video and podcast series highlights Mike Blumenthal and Mary Bowling’s top picks of industry coverage most worthy of your attention. Comes with the bonus of expert commentary as they share their list.

TechCrunch also keeps a finger on the pulse of technology and business dealings that point to the future of local.

Search Engine Land’s local category is consistently swift in getting the word out about breaking industry news, with the help of multiple authors.

Adweek is a good source for reportage on retail and brand news, but there’s a limit to the number of articles you can read without a subscription. I often find them covering quirky stories that are absent from other publications I read.

The SEMPost’s local tab is another good place to check for local developments, chiefly covered by Jennifer Slegg.

Search Engine Journal’s local column also gets my vote for speedy delivery of breaking local stories.

Google’s main blog and the ThinkWithGoogle blog are musts to keep tabs on the search engine’s own developments, bearing in mind, of course, that these publications can be highly promotional of their products and worldview.

The best of local search marketing analysis

Who can you trust most to analyze the present and predict the future?

LocalU’s Deep Dive video series features what I consider to be the our industry’s most consistently insightful analysis of a variety of local marketing topics, discussed by learned faculty and guests.

The Moz Blog’s local category hosts a slate of gifted bloggers and professional editorial standards that result in truly in-depth treatment of local topics, presented with care and attention. As a veteran contributor to this publication, I can attest to how Moz inspires authors to aim high, and one of the nicest things that happened to our team in 2018 was being voted the #2 local SEO blog by BrightLocal’s survey respondents.

The Local Search Association’s Insider blog is one I turn to again and again, particularly for their excellent studies and quotable statistics.

Mike Blumenthal’s blog has earned a place of honor over many years as a key destination for breaking local developments and one-of-a-kind analysis. When Blumenthal talks, local people listen. One of the things I’ve prized for well over a decade in Mike’s writing is his ability to see things from a small business perspective, as opposed to simply standing in awe of big business and technology.

BrightLocal’s surveys and studies are some of the industry’s most cited and I look eagerly forward to their annual publication.

Whitespark’s blog doesn’t publish as frequently as I wish it did, but their posts by Darren Shaw and crew are always on extremely relevant topics and of high quality.

Sterling Sky’s blog is a relative newcomer, but the expertise Joy Hawkins and Colan Nielsen bring to their agency’s publication is making it a go-to resource for advice on some of the toughest aspects of local SEO.

Local Visibility System’s blog continues to please, with the thoughtful voice of Phil Rozek exploring themes you likely encounter in your day-to-day work as a local SEO.

The Local Search Forum is, hands down, the best free forum on the web to take your local mysteries and musings to. Founded by Linda Buquet, the ethos of the platform is approachable, friendly, and often fun, and high-level local SEOs frequently weigh in on hot topics.

Pro tip: In addition to the above tried-and-true resources, I frequently scan the online versions of city newspapers across the country for interesting local stories that add perspective to my vision of the challenges and successes of local businesses. Sometimes, too, publications like The Atlantic, Forbes, or Business Insider will publish pieces of a high journalistic quality with relevance to our industry. Check them out!

The best for specific local marketing disciplines

Here, I’ll break this down by subject or industry for easy scanning:

Reviews

  • GetFiveStars can’t be beat for insight into online reputation management, with Aaron Weiche and team delivering amazing case studies and memorable statistics. I literally have a document of quotes from their work that I refer to on a regular basis in my own writing.
  • Grade.us is my other ORM favorite for bright and lively coverage from authors like Garrett Sussman and Andrew McDermott.

Email marketing

  • Tidings’ vault contains a tiny but growing treasure trove of email marketing wisdom from David Mihm, whose former glory days spent in the trenches of local SEO make him especially attuned to our industry.

SABs

  • Tom Waddington’s blog is the must-read publication for service area businesses whose livelihoods are being impacted by Google’s Local Service Ads program in an increasing number of categories and cities.

Automotive marketing

  • DealerOn’s blog is the real deal when it comes to automotive local SEO, with Greg Gifford teaching memorable lessons in an enjoyable way.

Legal marketing

  • JurisDigital brings the the educated voices of Casey Meraz and team to the highly-specialized field of attorney marketing.

Hospitality marketing

Independent businesses

Link building

  • Nifty Marketing’s blog has earned my trust for its nifty local link building ideas and case studies.
  • ZipSprout belongs here, too, because of their focus on local sponsorships, which are a favorite local link building methodology. Check them out for blog posts and podcasts.

Schema + other markup

  • Touchpoint Digital Marketing doesn’t publish much on their own website, but look anywhere you can for David Deering’s writings on markup. LocalU and Moz are good places to search for his expertise.

Patents

  • SEO by the Sea has proffered years to matchless analysis of Google patents that frequently impact local businesses or point to future possible developments.

Best local search industry newsletters

Get the latest news and tips delivered right to your inbox by signing up for these fine free newsletters:

Follow the local SEO leaders on Twitter

What an easy way to track what industry adepts are thinking and sharing, up-to-the-minute! Following this list of professionals (alphabetized by first name) will fill up your social calendar with juicy local tidbits. Keep in mind that many of these folks either own or work for agencies or publishers you can follow, too.

Aaron Weiche
Adam Dorfman
Andrew Shotland
Ben Fisher
Bernadette Coleman
Bill Slawski
Brian Barwig
Carrie Hill
Casey Meraz
Cindy Krum
Colan Nielsen
DJ Baxter
Dan Leibson
Dana DiTomaso
Dani Owens
Darren Shaw
Dave DiGreggorio
David Mihm
Don Campbell
Garrett Sussman
Glenn Gabe
Greg Gifford
Greg Sterling
Jennifer Slegg
Joel Headley
Joy Hawkins
Mary Bowling
Mike Blumenthal
Mike Ramsey
Miriam Ellis
Phil Rozek
Sherry Bonelli
Thibault Adda
Tim Capper
Tom Waddington

Share what you learn

How about your voice? How do you get it heard in the local SEO industry? The answer is simple: share what you learn with others. Each of the people and publications on my list has earned a place there because, at one time or another, they have taught me something they learned from their own work. Some tips:

  • Our industry has become a sizeable niche, but there is always room for new, interesting voices
  • Experiment and publish — consistent publication of your findings is the best way I know of to become a trusted source of information
  • Don’t be afraid of making mistakes, so long as you are willing to own them
  • Socialize — attend events, amplify the work of colleagues you admire, reach out in real ways to others to share your common work interest while also respecting busy schedules

Local SEO is a little bit like jazz, in which we’re all riffing off the same chord progressions created by Google, Facebook, Yelp, other major platforms, and the needs of clients. Mike Blumenthal plays a note about a jeweler whose WOMM is driving the majority of her customers. You take that note and turn it around for someone in the auto industry, yielding an unexpected insight. Someone else takes your insight and creates a print handout to bolster a loyalty program.

Everyone ends up learning in this virtuous, democratic cycle, so go ahead — start sharing! A zest for contribution is a step towards leadership and your observations could be music to the industry’s ears.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 2 months ago from tracking.feedpress.it

Rewriting the Beginner’s Guide to SEO, Chapter 2: Crawling, Indexing, and Ranking

Posted by BritneyMuller

It’s been a few months since our last share of our work-in-progress rewrite of the Beginner’s Guide to SEO, but after a brief hiatus, we’re back to share our draft of Chapter Two with you! This wouldn’t have been possible without the help of Kameron Jenkins, who has thoughtfully contributed her great talent for wordsmithing throughout this piece.

This is your resource, the guide that likely kicked off your interest in and knowledge of SEO, and we want to do right by you. You left amazingly helpful commentary on our outline and draft of Chapter One, and we’d be honored if you would take the time to let us know what you think of Chapter Two in the comments below.


Chapter 2: How Search Engines Work – Crawling, Indexing, and Ranking

First, show up.

As we mentioned in Chapter 1, search engines are answer machines. They exist to discover, understand, and organize the internet’s content in order to offer the most relevant results to the questions searchers are asking.

In order to show up in search results, your content needs to first be visible to search engines. It’s arguably the most important piece of the SEO puzzle: If your site can’t be found, there’s no way you’ll ever show up in the SERPs (Search Engine Results Page).

How do search engines work?

Search engines have three primary functions:

  1. Crawl: Scour the Internet for content, looking over the code/content for each URL they find.
  2. Index: Store and organize the content found during the crawling process. Once a page is in the index, it’s in the running to be displayed as a result to relevant queries.
  3. Rank: Provide the pieces of content that will best answer a searcher’s query. Order the search results by the most helpful to a particular query.

What is search engine crawling?

Crawling, is the discovery process in which search engines send out a team of robots (known as crawlers or spiders) to find new and updated content. Content can vary — it could be a webpage, an image, a video, a PDF, etc. — but regardless of the format, content is discovered by links.

The bot starts out by fetching a few web pages, and then follows the links on those webpages to find new URLs. By hopping along this path of links, crawlers are able to find new content and add it to their index — a massive database of discovered URLs — to later be retrieved when a searcher is seeking information that the content on that URL is a good match for.

What is a search engine index?

Search engines process and store information they find in an index, a huge database of all the content they’ve discovered and deem good enough to serve up to searchers.

Search engine ranking

When someone performs a search, search engines scour their index for highly relevant content and then orders that content in the hopes of solving the searcher’s query. This ordering of search results by relevance is known as ranking. In general, you can assume that the higher a website is ranked, the more relevant the search engine believes that site is to the query.

It’s possible to block search engine crawlers from part or all of your site, or instruct search engines to avoid storing certain pages in their index. While there can be reasons for doing this, if you want your content found by searchers, you have to first make sure it’s accessible to crawlers and is indexable. Otherwise, it’s as good as invisible.

By the end of this chapter, you’ll have the context you need to work with the search engine, rather than against it!

Note: In SEO, not all search engines are equal

Many beginners wonder about the relative importance of particular search engines. Most people know that Google has the largest market share, but how important it is to optimize for Bing, Yahoo, and others? The truth is that despite the existence of more than 30 major web search engines, the SEO community really only pays attention to Google. Why? The short answer is that Google is where the vast majority of people search the web. If we include Google Images, Google Maps, and YouTube (a Google property), more than 90% of web searches happen on Google — that’s nearly 20 times Bing and Yahoo combined.

Crawling: Can search engines find your site?

As you’ve just learned, making sure your site gets crawled and indexed is a prerequisite for showing up in the SERPs. First things first: You can check to see how many and which pages of your website have been indexed by Google using “site:yourdomain.com“, an advanced search operator.

Head to Google and type “site:yourdomain.com” into the search bar. This will return results Google has in its index for the site specified:

The number of results Google displays (see “About __ results” above) isn’t exact, but it does give you a solid idea of which pages are indexed on your site and how they are currently showing up in search results.

For more accurate results, monitor and use the Index Coverage report in Google Search Console. You can sign up for a free Google Search Console account if you don’t currently have one. With this tool, you can submit sitemaps for your site and monitor how many submitted pages have actually been added to Google’s index, among other things.

If you’re not showing up anywhere in the search results, there are a few possible reasons why:

  • Your site is brand new and hasn’t been crawled yet.
  • Your site isn’t linked to from any external websites.
  • Your site’s navigation makes it hard for a robot to crawl it effectively.
  • Your site contains some basic code called crawler directives that is blocking search engines.
  • Your site has been penalized by Google for spammy tactics.

If your site doesn’t have any other sites linking to it, you still might be able to get it indexed by submitting your XML sitemap in Google Search Console or manually submitting individual URLs to Google. There’s no guarantee they’ll include a submitted URL in their index, but it’s worth a try!

Can search engines see your whole site?

Sometimes a search engine will be able to find parts of your site by crawling, but other pages or sections might be obscured for one reason or another. It’s important to make sure that search engines are able to discover all the content you want indexed, and not just your homepage.

Ask yourself this: Can the bot crawl through your website, and not just to it?

Is your content hidden behind login forms?

If you require users to log in, fill out forms, or answer surveys before accessing certain content, search engines won’t see those protected pages. A crawler is definitely not going to log in.

Are you relying on search forms?

Robots cannot use search forms. Some individuals believe that if they place a search box on their site, search engines will be able to find everything that their visitors search for.

Is text hidden within non-text content?

Non-text media forms (images, video, GIFs, etc.) should not be used to display text that you wish to be indexed. While search engines are getting better at recognizing images, there’s no guarantee they will be able to read and understand it just yet. It’s always best to add text within the <HTML> markup of your webpage.

Can search engines follow your site navigation?

Just as a crawler needs to discover your site via links from other sites, it needs a path of links on your own site to guide it from page to page. If you’ve got a page you want search engines to find but it isn’t linked to from any other pages, it’s as good as invisible. Many sites make the critical mistake of structuring their navigation in ways that are inaccessible to search engines, hindering their ability to get listed in search results.

Common navigation mistakes that can keep crawlers from seeing all of your site:

  • Having a mobile navigation that shows different results than your desktop navigation
  • Any type of navigation where the menu items are not in the HTML, such as JavaScript-enabled navigations. Google has gotten much better at crawling and understanding Javascript, but it’s still not a perfect process. The more surefire way to ensure something gets found, understood, and indexed by Google is by putting it in the HTML.
  • Personalization, or showing unique navigation to a specific type of visitor versus others, could appear to be cloaking to a search engine crawler
  • Forgetting to link to a primary page on your website through your navigation — remember, links are the paths crawlers follow to new pages!

This is why it’s essential that your website has a clear navigation and helpful URL folder structures.

Information architecture

Information architecture is the practice of organizing and labeling content on a website to improve efficiency and fundability for users. The best information architecture is intuitive, meaning that users shouldn’t have to think very hard to flow through your website or to find something.

Your site should also have a useful 404 (page not found) page for when a visitor clicks on a dead link or mistypes a URL. The best 404 pages allow users to click back into your site so they don’t bounce off just because they tried to access a nonexistent link.

Tell search engines how to crawl your site

In addition to making sure crawlers can reach your most important pages, it’s also pertinent to note that you’ll have pages on your site you don’t want them to find. These might include things like old URLs that have thin content, duplicate URLs (such as sort-and-filter parameters for e-commerce), special promo code pages, staging or test pages, and so on.

Blocking pages from search engines can also help crawlers prioritize your most important pages and maximize your crawl budget (the average number of pages a search engine bot will crawl on your site).

Crawler directives allow you to control what you want Googlebot to crawl and index using a robots.txt file, meta tag, sitemap.xml file, or Google Search Console.

Robots.txt

Robots.txt files are located in the root directory of websites (ex. yourdomain.com/robots.txt) and suggest which parts of your site search engines should and shouldn’t crawl via specific robots.txt directives. This is a great solution when trying to block search engines from non-private pages on your site.

You wouldn’t want to block private/sensitive pages from being crawled here because the file is easily accessible by users and bots.

Pro tip:

  • If Googlebot can’t find a robots.txt file for a site (40X HTTP status code), it proceeds to crawl the site.
  • If Googlebot finds a robots.txt file for a site (20X HTTP status code), it will usually abide by the suggestions and proceed to crawl the site.
  • If Googlebot finds neither a 20X or a 40X HTTP status code (ex. a 501 server error) it can’t determine if you have a robots.txt file or not and won’t crawl your site.

Meta directives

The two types of meta directives are the meta robots tag (more commonly used) and the x-robots-tag. Each provides crawlers with stronger instructions on how to crawl and index a URL’s content.

The x-robots tag provides more flexibility and functionality if you want to block search engines at scale because you can use regular expressions, block non-HTML files, and apply sitewide noindex tags.

These are the best options for blocking more sensitive*/private URLs from search engines.

*For very sensitive URLs, it is best practice to remove them from or require a secure login to view the pages.

WordPress Tip: In Dashboard > Settings > Reading, make sure the “Search Engine Visibility” box is not checked. This blocks search engines from coming to your site via your robots.txt file!

Avoid these common pitfalls, and you’ll have clean, crawlable content that will allow bots easy access to your pages.

Once you’ve ensured your site has been crawled, the next order of business is to make sure it can be indexed. That’s right — just because your site can be discovered and crawled by a search engine doesn’t necessarily mean that it will be stored in their index. Read on to learn about how indexing works and how you can make sure your site makes it into this all-important database.

Sitemaps

A sitemap is just what it sounds like: a list of URLs on your site that crawlers can use to discover and index your content. One of the easiest ways to ensure Google is finding your highest priority pages is to create a file that meets Google’s standards and submit it through Google Search Console. While submitting a sitemap doesn’t replace the need for good site navigation, it can certainly help crawlers follow a path to all of your important pages.

Google Search Console

Some sites (most common with e-commerce) make the same content available on multiple different URLs by appending certain parameters to URLs. If you’ve ever shopped online, you’ve likely narrowed down your search via filters. For example, you may search for “shoes” on Amazon, and then refine your search by size, color, and style. Each time you refine, the URL changes slightly. How does Google know which version of the URL to serve to searchers? Google does a pretty good job at figuring out the representative URL on its own, but you can use the URL Parameters feature in Google Search Console to tell Google exactly how you want them to treat your pages.

Indexing: How do search engines understand and remember your site?

Once you’ve ensured your site has been crawled, the next order of business is to make sure it can be indexed. That’s right — just because your site can be discovered and crawled by a search engine doesn’t necessarily mean that it will be stored in their index. In the previous section on crawling, we discussed how search engines discover your web pages. The index is where your discovered pages are stored. After a crawler finds a page, the search engine renders it just like a browser would. In the process of doing so, the search engine analyzes that page’s contents. All of that information is stored in its index.

Read on to learn about how indexing works and how you can make sure your site makes it into this all-important database.

Can I see how a Googlebot crawler sees my pages?

Yes, the cached version of your page will reflect a snapshot of the last time googlebot crawled it.

Google crawls and caches web pages at different frequencies. More established, well-known sites that post frequently like https://www.nytimes.com will be crawled more frequently than the much-less-famous website for Roger the Mozbot’s side hustle, http://www.rogerlovescupcakes.com (if only it were real…)

You can view what your cached version of a page looks like by clicking the drop-down arrow next to the URL in the SERP and choosing “Cached”:

You can also view the text-only version of your site to determine if your important content is being crawled and cached effectively.

Are pages ever removed from the index?

Yes, pages can be removed from the index! Some of the main reasons why a URL might be removed include:

  • The URL is returning a “not found” error (4XX) or server error (5XX) – This could be accidental (the page was moved and a 301 redirect was not set up) or intentional (the page was deleted and 404ed in order to get it removed from the index)
  • The URL had a noindex meta tag added – This tag can be added by site owners to instruct the search engine to omit the page from its index.
  • The URL has been manually penalized for violating the search engine’s Webmaster Guidelines and, as a result, was removed from the index.
  • The URL has been blocked from crawling with the addition of a password required before visitors can access the page.

If you believe that a page on your website that was previously in Google’s index is no longer showing up, you can manually submit the URL to Google by navigating to the “Submit URL” tool in Search Console.

Ranking: How do search engines rank URLs?

How do search engines ensure that when someone types a query into the search bar, they get relevant results in return? That process is known as ranking, or the ordering of search results by most relevant to least relevant to a particular query.

To determine relevance, search engines use algorithms, a process or formula by which stored information is retrieved and ordered in meaningful ways. These algorithms have gone through many changes over the years in order to improve the quality of search results. Google, for example, makes algorithm adjustments every day — some of these updates are minor quality tweaks, whereas others are core/broad algorithm updates deployed to tackle a specific issue, like Penguin to tackle link spam. Check out our Google Algorithm Change History for a list of both confirmed and unconfirmed Google updates going back to the year 2000.

Why does the algorithm change so often? Is Google just trying to keep us on our toes? While Google doesn’t always reveal specifics as to why they do what they do, we do know that Google’s aim when making algorithm adjustments is to improve overall search quality. That’s why, in response to algorithm update questions, Google will answer with something along the lines of: “We’re making quality updates all the time.” This indicates that, if your site suffered after an algorithm adjustment, compare it against Google’s Quality Guidelines or Search Quality Rater Guidelines, both are very telling in terms of what search engines want.

What do search engines want?

Search engines have always wanted the same thing: to provide useful answers to searcher’s questions in the most helpful formats. If that’s true, then why does it appear that SEO is different now than in years past?

Think about it in terms of someone learning a new language.

At first, their understanding of the language is very rudimentary — “See Spot Run.” Over time, their understanding starts to deepen, and they learn semantics—- the meaning behind language and the relationship between words and phrases. Eventually, with enough practice, the student knows the language well enough to even understand nuance, and is able to provide answers to even vague or incomplete questions.

When search engines were just beginning to learn our language, it was much easier to game the system by using tricks and tactics that actually go against quality guidelines. Take keyword stuffing, for example. If you wanted to rank for a particular keyword like “funny jokes,” you might add the words “funny jokes” a bunch of times onto your page, and make it bold, in hopes of boosting your ranking for that term:

Welcome to funny jokes! We tell the funniest jokes in the world. Funny jokes are fun and crazy. Your funny joke awaits. Sit back and read funny jokes because funny jokes can make you happy and funnier. Some funny favorite funny jokes.

This tactic made for terrible user experiences, and instead of laughing at funny jokes, people were bombarded by annoying, hard-to-read text. It may have worked in the past, but this is never what search engines wanted.

The role links play in SEO

When we talk about links, we could mean two things. Backlinks or “inbound links” are links from other websites that point to your website, while internal links are links on your own site that point to your other pages (on the same site).

Links have historically played a big role in SEO. Very early on, search engines needed help figuring out which URLs were more trustworthy than others to help them determine how to rank search results. Calculating the number of links pointing to any given site helped them do this.

Backlinks work very similarly to real life WOM (Word-Of-Mouth) referrals. Let’s take a hypothetical coffee shop, Jenny’s Coffee, as an example:

  • Referrals from others = good sign of authority
    Example: Many different people have all told you that Jenny’s Coffee is the best in town
  • Referrals from yourself = biased, so not a good sign of authority
    Example: Jenny claims that Jenny’s Coffee is the best in town
  • Referrals from irrelevant or low-quality sources = not a good sign of authority and could even get you flagged for spam
    Example: Jenny paid to have people who have never visited her coffee shop tell others how good it is.
  • No referrals = unclear authority
    Example: Jenny’s Coffee might be good, but you’ve been unable to find anyone who has an opinion so you can’t be sure.

This is why PageRank was created. PageRank (part of Google’s core algorithm) is a link analysis algorithm named after one of Google’s founders, Larry Page. PageRank estimates the importance of a web page by measuring the quality and quantity of links pointing to it. The assumption is that the more relevant, important, and trustworthy a web page is, the more links it will have earned.

The more natural backlinks you have from high-authority (trusted) websites, the better your odds are to rank higher within search results.

The role content plays in SEO

There would be no point to links if they didn’t direct searchers to something. That something is content! Content is more than just words; it’s anything meant to be consumed by searchers — there’s video content, image content, and of course, text. If search engines are answer machines, content is the means by which the engines deliver those answers.

Any time someone performs a search, there are thousands of possible results, so how do search engines decide which pages the searcher is going to find valuable? A big part of determining where your page will rank for a given query is how well the content on your page matches the query’s intent. In other words, does this page match the words that were searched and help fulfill the task the searcher was trying to accomplish?

Because of this focus on user satisfaction and task accomplishment, there’s no strict benchmarks on how long your content should be, how many times it should contain a keyword, or what you put in your header tags. All those can play a role in how well a page performs in search, but the focus should be on the users who will be reading the content.

Today, with hundreds or even thousands of ranking signals, the top three have stayed fairly consistent: links to your website (which serve as a third-party credibility signals), on-page content (quality content that fulfills a searcher’s intent), and RankBrain.

What is RankBrain?

RankBrain is the machine learning component of Google’s core algorithm. Machine learning is a computer program that continues to improve its predictions over time through new observations and training data. In other words, it’s always learning, and because it’s always learning, search results should be constantly improving.

For example, if RankBrain notices a lower ranking URL providing a better result to users than the higher ranking URLs, you can bet that RankBrain will adjust those results, moving the more relevant result higher and demoting the lesser relevant pages as a byproduct.

Like most things with the search engine, we don’t know exactly what comprises RankBrain, but apparently, neither do the folks at Google.

What does this mean for SEOs?

Because Google will continue leveraging RankBrain to promote the most relevant, helpful content, we need to focus on fulfilling searcher intent more than ever before. Provide the best possible information and experience for searchers who might land on your page, and you’ve taken a big first step to performing well in a RankBrain world.

Engagement metrics: correlation, causation, or both?

With Google rankings, engagement metrics are most likely part correlation and part causation.

When we say engagement metrics, we mean data that represents how searchers interact with your site from search results. This includes things like:

  • Clicks (visits from search)
  • Time on page (amount of time the visitor spent on a page before leaving it)
  • Bounce rate (the percentage of all website sessions where users viewed only one page)
  • Pogo-sticking (clicking on an organic result and then quickly returning to the SERP to choose another result)

Many tests, including Moz’s own ranking factor survey, have indicated that engagement metrics correlate with higher ranking, but causation has been hotly debated. Are good engagement metrics just indicative of highly ranked sites? Or are sites ranked highly because they possess good engagement metrics?

What Google has said

While they’ve never used the term “direct ranking signal,” Google has been clear that they absolutely use click data to modify the SERP for particular queries.

According to Google’s former Chief of Search Quality, Udi Manber:

“The ranking itself is affected by the click data. If we discover that, for a particular query, 80% of people click on #2 and only 10% click on #1, after a while we figure out probably #2 is the one people want, so we’ll switch it.”

Another comment from former Google engineer Edmond Lau corroborates this:

“It’s pretty clear that any reasonable search engine would use click data on their own results to feed back into ranking to improve the quality of search results. The actual mechanics of how click data is used is often proprietary, but Google makes it obvious that it uses click data with its patents on systems like rank-adjusted content items.”

Because Google needs to maintain and improve search quality, it seems inevitable that engagement metrics are more than correlation, but it would appear that Google falls short of calling engagement metrics a “ranking signal” because those metrics are used to improve search quality, and the rank of individual URLs is just a byproduct of that.

What tests have confirmed

Various tests have confirmed that Google will adjust SERP order in response to searcher engagement:

  • Rand Fishkin’s 2014 test resulted in a #7 result moving up to the #1 spot after getting around 200 people to click on the URL from the SERP. Interestingly, ranking improvement seemed to be isolated to the location of the people who visited the link. The rank position spiked in the US, where many participants were located, whereas it remained lower on the page in Google Canada, Google Australia, etc.
  • Larry Kim’s comparison of top pages and their average dwell time pre- and post-RankBrain seemed to indicate that the machine-learning component of Google’s algorithm demotes the rank position of pages that people don’t spend as much time on.
  • Darren Shaw’s testing has shown user behavior’s impact on local search and map pack results as well.

Since user engagement metrics are clearly used to adjust the SERPs for quality, and rank position changes as a byproduct, it’s safe to say that SEOs should optimize for engagement. Engagement doesn’t change the objective quality of your web page, but rather your value to searchers relative to other results for that query. That’s why, after no changes to your page or its backlinks, it could decline in rankings if searchers’ behaviors indicates they like other pages better.

In terms of ranking web pages, engagement metrics act like a fact-checker. Objective factors such as links and content first rank the page, then engagement metrics help Google adjust if they didn’t get it right.

The evolution of search results

Back when search engines lacked a lot of the sophistication they have today, the term “10 blue links” was coined to describe the flat structure of the SERP. Any time a search was performed, Google would return a page with 10 organic results, each in the same format.

In this search landscape, holding the #1 spot was the holy grail of SEO. But then something happened. Google began adding results in new formats on their search result pages, called SERP features. Some of these SERP features include:

  • Paid advertisements
  • Featured snippets
  • People Also Ask boxes
  • Local (map) pack
  • Knowledge panel
  • Sitelinks

And Google is adding new ones all the time. It even experimented with “zero-result SERPs,” a phenomenon where only one result from the Knowledge Graph was displayed on the SERP with no results below it except for an option to “view more results.”

The addition of these features caused some initial panic for two main reasons. For one, many of these features caused organic results to be pushed down further on the SERP. Another byproduct is that fewer searchers are clicking on the organic results since more queries are being answered on the SERP itself.

So why would Google do this? It all goes back to the search experience. User behavior indicates that some queries are better satisfied by different content formats. Notice how the different types of SERP features match the different types of query intents.

Query Intent

Possible SERP Feature Triggered

Informational

Featured Snippet

Informational with one answer

Knowledge Graph / Instant Answer

Local

Map Pack

Transactional

Shopping

We’ll talk more about intent in Chapter 3, but for now, it’s important to know that answers can be delivered to searchers in a wide array of formats, and how you structure your content can impact the format in which it appears in search.

Localized search

A search engine like Google has its own proprietary index of local business listings, from which it creates local search results.

If you are performing local SEO work for a business that has a physical location customers can visit (ex: dentist) or for a business that travels to visit their customers (ex: plumber), make sure that you claim, verify, and optimize a free Google My Business Listing.

When it comes to localized search results, Google uses three main factors to determine ranking:

  1. Relevance
  2. Distance
  3. Prominence

Relevance

Relevance is how well a local business matches what the searcher is looking for. To ensure that the business is doing everything it can to be relevant to searchers, make sure the business’ information is thoroughly and accurately filled out.

Distance

Google use your geo-location to better serve you local results. Local search results are extremely sensitive to proximity, which refers to the location of the searcher and/or the location specified in the query (if the searcher included one).

Organic search results are sensitive to a searcher’s location, though seldom as pronounced as in local pack results.

Prominence

With prominence as a factor, Google is looking to reward businesses that are well-known in the real world. In addition to a business’ offline prominence, Google also looks to some online factors to determine local ranking, such as:

Reviews

The number of Google reviews a local business receives, and the sentiment of those reviews, have a notable impact on their ability to rank in local results.

Citations

A “business citation” or “business listing” is a web-based reference to a local business’ “NAP” (name, address, phone number) on a localized platform (Yelp, Acxiom, YP, Infogroup, Localeze, etc.).

Local rankings are influenced by the number and consistency of local business citations. Google pulls data from a wide variety of sources in continuously making up its local business index. When Google finds multiple consistent references to a business’s name, location, and phone number it strengthens Google’s “trust” in the validity of that data. This then leads to Google being able to show the business with a higher degree of confidence. Google also uses information from other sources on the web, such as links and articles.

Check a local business’ citation accuracy here.

Organic ranking

SEO best practices also apply to local SEO, since Google also considers a website’s position in organic search results when determining local ranking.

In the next chapter, you’ll learn on-page best practices that will help Google and users better understand your content.

[Bonus!] Local engagement

Although not listed by Google as a local ranking determiner, the role of engagement is only going to increase as time goes on. Google continues to enrich local results by incorporating real-world data like popular times to visit and average length of visits…

Screenshot of Google SERP result for a local business showing busy times of day

…and even provides searchers with the ability to ask the business questions!

Screenshot of the Questions & Answers portion of a local Google SERP result

Undoubtedly now more than ever before, local results are being influenced by real-world data. This interactivity is how searchers interact with and respond to local businesses, rather than purely static (and game-able) information like links and citations.

Since Google wants to deliver the best, most relevant local businesses to searchers, it makes perfect sense for them to use real time engagement metrics to determine quality and relevance.


You don’t have to know the ins and outs of Google’s algorithm (that remains a mystery!), but by now you should have a great baseline knowledge of how the search engine finds, interprets, stores, and ranks content. Armed with that knowledge, let’s learn about choosing the keywords your content will target!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 3 months ago from tracking.feedpress.it

The Local SEO&rsquo;s Guide to the Buy Local Phenomenon: A Competitive Advantage for Clients

Posted by MiriamEllis

Photo credit: Michelle Shirley

What if a single conversation with one of your small local business clients could spark activity that would lead to an increase in their YOY sales of more than 7%, as opposed to only 4% if you don’t have the conversation? What if this chat could triple the amount of spending that stays in their town, reduce pollution in their community, improve their neighbors’ health, and strengthen democracy?

What if the brass ring of content dev, link opportunities, consumer sentiment and realtime local inventory is just waiting for you to grab it, on a ride we just haven’t taken yet, in a setting we’re just not talking about?

Let’s travel a different road today, one that parallels our industry’s typical conversation about citations, reviews, markup, and Google My Business. As a 15-year sailor on the Local SEO ship, I love all this stuff, but, like you, I’m experiencing a merging of online goals with offline realities, a heightened awareness of how in-store is where local business successes are born and bred, before they become mirrored on the web.

At Moz, our SaaS tools serve businesses of every kind: Digital, bricks-and-mortar, SABs, enterprises, mid-market agencies, big brands, and bootstrappers. But today, I’m going to go as small and as local as possible, speaking directly to independently-owned local businesses and their marketers about the buy local/shop local/go local movement and what I’ve learned about its potential to deliver meaningful and far-reaching successes. Frankly, I think you’ll be as amazed as I’ve been.

At the very least, I hope reading this article will inspire you to have a conversation with your local business clients about what this growing phenomenon could do for them and for their communities. Successful clients, after all, are the very best kind to have.

What is the Buy Local movement all about?

What’s the big idea?

You’re familiar with the concept of there being power in numbers. A single independent business lacks the resources and clout to determine the local decisions and policies that affect it. Should Walmart or Target be invited to set up shop in town? Should the crumbling building on Main St. be renovated or demolished? Which safety and cultural services should be supported with funding? The family running the small grocery store has little say, but if they join together with the folks running the bakery, the community credit union, the animal shelter, and the bookstore … then they begin to have a stronger voice.

Who does this?

Buy Local programs formalize the process of independently-owned businesses joining together to educate their communities about the considerable benefits to nearly everyone of living in a thriving local economy. These efforts can be initiated by merchants, Chambers of Commerce, grassroots citizen groups, or others. They can be assisted and supported by non-profit organizations like the American Independent Business Alliance (AMIBA) and the Institute for Local Self-Reliance (ILSR).

What are the goals?

Through signage, educational events, media promotions, and other forms of marketing, most Buy Local campaigns share some or all of these goals:

  • Increase local wealth that recirculates within the community
  • Preserve local character
  • Build community
  • Create good jobs
  • Have a say in policy-making
  • Decrease environmental impacts
  • Support entrepreneurship
  • Improve diversity/variety
  • Compete with big businesses

Do Buy Local campaigns actually work?

Yes – research indicates that, if managed correctly, these programs yield a variety of benefits to both merchants and residents. Consider these findings:

1) Healthy YOY sales advantages

ILSR conducted a national survey of independent businesses to gauge YOY sales patterns. 2016 respondents reported a good increase in sales across the board, but with a significant difference which AMIBA sums up:

“Businesses in communities with a sustained grassroots “buy independent/buy local” campaign reported a strong 7.4% sales increase, nearly doubling the 4.2% gain for those in areas without such an alliance.”

2) Keeping spending local

The analysts at Civic Economics conducted surveys of 10 cities to gauge the local financial impacts of independents vs. chain retailers, yielding a series of graphics like this one:

While statistics vary from community to community, the overall pattern is one of significantly greater local recirculation of wealth in the independent vs. chain environment. These patterns can be put to good use by Buy Local campaigns with the goal of increasing community-sustaining wealth.

3) Keeping communities employed and safe

Few communities can safely afford the loss of jobs and tax revenue documented in a second Civic Economics study which details the impacts of Americans’ Amazon habit, state by state and across the nation:

While the recent supreme court ruling allowing states to tax e-commerce models could improve some of these dire numbers, towns and cities with Buy Local alliances can speak plainly: Lack of tax revenue that leads to lack of funding for emergency services like fire departments is simply unsafe and unsustainable. A study done a few years back found that ⅔ of volunteer firefighters in the US report that their departments are underfunded with 86% of these heroic workers having to dip into their own pockets to buy supplies to keep their stations going. As I jot these statistics down, there is a runaway 10,000 acre wildfire burning a couple of hours north of me…

Meanwhile, Inc.com is pointing out,

“According to the Bureau of Labor Statistics, since the end of the Great Recession, small businesses have created 62 percent of all net new private-sector jobs. Among those jobs, 66 percent were created by existing businesses, while 34 percent were generated through new establishments (adjusted for establishment closings and job losses)”.

When communities have Go Local-style business alliances, they are capitalizing on the ability to create jobs, increase sales, and build up tax revenue that could make a serious difference not just to local unemployment rates, but to local safety.

4) Shaping policy

In terms of empowering communities to shape policy, there are many anecdotes to choose from, but one of the most celebrated surrounds a landmark study conducted by the Austin Independent Business Alliance which documented community impacts of spending at the local book and music stores vs. a proposed Borders. Their findings were compelling enough to convince the city not to give a $2.1 million subsidy to the now-defunct corporation.

5) Improving the local environment

A single statistic here is incredibly eye opening. According to the US Department of Transportation, shopping-related driving per household more than tripled between 1969-2009.

All you have to do is picture to yourself the centralized location of mainstreet businesses vs. big boxes on the outskirts of town to imagine how city planning has contributed to this stunning rise in time spent on the road. When residents can walk or bike to make daily purchases, the positive environmental impacts are obvious.

6) Improving residents’ health and well-being

A recent Cigna survey of 20,000 Americans found that nearly half of them always or sometimes feel lonely, lacking in significant face-to-face interactions with others. Why does this matter? Because the American Psychological Association finds that you have a 50% less chance of dying prematurely if you have quality social interactions.

There’s a reason author Jan Karon’s “Mitford” series about life in a small town in North Carolina has been a string of NY Times Best Sellers; readers and reviewers continuously state that they yearn to live someplace like this fictitious community with the slogan “Mitford takes care of its own”. In the novels, the lives of residents, independent merchants, and “outsiders” interweave, in good times and bad, creating a support network many Americans envy.

This societal setup must be a winner, as well as a bestseller, because the Cambridge Journal of Regions published a paper in which they propose that the concentration of small businesses in a given community can be equated with levels of public health.

Beyond the theory that eating fresh and local is good for you, it turns out that knowing your farmer, your banker, your grocer could help you live longer.

7) Realizing big-picture goals

Speaking of memorable stories, this video from ILSR does a good job of detailing one view of the ultimate impacts independent business alliances can have on shaping community futures:

https://www.youtube.com/watch?time_continue=150&=&v=kDw4dZLSDXg

I interviewed author and AMIBA co-founder, Jeff Milchen, about the good things that can happen when independents join hands. He summed it up,

“The results really speak for themselves when you look at what the impact of public education for local alliances has been in terms of shifting culture. It’s a great investment for independent businesses to partner with other independents, to do things they can’t do individually. Forming these partnerships can help them compete with the online giants.”

Getting going with a Go Local campaign, the right way

If sharing some of the above with clients has made them receptive to further exploration of what involvement in an independent business alliance might do for them, here are the next steps to take:

  1. First, find out if a Go Local/Shop Local/Buy Local/Stay Local campaign already exists in the business’ community. If so, the client can join up.
  2. If not, contact AMIBA. The good folks there will know if other local business owners in the client’s community have already expressed interest in creating an alliance. They can help connect the interested parties up.
  3. I highly, highly recommend reading through Amiba’s nice, free primer covering just about everything you need to know about Go Local campaigns.
  4. Encourage the client to publicize their intent to create an alliance if none exists in their community. Do an op ed in the local print news, put it on social media sites, talk to neighbors. This can prompt outreach from potential allies in the effort.
  5. A given group can determine to go it alone, but it may be better to rely on the past experience of others who have already created successful campaigns. AMIBA offers a variety of paid community training modules, including expert speakers, workshops, and on-site consultations. Each community can write in to request a quote for a training plan that will work best for them. The organization also offers a wealth of free educational materials on their website.
  6. According to AMIBA’s Jeff Milchen, a typical Buy Local campaign takes about 3-4 months to get going.

It’s important to know that Go Local campaigns can fail, due to poor execution. Here is a roundup of practices all alliances should focus on to avoid the most common pitfalls:

  1. Codify the definition of a “local” business as being independently-owned-and-run, or else big chain inclusion will anger some members and cause them to leave.
  2. Emphasize all forms of local patronage; campaigns that stick too closely to words like “buy” or “shop” overlook the small banks, service area businesses, and other models that are an integral part of the independent local economy.
  3. Ensure diversity in leadership; an alliance that fails to reflect the resources of age, race, gender/identity, political views, economics and other factors may wind up perishing from narrow viewpoints. On a related note, AMIBA has been particularly active in advocating for business communities to rid themselves of bigotry. Strong communities welcome everyone.
  4. Do the math of what success looks like; education is a major contributing factor to forging a strong alliance, based on projected numbers of what campaigns can yield in concrete benefits for both merchants and residents.
  5. Differentiate inventory and offerings so that independently-owned businesses offer something of added value which patrons can’t easily replicate online; this could be specialty local products, face-to-face time with expert staff, or other benefits.
  6. Take the high road in inspiring the community to increase local spending; campaigns should not rely on vilifying big and online businesses or asking for patronage out of pity. In other words, guilt-tripping locals because they do some of their shopping at Walmart or Amazon isn’t a good strategy. Even a 10% shift towards local spending can have positive impacts for a community!
  7. Clearly assess community resources; not every town, city, or district hosts the necessary mix of independent businesses to create a strong campaign. For example, approximately 2.2% of the US population live in “food deserts”, many miles from a grocery store. These areas may lack other local businesses, as well, and their communities may need to create grassroots campaigns surrounding neighborhood gardens, mobile markets, private investors and other creative solutions.

In sum, success significantly depends on having clear definitions, clear goals, diverse participants and a proud identity as independents, devoid of shaming tactics.

Circling back to the Web — our native heath!

So, let’s say that your incoming client is now participating in a Buy Local program. Awesome! Now, where do we go from here?

In speaking with Jeff Milchen, I asked what he has seen in terms of digital marketing being used to promote the businesses involved in Buy Local campaigns. He said that, while some alliances have workshops, it’s a work in progress and something he hopes to see grow in the future.

As a Local SEO, that future is now for you and your fortunate clients. Here are some ways I see this working out beautifully:

Basic data distribution and consistency

Small local businesses can sometimes be unaware of inconsistent or absent local business listings, because the owners are just so busy. The quickest way I know to demo this scenario is to plug the company name and zip into the free Moz Check Listing tool to show them how they’re doing on the majors. Correct data errors and fill in the blanks, either manually, or, using affordable software like Moz Local. You’ll also want to be sure the client has a presence on any geo or industry-specific directories and platforms. It’s something your agency can really help with!

A hyperlocalized content powerhouse

Build proud content around the company’s involvement in the Buy Local program.

  • Write about all of the economic, environmental, and societal benefits residents can support by patronizing the business.
  • Motivated independents take time to know their customers. There are stories in this. Write about the customers and their needs. I’ve even seen independent restaurants naming menu items after beloved patrons. Get personal. Build community.
  • Don’t forget that even small towns can be powerful points of interest for tourists. Create a warm welcome for travelers, and for new neighbors, too!

Link building opportunities of a lifetime

Local business alliances form strong B2B bonds.

  • Find relationships with related businesses that can sprout links. For example, the caterer knows the wedding cake baker, who knows the professional seamstress, who knows the minister, who knows the DJ, who knows the florist.
  • Dive deep into opportunities for sponsoring local organizations, teams and events, hosting and participating in workshops and conferences, offering scholarships and special deals.
  • Make fast friends with local media. Be newsworthy.

A wellspring of sentiment

Independents form strong business-to-community bonds.

  • When a business really knows its customers, asking for online reviews is so much easier. In some communities, it may be necessary to teach customers how to leave reviews, but once you get a strategy going for this, the rest is gravy.
  • It’s also a natural fit for asking for written and video testimonials to be published on the company website.
  • Don’t forget the power of Word of Mouth Marketing, while you’re at it. Loyal patrons are an incredible asset.
  • The one drawback could be if your business model is one of a sensitive nature. Tight-knit communities can be ones in which residents may be more desirous of protecting their privacy.

Digitize inventory easily

30% of consumers say they’d buy from a local store instead of online if they knew the store was nearby (Google). Over half of consumers prefer to shop in-store to interact with products (Local Search Association). Over 63% of consumers would rather buy from a company they consider to be authentic over the competition (Bright Local).

It all adds up to the need for highly-authentic independently-owned businesses to have an online presence that signals to Internet users that they stock desired products. For many small, local brands, going full e-commerce on their website is simply too big of an implementation and management task. It’s a problem that’s dogged this particular business sector for years. And it’s why I got excited when the folks at AMIBA told me to check out Pointy.

Pointy offers a physical device that small business owners can attach to their barcode scanner to have their products ported to a Pointy-controlled webpage. But, that’s not all. Pointy integrates with the “See What’s In Store” inventory function of Google My Business Knowledge Panels. Check out Talbot’s Toyland in San Mateo, CA for a live example.

Pointy is a startup, but one that is exciting enough to have received angel investing from the founder of WordPress and the co-founder of Google Maps. Looks like a real winner to me, and it could provide a genuine answer for brick-and-mortar independents who have found their sales staggering in the wake of Amazon and other big digital brands.

Local SEOs have an important part to play

Satisfaction in work is a thing to be cherished. If the independent business movement speaks to you, bringing your local search marketing skills to these alliances and small brands could make more of your work days really good days.

The scenario could be an especially good fit for agencies that have specialized in city or state marketing. For example, one of our Moz Community members confines his projects to South Carolina. Imagine him taking it on the road a bit, hosting and attending workshops for towns across the state that are ready to revitalize main street. An energetic client roster could certainly result if someone like him could show local banks, grocery stores, retail shops and restaurants how to use the power of the local web!

Reading America

Our industry is living and working in complex times.

The bad news is, a current Bush-Biden poll finds that 8/10 US residents are “somewhat” or “very” concerned about the state of democracy in our nation.

The not-so-bad news is that citizen ingenuity for discovering solutions and opportunities is still going strong. We need only look as far as the runaway success of the TV show “Fixer Upper”, which drew 5.21 million viewers in its fourth season as the second-largest telecast of Q2 of that year. The show surrounded the revitalization of dilapidated homes and businesses in and around Waco, Texas, and has turned the entire town into a major tourist destination, pulling in millions of annual visitors and landing book deals, a magazine, and the Magnolia Home furnishing line for its entrepreneurial hosts.

While not every town can (or would want to) experience what is being called the “Magnolia effect”, channels like HGTV and the DIY network are heavily capitalizing on the rebirth of American communities, and private citizens are taking matters into their own hands.

There’s the family who moved from Washington D.C. to Water Valley, Mississippi, bought part of the decaying main street and began to refurbish it. I found the video story of this completely riveting, and look at the Yelp reviews of the amazing grocery store and lunch counter these folks are operating now. The market carries local products, including hoop cheese and milk from the first dairy anyone had opened in 50 years in the state.

There are the half-dozen millennials who are helping turn New Providence, Iowa into a place young families can live and work again. There’s Corning, NY, Greensburg, KS, Colorado Springs, CO, and so many more places where people are eagerly looking to strengthen community sufficiency and sustainability.

Some marketing firms are visionary forerunners in this phenomenon, like Deluxe, which has sponsored the Small Business Revolution show, doing mainstreet makeovers that are bringing towns back to life. There could be a place out there somewhere on the map of the country, just waiting for your agency to fill it.

The best news is that change is possible. A recent study in Science magazine states that the tipping point for a minority group to change a majority viewpoint is 25% of the population. This is welcome news at a time when 80% of citizens are feeling doubtful about the state of our democracy. There are 28 million small businesses in the United States – an astonishing potential educational force – if communities can be taught what a vote with their dollar can do in terms of giving them a voice. As Jeff Milchen told me:

One of the most inspiring things is when we see local organizations helping residents to be more engaged in the future of their community. Most communities feel somewhat powerless. When you see towns realize they have the ability to shift public policy to support their own community, that’s empowering.”

Sometimes, the extremes of our industry can make our society and our democracy hard to read. On the one hand, the largest brands developing AI, checkout-less shopping, driverless cars, same-day delivery via robotics, and the gig economy win applause at conferences.

On the other hand, the public is increasingly hearing the stories of employees at these same companies who are protesting Microsoft developing face recognition for ICE, Google’s development of AI drone footage analysis for the Pentagon, working conditions at Amazon warehouses that allegedly preclude bathroom breaks and have put people in the hospital, and the various outcomes of the “Walmart Effect”.

The Buy Local movement is poised in time at this interesting moment, in which our democracy gets to choose. Gigs or unions? Know your robot or know your farmer? Convenience or compassion? Is it either/or? Can it be both?

Both big and small brands have a major role to play in answering these timely questions and shaping the ethics of our economy. Big brands, after all, have tremendous resources for raising the bar for ethical business practices. Your agency likely wants to serve both types of clients, but it’s all to the good if all business sectors remember that the real choosers are the “consumers”, the everyday folks voting with their dollars.

I know that it can be hard to find good news sometimes. But I’m hoping what you’ve read today gifts you with a feeling of optimism that you can take to the office, take to your independently-owned local business clients, and maybe even help take to their communities. Spark a conversation today and you may stumble upon a meaningful competitive advantage for your agency and its most local customers.

Every year, local SEOs are delving deeper and deeper into the offline realities of the brands they serve, large and small. We’re learning so much, together. It’s sometimes a heartbreaker, but always an honor, being part of this local journey.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The Guide to Local Sponsorship Marketing – The 2018 Edition

Posted by Claudia0428

For most Moz readers, local marketing means content, reviews, AdWords, local listings, and of course citations. If you’re a larger brand, you might be doing outdoor, radio, print, and television advertising as well. Today we’re here to humbly submit that local sponsorships remain the most-overlooked and opportunity-rich channel, and they build real local connections for both large brands and small business alike.

This article is the second edition of the ZipSprout team’s guide to local sponsorships. We wrote the first edition in 2016 after a few months of securing local sponsorship campaigns for a handful of clients. Since then, we’ve tripled our client roster and we’ve worked with more than 8,000 local organizations, donating nearly $1,000,000 in local sponsorships to 1,300+ opportunities. Since then we’ve also learned how to build campaigns for local presence.

So we knew the guide was due for a reboot.

One of our most significant learnings of the past two years is the understanding of local sponsorships as a channel in their own right. They can be directed toward local SEO or local marketing campaigns, but sponsorships are their own breed of local connection — and just like content campaigns, local PR campaigns, or review management, local sponsorships have their own set of conventions and best practices.

This article is meant for anyone with an eye toward local sponsorships as a marketing channel. Agencies and enterprise organizations may find it particularly helpful, but we’re big believers in encouraging smaller local businesses to engage in sponsorships too. Get out there and meet your neighbors!


The what & why of local sponsorships

Local events, nonprofits, and associations constitute a disjointed but very real network of opportunities. Unlike other channels, local sponsorships aren’t accessible from a single platform, but we’ve found that many sponsorships share similarities. This makes it possible to develop processes that work for campaigns in any metro area.

Local sponsorships are also a unique channel in that the benefits can range from the digital to the analog: from local links to a booth, from social posts to signage on a soccer field. The common thread is joining the community by partnering with local organizations, but the benefits themselves vary widely.

We’ve identified and track 24 unique benefits of sponsorships related to local marketing:

  1. Ad (full or partial)
  2. Advertising on event app
  3. Blog post featuring sponsor
  4. Booth, tent, or table at event
  5. Event named for sponsor
  6. Guest post on organization blog
  7. Inclusion in press release
  8. Link in email newsletter
  9. Link on website
  10. Logo on event t-shirt or other swag
  11. Logo on signage
  12. Logo or name on website
  13. Media spots (television/radio/newspaper)
  14. Mention in email newsletter
  15. Mention in publicity materials, such as programs & other printed materials
  16. Networking opportunity
  17. Physical thing (building, etc.) named for sponsor
  18. Social media mention
  19. Speaking opportunity at event
  20. Sponsor & sponsor’s employees receive discounts on services/products/events
  21. Sponsor can donate merchandise for goodie bags
  22. Sponsored post (on blog or online magazine)
  23. Tickets to event
  24. Verbal recognition

There are probably more, but in our experience most benefits fall into these core categories. That said, these benefits aren’t necessarily for everyone…

Who shouldn’t do local sponsorships?

1. Don’t do local sponsorships if you need fast turnaround.

Campaigns can take 1–3 months from launch until fulfillment. If you’re in a hurry to see a return, just increase your search ad budget.

2. Don’t do local sponsorships if you’re not okay with the branding component.

Local link building can certainly be measured, as can coupon usage, email addresses gathered for a drawing, etc… But measuring local brand lift still isn’t a perfect art form. Leave pure attribution to digital ads.

3. Don’t do local sponsorships with a “one size fits all” expectation.

The great thing about local events and opportunities is their diversity. While some components can be scaled, others require high touch outreach, more similar to a PR campaign.

Considerations for agencies vs brands in local sponsorship campaigns

Agencies, especially if they’re creating sponsorship campaigns for multiple clients, can cast a wide net and select from the best opportunities that return. Even if a potential partnership isn’t a good fit for a current client, they may work for a client down the road. Brands, on the other hand, need to be a little more goal and mission-focused during prospecting and outreach. If they’re reaching out to organizations that are clearly a bad fit, they’re wasting everyone’s time.

Brands also need to be more careful because they have a consumer-facing image to protect. As with any outreach campaign, there are dos and don’ts and best practices that all should follow (DO be respectful; DON’T over-email), but brands especially have more to lose from an outreach faux pas.


Our process

Outreach

Once we’ve identified local organizations in a given metro area, we recommend reaching out with an email to introduce ourselves and learn more about sponsorship opportunities. In two years, the ZipSprout team has A/B tested 100 different email templates.

With these initial emails, we’re trying to inform without confusing or scaring away potential new partners. Some templates have resulted in local organizations thinking we’re asking them for sponsorship money or that we want to charge them for a service. Oops! A/B tests have helped to find the best wording for clarity and, in turn, response rate.

Here are some of our learnings:

1. Mentioning location matters.

We reached out to almost 1,000 Chicago organizations in the spring of 2017. When we mentioned Chicago in the email, the response rate increased by 20%.

2. Emails sent to organizations who already had sponsorship info on their websites were most successful if the email acknowledged the onsite sponsorship info and asked for confirmation.

These are also our most successful outreach attempts, likely because these organizations are actively looking for sponsors (as signified by having sponsorship info on their site). Further, by demonstrating that we’ve been on their site, we’re signaling a higher level of intent.

3. Whether or not we included an outreacher phone number in email signatures had no effect on response rate.

If anything, response rates were higher for emails with no phone number in signature, at 41% compared with 40.2%.

4. Shorter is better when it comes to outreach emails.

Consider the following two emails:

EMAIL A


Hi [NAME],

I sent an email last week, but in case you missed it, I figured I’d follow up. 🙂

I work to help corporate clients find local sponsorships. We’re an agency that helps our business clients identify and sponsor local organizations like [ORG NAME]. We’re paid by businesses who are looking for local sponsorships.

Often, local organizations are overlooked, so my company, ZipSprout, works for businesses who want to sponsor locally, but aren’t sure who to partner with. To that end, I’d love to learn more about [ORG NAME] and see what sponsorship opportunities you have available. Is there a PDF or list of cost and benefits you can share over email or a phone call?


Thanks,

___

EMAIL B

Hi [NAME],

I sent an email last week, but in case you missed it, I figured I’d follow up. 🙂

I’d love to learn more about [ORG NAME] and see what sponsorships you have available. Is there a PDF or list of cost and benefits you can share over email or a phone call?


Thanks,

___

In an 800-email test, Email B performed 30% better than Email A.

Matchmaking: How can I choose a sponsorship opportunity that fits my brand?

There are many ways to evaluate potential sponsorships.

These are the questions that help us match organizations with clients:

  • Who is your brand targeting (women, senior citizens, family-friendly, dog owners, new parents)?
  • Do you want to tie your brand with a particular cause (eco-friendly, professional associations, awareness foundations, advocacy groups)?
  • Is your campaign based on location? Are you launching your brand in a particular city? A particular zip code?
  • What is your total budget and per-sponsorship range? A top max price or a price range is a useful parameter — and perhaps the most important.

Once the campaign goals are determined, we filter through opportunities based partially on their online presence. We look at Domain Authority, location, website aesthetics, and other sponsors (competitors and non-competitors) in addition to Reach Score (details below).

Further, we review backlinks, organic traffic, and referring domains. We make sure that this nonprofit partnership is not spammy or funky from an SEO perspective and that is a frequently visited website. A small organization may not have all the juicy digital metrics, but by gauging event attendance or measuring organic traffic we can further identify solid prospects that could have been missed otherwise.

We also look at social media presence; event attendance, event dates and how responsive these organizations or event organizers are. Responsiveness, we have learned, is a CRITICAL variable. It can be the determining point of your link going live in 48 hours or less, as opposed to 6+ months from payment.

Reach Score

From a numbers perspective, Domain Authority is a good way to appreciate the value of a website, but it doesn’t tell the whole story when it comes to local marketing. To help fill in the gaps we created Reach Score, which combines virtual measures (like Domain Authority) with social measures (friends/followers) and physical measures (event attendance). The score ranks entities based on their metro area, so we’re not comparing the reach of an organization in Louisville, KY to one in NYC.

As of March 2018, we have about 8,000 organizations with valid Reach Scores across four metro areas — Raleigh/Durham, Boston, Houston, and Chicago. The average Reach Score is 37 out of 100. Of the 34 types of organizations that we track, the most common is Event Venue/Company (average Reach Score of 38), followed by Advocacy Groups (43) and Sports Teams/Clubs/Leagues (22). The types of organizations with the highest Reach Scores are Local Government (64), Museums (63), and Parks and Recreation (55).

Thanks to Reach Score, we’ve found differences between organizations from city to city as well. In Raleigh-Durham, the entities with the highest reach tend to be government-related organizations, such as Chambers of Commerce and Parks & Rec Departments.

In Boston, the highest reach tends to fall to arts organizations, such as music ensembles, as well as professional associations. This score serves as a good reminder that each metro area has a unique community of local organizations. (Read more about our Reach Score findings here.)

Fulfillment

Our campaigns used to take several months to complete, from contract to final sponsorship. Now our average fulfillment rate is 18.7 days, regardless of our project size! Staying (politely) on top of the communication with the nonprofit organizations was the main driver for this improvement.

We find further that the first 48 hours from sending a notification of sponsorship on behalf of your brand are crucial to speedy campaigns. Be ready to award the sponsorship funds in a timely manner and follow up with a phone call or an email, checking in to see if these funds have been received.

It’s okay to ask when can you expect the sponsorship digital benefits to go live and how to streamline the process for any other deliverables needed to complete the sponsorship.

Applying these simple best practices, our team has been able to run a campaign in a week or less.

Two important concepts to remember about the sponsorship channel from the fulfillment perspective:

  1. It’s difficult to fulfill. If your city project involves any more than two or three sponsorships, you’re in for multiple hours of follow ups, reminders, phone calls, etc. There is the desire from most local organizations to honor their sponsors and keep them happy. That said, we’ve learned that keeping the momentum going serves as an important reminder for the nonprofit. This can involve phone call reminders and emails for links to go live and other benefits to come through. Again, be polite and respectful.
  2. It’s SO worth all the effort though! It shows that your brand cares. A sponsorship campaign is a fantastic way to get in front of your target audience in areas that have a special meaning at a personal level. And not in a broad general scope, but locally. Locally sponsoring a beach cleanup in Santa Monica gives you the opportunity to impact a highly localized audience with a very particular cause in mind that would ultimately affect their everyday life, as opposed to partnering with a huge foundation advocating for clean oceans.

Enhancing a local campaign

Some prefer to use local sponsorships as a link building effort, but there are ways — and ample benefit — to going far beyond the link.

Local event attendance

So, so many local sponsorship campaigns come with the opportunity for event attendance. We currently have 11,345 opportunities in our database (62.2% of our total inventory) that feature events: 5Ks, galas, performances, parades, and even a rubber ducky derby or two! If you’re able to send local team members, find opportunities that match your target audience and test it out — and bring your camera so your social and brand team will have material for publication. If local team members aren’t an option, consider working with a notable and ambitious startup such as Field Day, which can send locals out on behalf of your brand. We’ve spoken with them on several occasions and found them adaptable and wonderful to work with.

Coupons/invitations

One client, FunBrands, used local sponsorships as a way to reach out to locals ahead of stores’ grand re-openings (read the full case study here).

For another client, we created unique coupons for each local organization, using print and social media posts for distribution.

An example coupon — use codes to track attribution back to an event.


Conclusion: Local sponsorships are a channel

Sponsorships are an actionable strategy that contribute to your local rankings, while providing unprecedented opportunities for community engagement and neighborly branding. We hope that this updated guide will provide a strong operational overview along with realistic expectations — and even inspirations — for a local sponsorship campaign in your target cities.

Last but not least: As with all outreach campaigns, please remember to be human. Keep in mind that local engagements are the living extension of your brand in the real world. And if somehow this article wasn’t enough, we just finished up The Local Sponsorship Playbook. Every purchase comes with a 30-minute consultation with the author. We hope everyone chooses to get out, get local, and join the community in the channel that truly benefits everyone.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 7 months ago from tracking.feedpress.it

Rewriting the Beginner’s Guide to SEO

Posted by BritneyMuller

(function($) {
// code using $ as alias to jQuery
$(function() {
// Hide the hypotext content.
$(‘.hypotext-content’).hide();
// When a hypotext link is clicked.
$(‘a.hypotext.closed’).click(function (e) {
// custom handling here
e.preventDefault();
// Create the class reference from the rel value.
var id = ‘.’ + $(this).attr(‘rel’);
// If the content is hidden, show it now.
if ( $(id).css(‘display’) == ‘none’ ) {
$(id).show(‘slow’);
if (jQuery.ui) {
// UI loaded
$(id).effect(“highlight”, {}, 1000);
}
}
// If the content is shown, hide it now.
else {
$(id).hide(‘slow’);
}
});
// If we have a hash value in the url.
if (window.location.hash) {
// If the anchor is within a hypotext block, expand it, by clicking the
// relevant link.
console.log(window.location.hash);
var anchor = $(window.location.hash);
var hypotextLink = $(‘#’ + anchor.parents(‘.hypotext-content’).attr(‘rel’));
console.log(hypotextLink);
hypotextLink.click();
// Wait until the content has expanded before jumping to anchor.
//$.delay(1000);
setTimeout(function(){
scrollToAnchor(window.location.hash);
}, 1000);
}
});
function scrollToAnchor(id) {
var anchor = $(id);
$(‘html,body’).animate({scrollTop: anchor.offset().top},’slow’);
}
})(jQuery);

.hypotext-content {
position: relative;
padding: 10px;
margin: 10px 0;
border-right: 5px solid;
}
a.hypotext {
border-bottom: 1px solid;
}
.hypotext-content .close:before {
content: “close”;
font-size: 0.7em;
margin-right: 5px;
border-bottom: 1px solid;
}
a.hypotext.close {
display: block;
position: absolute;
right: 0;
top: 0;
line-height: 1em;
border: none;
}

Many of you reading likely cut your teeth on Moz’s Beginner’s Guide to SEO. Since it was launched, it’s easily been our top-performing piece of content:

Most months see 100k+ views (the reverse plateau in 2013 is when we changed domains).

While Moz’s Beginner’s Guide to SEO still gets well over 100k views a month, the current guide itself is fairly outdated. This big update has been on my personal to-do list since I started at Moz, and we need to get it right because — let’s get real — you all deserve a bad-ass SEO 101 resource!

However, updating the guide is no easy feat. Thankfully, I have the help of my fellow Mozzers. Our content team has been a collective voice of reason, wisdom, and organization throughout this process and has kept this train on its tracks.

Despite the effort we’ve put into this already, it felt like something was missing: your input! We’re writing this guide to be a go-to resource for all of you (and everyone who follows in your footsteps), and want to make sure that we’re including everything that today’s SEOs need to know. You all have a better sense of that than anyone else.

So, in order to deliver the best possible update, I’m seeking your help.

This is similar to the way Rand did it back in 2007. And upon re-reading your many “more examples” requests, we’ve continued to integrate more examples throughout.

The plan:

  • Over the next 6–8 weeks, I’ll be updating sections of the Beginner’s Guide and posting them, one by one, on the blog.
  • I’ll solicit feedback from you incredible people and implement top suggestions.
  • The guide will be reformatted/redesigned, and I’ll 301 all of the blog entries that will be created over the next few weeks to the final version.
  • It’s going to remain 100% free to everyone — no registration required, no premium membership necessary.

To kick things off, here’s the revised outline for the Beginner’s Guide to SEO:

Click each chapter’s description to expand the section for more detail.

Chapter 1: SEO 101

What is it, and why is it important? ↓

  • What is SEO?
  • Why invest in SEO?
  • Do I really need SEO?
  • Should I hire an SEO professional, consultant, or agency?

Search engine basics:

  • Google Webmaster Guidelines basic principles
  • Bing Webmaster Guidelines basic principles
  • Guidelines for representing your business on Google

Fulfilling user intent

Know your SEO goals


Chapter 2: Crawlers & Indexing

First, you need to show up. ↓

How do search engines work?

  • Crawling & indexing
  • Determining relevance
  • Links
  • Personalization

How search engines make an index

  • Googlebot
  • Indexable content
  • Crawlable link structure
  • Links
  • Alt text
  • Types of media that Google crawls
  • Local business listings

Common crawling and indexing problems

  • Online forms
  • Blocking crawlers
  • Search forms
  • Duplicate content
  • Non-text content

Tools to ensure proper crawl & indexing

  • Google Search Console
  • Moz Pro Site Crawl
  • Screaming Frog
  • Deep Crawl

How search engines order results

  • 200+ ranking factors
  • RankBrain
  • Inbound links
  • On-page content: Fulfilling a searcher’s query
  • PageRank
  • Domain Authority
  • Structured markup: Schema
  • Engagement
  • Domain, subdomain, & page-level signals
  • Content relevance
  • Searcher proximity
  • Reviews
  • Business citation spread and consistency

SERP features

  • Rich snippets
  • Paid results
  • Universal results
    • Featured snippets
    • People Also Ask boxes
  • Knowledge Graph
  • Local Pack
  • Carousels

Chapter 3: Keyword Research

Next, know what to say and how to say it. ↓

How to judge the value of a keyword

The search demand curve

  • Fat head
  • Chunky middle
  • Long tail

Four types of searches:

  • Transactional queries
  • Informational queries
  • Navigational queries
  • Commercial investigation

Fulfilling user intent

Keyword research tools:

  • Google Keyword Planner
  • Moz Keyword Explorer
  • Google Trends
  • AnswerThePublic
  • SpyFu
  • SEMRush

Keyword difficulty

Keyword abuse

Content strategy {link to the Beginner’s Guide to Content Marketing}


Chapter 4: On-Page SEO

Next, structure your message to resonate and get it published. ↓

Keyword usage and targeting

Keyword stuffing

Page titles:

  • Unique to each page
  • Accurate
  • Be mindful of length
  • Naturally include keywords
  • Include branding

Meta data/Head section:

  • Meta title
  • Meta description
  • Meta keywords tag
    • No longer a ranking signal
  • Meta robots

Meta descriptions:

  • Unique to each page
  • Accurate
  • Compelling
  • Naturally include keywords

Heading tags:

  • Subtitles
  • Summary
  • Accurate
  • Use in order

Call-to-action (CTA)

  • Clear CTAs on all primary pages
  • Help guide visitors through your conversion funnels

Image optimization

  • Compress file size
  • File names
  • Alt attribute
  • Image titles
  • Captioning
  • Avoid text in an image

Video optimization

  • Transcription
  • Thumbnail
  • Length
  • “~3mo to YouTube” method

Anchor text

  • Descriptive
  • Succinct
  • Helps readers

URL best practices

  • Shorter is better
  • Unique and accurate
  • Naturally include keywords
  • Go static
  • Use hyphens
  • Avoid unsafe characters

Structured data

  • Microdata
  • RFDa
  • JSON-LD
  • Schema
  • Social markup
    • Twitter Cards markup
    • Facebook Open Graph tags
    • Pinterest Rich Pins

Structured data types

  • Breadcrumbs
  • Reviews
  • Events
  • Business information
  • People
  • Mobile apps
  • Recipes
  • Media content
  • Contact data
  • Email markup

Mobile usability

  • Beyond responsive design
  • Accelerated Mobile Pages (AMP)
  • Progressive Web Apps (PWAs)
  • Google mobile-friendly test
  • Bing mobile-friendly test

Local SEO

  • Business citations
  • Entity authority
  • Local relevance

Complete NAP on primary pages

Low-value pages


Chapter 5: Technical SEO

Next, translate your site into Google’s language. ↓

Internal linking

  • Link positioning
  • Anchor links

Common search engine protocols

  • Sitemaps
    • Mobile
    • News
    • Image
    • Video
  • XML
  • RSS
  • TXT

Robots

  • Robots.txt
    • Disallow
    • Sitemap
    • Crawl Delay
  • X-robots
  • Meta robots
    • Index/noindex
    • Follow/nofollow
  • Noimageindex
  • None
  • Noarchive
  • Nocache
  • No archive
  • No snippet
  • Noodp/noydir
  • Log file analysis
  • Site speed
  • HTTP/2
  • Crawl errors

Duplicate content

  • Canonicalization
  • Pagination

What is the DOM?

  • Critical rendering path
  • Help robots find the most important code first

Hreflang/Targeting multiple languages

Chrome DevTools

Technical site audit checklist


Chapter 6: Establishing Authority

Finally, turn up the volume. ↓

Link signals

  • Global popularity
  • Local/topic-specific popularity
  • Freshness
  • Social sharing
  • Anchor text
  • Trustworthiness
    • Trust Rank
  • Number of links on a page
  • Domain Authority
  • Page Authority
  • MozRank

Competitive backlinks

  • Backlink analysis

The power of social sharing

  • Tapping into influencers
  • Expanding your reach

Types of link building

  • Natural link building
  • Manual link building
  • Self-created

Six popular link building strategies

  1. Create content that inspires sharing and natural links
  2. Ego-bait influencers
  3. Broken link building
  4. Refurbish valuable content on external platforms
  5. Get your customers/partners to link to you
  6. Local community involvement

Manipulative link building

  • Reciprocal link exchanges
  • Link schemes
  • Paid links
  • Low-quality directory links
  • Tiered link building
  • Negative SEO
  • Disavow

Reviews

  • Establishing trust
  • Asking for reviews
  • Managing reviews
  • Avoiding spam practices

Chapter 7: Measuring and Tracking SEO

Pivot based on what’s working. ↓

KPIs

  • Conversions
  • Event goals
  • Signups
  • Engagement
  • GMB Insights:
    • Click-to-call
    • Click-for-directions
  • Beacons

Which pages have the highest exit percentage? Why?

Which referrals are sending you the most qualified traffic?

Pivot!

Search engine tools:

  • Google Search Console
  • Bing Webmaster Tools
  • GMB Insights

Appendix A: Glossary of Terms

Appendix B: List of Additional Resources

Appendix C: Contributors & Credits


What did you struggle with most when you were first learning about SEO? What would you have benefited from understanding from the get-go?

Are we missing anything? Any section you wish wouldn’t be included in the updated Beginner’s Guide?

Thanks in advance for contributing.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 1 year ago from tracking.feedpress.it

The Beginner&rsquo;s Guide to Structured Data for SEO: How to Implement Structured Data [Part 2]

Posted by bridget.randolph

Welcome to Part 2 of The Beginner’s Guide to Structured Data: How to Implement Structured Data for SEO. In Part 1, we focused on gaining a high-level understanding of what structured data is and how it can be used to support SEO efforts.

(If you missed Part 1, you can go check it out here).

In Part 2, we’ll be looking at the steps to identify opportunities and implement structured data for SEO on your website. Since this is an introductory guide, I’ll be focusing on the most basic types of markup you can add and the most common use cases, and providing resources with additional detail for the more technical aspects of implementation.

Is structured data right for you?

Generally speaking, implementing structured data for SEO is worthwhile for most people. However, it does require a certain level of effort and resources, and you may be asking yourself whether it’s worth prioritizing.

Here are some signs that it’s a good time to prioritize structured data for SEO:

  • Search is a key value-driving channel for your business
  • You’ve recently audited your site for basic optimization issues and you know that you’ve achieved a competitive baseline with your keyword targeting, backlinks profile, site structure, and technical setup
  • You’re in a competitive vertical and need your results to stand out in the SERPs
  • You want to use AMP (Accelerated Mobile Pages) as a way to show up in featured areas of the SERP, including carousels
  • You have a lot of article-style content related to key head terms (e.g. 10 chicken recipes) and you’d like a way to display multiple results for those terms in the SERP
  • You’re ranking fairly well (position 15 or higher) already for terms with significant search volume (5000–50,000 searches/month)*
  • You have solid development resources with availability on staff and can implement with minimal time and financial investment
  • You’re in any of the following verticals: e-commerce, publishing, educational products, events/ticketing, creative production, TV/movie/book reviews, job listings, local business

*What is considered significant volume may vary according to how niche your market is.

If you said yes to any of these statements, then implementing structured data is particularly relevant to you! And if these criteria don’t currently apply to you, of course you can still go ahead and implement; you might have great results. The above are just a few of the most common indicators that it’s a worthwhile investment.

Implementing structured data on your site

In this guide, we will be looking solely at opportunities to implement Schema.org markup, as this is the most extensive vocabulary for our purposes. Also, because it was developed by the search engine companies themselves, it aligns with what they support now and should continue to be the most supported framework going forward.

How is Schema.org data structured?

The way that the Schema.org vocabulary is structured is with different “types” (Recipe, Product, Article, Person, Organization, etc.) that represent entities, kinds of data, and/or content types.

Each Type has its own set of “properties” that you can use to identify the attributes of that item. For example, a “Recipe” Type includes properties like “image,” “cookTime,” “nutritionInformation,” etc. When you mark up a recipe on your site with these properties, Google is able to present those details visually in the SERP, like this:

Image source

In order to mark up your content with Schema.org vocabulary, you’ll need to define the specific properties for the Type you’re indicating.

For example:

If you’re marking up a recipe page, you need to include the title and at least two other attributes. These could be properties like:

  • aggregateRating: The averaged star rating of the recipe by your users
  • author: The person who created the recipe
  • prepTime: The length of time required to prepare the dish for cooking
  • cookTime: The length of time required to cook the dish
  • datePublished: Date of the article’s publication
  • image: An image of the dish
  • nutritionInformation: Number of calories in the dish
  • review: A review of the dish
  • …and more.

Each Type has different “required” properties in order to work correctly, as well as additional properties you can include if relevant. (You can view a full list of the Recipe properties at Schema.org/Recipe, or check out Google’s overview of Recipe markup.)

Once you know what Types, properties and data need to be included in your markup, you can generate the code.

The code: Microdata vs JSON-LD

There are two common approaches to adding Schema.org markup to your pages: Microdata (in-line annotations added directly to the relevant HTML) and JSON-LD (which uses a Javascript script tag to insert the markup into the head of the page).

JSON-LD is Google’s recommended approach, and in general is a cleaner, simpler implementation… but it is worth noting that Bing does not yet officially support JSON-LD. Also, if you have a WordPress site, you may be able to use a plugin (although be aware that not all of WordPress’ plugins work they way they’re supposed to, so it’s especially important to choose one with good reviews, and test thoroughly after implementation).

Whatever option you choose to use, always test your implementation to make sure Google is seeing it show up correctly.

What does this code look like?

Let’s look at an example of marking up a very simple news article (Schema.org/NewsArticle).


Here’s the article content (excluding body copy), with my notes about what each element is:

[posted by publisher ‘Google’]
[headline]Article Headline
[author byline]By John Doe
[date published] Feb 5, 2015
[description] A most wonderful article
[image]
[company logo]

And here’s the basic HTML version of that article:

<div>
  <h2>Article headline</h2>
  <h3>By John Doe</h3>
    <div>
    <img src="https://google.com/thumbnai1.jpg"/>
    </div>
  <div>
      <img src="https://google.com/logo.jpg"/>
      </div>

If you use Microdata, you’ll nest your content inside the relevant meta tags for each piece of data. For this article example, your Microdata code might look like this (within the <body> of the page):

<div itemscope itemtype="http://schema.org/NewsArticle">
  <meta itemscope itemprop="mainEntityOfPage"  itemType="https://schema.org/WebPage" itemid="https://google.com/article"/>
  <h2 itemprop="headline">Article headline</h2>
  <h3 itemprop="author" itemscope itemtype="https://schema.org/Person">
    By <span itemprop="name">John Doe</span>
  </h3>
  <span itemprop="description">A most wonderful article</span>
  <div itemprop="image" itemscope itemtype="https://schema.org/ImageObject">
    <img src="https://google.com/thumbnail1.jpg"/>
    <meta itemprop="url" content="https://google.com/thumbnail1.jpg">
    <meta itemprop="width" content="800">
    <meta itemprop="height" content="800">
  </div>
  <div itemprop="publisher" itemscope itemtype="https://schema.org/Organization">
    <div itemprop="logo" itemscope itemtype="https://schema.org/ImageObject">
      <img src="https://google.com/logo.jpg"/>
      <meta itemprop="url" content="https://google.com/logo.jpg">
      <meta itemprop="width" content="600">
      <meta itemprop="height" content="60">
    </div>
    <meta itemprop="name" content="Google">
  </div>
  <meta itemprop="datePublished" content="2015-02-05T08:00:00+08:00"/>
  <meta itemprop="dateModified" content="2015-02-05T09:20:00+08:00"/>
</div>

The JSON-LD version would usually be added to the <head> of the page, rather than integrated with the <body> content (although adding it in the <body> is still valid).

JSON-LD code for this same article would look like this:

<script type="application/ld+json">
{
  "@context": "http://schema.org",
  "@type": "NewsArticle",
  "mainEntityOfPage": {
    "@type": "WebPage",
    "@id": "https://google.com/article"
  },
  "headline": "Article headline",
  "image": {
    "@type": "ImageObject",
    "url": "https://google.com/thumbnail1.jpg",
    "height": 800,
    "width": 800
  },
  "datePublished": "2015-02-05T08:00:00+08:00",
  "dateModified": "2015-02-05T09:20:00+08:00",
  "author": {
    "@type": "Person",
    "name": "John Doe"
  },
   "publisher": {
    "@type": "Organization",
    "name": "Google",
    "logo": {
      "@type": "ImageObject",
      "url": "https://google.com/logo.jpg",
      "width": 600,
      "height": 60
    }
  },
  "description": "A most wonderful article"
}
</script>

This is the general style for Microdata and JSON-LD code (for Schema.org/Article). The Schema.org website has a full list of every supported Type and its Properties, and Google has created “feature guides” with example code for the most common structured data use cases, which you can use as a reference for your own code.

How to identify structured data opportunities (and issues)

If structured data has previously been added to your site (or if you’re not sure whether it has), the first place to check is the Structured Data Report in Google Search Console.

This report will tell you not only how many pages have been identified as containing structured data (and how many of these have errors), but may also be able to identify where and/or why the error is occurring. You can also use the Structured Data Testing Tool for debugging any flagged errors: as you edit the code in the tool interface, it will flag any errors or warnings.

If you don’t have structured data implemented yet, or want to overhaul your setup from scratch, the best way to identify opportunities is with a quick content audit of your site, based on the kind of business you have.

A note on keeping it simple

There are lots of options when it comes to Schema.org markup, and it can be tempting to go crazy marking up everything you possibly can. But best practice is to keep focused and generally use a single top-level Type on a given page. In other words, you might include review data on your product page, but the primary Type you’d be using is Schema.org/Product. The goal is to tell search engines what this page is about.

Structured data must be representative of the main content of the page, and marked up content should not be hidden from the user. Google will penalize sites which they believe are using structured data markup in scammy ways.

There are some other general guidelines from Google, including:

  • Add your markup to the page it describes (so Product markup would be added to the individual product page, not the homepage)
  • For duplicated pages with a canonical version, add the same markup to all versions of the page (not just the canonical)
  • Don’t block your marked-up pages from search engines
  • Be as specific as possible when choosing a Type to add to a page
  • Multiple entities on the same page must each be marked up individually (so for a list of products, each product should have its own Product markup added)
  • As a rule, you should only be adding markup for content which is being shown on the page you add it to

So how do you know which Schema.org Types are relevant for your site? That depends on the type of business and website you run.

Schema.org for websites in general

There are certain types of Schema.org markup which almost any business can benefit from, and there are also more specific use cases for certain types of business.

General opportunities to be aware of are:

  • Sitelinks Search Box: if you have search functionality on your site, you can add markup which enables a search box to appear in your sitelinks:

Image source

Image source

  • VideoObject: if you have video content on your site, this markup can enable video snippets in SERPs, with info about uploader, duration, a thumbnail image, and more:

A note about Star reviews in the SERP

You’ll often see recommendations about “marking up your reviews” to get star ratings in the SERP results. “Reviews” have their own type, Schema.org/Review, with properties that you’ll need to include; but they can also be embedded into other types using that type’s “review” property.

You can see an example of this above, in the Recipes image, where some of the recipes in the SERP display a star rating. This is because they have included the aggregate user rating for that recipe in the “review” property within the Schema.org/Recipe type.

You’ll see a similar implementation for other properties which have their own type, such as Schema.org/Duration, Schema.org/Date, and Schema.org/Person. It can feel really complicated, but it’s actually just about organizing your information in terms of category > subcategory > discrete object.

If this feels a little confusing, it might help to think about it in terms of how we define a physical thing, like an ingredient in a recipe. Chicken broth is a dish that you can make, and each food item that goes into making the chicken broth would be classified as an ingredient. But you could also have a recipe that calls for chicken broth as an ingredient. So depending on whether you’re writing out a recipe for chicken broth, or a recipe that includes chicken broth, you’ll classify it differently.

In the same way, attributes like “Review,” “Date,” and “Duration” can be their own thing (Type), or a property of another Type. This is just something to be aware of when you start implementing this kind of markup. So when it comes to “markup for reviews,” unless the page itself is primarily a review of something, you’ll usually want to implement Review markup as a property of the primary Type for the page.


In addition to this generally applicable markup, there are certain Schema.org Types which are particularly helpful for specific kinds of businesses:

  • E-commerce
    • including online course providers
  • Recipes Sites
  • Publishers
  • Events/Ticketing Sites
    • including educational institutions which offer courses
  • Local Businesses
  • Specific Industries (small business and larger organizations)
  • Creative Producers

Schema.org for e-commerce

If you have an e-commerce site, you’ll want to check out:

  • Product: this allows you to display product information, such as price, in the search result. You can use this markup on an individual product page, or an aggregator page which shows information about different sellers offering an individual product.
  • Offer: this can be combined with Schema.org/Product to show a special offer on your product (and encourage higher CTRs).
  • Review: if your site has product reviews, you can aggregate the star ratings for each individual product and display it in the SERP for that product page, using Schema.org/aggregateRating.

Things to watch out for…

  • Product markup is designed for individual products, not lists of products. If you have a category page and want to mark it up, you’ll need to mark up each individual product on the page with its own data.
  • Review markup is designed for reviews of specific items, goods, services, and organizations. You can mark up your site with reviews of your business, but you should do this on the homepage as part of your organization markup.
  • If you are marking up reviews, they must be generated by your site, rather than via a third-party source.
  • Course markup should not be used for how-to content, or for general lectures which do not include a curriculum, specific outcomes, or a set student list.

Schema.org for recipes sites

For sites that publish a lot of recipe content, Recipe markup is a fantastic way to add additional context to your recipe pages and get a lot of visual impact in the SERPs.

Things to watch out for…

If you’re implementing Recipe Rich Cards, you’ll want to be aware of some extra guidelines:

Schema.org for publishers

If you have an publisher site, you’ll want to check out the following:

  • Article and its subtypes,
    • NewsArticle: this indicates that the content is a news article
    • BlogPosting: similar to Article and NewsArticle, but specifies that the content is a blog post
  • Fact Check: If your site reviews or discusses “claims made by others,” as Google diplomatically puts it, you can add a “fact check” to your snippet using the Schema.org/ClaimReview.

Image source

  • CriticReview: if your site offers critic-written reviews of local businesses (such as a restaurant critic’s review), books, and /or movies, you can mark these up with Schema.org/CriticReview.
    • Note that this is a feature being tested, and is a knowledge box feature rather than a rich snippet enhancement of your own search result.

Image source

Things to watch out for…

Schema.org for events/ticketing sites

If your business hosts or lists events, and/or sells tickets, you can use:

  • Events: you can mark up your events pages with Schema.org/Event and get your event details listed in the SERP, both in a regular search result and as instant answers at the top of the SERP:

  • Courses: If your event is a course (i.e., instructor-led with a student roster), you can also use Schema.org/Course markup.

Things to watch out for…

  • Don’t use Events markup to mark up time-bound non-events like travel packages or business hours.
  • As with products and recipes, don’t mark up multiple events listed on a page with a single usage of Event markup.
    • For a single event running over several days, you should mark this up as an individual event and make sure you indicate start and end dates;
    • For an event series, with multiple connected events running over time, mark up each individual event separately.
  • Course markup should not be used for how-to content, or for general events/lectures which do not include a curriculum, specific outcomes, and an enrolled student list.

Schema.org for job sites

If your site offers job listings, you can use Schema.org/JobPosting markup to appear in Google’s new Jobs listing feature:

Note that this is a Google aggregator feature, rather than a rich snippet enhancement of your own result (like Google Flights).

Things to watch out for…

  • Mark up each job post individually, and do not mark up a jobs listings page.
  • Include your job posts in your sitemap, and update your sitemap at least once daily.
  • You can include Review markup if you have review data about the employer advertising the job.

Schema.org for local businesses

If you have a local business or a store with a brick-and-mortar location (or locations), you can use structured data markup on your homepage and contact page to help flag your location for Maps data as well as note your “local” status:

  • LocalBusiness: this allows you to specify things like your opening hours and payment accepted
  • PostalAddress: this is a good supplement to getting all those NAP citations consistent
  • OrderAction and ReservationAction: if users can place orders or book reservations on your website, you may want to add action markup as well.

You should also get set up with GoogleMyBusiness.

☆ Additional resources for local business markup

Here’s an article from Whitespark specifically about using Schema.org markup and JSON-LD for local businesses, and another from Phil Rozek about choosing the right Schema.org Type. For further advice on local optimization, check out the local SEO learning center and this recent post about common pitfalls.

Schema.org for specific industries

There are certain industries and/or types of organization which get specific Schema.org types, because they have a very individual set of data that they need to specify. You can implement these Types on the homepage of your website, along with your Brand Information.

These include LocalBusiness Types:

And a few larger organizations, such as:

Things to watch out for…

  • When you’re adding markup that describes your business as a whole, it might seem like you should add that markup to every page on the site. However, best practice is to add this markup only to the homepage.

Schema.org for creative producers

If you create a product or type of content which could be considered a “creative work” (e.g. content produced for reading, viewing, listening, or other consumption), you can use CreativeWork markup.

More specific types within CreativeWork include:

Schema.org new features (limited availability)

Google is always developing new SERP features to test, and you can participate in the testing for some of these. For some, the feature is an addition to an existing Type; for others, it is only being offered as part of a limited test group. At the time of this writing, these are some of the new features being tested:

Structured data beyond SEO

As mentioned in Part 1 of this guide, structured data can be useful for other marketing channels as well, including:

For more detail on this, see the section in Part 1 titled: “Common Uses for Structured Data.”

How to generate and test your structured data implementation

Once you’ve decided which Schema.org Types are relevant to you, you’ll want to add the markup to your site. If you need help generating the code, you may find Google’s Data Highlighter tool useful. You can also try this tool from Joe Hall. Note that these tools are limited to a handful of Schema.org Types.

After you generate the markup, you’ll want to test it at two stages of the implementation using the Structured Data Testing Tool from Google — first, before you add it to the site, and then again once it’s live. In that pre-implementation test, you’ll be able to see any errors or issues with the code and correct before adding it to the site. Afterwards, you’ll want to test again to make sure that nothing went wrong in the implementation.

In addition to the Google tools listed above, you should also test your implementation with Bing’s Markup Validator tool and (if applicable) the Yandex structured data validator tool. Bing’s tool can only be used with a URL, but Yandex’s tool will validate a URL or a code snippet, like Google’s SDT tool.

You can also check out Aaron Bradley’s roundup of Structured Data Markup Visualization, Validation, and Testing Tools for more options.

Once you have live structured data on your site, you’ll also want to regularly check the Structured Data Report in Google Search Console, to ensure that your implementation is still working correctly.

Common mistakes in Schema.org structured data implementation

When implementing Schema.org on your site, there are a few things you’ll want to be extra careful about. Marking up content with irrelevant or incorrect Schema.org Types looks spammy, and can result in a “spammy structured markup” penalty from Google. Here are a few of the most common mistakes people make with their Schema.org markup implementation:

Mishandling multiple entities

Marking up categories or lists of items (Products, Recipes, etc) or anything that isn’t a specific item with markup for a single entity

  • Recipe and Product markup are designed for individual recipes and products, not for listings pages with multiple recipes or products on a single page. If you have multiple entities on a single page, mark up each item individually with the relevant markup.

Misapplying Recipes markup

Using Recipe markup for something that isn’t food

  • Recipe markup should only be used for content about preparing food. Other types of content, such as “diy skin treatment” or “date night ideas,” are not valid names for a dish.

Misapplying Reviews and Ratings markup

Using Review markup to display “name” content which is not a reviewer’s name or aggregate rating

  • If your markup includes a single review, the reviewer’s name must be an actual organization or person. Other types of content, like “50% off ingredients,” are considered invalid data to include in the “name” property.

Adding your overall business rating with aggregateRating markup across all pages on your site

  • If your business has reviews with an aggregateRating score, this can be included in the “review” property on your Organization or LocalBusiness.

Using overall service score as a product review score

  • The “review” property in Schema.org/Product is only for reviews of that specific product. Don’t combine all product or business ratings and include those in this property.

Marking up third-party reviews of local businesses with Schema.org markup

  • You should not use structured data markup on reviews which are generated via third-party sites. While these reviews are fine to have on your site, they should not be used for generating rich snippets. The only UGC review content you should mark up is reviews which are displayed on your website, and generated there by your users.

General errors

Using organization markup on multiple pages/pages other than the homepage

  • It might seem counter-intuitive, but organization and LocalBusiness markup should only be used on the pages which are actually about your business (e.g. homepage, about page, and/or contact page).

Improper nesting

  • This is why it’s important to validate your code before implementing. Especially if you’re using Microdata tags, you need to make sure that the nesting of attributes and tags is done correctly.

So there you have it — a beginner’s guide to understanding and implementing structured data for SEO! There’s so much to learn around this topic that a single article or guide can’t cover everything, but if you’ve made it to the end of this series you should have a pretty good understanding of how structured data can help you with SEO and other marketing efforts. Happy implementing!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 1 year ago from tracking.feedpress.it

Fighting Review Spam: The Complete Guide for the Local Enterprise

Posted by MiriamEllis

It’s 105 degrees outside my office right now, and the only thing hotter in this summer of 2017 is the local SEO industry’s discussion of review spam. It’s become increasingly clear that major review sites represent an irresistible temptation to spammers, highlighting systemic platform weaknesses and the critical need for review monitoring that scales.

Just as every local brand, large and small, has had to adjust to the reality of reviews’ substantial impact on modern consumer behavior, competitive businesses must now prepare themselves to manage the facts of fraudulent sentiment. Equip your team and clients with this article, which will cover every aspect of review spam and includes a handy list for reporting fake reviews to major platforms.

What is review spam?

A false review is one that misrepresents either the relationship of the reviewer to the business, misrepresents the nature of the interaction the reviewer had with the business, or breaks a guideline. Examples:

  • The reviewer is actually a competitor of the business he is reviewing; he’s writing the review to hurt a competitor and help himself
  • The reviewer is actually the owner, an employee, or a marketer of the business he is reviewing; he’s falsifying a review to manipulate public opinion via fictitious positive sentiment
  • The reviewer never had a transaction with the business he is reviewing; he’s pretending he’s a customer in order to help/hurt the business
  • The reviewer had a transaction, but is lying about the details of it; he’s trying to hurt the company by misrepresenting facts for some gain of his own
  • The reviewer received an incentive to write the review, monetary or otherwise; his sentiment stems from a form of reward and is therefore biased
  • The reviewer violates any of the guidelines on the platform on which he’s writing his review; this could include personal attacks, hate speech or advertising

All of the above practices are forbidden by the major review platforms and should result in the review being reported and removed.

What isn’t review spam?

A review is not spam if:

  • It’s left directly by a genuine customer who experienced a transaction
  • It represents the facts of a transaction with reasonable, though subjective, accuracy
  • It adheres to the policies of the platform on which it’s published

Reviews that contain negative (but accurate) consumer sentiment shouldn’t be viewed as spam. For example, it may be embarrassing to a brand to see a consumer complain that an order was filled incorrectly, that an item was cold, that a tab was miscalculated or that a table was dirty, but if the customer is correctly cataloging his negative experience, then his review isn’t a misrepresentation.

There’s some inherent complexity here, as the brand and the consumer can differ widely in their beliefs about how satisfying a transaction may have been. A restaurant franchise may believe that its meals are priced fairly, but a consumer can label them as too expensive. Negative sentiment can be subjective, so unless the reviewer is deliberately misrepresenting facts and the business can prove it, it’s not useful to report this type of review as spam as it’s unlikely to be removed.

Why do individuals and businesses write spam reviews?

Unfortunately, the motives can be as unpleasant as they are multitudinous:

Blackmail/extortion

There’s the case of the diner who was filmed putting her own hair in her food in hopes of extorting a free meal under threat of negative reviews as a form of blackmail. And then there’s blackmail as a business model, as this unfortunate business reported to the GMB forum after being bulk-spammed with 1-star reviews and then contacted by the spammer with a demand for money to raise the ratings to 5-stars.

Revenge

The classic case is the former employee of a business venting his frustrations by posing as a customer to leave a highly negative review. There are also numerous instances of unhappy personal relationships leading to fake negative reviews of businesses.

Protest or punishment

Consumer sentiment may sometimes appear en masse as a form of protest against an individual or institution, as the US recently witnessed following the election of President Trump and the ensuing avalanche of spam reviews his various businesses received.

It should be noted here that attempting to shame a business with fake negative reviews can have the (likely undesirable) effect of rewarding it with high local rankings, based on the sheer number of reviews it receives. We saw this outcome in the infamous case of the dentist who made national news and received an onslaught of shaming reviews for killing a lion.

Finally, there is the toxic reviewer, a form of Internet troll who may be an actual customer but whose personality leads them to write abusive or libelous reviews as a matter of course. While these reviews should definitely be reported and removed if they fail to meet guidelines, discussion is open and ongoing in the local SEO industry as to how to manage the reality of consumers of this type.

Ranking manipulation

The total review count of a business (regardless of the sentiment the reviews contain) can positively impact Google’s local pack rankings or the internal rankings of certain review platforms. For the sake of boosting rankings, some businesses owners review themselves, tell their employees to review their employer, offer incentives to others in exchange for reviews, or even engage marketers to hook them up to a network of review spammers.

Public perception manipulation

This is a two-sided coin. A business can either positively review itself or negatively review its competitors in an effort to sway consumer perception. The latter is a particularly prevalent form of review spam, with the GMB forum overflowing with at least 10,000 discussions of this topic. Given that respected surveys indicate that 91% of consumers now read online reviews, 84% trust them as much as personal recommendations and 86% will hesitate to patronize a business with negative reviews, the motives for gaming online sentiment, either positively or negatively, are exceedingly strong.

Wages

Expert local SEO, Mike Blumenthal, is currently doing groundbreaking work uncovering a global review spam network that’s responsible for tens or hundreds of thousands of fake reviews. In this scenario, spammers are apparently employed to write reviews of businesses around the world depicting sets of transactions that not even the most jet-setting globetrotter could possibly have experienced. As Mike describes one such reviewer:

“She will, of course, be educated at the mortuary school in Illinois and will have visited a dentist in Austin after having reviewed four other dentists … Oh, and then she will have bought her engagement ring in Israel, and then searched out a private investigator in Kuru, Philippines eight months later to find her missing husband. And all of this has taken place in the period of a year, right?”

The scale of this network makes it clear that review spam has become big business.

Lack of awareness

Not all review spammers are dastardly characters. Some small-timers are only guilty of a lack of awareness of guidelines or a lack of foresight about the potential negative outcomes of fake reviews to their brand. I’ve sometimes heard small local business owners state they had their family review their newly-opened business to “get the ball rolling,” not realizing that they were breaking a guideline and not considering how embarrassing and costly it could prove if consumers or the platform catch on. In this scenario, I try to teach that faking success is not a viable business model — you have to earn it.

Lack of consequences

Unfortunately, some of the most visible and powerful review platforms have become enablers of the review spam industry due to a lack of guideline enforcement. When a platform fails to identify and remove fake reviews, either because of algorithmic weaknesses or insufficient support staffing, spammers are encouraged to run amok in an environment devoid of consequences. For unethical parties, no further justification for manipulating online sentiment is needed than that they can “get away with it.” Ironically, there are consequences to bear for lack of adequate policing, and until they fall on the spammer, they will fall on any platform whose content becomes labeled as untrustworthy in the eyes of consumers.

What is the scope of review spam?

No one knows for sure, but as we’ve seen, the playing field ranges from the single business owner having his family write a couple of reviews on Yelp to the global network employing staff to inundate Google with hundreds of thousands of fake reviews. And, we’ve see two sides to the review spam environment:

  1. People who write reviews to help themselves (in terms of positive rankings, perception, and earnings for themselves either directly from increased visibility or indirectly via extortion, and/or in terms of negative outcomes for competitors).
  2. People who write reviews to hurt others (for the sake of revenge with little or no consequence).

The unifying motive of all forms of review spam is manipulation, creating an unfair and untrustworthy playing field for consumers, enterprises and platforms alike. One Harvard study suggests that 20% of Yelp reviews are fake, but it would be up to the major review platforms to transparently publicize the total number of spam reviews they receive. Just the segment I’ve seen as an individual local SEO has convinced me that review spam has now become an industry, just like “black hat” SEO once did.

How to spot spam reviews

Here are some basic tips:

Strange patterns:

A reviewer’s profile indicates that they’ve been in too many geographic locations at once. Or, they have a habit of giving 1-star reviews to one business and 5-star reviews to its direct competitor. While neither is proof positive of spam, think of these as possible red flags.

Strange language:

Numerous 5-star reviews that fawn on the business owner by name (e.g. “Bill is the greatest man ever to walk the earth”) may be fishy. If adulation seems to be going overboard, pay attention.

Strange timing:

Over the course of a few weeks, a business skyrockets from zero reviews to 30, 50, or 100 of them. Unless an onslaught of sentiment stems from something major happening in the national news, chances are good the company has launched some kind of program. If you suspect spam, you’ll need to research whether the reviews seem natural or could be stemming from some form of compensation.

Strange numbers:

The sheer number of reviews a business has earned seems inconsistent with its geography or industry. Some business models (restaurants) legitimately earn hundreds of reviews each year on a given platform, but others (mortuaries) are unlikely to have the same pattern. If a competitor of yours has 5x as many reviews as seems normal for your geo-industry, it could be a first indicator of spam.

Strange “facts”:

None of your staff can recall that a transaction matching the description in a negative review ever took place, or a transaction can be remembered but the way the reviewer is presenting it is demonstrably false. Example: a guest claims you rudely refused to seat him, but your in-store cam proves that he simply chose not to wait in line like other patrons.

Obvious threats:

If any individual or entity threatens your company with a negative review to extort freebies or money from you, take it seriously and document everything you can.

Obvious guideline violations:

Virtually every major review platform prohibits profane, obscene, and hateful content. If your brand is victimized by this type of attack, definitely report it.

In a nutshell, the first step to spotting review spam is review monitoring. You’ll want to manually check direct competitors for peculiar patterns, and, more importantly, all local businesses must have a schedule for regularly checking their own incoming sentiment. For larger enterprises and multi-location business models, this process must be scaled to minimize manual workloads and cover all bases.

Scaling review management

On an average day, one Moz Local customer with 100 retail locations in the U.S. receives 20 reviews across the various platforms we track. Some are just ratings, but many feature text. Many are very positive. A few contain concerns or complaints that must be quickly addressed to protect reputation/budget by taking action to satisfy and retain an existing customer while proving responsiveness to the general consumer public. Some could turn out to be spam.

Over the course of an average week for this national brand, 100–120 such reviews will come in, totaling up to more than 400 pieces of customer feedback in a month that must be assessed for signs of success at specific locations or emerging quality control issues at others. Parse this out to a year’s time, and this company must be prepared to receive and manage close to 5,000 consumer inputs in the form of reviews and ratings, not just for positive and negative sentiment, but for the purposes of detecting spam.

Spam detection starts with awareness, which can only come from the ability to track and audit a large volume of reviews to identify some of the suspicious hallmarks we’ve covered above. At the multi-location or enterprise level, the solution to this lies in acquiring review monitoring software and putting it in the hands of a designated department or staffer. Using a product like Moz Local, monitoring and detection of questionable reviews can be scaled to meet the needs of even the largest brands.

What should your business do if it has been victimized by review spam?

Once you’ve become reasonably certain that a review or a body of reviews violates the guidelines of a specific platform, it’s time to act. The following list contains links to the policies of 7 dominant review platforms that are applicable to all industries, and also contains tips and links outlining reporting options:

Google

Policy: https://support.google.com/business/answer/2622994?hl=en

Review reporting tips

Flag the review by mousing over it, clicking the flag symbol that appears and then entering your email address and choosing a radio button. If you’re the owner, use the owner response function to mention that you’ve reported the review to Google for guideline violations. Then, contact GMB support via their Twitter account and/or post your case in the GMB forum to ask for additional help. Cross your fingers!

Yelp

Policy: https://www.yelp.com/guidelines

Review reporting tips

Yelp offers these guidelines for reporting reviews and also advises owners to respond to reviews that violate guidelines. Yelp takes review quality seriously and has set high standards other platforms might do well to follow, in terms of catching spammers and warning the public against bad actors.

Facebook

Policy: https://www.facebook.com/communitystandards

Review reporting tips

Here are Facebook’s instructions for reporting reviews that fail to meet community standards. Note that you can only report reviews with text — you can’t report solo ratings. Interestingly, you can turn off reviews on Facebook, but to do so out of fear would be to forego the considerable benefits they can provide.

Yellow Pages

Policy: https://www.yellowpages.com/about/legal/terms-conditions#user-generated-content

Review reporting tips

In 2016, YP.com began showing TripAdvisor reviews alongside internal reviews. If review spam stems from a YP review, click the “Flag” link in the lower right corner of the review and fill out the form to report your reasons for flagging. If the review spam stems from TripAdvisor, you’ll need to deal with them directly and read their extensive guidelines, TripAdvisor states that they screen reviews for quality purposes, but that fake reviews can slip through. If you’re the owner, you can report fraudulent reviews from the Management Center of your TripAdvisor dashboard. Click the “concerned about a review” link and fill out the form. If you’re simply a member of the public, you’ll need to sign into TripAdvisor and click the flag link next to the review to report a concern.

SuperPages

Policy: https://my.dexmedia.com/spportal/jsp/popups/businessprofile/reviewGuidelines.jsp

Review reporting tips

The policy I’ve linked to (from Dex Media, which owns SuperPages) is the best I can find. It’s reasonably thorough but somewhat broken. To report a fake review to SuperPages, you’ll need either a SuperPages or Facebook account. Then, click the “flag abuse” link associated with the review and fill out a short form.

CitySearch

Policy: http://www.citysearch.com/aboutcitysearch/about_us

Review reporting tips

If you receive a fake review on CitySearch, email customerservice@citygrid.com. In your email, link to the business that has received the spam review, include the date of the review and the name of the reviewer and then cite the guidelines you feel the review violates.

FourSquare

Policy: https://foursquare.com/legal/terms

Review reporting tips

The “Rules and Conduct” section I’ve linked to in Foursquare’s TOS outlines their content policy. Foursquare is a bit different in the language they use to describe tips/reviews. They offer these suggestions for reporting abusive tips.

*If you need to find the guidelines and reporting options for an industry-specific review platform like FindLaw or HealthGrades, Phil Rozek’s definitive list will be a good starting point for further research.

Review spam can feel like being stuck between a rock and a hard place

I feel a lot of empathy in this regard. Google, Facebook, Yelp, and other major review platforms have the visibility to drive massive traffic and revenue to your enterprise. That’s the positive side of this equation. But there’s another side — the uneasy side that I believe has its roots in entities like Google originating their local business index via aggregation from third party sources, rather than as a print YellowPages-style, opt-in program, and subsequently failing to adequately support the millions of brands it was then representing to the Internet public.

To this day, there are companies that are stunned to discover that their business is listed on 35 different websites, and being actively reviewed on 5 or 10 of them when the company took no action to initiate this. There’s an understandable feeling of a loss of control that can be particularly difficult for large brands, with their carefully planned quality structures, to adjust to.

This sense of powerlessness is further compounded when the business isn’t just being listed and discussed on platforms it doesn’t control, but is being spammed. I’ve seen business owners on Facebook declaring they’ve decided to disable reviews because they feel so victimized and unsupported after being inundated with suspicious 1-star ratings which Facebook won’t investigate or remove. By doing so, these companies are choosing to forego the considerable benefits reviews drive because meaningful processes for protecting the business aren’t yet available.

These troubling aspects of the highly visible world of reviews can leave owners feeling like they’re stuck between a rock and a hard place. Their companies will be listed, will be reviewed, and may be spammed whether the brand actively participates or not, and they may or may not be able to get spam removed.

It’s not a reality from which any competitive enterprise can opt-out, so my best advice is to realize that it’s better to opt-in fully, with the understanding that some control is better than none. There are avenues for getting many spam reviews taken down, with the right information and a healthy dose of perseverance. Know, too, that every one of your competitors is in the same boat, riding a rising tide that will hopefully grow to the point of offering real-world support for managing consumer sentiment that impacts bottom-line revenue in such a very real way.

There ought to be a law

While legitimate negative reviews have legal protection under the Consumer Review Fairness Act of 2016, fraudulent reviews are another matter.

Section 5(a) of the Federal Trade Communication Act states:

Unfair methods of competition in or affecting commerce, and unfair or deceptive acts or practices in or affecting commerce, are hereby declared unlawful.”

Provisions like these are what allowed the FTC to successfully sue Sage Automotive Group for $3.6 million dollars for deceptive advertising practices and deceptive online reviews, but it’s important to note that this appears to be the first instance in which the FTC has involved themselves in bringing charges on the basis of fraudulent reviews. At this point, it’s simply not reasonable to expect the FTC to step in if your enterprise receives some suspicious reviews, unless your research should uncover a truly major case.

Lawsuits amongst platforms, brands, and consumers, however, are proliferating. Yelp has sued agencies and local businesses over the publication of fake reviews. Companies have sued their competitors over malicious, false sentiment, and they’ve sued their customers with allegations of the same.

Should your enterprise be targeted with spam reviews, some cases may be egregious enough to warrant legal action. In such instances, definitely don’t attempt to have the spam reviews removed by the host platform, as they could provide important evidence. Contact a lawyer before you take a step in any direction, and avoid using the owner response function to take verbal revenge on the person you believe has spammed you, as we now have a precedent in Dietz v. Perez for such cases being declared a draw.

In many scenarios, however, the business may not wish to become involved in a noisy court battle, and seeking removal can be a quieter way to address the problem.

Local enterprises, consumers, and marketers must advocate for themselves

According to one survey, 90% of consumers read less than 10 reviews before forming an opinion about a business. If some of those 10 reviews are the result of negative spam, the cost to the business is simply too high to ignore, and it’s imperative that owners hold not just spammers, but review platforms, accountable.

Local businesses, consumers, and marketers don’t own review sites, but they do have the power to advocate. A single business could persistently blog about spam it has documented. Multiple businesses could partner up to request a meeting with a specific platform to present pain points. Legitimate consumers could email or call their favorite platforms to explain that they don’t want their volunteer hours writing reviews to be wasted on a website that is failing to police its content. Marketers can thoughtfully raise these issues repeatedly at conferences attended by review platform reps. There is no cause to take an adversarial tone in this, but there is every need for squeaky wheels to highlight the costliness of spam to all parties, advocating for platforms to devote all possible resources to:

  • Increasing the sophistication of algorithmic spam detection
  • Increasing staffing for manual detection
  • Providing real-time support to businesses so that spam can be reported, evaluated and removed as quickly as possible

All of the above could begin to better address the reality of review spam. In the meantime, if your business is being targeted right now, I would suggest using every possible avenue to go public with the problem. Blog, use social media, report the issue on the platform’s forum if it has one. Do anything you can to bring maximum attention to the attack on your brand. I can’t promise results from persistence and publicity, but I’ve seen this method work enough times to recommend it.

Why review platforms must act aggressively to minimize spam

I’ve mentioned the empathy I feel for owners when it comes to review platforms, and I also feel empathy for the platforms, themselves. I’ve gotten the sense, sometimes, that different entities jumped into the review game and have been struggling to handle its emerging complexities as they’ve rolled out in real time. What is a fair and just policy? How can you best automate spam detection? How deeply should a platform be expected to wade into disputes between customers and brands?

With sincere respect for the big job review sites have on their hands, I think it’s important to state:

  • If brands and consumers didn’t exist, neither would review platforms. Businesses and reviewers should be viewed and treated as MVPs.
  • Platforms which fail to offer meaningful support options to business owners are not earning goodwill or a good reputation.
  • The relationship between local businesses and review platforms isn’t an entirely comfortable one. Increasing comfort could turn wary brands into beneficial advocates.
  • Platforms that allow themselves to become inundated with spam will lose consumers’ trust, and then advertisers’ trust. They won’t survive.

Every review platform has a major stake in this game, but, to be perfectly honest, some of them don’t act like it.

Google My Business Forum Top Contributor and expert Local SEO, Joy Hawkins, recently wrote an open letter to Google offering them four actionable tips for improving their handling of their massive review spam problem. It’s a great example of a marketer advocating for her industry, and, of interest, some of Joy’s best advice to Google is taken from Yelp’s own playbook. Yelp may be doing the best of all platforms in combating spam, in that they have very strong filters and place public warnings on the profiles of suspicious reviewers and brands.

What Joy Hawkins, Mike Blumenthal, other industry experts, and local business owners seem to be saying to review platforms could be summed up like this:

“We recognize the power of reviews and appreciate the benefits they provide, but a responsibility comes with setting your platform up as a hub of reputation for millions of businesses. Don’t see spammed reputations as acceptable losses — they represent the livelihoods of real people. If you’re going to trade responsibly in representing us, you’ve got to back your product up with adequate quality controls and adequate support. A fair and trustworthy environment is better for us, better for consumers and better for you.”

Key takeaways for taking control of review spam

  • All local enterprises need to know that review spam is a real problem
  • Its scope ranges from individual spammers to global networks
  • Enterprises must monitor all incoming reviews, and scale this with software where necessary
  • Designated staff must be on the lookout for suspicious patterns
  • All major review platforms have some form of support for reporting spam reviews, but its not always adequate and may not lead to removal
  • Because of this, brands must advocate for better support from review platforms
  • Review platforms need to listen and act, because their stake in game is real

Being the subject of a review spam attack can be a stressful event that I wish no brand ever had to face, but it’s my hope that this article has empowered you to meet a possible challenge with complete information and a smart plan of action.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 1 year ago from tracking.feedpress.it

What You Need to Know About Duplicate GMB Listings [Excerpt from the Expert&rsquo;s Guide to Local SEO]

Posted by JoyHawkins

Recently, I’ve had a lot of people ask me how to deal with duplicate listings in Google My Business now that MapMaker is dead. Having written detailed instructions outlining different scenarios for the advanced local SEO training manual I started selling over at LocalU, I thought it’d be great to give Moz readers a sample of 5 pages from the manual outlining some best practices.


What you need to know about duplicate GMB listings

Before you start, you need to find out if the listing is verified. If the listing has an “own this business” or “claim this business” option, it is not currently verified. If missing that label, it means it is verified — there is nothing you can do until you get ownership or have it unverified (if you’re the one who owns it in GMB). This should be your first step before you proceed with anything below.

Storefronts

  • Do the addresses on the two listings match? If the unverified duplicate has the same address as the verified listing, you should contact Google My Business support and ask them to merge the two listings.
  • If the addresses do not match, find out if the business used to be at that address at some point in time.
    • If the business has never existed there:
      • Pull up the listing on Maps
      • Press “Suggest an edit”
      • Switch the toggle beside “Place is permanently closed” to Yes
      • Select “Never existed” as the reason and press submit. *Note: If there are reviews on the listing, you should get them transferred before doing this.

  • If the duplicate lists an address that is an old address (they were there at some point but have moved), you will want to have the duplicate marked as moved.

Service area businesses

  • Is the duplicate listing verified? If it is, you will first have to get it unverified or gain access to it. Once you’ve done that, contact Google My Business and ask them to merge the two listings.
  • If the duplicate is not verified, you can have it removed from Maps since service area businesses are not permitted on Google Maps. Google My Business allows them, but any unverified listing would follow Google Maps rules, not Google My Business. To remove it:
    • Pull up the listing on Maps
    • Press “Suggest an edit”
    • Switch the toggle beside “Place is permanently closed” to Yes
    • Select “Private” as the reason and press submit. *Note: If there are reviews on the listing, you should get them transferred before doing this.

Practitioner listings

Public-facing professionals (doctors, lawyers, dentists, realtors, etc.) are allowed their own listings separate from the office they work for, unless they’re the only public-facing professional at that office. In that case, they are considered a solo practitioner and there should only be one listing, formatted as “Business Name: Professional Name.”

Solo practitioner with two listings

This is probably one of the easiest scenarios to fix because solo practitioners are only supposed to have one listing. If you have a scenario where there’s a listing for both the practice and the practitioner, you can ask Google My Business to merge the two and it will combine the ranking strength of both. It will also give you one listing with more reviews (if each individual listing had reviews on it). The only scenario where I don’t advise combining the two is if your two listings both rank together and are monopolizing two of the three spots in the 3-pack. This is extremely rare.

Multi-practitioner listings

If the business has multiple practitioners, you are not able to get these listings removed or merged provided the practitioner still works there. While I don’t generally suggest creating listings for practitioners, they often exist already, leaving people to wonder what to do with them to keep them from competing with the listing for the practice.

A good strategy is to work on having multiple listings rank if you have practitioners that specialize in different things. Let’s say you have a chiropractor who also has a massage therapist at his office. The massage therapist’s listing could link to a page on the site that ranks highly for “massage therapy” and the chiropractor could link to the page that ranks highest organically for chiropractic terms. This is a great way to make the pages more visible instead of competing.

Another example would be a law firm. You could have the main listing for the law firm optimized for things like “law firm,” then have one lawyer who specializes in personal injury law and another lawyer who specializes in criminal law. This would allow you to take advantage of the organic ranking for several different keywords.

Keep in mind that if your goal is to have three of your listings all rank for the exact same keyword on Google, thus monopolizing the entire 3-pack, this is an unrealistic strategy. Google has filters that keep the same website from appearing too many times in the results and unless you’re in a really niche industry or market, it’s almost impossible to accomplish this.

Practitioners who no longer work there

It’s common to find listings for practitioners who no longer work for your business but did at some point. If you run across a listing for a former practitioner, you’ll want to contact Google My Business and ask them to mark the listing as moved to your practice listing. It’s extremely important that you get them to move it to your office listing, not the business the practitioner now works for (if they have been employed elsewhere). Here is a good case study that shows you why.

If the practitioner listing is verified, things can get tricky since Google My Business won’t be able to move it until it’s unverified. If the listing is verified by the practitioner and they refuse to give you access or remove it, the second-best thing would be to get them to update the listing to have their current employer’s information on it. This isn’t ideal and should be a last resort.

Listings for employees (not public-facing)

If you find a listing for a non-public-facing employee, it shouldn’t exist on Maps. For example: an office manager of a law firm, a paralegal, a hygienist, or a nurse. You can get the listing removed:

  • Pull up the listing on Maps
  • Press “Suggest an edit”
  • Switch the toggle beside “Place is permanently closed..” to Yes
  • Select “Never existed” as the reason and press submit.

Listings for deceased practitioners

This is always a terrible scenario to have to deal with, but I’ve run into lots of cases where people don’t know how to get rid of listings for deceased practitioners. The solution is similar to what you would do for someone who has left the practice, except you want to add an additional step. Since the listings are often verified and people usually don’t have access to the deceased person’s Google account, you want to make sure you tell Google My Business support that the person is deceased and include a link to their obituary online so the support worker can confirm you’re telling the truth. I strongly recommend using either Facebook or Twitter to do this, since you can easily include the link (it’s much harder to do on a phone call).

Creating practitioner listings

If you’re creating a practitioner listing from scratch, you might run into issues if you’re trying to do it from the Google My Business dashboard and you already have a verified listing for the practice. The error you would get is shown below.

There are two ways around this:

  1. Create the listing via Google Maps. Do this by searching the address and then clicking “Add a missing place.” Do not include the firm/practice name in the title of the listing or your edit most likely won’t go through, since it will be too similar to the listing that already exists for the practice. Once you get an email from Google Maps stating the listing has been successfully added, you will be able to claim it via GMB.
  2. Contact GMB support and ask them for help.

We hope you enjoyed this excerpt from the Expert’s Guide to Local SEO! The full 160+-page guide is available for purchase and download via LocalU below.

Get the Expert’s Guide to Local SEO

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 1 year ago from tracking.feedpress.it

SearchCap: Google updates its iOS app, a guide to SSL certificates & Bing Ads

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google updates its iOS app, a guide to SSL certificates & Bing Ads appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Reblogged 1 year ago from feeds.searchengineland.com