The Nifty Guide to Local Content Strategy and Marketing

Posted by NiftyMarketing

This is my Grandma.

She helped raised me and I love her dearly. That chunky baby with the Gerber cheeks is
me. The scarlet letter “A” means nothing… I hope.

This is a rolled up newspaper. 

rolled up newspaper

When I was growing up, I was the king of mischief and had a hard time following parental guidelines. To ensure the lessons she wanted me to learn “sunk in” my grandma would give me a soft whack with a rolled up newspaper and would say,

“Mike, you like to learn the hard way.”

She was right. I have
spent my life and career learning things the hard way.

Local content has been no different. I started out my career creating duplicate local doorway pages using “find and replace” with city names. After getting whacked by the figurative newspaper a few times, I decided there had to be a better way. To save others from the struggles I experienced, I hope that the hard lessons I have learned about local content strategy and marketing help to save you fearing a rolled newspaper the same way I do.

Lesson one: Local content doesn’t just mean the written word

local content ecosystem

Content is everything around you. It all tells a story. If you don’t have a plan for how that story is being told, then you might not like how it turns out. In the local world, even your brick and mortar building is a piece of content. It speaks about your brand, your values, your appreciation of customers and employees, and can be used to attract organic visitors if it is positioned well and provides a good user experience. If you just try to make the front of a building look good, but don’t back up the inside inch by inch with the same quality, people will literally say, “Hey man, this place sucks… let’s bounce.”

I had this experience proved to me recently while conducting an interview at
Nifty for our law division. Our office is a beautifully designed brick, mustache, animal on the wall, leg lamp in the center of the room, piece of work you would expect for a creative company.

nifty offices idaho

Anywho, for our little town of Burley, Idaho it is a unique space, and helps to set apart our business in our community. But, the conference room has a fluorescent ballast light system that can buzz so loudly that you literally can’t carry on a proper conversation at times, and in the recent interviews I literally had to conduct them in the dark because it was so bad.

I’m cheap and slow to spend money, so I haven’t got it fixed yet. The problem is I have two more interviews this week and I am so embarrassed by the experience in that room, I am thinking of holding them offsite to ensure that we don’t product a bad content experience. What I need to do is just fix the light but I will end up spending weeks going back and forth with the landlord on whose responsibility it is.

Meanwhile, the content experience suffers. Like I said, I like to learn the hard way.

Start thinking about everything in the frame of content and you will find that you make better decisions and less costly mistakes.

Lesson two: Scalable does not mean fast and easy growth

In every sales conversation I have had about local content, the question of scalability comes up. Usually, people want two things:

  1. Extremely Fast Production 
  2. Extremely Low Cost

While these two things would be great for every project, I have come to find that there are rare cases where quality can be achieved if you are optimizing for fast production and low cost. A better way to look at scale is as follows:

The rate of growth in revenue/traffic is greater than the cost of continued content creation.

A good local content strategy at scale will create a model that looks like this:

scaling content graph

Lesson three: You need a continuous local content strategy

This is where the difference between local content marketing and content strategy kicks in. Creating a single piece of content that does well is fairly easy to achieve. Building a true scalable machine that continually puts out great local content and consistently tells your story is not. This is a graph I created outlining the process behind creating and maintaining a local content strategy:

local content strategy

This process is not a one-time thing. It is not a box to be checked off. It is a structure that should become the foundation of your marketing program and will need to be revisited, re-tweaked, and replicated over and over again.

1. Identify your local audience

Most of you reading this will already have a service or product and hopefully local customers. Do you have personas developed for attracting and retaining more of them? Here are some helpful tools available to give you an idea of how many people fit your personas in any given market.

Facebook Insights

Pretend for a minute that you live in the unique market of Utah and have a custom wedding dress line. You focus on selling modest wedding dresses. It is a definite niche product, but one that shows the idea of personas very well.

You have interviewed your customer base and found a few interests that your customer base share. Taking that information and putting it into Facebook insights will give you a plethora of data to help you build out your understanding of a local persona.

facebook insights data

We are able to see from the interests of our customers there are roughly 6k-7k current engaged woman in Utah who have similar interests to our customer base.

The location tab gives us a break down of the specific cities and, understandably, Salt Lake City has the highest percentage with Provo (home of BYU) in second place. You can also see pages this group would like, activity levels on Facebook, and household income with spending habits. If you wanted to find more potential locations for future growth you can open up the search to a region or country.

localized facebook insights data

From this data it’s apparent that Arizona would be a great expansion opportunity after Utah.

Neilson Prizm

Neilson offers a free and extremely useful tool for local persona research called Zip Code Lookup that allows you to identify pre-determined personas in a given market.

Here is a look at my hometown and the personas they have developed are dead on.

Neilson Prizm data

Each persona can be expanded to learn more about the traits, income level, and areas across the country with other high concentrations of the same persona group.

You can also use the segment explorer to get a better idea of pre-determined persona lists and can work backwards to determine the locations with the highest density of a given persona.

Google Keyword Planner Tool

The keyword tool is fantastic for local research. Using our same Facebook Insight data above we can match keyword search volume against the audience size to determine how active our persona is in product research and purchasing. In the case of engaged woman looking for dresses, it is a very active group with a potential of 20-30% actively searching online for a dress.

google keyword planner tool

2. Create goals and rules

I think the most important idea for creating the goals and rules around your local content is the following from the must read book Content Strategy for the Web.

You also need to ensure that everyone who will be working on things even remotely related to content has access to style and brand guides and, ultimately, understands the core purpose for what, why, and how everything is happening.

3. Audit and analyze your current local content

The point of this step is to determine how the current content you have stacks up against the goals and rules you established, and determine the value of current pages on your site. With tools like Siteliner (for finding duplicate content) and ScreamingFrog (identifying page titles, word count, error codes and many other things) you can grab a lot of information very fast. Beyond that, there are a few tools that deserve a more in-depth look.

BuzzSumo

With BuzzSumo you can see social data and incoming links behind important pages on your site. This can you a good idea which locations or areas are getting more promotion than others and identify what some of the causes could be.

Buzzsumo also can give you access to competitors’ information where you might find some new ideas. In the following example you can see that one of Airbnb.com’s most shared pages was a motiongraphic of its impact on Berlin.

Buzzsumo

urlProfiler

This is another great tool for scraping urls for large sites that can return about every type of measurement you could want. For sites with 1000s of pages, this tool could save hours of data gathering and can spit out a lovely formatted CSV document that will allow you to sort by things like word count, page authority, link numbers, social shares, or about anything else you could imagine.

url profiler

4. Develop local content marketing tactics

This is how most of you look when marketing tactics are brought up.

monkey

Let me remind you of something with a picture. 

rolled up newspaper

Do not start with tactics. Do the other things first. It will ensure your marketing tactics fall in line with a much bigger organizational movement and process. With the warning out of the way, here are a few tactics that could work for you.

Local landing page content

Our initial concept of local landing pages has stood the test of time. If you are scared to even think about local pages with the upcoming doorway page update then please read this analysis and don’t be too afraid. Here are local landing pages that are done right.

Marriott local content

Marriot’s Burley local page is great. They didn’t think about just ensuring they had 500 unique words. They have custom local imagery of the exterior/interior, detailed information about the area’s activities, and even their own review platform that showcases both positive and negative reviews with responses from local management.

If you can’t build your own platform handling reviews like that, might I recommend looking at Get Five Stars as a platform that could help you integrate reviews as part of your continuous content strategy.

Airbnb Neighborhood Guides

I not so secretly have a big crush on Airbnb’s approach to local. These neighborhood guides started it. They only have roughly 21 guides thus far and handle one at a time with Seoul being the most recent addition. The idea is simple, they looked at extremely hot markets for them and built out guides not just for the city, but down to a specific neighborhood.

air bnb neighborhood guides

Here is a look at Hell’s Kitchen in New York by imagery. They hire a local photographer to shoot the area, then they take some of their current popular listing data and reviews and integrate them into the page. This idea would have never flown if they only cared about creating content that could be fast and easy for every market they serve.

Reverse infographicing

Every decently sized city has had a plethora of infographics made about them. People spent the time curating information and coming up with the concept, but a majority just made the image and didn’t think about the crawlability or page title from an SEO standpoint.

Here is an example of an image search for Portland infographics.

image search results portland infographics

Take an infographic and repurpose it into crawlable content with a new twist or timely additions. Usually infographics share their data sources in the footer so you can easily find similar, new, or more information and create some seriously compelling data based content. You can even link to or share the infographic as part of it if you would like.

Become an Upworthy of local content

No one I know does this better than Movoto. Read the link for their own spin on how they did it and then look at these examples and share numbers from their local content.

60k shares in Boise by appealing to that hometown knowledge.

movoto boise content

65k shares in Salt Lake following the same formula.

movoto salt lake city content

It seems to work with video as well.

movoto video results

Think like a local directory

Directories understand where content should be housed. Not every local piece should be on the blog. Look at where Trip Advisor’s famous “Things to Do” page is listed. Right on the main city page.

trip advisor things to do in salt lake city

Or look at how many timely, fresh, quality pieces of content Yelp is showcasing from their main city page.

yelp main city page

The key point to understand is that local content isn’t just about being unique on a landing page. It is about BEING local and useful.

Ideas of things that are local:

  • Sports teams
  • Local celebrities or heroes 
  • Groups and events
  • Local pride points
  • Local pain points

Ideas of things that are useful:

  • Directions
  • Favorite local sports
  • Granular details only “locals” know

The other point to realize is that in looking at our definition of scale you don’t need to take shortcuts that un-localize the experience for users. Figure and test a location at a time until you have a winning formula and then move forward at a speed that ensures a quality local experience.

5. Create a content calendar

I am not going to get into telling you exactly how or what your content calendar needs to include. That will largely be based on the size and organization of your team and every situation might call for a unique approach. What I will do is explain how we do things at Nifty.

  1. We follow the steps above.
  2. We schedule the big projects and timelines first. These could be months out or weeks out. 
  3. We determine the weekly deliverables, checkpoints, and publish times.
  4. We put all of the information as tasks assigned to individuals or teams in Asana.

asana content calendar

The information then can be viewed by individual, team, groups of team, due dates, or any other way you would wish to sort. Repeatable tasks can be scheduled and we can run our entire operation visible to as many people as need access to the information through desktop or mobile devices. That is what works for us.

6. Launch and promote content

My personal favorite way to promote local content (other than the obvious ideas of sharing with your current followers or outreaching to local influencers) is to use Facebook ads to target the specific local personas you are trying to reach. Here is an example:

I just wrapped up playing Harold Hill in our communities production of The Music Man. When you live in a small town like Burley, Idaho you get the opportunity to play a lead role without having too much talent or a glee-based upbringing. You also get the opportunity to do all of the advertising, set design, and costuming yourself and sometime even get to pay for it.

For my advertising responsibilities, I decided to write a few blog posts and drive traffic to them. As any good Harold Hill would do, I used fear tactics.

music man blog post

I then created Facebook ads that had the following stats: Costs of $.06 per click, 12.7% click through rate, and naturally organic sharing that led to thousands of visits in a small Idaho farming community where people still think a phone book is the only way to find local businesses.

facebook ads setup

Then we did it again.

There was a protestor in Burley for over a year that parked a red pickup with signs saying things like, “I wud not trust Da Mayor” or “Don’t Bank wid Zions”. Basically, you weren’t working hard enough if you name didn’t get on the truck during the year.

Everyone knew that ol’ red pickup as it was parked on the corner of Main and Overland, which is one of the few stoplights in town. Then one day it was gone. We came up with the idea to bring the red truck back, put signs on it that said, “I wud Not Trust Pool Tables” and “Resist Sins n’ Corruption” and other things that were part of The Music Man and wrote another blog complete with pictures.

facebook ads red truck

Then I created another Facebook Ad.

facebook ads set up

A little under $200 in ad spend resulted in thousands more visits to the site which promoted the play and sold tickets to a generation that might not have been very familiar with the show otherwise.

All of it was local targeting and there was no other way would could have driven that much traffic in a community like Burley without paying Facebook and trying to create click bait ads in hope the promotion led to an organic sharing.

7. Measure and report

This is another very personal step where everyone will have different needs. At Nifty we put together very custom weekly or monthly reports that cover all of the plan, execution, and relevant stats such as traffic to specific content or location, share data, revenue or lead data if available, analysis of what worked and what didn’t, and the plan for the following period.

There is no exact data that needs to be shared. Everyone will want something slightly different, which is why we moved away from automated reporting years ago (when we moved away from auto link building… hehe) and built our report around our clients even if it took added time.

I always said that the product of a SEO or content shop is the report. That is what people buy because it is likely that is all they will see or understand.

8. In conclusion, you must refine and repeat the process

local content strategy - refine and repeat

From my point of view, this is by far the most important step and sums everything up nicely. This process model isn’t perfect. There will be things that are missed, things that need tweaked, and ways that you will be able to improve on your local content strategy and marketing all the time. The idea of the cycle is that it is never done. It never sleeps. It never quits. It never surrenders. You just keep perfecting the process until you reach the point that few locally-focused companies ever achieve… where your local content reaches and grows your target audience every time you click the publish button.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

What SEOs Need to Know About Topic Modeling & Semantic Connectivity – Whiteboard Friday

Posted by randfish

Search engines, especially Google, have gotten remarkably good at understanding searchers’ intent—what we
mean to search for, even if that’s not exactly what we search for. How in the world do they do this? It’s incredibly complex, but in today’s Whiteboard Friday, Rand covers the basics—what we all need to know about how entities are connected in search.

For reference, here’s a still of this week’s whiteboard!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re talking topic modeling and semantic connectivity. Those words might sound big and confusing, but, in fact, they are important to understanding the operations of search engines, and they have some direct influence on things that we might do as SEOs, hence our need to understand them.

Now, I’m going to make a caveat here. I am not an expert in this topic. I have not taken the required math classes, stats classes, programming classes to truly understand this topic in a way that I would feel extremely comfortable explaining. However, even at the surface level of understanding, I feel like I can give some compelling information that hopefully you all and myself included can go research some more about. We’re certainly investigating a lot of topic modeling opportunities and possibilities here at Moz. We’ve done so in the past, and we’re revisiting that again for some future tools, so the topic is fresh on my mind.

So here’s the basic concept. The idea is that search engines are smarter than just knowing that a word, a phrase that someone searches for, like “Super Mario Brothers,” is only supposed to bring back results that have exactly the words “Super Mario Brothers,” that perfect phrase in the title and in the headline and in the document itself. That’s still an SEO best practice because you’re trying to serve visitors who have that search query. But search engines are actually a lot smarter than this.

One of my favorite examples is how intelligent Google has gotten around movie topics. So try, for example, searching for “That movie where the guy is called The Dude,” and you will see that Google properly returns “The Big Lebowski” in the first ranking position. How do they know that? Well, they’ve essentially connected up “movie,” “The Dude,” and said, “Aha, those things are most closely related to ‘The Big Lebowski. That’s what the intent of the searcher is. That’s the document that we’re going to return, not a document that happens to have ‘That movie about the guy named ‘The Dude’ in the title, exactly those words.'”

Here’s another example. So this is Super Mario Brothers, and Super Mario Brothers might be connected to a lot of other terms and phrases. So a search engine might understand that Super Mario Brothers is a little bit more semantically connected to Mario than it is to Luigi, then to Nintendo and then Bowser, the jumping dragon guy, turtle with spikes on his back — I’m not sure exactly what he is — and Princess Peach.

As you go down here, the search engine might actually have a topic modeling algorithm, something like latent semantic indexing, which was an early model, or a later model like latent Dirichlet allocation, which is a somewhat later model, or even predictive latent Dirichlet allocation, which is an even later model. Model’s not particularly important, especially for our purposes.

What is important is to know that there’s probably some scoring going on. A search engine — Google, Bing — can understand that some of these words are more connected to Super Mario Brothers than others, and it can do the reverse. They can say Super Mario Brothers is somewhat connected to video games and very not connected to cat food. So if we find a page that happens to have the title element of Super Mario Brothers, but most of the on-page content seems to be about cat food, well, maybe we shouldn’t rank that even if it has lots of incoming links with anchor text saying “Super Mario Brothers” or a very high page rank or domain authority or those kinds of things.

So search engines, Google, in particular, has gotten very, very smart about this connectivity stuff and this topic modeling post-Hummingbird. Hummingbird, of course, being the algorithm update from last fall that changed a lot of how they can interpret words and phrases.

So knowing that Google and Bing can calculate this relative connectivity, connectivity between the words and phrases and topics, we want to know how are they doing this. That answer is actually extremely broad. So that could come from co-occurrence in web documents. Sorry for turning my back on the camera. I know I’m supposed to move like this, but I just had to do a little twirl for you.

Distance between the keywords. I mean distance on the actual page itself. Does Google find “Super Mario Brothers” near the word “Mario” on a lot of the documents where the two occur, or are they relatively far away? Maybe Super Mario Brothers does appear with cat food a lot, but they’re quite far away. They might look at citations and links between documents in terms of, boy, there’s a lot pages on the web, when they talk about Super Mario Brothers, they also link to pages about Mario, Luigi, Nintendo, etc.

They can look at the anchor text connections of those links. They could look at co-occurrence of those words biased by a given corpi, a set of corpuses, or from certain domains. So they might say, “Hey, we only want to pay attention to what’s on the fresh web right now or in the blogosphere or on news sites or on trusted domains, these kinds of things as opposed to looking at all of the documents on the web.” They might choose to do this in multiple different sets of corpi.

They can look at queries from searchers, which is a really powerful thing that we unfortunately don’t have access to. So they might see searcher behavior saying that a lot of people who search for Mario, Luigi, Nintendo are also searching for Super Mario Brothers.

They might look at searcher clicks, visits, history, all of that browser data that they’ve got from Chrome and from Android and, of course, from Google itself, and they might say those are corpi that they use to connect up words and phrases.

Probably there’s a whole list of other places that they’re getting this from. So they can build a very robust data set to connect words and phrases. For us, as SEOs, this means a few things.

If you’re targeting a keyword for rankings, say “Super Mario Brothers,” those semantically connected and related terms and phrases can help with a number of things. So if you could know that these were the right words and phrases that search engines connected to Super Mario Brothers, you can do all sorts of stuff. Things like inclusion on the page itself, helping to tell the search engine my page is more relevant for Super Mario Brothers because I include words like Mario, Luigi, Princess Peach, Bowser, Nintendo, etc. as opposed to things like cat food, dog food, T-shirts, glasses, what have you.

You can think about it in the links that you earn, the documents that are linking to you and whether they contain those words and phrases and are on those topics, the anchor text that points to you potentially. You can certainly be thinking about this from a naming convention and branding standpoint. So if you’re going to call a product something or call a page something or your unique version of it, you might think about including more of these words or biasing to have those words in the description of the product itself, the formal product description.

For an About page, you might think about the formal bio for a person or a company, including those kinds of words, so that as you’re getting cited around the web or on your book cover jacket or in the presentation that you give at a conference, those words are included. They don’t necessarily have to be links. This is a potentially powerful thing to say a lot of people who mention Super Mario Brothers tend to point to this page Nintendo8.com, which I think actually you can play the original “Super Mario Brothers” live on the web. It’s kind of fun. Sorry to waste your afternoon with that.

Of course, these can also be additional keywords that you might consider targeting. This can be part of your keyword research in addition to your on-page and link building optimization.

What’s unfortunate is right now there are not a lot of tools out there to help you with this process. There is a tool from Virante. Russ Jones, I think did some funding internally to put this together, and it’s quite cool. It’s 
nTopic.org. Hopefully, this Whiteboard Friday won’t bring that tool to its knees by sending tons of traffic over there. But if it does, maybe give it a few days and come back. It gives you a broad score with a little more data if you register and log in. It’s got a plugin for Chrome and for WordPress. It’s fairly simplistic right now, but it might help you say, “Is this page on the topic of the term or phrase that I’m targeting?”

There are many, many downloadable tools and libraries. In fact, Code.google.com has an LDA topic modeling tool specifically, and that might have been something that Google used back in the day. We don’t know.

If you do a search for topic modeling tools, you can find these. Unfortunately, almost all of them are going to require some web development background at the very least. Many of them rely on a Python library or an API. Almost all of them also require a training corpus in order to model things on. So you can think about, “Well, maybe I can download Wikipedia’s content and use that as a training model or use the top 10 search results from Google as some sort of training model.”

This is tough stuff. This is one of the reasons why at Moz I’m particularly passionate about trying to make this something that we can help with in our on-page optimization and keyword difficulty tools, because I think this can be very powerful stuff.

What is true is that you can spot check this yourself right now. It is very possible to go look at things like related searches, look at the keyword terms and phrases that also appear on the pages that are ranking in the top 10 and extract these things out and use your own mental intelligence to say, “Are these terms and phrases relevant? Should they be included? Are these things that people would be looking for? Are they topically relevant?” Consider including them and using them for all of these things. Hopefully, over time, we’ll get more sophisticated in the SEO world with tools that can help with this.

All right, everyone, hope you’ve enjoyed this addition of Whiteboard Friday. Look forward to some great comments, and we’ll see you again next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 5 years ago from feedproxy.google.com