Is Australia the land of opportunity for your retail brand?

Australia has a resident population of more than 24 million and, according to eMarketer, the country’s ecommerce sales are predicted to reach A$32.56 billion by 2017. The country’s remote location in the APAC region means that unlike European countries or the USA, traditionally there have been a lack of global brands sold locally.

Of course, we also know that many expatriates, particularly from inside the Commonwealth, have made Australia their home and are keen to buy products they know and love from their country of origin.

All of these factors present a huge and potentially lucrative opportunity for non-Australian brands wanting to open up their new and innovative products to a fresh market, or compete for market share.

But it’s not just non-Australian retailers who are at an advantage here: Australia was late to the ecommerce party because native, established brands were trading well without it. Subsequently, Australian retailers’ ecommerce technology stacks are much more recent and not burdened by legacy systems. This makes it much easier to extend, or get started with, best-of-breed technologies and cash in on a market that’s booming. To put some of this into perspective, Magento’s innovative ecommerce platform currently takes 42% of Australia’s market share and the world’s first adopter of Magento 2.0 was an Australian brand.

The GST loophole

At the moment, local retailers are campaigning against a rule that exempts foreign websites from being charged a 10% general sales tax (GST) on purchases under A$1,000. And in 2013, Australian consumers made $3.11 billion worth of purchases under A$1,000.[1]

While the current GST break appears to put non-Australian retailers at an advantage, Australian-based brands such as Harvey Norman are using it to their advantage by setting up ecommerce operations in Asia to enjoy the GST benefit.

Australian consumers have also countered the argument by saying that price isn’t always the motivator when it comes to making purchasing decisions.

It’s not a place where no man has gone before

Often, concerns around meeting local compliance and lack of overseas business knowledge prevent outsiders from taking the leap into cross-border trade. However, this ecommerce passport, created by Ecommerce Worldwide and NORA, is designed to support those considering selling in Australia. The guide provides a comprehensive look into everything from the country’s economy and trade status, to logistics and dealing with international payments.

Global expansion success stories are also invaluable sources of information. For instance, it’s not just lower-end retailers that are fitting the bill, with brands like online luxury fashion retailer Net-a-Porter naming Australia as one of its biggest markets.

How tech-savvy are the Aussies?

One of the concerns you might have as a new entrant into the market is how you’ll reach and sell to your new audience, particularly without having a physical presence. The good news is that more than 80% of the country is digitally enabled and 60% of mobile phone users own a smartphone – so online is deeply rooted into the majority of Australians’ lives. [2]

Marketing your brand

Heard the saying “Fire bullets then fire cannonballs”? In any case, you’ll want to test the waters and gauge people’s reactions to your product or service.

It all starts with the website because, without it, you’re not discoverable or searchable, and you’ve nowhere to drive people to when running campaigns. SEO and SEM should definitely be a priority, and an online store that can handle multiple regions and storefronts, like Magento, will make your life easier. A mobile-first mentality and well thought-out UX will also place you in a good position.

Once your new web store is set up, you should be making every effort to collect visitors’ email addresses, perhaps via a popover. Why? Firstly, email is one of the top three priority areas for Australian retailers, because it’s a cost-effective, scalable marketing channel that enables true personalization.

Secondly, email marketing automation empowers you to deliver the customer experience today’s consumer expects, as well as enabling you to communicate with them throughout the lifecycle. Check out our ‘Do customer experience masters really exist?’ whitepaper for some real-life success stories.

Like the Magento platform, dotmailer is set up to handle multiple languages, regions and accounts, and is designed to grow with you.

In summary, there’s great scope for ecommerce success in Australia, whether you’re a native bricks-and-mortar retailer, a start-up or a non-Australian merchant. The barriers to cross-border trade are falling and Australia is one of APAC’s most developed regions in terms of purchasing power and tech savviness.

We recently worked with ecommerce expert Chloe Thomas to produce a whitepaper on cross-border trade, which goes into much more detail on how to market and sell successfully in new territories. You can download a free copy here.

[1] Australian Passport 2015: Cross-Border Trading Report

[2] Australian Passport 2015: Cross-Border Trading Report

Reblogged 3 years ago from blog.dotmailer.com

Controlling Search Engine Crawlers for Better Indexation and Rankings – Whiteboard Friday

Posted by randfish

When should you disallow search engines in your robots.txt file, and when should you use meta robots tags in a page header? What about nofollowing links? In today’s Whiteboard Friday, Rand covers these tools and their appropriate use in four situations that SEOs commonly find themselves facing.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to talk about controlling search engine crawlers, blocking bots, sending bots where we want, restricting them from where we don’t want them to go. We’re going to talk a little bit about crawl budget and what you should and shouldn’t have indexed.

As a start, what I want to do is discuss the ways in which we can control robots. Those include the three primary ones: robots.txt, meta robots, and—well, the nofollow tag is a little bit less about controlling bots.

There are a few others that we’re going to discuss as well, including Webmaster Tools (Search Console) and URL status codes. But let’s dive into those first few first.

Robots.txt lives at yoursite.com/robots.txt, it tells crawlers what they should and shouldn’t access, it doesn’t always get respected by Google and Bing. So a lot of folks when you say, “hey, disallow this,” and then you suddenly see those URLs popping up and you’re wondering what’s going on, look—Google and Bing oftentimes think that they just know better. They think that maybe you’ve made a mistake, they think “hey, there’s a lot of links pointing to this content, there’s a lot of people who are visiting and caring about this content, maybe you didn’t intend for us to block it.” The more specific you get about an individual URL, the better they usually are about respecting it. The less specific, meaning the more you use wildcards or say “everything behind this entire big directory,” the worse they are about necessarily believing you.

Meta robots—a little different—that lives in the headers of individual pages, so you can only control a single page with a meta robots tag. That tells the engines whether or not they should keep a page in the index, and whether they should follow the links on that page, and it’s usually a lot more respected, because it’s at an individual-page level; Google and Bing tend to believe you about the meta robots tag.

And then the nofollow tag, that lives on an individual link on a page. It doesn’t tell engines where to crawl or not to crawl. All it’s saying is whether you editorially vouch for a page that is being linked to, and whether you want to pass the PageRank and link equity metrics to that page.

Interesting point about meta robots and robots.txt working together (or not working together so well)—many, many folks in the SEO world do this and then get frustrated.

What if, for example, we take a page like “blogtest.html” on our domain and we say “all user agents, you are not allowed to crawl blogtest.html. Okay—that’s a good way to keep that page away from being crawled, but just because something is not crawled doesn’t necessarily mean it won’t be in the search results.

So then we have our SEO folks go, “you know what, let’s make doubly sure that doesn’t show up in search results; we’ll put in the meta robots tag:”

<meta name="robots" content="noindex, follow">

So, “noindex, follow” tells the search engine crawler they can follow the links on the page, but they shouldn’t index this particular one.

Then, you go and run a search for “blog test” in this case, and everybody on the team’s like “What the heck!? WTF? Why am I seeing this page show up in search results?”

The answer is, you told the engines that they couldn’t crawl the page, so they didn’t. But they are still putting it in the results. They’re actually probably not going to include a meta description; they might have something like “we can’t include a meta description because of this site’s robots.txt file.” The reason it’s showing up is because they can’t see the noindex; all they see is the disallow.

So, if you want something truly removed, unable to be seen in search results, you can’t just disallow a crawler. You have to say meta “noindex” and you have to let them crawl it.

So this creates some complications. Robots.txt can be great if we’re trying to save crawl bandwidth, but it isn’t necessarily ideal for preventing a page from being shown in the search results. I would not recommend, by the way, that you do what we think Twitter recently tried to do, where they tried to canonicalize www and non-www by saying “Google, don’t crawl the www version of twitter.com.” What you should be doing is rel canonical-ing or using a 301.

Meta robots—that can allow crawling and link-following while disallowing indexation, which is great, but it requires crawl budget and you can still conserve indexing.

The nofollow tag, generally speaking, is not particularly useful for controlling bots or conserving indexation.

Webmaster Tools (now Google Search Console) has some special things that allow you to restrict access or remove a result from the search results. For example, if you have 404’d something or if you’ve told them not to crawl something but it’s still showing up in there, you can manually say “don’t do that.” There are a few other crawl protocol things that you can do.

And then URL status codes—these are a valid way to do things, but they’re going to obviously change what’s going on on your pages, too.

If you’re not having a lot of luck using a 404 to remove something, you can use a 410 to permanently remove something from the index. Just be aware that once you use a 410, it can take a long time if you want to get that page re-crawled or re-indexed, and you want to tell the search engines “it’s back!” 410 is permanent removal.

301—permanent redirect, we’ve talked about those here—and 302, temporary redirect.

Now let’s jump into a few specific use cases of “what kinds of content should and shouldn’t I allow engines to crawl and index” in this next version…

[Rand moves at superhuman speed to erase the board and draw part two of this Whiteboard Friday. Seriously, we showed Roger how fast it was, and even he was impressed.]

Four crawling/indexing problems to solve

So we’ve got these four big problems that I want to talk about as they relate to crawling and indexing.

1. Content that isn’t ready yet

The first one here is around, “If I have content of quality I’m still trying to improve—it’s not yet ready for primetime, it’s not ready for Google, maybe I have a bunch of products and I only have the descriptions from the manufacturer and I need people to be able to access them, so I’m rewriting the content and creating unique value on those pages… they’re just not ready yet—what should I do with those?”

My options around crawling and indexing? If I have a large quantity of those—maybe thousands, tens of thousands, hundreds of thousands—I would probably go the robots.txt route. I’d disallow those pages from being crawled, and then eventually as I get (folder by folder) those sets of URLs ready, I can then allow crawling and maybe even submit them to Google via an XML sitemap.

If I’m talking about a small quantity—a few dozen, a few hundred pages—well, I’d probably just use the meta robots noindex, and then I’d pull that noindex off of those pages as they are made ready for Google’s consumption. And then again, I would probably use the XML sitemap and start submitting those once they’re ready.

2. Dealing with duplicate or thin content

What about, “Should I noindex, nofollow, or potentially disallow crawling on largely duplicate URLs or thin content?” I’ve got an example. Let’s say I’m an ecommerce shop, I’m selling this nice Star Wars t-shirt which I think is kind of hilarious, so I’ve got starwarsshirt.html, and it links out to a larger version of an image, and that’s an individual HTML page. It links out to different colors, which change the URL of the page, so I have a gray, blue, and black version. Well, these four pages are really all part of this same one, so I wouldn’t recommend disallowing crawling on these, and I wouldn’t recommend noindexing them. What I would do there is a rel canonical.

Remember, rel canonical is one of those things that can be precluded by disallowing. So, if I were to disallow these from being crawled, Google couldn’t see the rel canonical back, so if someone linked to the blue version instead of the default version, now I potentially don’t get link credit for that. So what I really want to do is use the rel canonical, allow the indexing, and allow it to be crawled. If you really feel like it, you could also put a meta “noindex, follow” on these pages, but I don’t really think that’s necessary, and again that might interfere with the rel canonical.

3. Passing link equity without appearing in search results

Number three: “If I want to pass link equity (or at least crawling) through a set of pages without those pages actually appearing in search results—so maybe I have navigational stuff, ways that humans are going to navigate through my pages, but I don’t need those appearing in search results—what should I use then?”

What I would say here is, you can use the meta robots to say “don’t index the page, but do follow the links that are on that page.” That’s a pretty nice, handy use case for that.

Do NOT, however, disallow those in robots.txt—many, many folks make this mistake. What happens if you disallow crawling on those, Google can’t see the noindex. They don’t know that they can follow it. Granted, as we talked about before, sometimes Google doesn’t obey the robots.txt, but you can’t rely on that behavior. Trust that the disallow in robots.txt will prevent them from crawling. So I would say, the meta robots “noindex, follow” is the way to do this.

4. Search results-type pages

Finally, fourth, “What should I do with search results-type pages?” Google has said many times that they don’t like your search results from your own internal engine appearing in their search results, and so this can be a tricky use case.

Sometimes a search result page—a page that lists many types of results that might come from a database of types of content that you’ve got on your site—could actually be a very good result for a searcher who is looking for a wide variety of content, or who wants to see what you have on offer. Yelp does this: When you say, “I’m looking for restaurants in Seattle, WA,” they’ll give you what is essentially a list of search results, and Google does want those to appear because that page provides a great result. But you should be doing what Yelp does there, and make the most common or popular individual sets of those search results into category-style pages. A page that provides real, unique value, that’s not just a list of search results, that is more of a landing page than a search results page.

However, that being said, if you’ve got a long tail of these, or if you’d say “hey, our internal search engine, that’s really for internal visitors only—it’s not useful to have those pages show up in search results, and we don’t think we need to make the effort to make those into category landing pages.” Then you can use the disallow in robots.txt to prevent those.

Just be cautious here, because I have sometimes seen an over-swinging of the pendulum toward blocking all types of search results, and sometimes that can actually hurt your SEO and your traffic. Sometimes those pages can be really useful to people. So check your analytics, and make sure those aren’t valuable pages that should be served up and turned into landing pages. If you’re sure, then go ahead and disallow all your search results-style pages. You’ll see a lot of sites doing this in their robots.txt file.

That being said, I hope you have some great questions about crawling and indexing, controlling robots, blocking robots, allowing robots, and I’ll try and tackle those in the comments below.

We’ll look forward to seeing you again next week for another edition of Whiteboard Friday. Take care!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

The Importance of Being Different: Creating a Competitive Advantage With Your USP

Posted by TrentonGreener

“The one who follows the crowd will usually go no further than the crowd. Those who walk alone are likely to find themselves in places no one has ever been before.”

While this quote has been credited to everyone from Francis Phillip Wernig, under the pseudonym Alan Ashley-Pitt, to Einstein himself, the powerful message does not lose its substance no matter whom you choose to credit. There is a very important yet often overlooked effect of not heeding this warning. One which can be applied to all aspects of life. From love and happiness, to business and marketing, copying what your competitors are doing and failing to forge your own path can be a detrimental mistake.

While as marketers we are all acutely aware of the importance of differentiation, we’ve been trained for the majority of our lives to seek out the norm.

We spend the majority of our adolescent lives trying desperately not to be different. No one has ever been picked on for being too normal or not being different enough. We would beg our parents to buy us the same clothes little Jimmy or little Jamie wore. We’d want the same backpack and the same bike everyone else had. With the rise of the cell phone and later the smartphone, on hands and knees, we begged and pleaded for our parents to buy us the Razr, the StarTAC (bonus points if you didn’t have to Google that one), and later the iPhone. Did we truly want these things? Yes, but not just because they were cutting edge and nifty. We desired them because the people around us had them. We didn’t want to be the last to get these devices. We didn’t want to be different.

Thankfully, as we mature we begin to realize the fallacy that is trying to be normal. We start to become individuals and learn to appreciate that being different is often seen as beautiful. However, while we begin to celebrate being different on a personal level, it does not always translate into our business or professional lives.

We unconsciously and naturally seek out the normal, and if we want to be different—truly different in a way that creates an advantage—we have to work for it.

The truth of the matter is, anyone can be different. In fact, we all are very different. Even identical twins with the same DNA will often have starkly different personalities. As a business, the real challenge lies in being different in a way that is relevant, valuable to your audience, and creates an advantage.

“Strong products and services are highly differentiated from all other products and services. It’s that simple. It’s that difficult.” – Austin McGhie, Brand Is a Four Letter Word

Let’s explore the example of Revel Hotel & Casino. Revel is a 70-story luxury casino in Atlantic City that was built in 2012. There is simply not another casino of the same class in Atlantic City, but there might be a reason for this. Even if you’re not familiar with the city, a quick jump onto Atlantic City’s tourism website reveals that of the five hero banners that rotate, not one specifically mentions gambling, but three reference the boardwalk. This is further illustrated when exploring their internal linking structure. The beaches, boardwalk, and shopping all appear before a single mention of casinos. There simply isn’t as much of a market for high-end gamblers in the Atlantic City area; in the states Las Vegas serves that role. So while Revel has a unique advantage, their ability to attract customers to their resort has not resulted in profitable earnings reports. In Q2 2012, Revel had a gross operating loss of $35.177M, and in Q3 2012 that increased to $36.838M.

So you need to create a unique selling proposition (also known as unique selling point and commonly referred to as a USP), and your USP needs to be valuable to your audience and create a competitive advantage. Sounds easy enough, right? Now for the kicker. That advantage needs to be as sustainable as physically possible over the long term.

“How long will it take our competitors to duplicate our advantage?”

You really need to explore this question and the possible solutions your competitors could utilize to play catch-up or duplicate what you’ve done. Look no further than Google vs Bing to see this in action. No company out there is going to just give up because your USP is so much better; most will pivot or adapt in some way.

Let’s look at a Seattle-area coffee company of which you may or may not be familiar. Starbucks has tried quite a few times over the years to level-up their tea game with limited success, but the markets that Starbucks has really struggled to break into are the pastry, breads, dessert, and food markets.

Other stores had more success in these markets, and they thought that high-quality teas and bakery items were the USPs that differentiated them from the Big Bad Wolf that is Starbucks. And while they were right to think that their brick house would save them from the Big Bad Wolf for some time, this fable doesn’t end with the Big Bad Wolf in a boiling pot.

Never underestimate your competitor’s ability to be agile, specifically when overcoming a competitive disadvantage.

If your competitor can’t beat you by making a better product or service internally, they can always choose to buy someone who can.

After months of courting, on June 4th, 2012 Starbucks announced that they had come to an agreement to purchase La Boulange in order to “elevate core food offerings and build a premium, artisanal bakery brand.” If you’re a small-to-medium sized coffee shop and/or bakery that even indirectly competed with Starbucks, a new challenger approaches. And while those tea shops momentarily felt safe within the brick walls that guarded their USP, on the final day of that same year, the Big Bad Wolf huffed and puffed and blew a stack of cash all over Teavana. Making Teavana a wholly-owned subsidiary of Starbucks for the low, low price of $620M.

Sarcasm aside, this does a great job of illustrating the ability of companies—especially those with deep pockets—to be agile, and demonstrates that they often have an uncanny ability to overcome your company’s competitive advantage. In seven months, Starbucks went from a minor player in these markets to having all the tools they need to dominate tea and pastries. Have you tried their raspberry pound cake? It’s phenomenal.

Why does this matter to me?

Ok, we get it. We need to be different, and in a way that is relevant, valuable, defensible, and sustainable. But I’m not the CEO, or even the CMO. I cannot effect change on a company level; why does this matter to me?

I’m a firm believer that you effect change no matter what the name plate on your desk may say. Sure, you may not be able to call an all-staff meeting today and completely change the direction of your company tomorrow, but you can effect change on the parts of the business you do touch. No matter your title or area of responsibility, you need to know your company’s, client’s, or even a specific piece of content’s USP, and you need to ensure it is applied liberally to all areas of your work.

Look at this example SERP for “Mechanics”:

While yes, this search is very likely to be local-sensitive, that doesn’t mean you can’t stand out. Every single AdWords result, save one, has only the word “Mechanics” in the headline. (While the top of page ad is pulling description line 1 into the heading, the actual headline is still only “Mechanic.”) But even the one headline that is different doesn’t do a great job of illustrating the company’s USP. Mechanics at home? Whose home? Mine or theirs? I’m a huge fan of Steve Krug’s “Don’t Make Me Think,” and in this scenario there are too many questions I need answered before I’m willing to click through. “Mechanics; We Come To You” or even “Traveling Mechanics” illustrates this point much more clearly, and still fits within the 25-character limit for the headline.

If you’re an AdWords user, no matter how big or small your monthly spend may be, take a look at your top 10-15 keywords by volume and evaluate how well you’re differentiating yourself from the other brands in your industry. Test ad copy that draws attention to your USP and reap the rewards.

Now while this is simply an AdWords text ad example, the same concept can be applied universally across all of marketing.

Title tags & meta descriptions

As we alluded to above, not only do companies have USPs, but individual pieces of content can, and should, have their own USP. Use your title tag and meta description to illustrate what differentiates your piece of content from the competition and do so in a way that attracts the searcher’s click. Use your USP to your advantage. If you have already established a strong brand within a specific niche, great! Now use it to your advantage. Though it’s much more likely that you are competing against a strong brand, and in these scenarios ask yourself, “What makes our content different from theirs?” The answer you come up with is your content’s USP. Call attention to that in your title tag and meta description, and watch the CTR climb.

I encourage you to hop into your own site’s analytics and look at your top 10-15 organic landing pages and see how well you differentiate yourself. Even if you’re hesitant to negatively affect your inbound gold mines by changing the title tags, run a test and change up your meta description to draw attention to your USP. In an hour’s work, you just may make the change that pushes you a little further up those SERPs.

Branding

Let’s break outside the world of digital marketing and look at the world of branding. Tom’s Shoes competes against some heavy hitters in Nike, Adidas, Reebok, and Puma just to name a few. While Tom’s can’t hope to compete against the marketing budgets of these companies in a fair fight, they instead chose to take what makes them different, their USP, and disseminate it every chance they get. They have labeled themselves “The One for One” company. It’s in their homepage’s title tag, in every piece of marketing they put out, and it smacks you in the face when you land on their site. They even use the call-to-action “Get Good Karma” throughout their site.

Now as many of us may know, partially because of the scandal it created in late 2013, Tom’s is not actually a non-profit organization. No matter how you feel about the matter, this marketing strategy has created a positive effect on their bottom line. Fast Company conservatively estimated their revenues in 2013 at $250M, with many estimates being closer to the $300M mark. Not too bad of a slice of the pie when competing against the powerhouses Tom’s does.

Wherever you stand on this issue, Tom’s Shoes has done a phenomenal job of differentiating their brand from the big hitters in their industry.

Know your USP and disseminate it every chance you get.

This is worth repeating. Know your USP and disseminate it every chance you get, whether that be in title tags, ad copy, on-page copy, branding, or any other segment of your marketing campaigns. Online or offline, be different. And remember the quote that we started with, “The one who follows the crowd will usually go no further than the crowd. Those who walk alone are likely to find themselves in places no one has ever been before.”

The amount of marketing knowledge that can be taken from this one simple statement is astounding. Heed the words, stand out from the crowd, and you will have success.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Should I Use Relative or Absolute URLs? – Whiteboard Friday

Posted by RuthBurrReedy

It was once commonplace for developers to code relative URLs into a site. There are a number of reasons why that might not be the best idea for SEO, and in today’s Whiteboard Friday, Ruth Burr Reedy is here to tell you all about why.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Let’s discuss some non-philosophical absolutes and relatives

Howdy, Moz fans. My name is Ruth Burr Reedy. You may recognize me from such projects as when I used to be the Head of SEO at Moz. I’m now the Senior SEO Manager at BigWing Interactive in Oklahoma City. Today we’re going to talk about relative versus absolute URLs and why they are important.

At any given time, your website can have several different configurations that might be causing duplicate content issues. You could have just a standard http://www.example.com. That’s a pretty standard format for a website.

But the main sources that we see of domain level duplicate content are when the non-www.example.com does not redirect to the www or vice-versa, and when the HTTPS versions of your URLs are not forced to resolve to HTTP versions or, again, vice-versa. What this can mean is if all of these scenarios are true, if all four of these URLs resolve without being forced to resolve to a canonical version, you can, in essence, have four versions of your website out on the Internet. This may or may not be a problem.

It’s not ideal for a couple of reasons. Number one, duplicate content is a problem because some people think that duplicate content is going to give you a penalty. Duplicate content is not going to get your website penalized in the same way that you might see a spammy link penalty from Penguin. There’s no actual penalty involved. You won’t be punished for having duplicate content.

The problem with duplicate content is that you’re basically relying on Google to figure out what the real version of your website is. Google is seeing the URL from all four versions of your website. They’re going to try to figure out which URL is the real URL and just rank that one. The problem with that is you’re basically leaving that decision up to Google when it’s something that you could take control of for yourself.

There are a couple of other reasons that we’ll go into a little bit later for why duplicate content can be a problem. But in short, duplicate content is no good.

However, just having these URLs not resolve to each other may or may not be a huge problem. When it really becomes a serious issue is when that problem is combined with injudicious use of relative URLs in internal links. So let’s talk a little bit about the difference between a relative URL and an absolute URL when it comes to internal linking.

With an absolute URL, you are putting the entire web address of the page that you are linking to in the link. You’re putting your full domain, everything in the link, including /page. That’s an absolute URL.

However, when coding a website, it’s a fairly common web development practice to instead code internal links with what’s called a relative URL. A relative URL is just /page. Basically what that does is it relies on your browser to understand, “Okay, this link is pointing to a page that’s on the same domain that we’re already on. I’m just going to assume that that is the case and go there.”

There are a couple of really good reasons to code relative URLs

1) It is much easier and faster to code.

When you are a web developer and you’re building a site and there thousands of pages, coding relative versus absolute URLs is a way to be more efficient. You’ll see it happen a lot.

2) Staging environments

Another reason why you might see relative versus absolute URLs is some content management systems — and SharePoint is a great example of this — have a staging environment that’s on its own domain. Instead of being example.com, it will be examplestaging.com. The entire website will basically be replicated on that staging domain. Having relative versus absolute URLs means that the same website can exist on staging and on production, or the live accessible version of your website, without having to go back in and recode all of those URLs. Again, it’s more efficient for your web development team. Those are really perfectly valid reasons to do those things. So don’t yell at your web dev team if they’ve coded relative URLS, because from their perspective it is a better solution.

Relative URLs will also cause your page to load slightly faster. However, in my experience, the SEO benefits of having absolute versus relative URLs in your website far outweigh the teeny-tiny bit longer that it will take the page to load. It’s very negligible. If you have a really, really long page load time, there’s going to be a whole boatload of things that you can change that will make a bigger difference than coding your URLs as relative versus absolute.

Page load time, in my opinion, not a concern here. However, it is something that your web dev team may bring up with you when you try to address with them the fact that, from an SEO perspective, coding your website with relative versus absolute URLs, especially in the nav, is not a good solution.

There are even better reasons to use absolute URLs

1) Scrapers

If you have all of your internal links as relative URLs, it would be very, very, very easy for a scraper to simply scrape your whole website and put it up on a new domain, and the whole website would just work. That sucks for you, and it’s great for that scraper. But unless you are out there doing public services for scrapers, for some reason, that’s probably not something that you want happening with your beautiful, hardworking, handcrafted website. That’s one reason. There is a scraper risk.

2) Preventing duplicate content issues

But the other reason why it’s very important to have absolute versus relative URLs is that it really mitigates the duplicate content risk that can be presented when you don’t have all of these versions of your website resolving to one version. Google could potentially enter your site on any one of these four pages, which they’re the same page to you. They’re four different pages to Google. They’re the same domain to you. They are four different domains to Google.

But they could enter your site, and if all of your URLs are relative, they can then crawl and index your entire domain using whatever format these are. Whereas if you have absolute links coded, even if Google enters your site on www. and that resolves, once they crawl to another page, that you’ve got coded without the www., all of that other internal link juice and all of the other pages on your website, Google is not going to assume that those live at the www. version. That really cuts down on different versions of each page of your website. If you have relative URLs throughout, you basically have four different websites if you haven’t fixed this problem.

Again, it’s not always a huge issue. Duplicate content, it’s not ideal. However, Google has gotten pretty good at figuring out what the real version of your website is.

You do want to think about internal linking, when you’re thinking about this. If you have basically four different versions of any URL that anybody could just copy and paste when they want to link to you or when they want to share something that you’ve built, you’re diluting your internal links by four, which is not great. You basically would have to build four times as many links in order to get the same authority. So that’s one reason.

3) Crawl Budget

The other reason why it’s pretty important not to do is because of crawl budget. I’m going to point it out like this instead.

When we talk about crawl budget, basically what that is, is every time Google crawls your website, there is a finite depth that they will. There’s a finite number of URLs that they will crawl and then they decide, “Okay, I’m done.” That’s based on a few different things. Your site authority is one of them. Your actual PageRank, not toolbar PageRank, but how good Google actually thinks your website is, is a big part of that. But also how complex your site is, how often it’s updated, things like that are also going to contribute to how often and how deep Google is going to crawl your site.

It’s important to remember when we think about crawl budget that, for Google, crawl budget cost actual dollars. One of Google’s biggest expenditures as a company is the money and the bandwidth it takes to crawl and index the Web. All of that energy that’s going into crawling and indexing the Web, that lives on servers. That bandwidth comes from servers, and that means that using bandwidth cost Google actual real dollars.

So Google is incentivized to crawl as efficiently as possible, because when they crawl inefficiently, it cost them money. If your site is not efficient to crawl, Google is going to save itself some money by crawling it less frequently and crawling to a fewer number of pages per crawl. That can mean that if you have a site that’s updated frequently, your site may not be updating in the index as frequently as you’re updating it. It may also mean that Google, while it’s crawling and indexing, may be crawling and indexing a version of your website that isn’t the version that you really want it to crawl and index.

So having four different versions of your website, all of which are completely crawlable to the last page, because you’ve got relative URLs and you haven’t fixed this duplicate content problem, means that Google has to spend four times as much money in order to really crawl and understand your website. Over time they’re going to do that less and less frequently, especially if you don’t have a really high authority website. If you’re a small website, if you’re just starting out, if you’ve only got a medium number of inbound links, over time you’re going to see your crawl rate and frequency impacted, and that’s bad. We don’t want that. We want Google to come back all the time, see all our pages. They’re beautiful. Put them up in the index. Rank them well. That’s what we want. So that’s what we should do.

There are couple of ways to fix your relative versus absolute URLs problem

1) Fix what is happening on the server side of your website

You have to make sure that you are forcing all of these different versions of your domain to resolve to one version of your domain. For me, I’m pretty agnostic as to which version you pick. You should probably already have a pretty good idea of which version of your website is the real version, whether that’s www, non-www, HTTPS, or HTTP. From my view, what’s most important is that all four of these versions resolve to one version.

From an SEO standpoint, there is evidence to suggest and Google has certainly said that HTTPS is a little bit better than HTTP. From a URL length perspective, I like to not have the www. in there because it doesn’t really do anything. It just makes your URLs four characters longer. If you don’t know which one to pick, I would pick one this one HTTPS, no W’s. But whichever one you pick, what’s really most important is that all of them resolve to one version. You can do that on the server side, and that’s usually pretty easy for your dev team to fix once you tell them that it needs to happen.

2) Fix your internal links

Great. So you fixed it on your server side. Now you need to fix your internal links, and you need to recode them for being relative to being absolute. This is something that your dev team is not going to want to do because it is time consuming and, from a web dev perspective, not that important. However, you should use resources like this Whiteboard Friday to explain to them, from an SEO perspective, both from the scraper risk and from a duplicate content standpoint, having those absolute URLs is a high priority and something that should get done.

You’ll need to fix those, especially in your navigational elements. But once you’ve got your nav fixed, also pull out your database or run a Screaming Frog crawl or however you want to discover internal links that aren’t part of your nav, and make sure you’re updating those to be absolute as well.

Then you’ll do some education with everybody who touches your website saying, “Hey, when you link internally, make sure you’re using the absolute URL and make sure it’s in our preferred format,” because that’s really going to give you the most bang for your buck per internal link. So do some education. Fix your internal links.

Sometimes your dev team going to say, “No, we can’t do that. We’re not going to recode the whole nav. It’s not a good use of our time,” and sometimes they are right. The dev team has more important things to do. That’s okay.

3) Canonicalize it!

If you can’t get your internal links fixed or if they’re not going to get fixed anytime in the near future, a stopgap or a Band-Aid that you can kind of put on this problem is to canonicalize all of your pages. As you’re changing your server to force all of these different versions of your domain to resolve to one, at the same time you should be implementing the canonical tag on all of the pages of your website to self-canonize. On every page, you have a canonical page tag saying, “This page right here that they were already on is the canonical version of this page. ” Or if there’s another page that’s the canonical version, then obviously you point to that instead.

But having each page self-canonicalize will mitigate both the risk of duplicate content internally and some of the risk posed by scrappers, because when they scrape, if they are scraping your website and slapping it up somewhere else, those canonical tags will often stay in place, and that lets Google know this is not the real version of the website.

In conclusion, relative links, not as good. Absolute links, those are the way to go. Make sure that you’re fixing these very common domain level duplicate content problems. If your dev team tries to tell you that they don’t want to do this, just tell them I sent you. Thanks guys.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

The Long Click and the Quality of Search Success

Posted by billslawski

“On the most basic level, Google could see how satisfied users were. To paraphrase Tolstoy, happy users were all the same. The best sign of their happiness was the “Long Click” — This occurred when someone went to a search result, ideally the top one, and did not return. That meant Google has successfully fulfilled the query.”

~ Steven Levy. In the Plex: How Google Thinks, Works, and Shapes our Lives

I often explore and read patents and papers from the search engines to try to get a sense of how they may approach different issues, and learn about the assumptions they make about search, searchers, and the Web. Lately, I’ve been keeping an eye open for papers and patents from the search engines where they talk about a metric known as the “long click.”

A recently granted Google patent uses the metric of a “Long Click” as the center of a process Google may use to track results for queries that were selected by searchers for long visits in a set of search results.

This concept isn’t new. In 2011, I wrote about a Yahoo patent in How a Search Engine May Measure the Quality of Its Search Results, where they discussed a metric that they refer to as a “target page success metric.” It included “dwell time” upon a result as a sign of search success (Yes, search engines have goals, too).

5543947f5bb408.24541747.jpg

Another Google patent described assigning web pages “reachability scores” based upon the quality of pages linked to from those initially visited pages. In the post Does Google Use Reachability Scores in Ranking Resources? I described how a Google patent that might view a long click metric as a sign to see if visitors to that page are engaged by the links to content they find those links pointing to, including links to videos. Google tells us in that patent that it might consider a “long click” to have been made on a video if someone watches at least half the video or 30 seconds of it. The patent suggests that a high reachability score on a page may mean that page could be boosted in Google search results.

554394a877e8c8.30299132.jpg

But the patent I’m writing about today is focused primarily upon looking at and tracking a search success metric like a long click or long dwell time. Here’s the abstract:

Modifying ranking data based on document changes

Invented by Henele I. Adams, and Hyung-Jin Kim

Assigned to Google

US Patent 9,002,867

Granted April 7, 2015

Filed: December 30, 2010

Abstract

Methods, systems, and apparatus, including computer programs encoded on computer storage media for determining a weighted overall quality of result statistic for a document.

One method includes receiving quality of result data for a query and a plurality of versions of a document, determining a weighted overall quality of result statistic for the document with respect to the query including weighting each version specific quality of result statistic and combining the weighted version-specific quality of result statistics, wherein each quality of result statistic is weighted by a weight determined from at least a difference between content of a reference version of the document and content of the version of the document corresponding to the version specific quality of result statistic, and storing the weighted overall quality of result statistic and data associating the query and the document with the weighted overall quality of result statistic.

This patent tells us that search results may be be ranked in an order, according to scores assigned to the search results by a scoring function or process that would be based upon things such as:

  • Where, and how often, query terms appear in the given document,
  • How common the query terms are in the documents indexed by the search engine, or
  • A query-independent measure of quality of the document itself.

Last September, I wrote about how Google might identify a category associated with a query term base upon clicks, in the post Using Query User Data To Classify Queries. In a query for “Lincoln.” the results that appear in response might be about the former US President, the town of Lincoln, Nebraska, and the model of automobile. When someone searches for [Lincoln], Google returning all three of those responses as a top result could be said to be reasonable. The patent I wrote about in that post told us that Google might collect information about “Lincoln” as a search entity, and track which category of results people clicked upon most when they performed that search, to determine what categories of pages to show other searchers. Again, that’s another “search success” based upon a past search history.

There likely is some value in working to find ways to increase the amount of dwell time someone spends upon the pages of your site, if you are already having some success in crafting page titles and snippets that persuade people to click on your pages when they those appear in search results. These approaches can include such things as:

  1. Making visiting your page a positive experience in terms of things like site speed, readability, and scannability.
  2. Making visiting your page a positive experience in terms of things like the quality of the content published on your pages including spelling, grammar, writing style, interest, quality of images, and the links you share to other resources.
  3. Providing a positive experience by offering ideas worth sharing with others, and offering opportunities for commenting and interacting with others, and by being responsive to people who do leave comments.

Here are some resources I found that discuss this long click metric in terms of “dwell time”:

Your ability to create pages that can end up in a “long click” from someone who has come to your site in response to a query, is also a “search success” metric on the search engine’s part, and you both succeed. Just be warned that as the most recent patent from Google on Long Clicks shows us, Google will be watching to make sure that the content of your page doesn’t change too much, and that people are continuing to click upon it in search results, and spend a fair amount to time upon it.

(Images for this post are from my Go Fish Digital Design Lead Devin Holmes @DevinGoFish. Thank you, Devin!)

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

Understand and Harness the Power of Archetypes in Marketing

Posted by gfiorelli1

Roger Dooley, neuromarketing expert, reminds us in his book Brainfluence that in 80% of cases we make a decision before being rationally aware of it.

Although Dooley explains this effect in terms of how our brain works, in my opinion, distinctly separating neuroscience and the theory of archetypes would be incorrect. On the contrary, I believe that these two aspects of the study of the human mind are complementary.

According to
Jung, archetypes are “[…] forms or images of a collective nature which occur practically all over the Earth as constituents of myths and—at the same time—as individual products of unconscious”. He then, added something that interests us greatly: “The [forms and images] are imprinted and hardwired into out psyches”.

Being able to design a brand personality around an archetype that connects unconsciously with our audience is a big first step for: brand loyalty, community creation, engagement, conversions.

The Slender Man is the “Internet age” version of the archetype figure of the Shadow

Archetypes can be also used for differentiating our brand and its messaging from others in our same market niche and to give that brand a unique voice.

If we put users at the center of our marketing strategy, then
we cannot limit ourselves in knowing how they search, how they talk on social media, what they like to share or what their demographics are.

No,
we should also understand the deep psychological reasons why they desire something they search for, talk the way they talk, share what they share, and their psychological relation with the environment and society they live in.

Knowing that,
we can use archetypes to create a deep emotional connection with our audience and earn their strong positive attitude toward us thanks to the empathy that is created between them and us.

Narrative modes, then, help us in shaping in a structured way a brand storytelling able to guide and engage the users, and not simply selling or creating content narrative doomed to fail.

The 12 archetypes




graph by Emily Bennet

The chart above presents the 12 Jungian archetypes (i.e: Hero), to what principal human desire (i.e.: leave a mark on the world) they correspond and what is the main behavior each one uses for achieving that desire (i.e.: mastery).


Remember: if the audience instinctively recognizes the archetypal figure of the brand and its symbolism and instinctively connect with it, then your audience is more ready to like and trust what your brand proposes
.

On the other hand, it is also a good exercise to experiment with archetypes that we would not think are our brand’s one, expanding the practice of A/B tests to make sure we’re working with the correct archetype. 

The Creator

In my last post I used Lego as example of a brand that is winning Internet marketing thanks to its holistic and synergistic use of offline and online marketing channels.

I explained also how part of its success is due to the fact Lego was able to shape its messages and brand personality around the Creator archetype (sometimes called the “Builder”) which is embodied by their tagline, “let’s build”.

Creators tend to be nonconformist and to enjoy self expression.
A Creator brand, then, will empower and prize its audience as much as it is able to express itself using its products.

The Ruler

The Ruler is the leader, the one setting the rules others will follow, even competitors. Usually it’s paired with an
idea of exclusiveness and status growth.

A brand that presents itself as a Ruler is suggesting to their audience that they can be rulers too.

A classic example of Ruler brand is Mercedes:

The Caregiver

Altruism, compassion, generosity.
Caregiver brands present themselves as someone to trust, because they care and empathize with their audience.

The Caregiver is one of the most positive archetypes, and it is obviously used by nonprofit organizations or governmental institutions like UNICEF, but brands like Johnson & Johnson have also shaped their personality and messages around this figure.

The Innocent

The Innocent finds positive sides in everyone and everything

It sees beauty even in things that others will not even consider, and feels in peace with its inner beauty.

Dove, obviously, is a good representation of the Innocent archetype.

The Sage

The Sages wants to know and understand things. 


The Sage is deeply humanist and believe in the power of humankind to shape a better world through knowledge
.

However, the Sage also has a shadowed side: intolerance to ideas others than their own.

Google, in both cases, is a good example a Sage brand.

The Explorer

The Explorer is adventurous, brave, and loves challenges. He tends to be an individualist too, and loves to challenge himself so as to find his real self.


Explorer brands prompt their audience to challenge themselves and to discover the Explorer within

Red Bull is a classic example of these kinds of brands, but REI and Patagonia are even better representations.

The Hero

In many aspects, the Hero archetype is similar to the Explorer and Outlaw ones, with the difference that the Hero many times never wanted to be the hero, but injustice and external events obliged him to find the courage, braveness, and the honor to become one.

Nike, and also its competitor Adidas, shapes its brand voice around this archetypal figure.

The Magician

The Magician is clever, intelligent, and sometimes his ability can be considered supernatural. 


The Magician is able to make the impossible possible
. Because of that some of the best known technology brands use this archetype as their own to showcase their innovation and how they use their advanced knowledge creatively.

Apple—even if you are not an Apple fan—created a powerful brand by shaping it around this archetype. 

The Outlaw


The Outlaw is the rebel, the one who breaks the rules in order to free his true self
.

The Outlaw goes against the canon and is very aware of the constrictions society creates.

A great example of a brand that very well represents the Outlaw archetype is Betabrand.

The Everyman

It is perfectly fine to be “normal,” and happiness can come from simply sharing things with people we love.


Brands targeting the Everyman audience (and painting themselves as such) craft their messages about the beauty of simple things and daily real life
.

Ikea is probably the brand that’s achieved mastery in the use of this archetype over the past few years.

The Jester 

Fun, irreverent, energetic, impulsive and against the established rules at the same time, the Jester is also the only one who is able to tell the truth with a joke. 

Jesters can be revolutionary too, and their motto could be “a laugh will bury you all.”


A brand that presents itself as the Jester is a brand that wants to make our lives easier and more bearable, providing us joy.

The Lover


Sensuality is the main characteristic of the Lover archetype
, as well as strong physicality, passion, and a need for deep and strong sensations.

But the Lover can be also the idealist, the romantic longing for the perfect love.

Archetypes and brand storytelling

Our brain, as many neuroscientists have proved, is
hard-wired for stories (I suggest you to watch this TEDx too).

Therefore, once we have decided what archetype figure best responds both to our audience and our values as a brand,
we must translate the psychology we created for our brand into
brand storytelling.
That storytelling must then be attuned to the psychology of our audience based on our psychographic analysis of them.

Good (brand) storytelling is very hard to achieve, and most of the time we see brands that miserably fail when trying to tell branded stories.

Introducing the Theory of Literary (or Narrative) Modes

In order to help my clients find the correct narrative, I rely on something that usually is not considered by marketers: the
Theory of Literary Modes.

I use this theory, presented first by
Northrop Frye in it essay Anatomy of Criticism, because it is close to our “technical marketer” mindset.

In fact:

  1. The theory is based on a objective and “scientific” analysis of data (the literary corpus produced by humans);
  2. It refuses “personal taste” as a metric, which in web marketing would be the same as creating a campaign with tactics you like but you don’t really know if your public is interested in. Even worse, it would be like saying “create great content” without defining what that means.

Moreover, the
Theory of Literary Modes is deeply structured and strongly relies on semiotics, which is going to be the natural evolution of how search engines like Google will comprehend the content published in the Internet. Semantic thinking is just the first step as well explained 
Isla McKetta here on Moz few months ago.

Finally, Northrop Fryed
considers also archetypes this theory because of the psychological and semiotic value of the symbolism attached to the archetypal figure.

Therefore, my election to use the Theory of Literary Modes responds 

  1. To the need to translate ideal brand storytelling into something real that can instinctively connect with the brand’s audience;
  2. To make the content based on that storytelling process understandable also by search engines.

The Theory of Literary Modes in marketing

To understand how this works in marketing, we need to dig a little deeper into the theory.

A literary work can be classified in two different but complementary ways:

1) Considering only the
relation between the nature of the main character (the Hero) and the ambient (or environment) where he acts.

2) Considering also
if the Hero is refused or accepted by society (Tragedy and Comedy).

In the
first case, as represented in the schema above, if the Hero:
  1. Is higher by nature than the readers and acts in a completely different ambient than theirs, we have a Romance;
  2. Is higher by nature than the readers, but acts in their same ambient, we have an Epic;
  3. Is someone like the reader and acts in the reader’s own ambient, we are in field of Realism;
  4. Is someone lower by nature than the readers and acts in a different or identical ambient, we are in the realm of Irony, which is meant as “distance.”
A fifth situation exists too, the
Myth, when the nature of the Hero is different than ours and acts in an ambient different than ours. The Hero, in this case, is the God.

If we consider also if society refuses or accepts the hero, we can discover the different versions of Tragedy and Comedy.

I will not enter in the details of Tragedy, because
we will not use its modes for brand storytelling (this is only common in specific cases of political marketing or propaganda, classic examples are the mythology of Nazism or Communism).

On the contrary,
the most common modes used in brand storytelling are related to Comedy, where the Hero, who usually is the target audience, is eventually accepted by society (the archetypal world designed by the brand).

In
Comedy we have several sub modes of storytelling:

  1. “The God Accepted.” The Hero is a god or god-like kind of person who must pass through trials in order to be accepted by the society;
  2. The Idyll, where the Hero uses his skills to explore (or conquer) an ideal world and/or become part of an ideal society. Far West and its heir, Space Opera (think of Interstellar) are classic examples. 
  3. Comedy sees the hero trying to impose his own view of the world, fighting for it and finally being awarded with acceptance of his worldview. A good example of this is every well ending biopic of an entrepreneur, and Comedy is the exact contrary of melodrama. 
  4. On a lower level we can find the Picaresque Comedy, where the hero is by nature inferior to the society, but – thanks to his cleverness – is able to elevate himself to society’s level. Some technology business companies use this narrative mode for telling their users that they can “conquer” their market niche despite not having the same economic possibilities as the big brands (this conquering usually involves the brand’s tools).
  5. Finally we have the Irony Mode of Comedy which is quite complex to define. 
    1. It can represent stories where the hero is actually an antihero, who finally fails in his integration into the society. 
    2. It can also be about inflicting pain on helpless victims, as in mystery novels. 
    3. It can also be Parody.

Some examples

The Magician, gamification, and the Idyllic mode

Consider this brand plot:

The user (the Hero) can become part of a community of users only if he or she passes through a series of tasks, which will award prizes and more capabilities. If the user is able to pass through all the tasks, he will not only be accepted but also may have the opportunity to be among the leaders of the community itself.

And now
consider sites, which are strongly centered on communities like GitHub and Code Academy. Consider also SAAS companies that present the freemium model like Moz or mobile games like Boom Beach, where you can unlock new weapons only if you pass a given trial (or you buy them).

The Magician is usually the archetype of reference for these kinds of brands. The Hero (the user) will be able to dominate a complex art thanks to the help of a Master (the brand), which will offer him instruments (i.e.: tools/courses/weapons). 

Trials are not necessarily tests. A trial can be doing something that will be awarded, for instance, with points (like commenting on a Moz blog post), and the more the points the more the recognition, with all the advantages that it may offer. 

Gamification, then, assumes an even stronger meaning and narrative function when tied to an archetype and literary mode.

Ikea, the Everyman, and the Comedic mode

Another
example is Ikea, which we cited before when talking of the Everyman archetype.

In this case, the Hero is someone like me or you who is not an interior designer or decorator or, maybe, who does not have the money for hiring those professionals or buying very expensive furniture and decoration.

But, faithful to its mission statements (“design for all”, “design your own life”…), Ikea is there to help Everyman kind of people like me and you in every way as we decorate our own houses.

On the practical side, this narrative is delivered in all the possible channels used by Ikea: web site, mobile app, social media (look at its
Twitter profile) and YouTube channel.

Betabrand, the Outlaw, and Picaresque Comedy

A third and last example can be
Betabrand.

In this case both the brand and the audience is portrayed using the
Outlaw archetype, and the brand narrative tend to use the Picaresque mode.

The Heroes is the Betabrand community who does not care what the mainstream concept of fashion is and designs and crowdfounds “its fashion.”

How to use archetypes and narrative modes in your brand storytelling

The first thing you must understand is what archetype best responds to your company tenets and mission. 

Usually this is not something an SEO can decide by him- or herself, but it is something that founders, CEOs, and directors of a company can inform.

Oftentimes a small to medium business company can achieve this with a long talk among those company figures and where they are asked to directly define the idealistic “why?” of their company.

In case of bigger companies, defining an archetype can seem almost impossible to do, but the same history of the company and hidden treasure pages like “About Us” can offer clear inspiration.

Look at REI:

Clearly the archetype figure that bests fits REI is the Explorer.

Then, using the information we retrieve when creating the
psychographic of our audience and buyer personas, matching with the characteristics each archetype has, and comparing it with the same brand core values, we can start to understand the archetype and narrative mode. If we look at REI’s audience, then we will see how it also has a certain affinity with the Everyman archetypal figure (and that also explains why REI also dedicates great attention to family as audience).

Once we have defined the best archetype commonly shared by our company and our audience, we must translate this figure and its symbolism into brand storytelling, which in web site includes design, especially the following:

  • Color pattern, because colors have a direct relation with psychological reaction (see this article, especially all the sources it links to)
  • Images, considering that in user-centric marketing the ideal is always to represent our targeted audience (or a credible approximation) as their main characters. I am talking of the so called “hero-shots”, about which Angie Shoetmuller brilliantly discussed in the deck I embed here below:

If you want to dig deeper in discovering the meaning and value of symbols worldwide, I suggest you become member of
Aras.org or to buy the Book of Symbols curated by Aras.

  • Define the best narrative mode to use. REI, again, does this well, using the Idyllic mode where the Hero explores and become part of an ideal society (the REI community, which literally means becoming a member of REI). 

We should, then:

  1. Continue investigating the archetypal nature of our audience conducting surveys
  2. Analyzing the demographic data Google Analytics offers us about our users 
  3. Using GA insights in combination with the data and demographic information offered by social networks’ ad platforms in order to create not only the interest graph of our audience but also to understand the psychology behind those interests 
  4. Doing A/B tests so to see whether symbols, images, and copywriting based on the targeted archetypes work better and if we have the correct archetype.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it

My Favorite 5 Analytics Dashboards – Whiteboard Friday

Posted by Sixthman

Finding effective ways of organizing your analytics dashboards is quite a bit easier if you can get a sense for what has worked for others. To that end, in today’s Whiteboard Friday the founder of Sixth Man Marketing, Ed Reese, shares his five favorite approaches.

UPDATE: At the request of several commenters, Ed has generously provided GA templates for these dashboards. Check out the links in his comment below!

For reference, here’s a still of this week’s whiteboard!

Video transcription

Hi, I’m Ed Reese with Sixth Man Marketing and Local U. Welcome to this edition of Whiteboard Friday. Today we’re going to talk about one of my favorite things in terms of Google Analytics — the dashboard.

So think of your dashboard like the dashboard on your car — what’s important to you and what’s important to your client. I have the new Tesla dashboard, you might recognize it. So, for my Tesla dashboard, I want navigation, tunes, calendar, everything and a bag of chips. You notice my hands are not on the wheel because it drives itself now. Awesome.

So, what’s important? I have the top five dashboards that I like to share with my clients and create for them. These are the executive dashboards — one for the CMO on the marketing side, new markets, content, and a tech check. You can actually create dashboards and make sure that everything is working.

These on the side are some of the few that I think people don’t take a look at as often. It’s my opinion that we have a lot of very generic dashboards, so I like to really dive in and see what we can learn so that your client can really start using them for their advantage.

#1 – Executives

Let’s start with the executive dashboard. There is a lot of debate on whether or not to go from left to right or right to left. So in terms of outcome, behavior, and acquisition, Google Analytics gives you those areas. They don’t mark them as these three categories, but I follow Avinash’s language and the language that GA uses.

When you’re talking to executives or CFOs, it’s my personal opinion that executives always want to see the money first. So focus on financials, conversion rates, number of sales, number of leads. They don’t want to go through the marketing first and then get to the numbers. Just give them what they want. On a dashboard, they’re seeing that first.

So let’s start with the result and then go back to behavior. Now, this is where a lot of people have very generic metrics — pages viewed, generic bounce rate, very broad metrics. To really dive in, I like focusing and using the filters to go to specific areas on the site. So if it’s a destination like a hotel, “Oh, are they viewing the pages that helped them get there? Are they looking at the directional information? Are they viewing discounts and sorts of packages?” Think of the behavior on those types of pages you want to measure, and then reverse engineer. That way you can tell they executive, “Hey, this hotel reservation viewed these packages, which came from these sources, campaigns, search, and social.” Remember, you’re building it so that they can view it for themselves and really take advantage and see, “Oh, that’s working, and this campaign from this source had these behaviors that generated a reservation,” in that example.

#2 – CMO

Now, let’s look at it from a marketing perspective. You want to help make them look awesome. So I like to reverse it and start with the marketing side in terms of acquisition, then go to behavior on the website, and then end up with the same financials — money, conversion rate percentages, number of leads, number of hotel rooms booked, etc. I like to get really, really focused.

So when you’re building a dashboard for a CMO or anyone on the marketing side, talk to them about what metrics matter. What do they really want to learn? A lot of times you need to know their exact territory and really fine tune it in to figure out exactly what they want to find out.

Again, I’m a huge fan of filters. What behavior matters? So for example, one of our clients is Beardbrand. They sell beard oil and they support the Urban Beardsman. We know that their main markets are New York, Texas, California, and the Pacific Northwest. So we could have a very broad regional focus for acquisition, but we don’t. We know where their audience lives, we know what type of behavior they like, and ultimately what type of behavior on the website influences purchases.

So really think from a marketing perspective, “How do we want to measure the acquisition to the behavior on the website and ultimately what does that create?”

These are pretty common, so I think most people are using a marketing and executive dashboard. Here are some that have really made a huge difference for clients of ours.

#3 – New markets

Love new market dashboards. Let’s say, for example, you’re a hotel chain and you normally have people visiting your site from Washington, Oregon, Idaho, and Montana. Well, what happened in our case, we had that excluded, and we were looking at states broader — Hawaii, Alaska, Colorado, Texas. Not normally people who would come to this particular hotel.

Well, we discovered in the dashboard — and it was actually the client that discovered it — that we suddenly had a 6000% increase in Hawaii. They called me and said, “Are we marketing to Hawaii?” I said no. They said, “Well, according to the dashboard, we’ve had 193 room nights in the past 2 months.” Like, “Wow, 193 room nights from Hawaii, what happened?” So we started reverse engineering that, and we found out that Allegiant Airlines suddenly had a direct flight from Honolulu to Spokane, and the hotel in this case was two miles from the hotel. They could then do paid search campaigns in Hawaii. They can try to connect with Allegiant to co-op some advertising and some messaging. Boom. Would never have been discovered without that dashboard.

#4 – Top content

Another example, top content. Again, going back to Beardbrand, they have a site called the Urban Beardsman, and they publish a lot of content for help and videos and tutorials. To measure that content, it’s really important, because they’re putting a lot of work into educating their market and new people who are growing beards and using their product. They want to know, “Is it worth it?” They’re hiring photographers, they’re hiring writers, and we’re able to see if people are reading the content they’re providing, and then ultimately, we’re focusing much more on their content on the behavior side and then figuring out what that outcome is.

A lot of people have content or viewing of the blog as part of an overall dashboard, let’s say for your CMO. I’m a big fan of, in addition to having that ,also having a very specific content dashboard so you can see your top blogs. Whatever content you provide, I want you to always know what that’s driving on your website.

#5 – Tech check

One of the things that I’ve never heard anyone talk about before, that we use all the time, is a tech check. So we want to see a setup so we can view mobile, tablet, desktop, browsers. What are your gaps? Where is your site possibly not being used to its fullest potential? Are there any issues with shopping carts? Where do they fall off on your website? Set up any possible tech that you can track. I’m a big fan of looking both on the mobile, tablet, any type of desktop, browsers especially to see where they’re falling off. For a lot of our clients, we’ll have two, three, or four different tech dashboards. Get them to the technical person on the client side so they can immediately see if there’s an issue. If they’ve updated the website, but maybe they forgot to update a certain portion of it, they’ve got a technical issue, and the dashboard can help detect that.

So these are just a few. I’m a huge fan of dashboards. They’re very powerful. But the big key is to make sure that not only you, but your client understands how to use them, and they use them on a regular basis.

I hope that’s been very helpful. Again, I’m Ed Reese, and these are my top five dashboards. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 4 years ago from tracking.feedpress.it