Well that’s a wrap! What a successful night for the ‘Go Global. Think Local’ seminar held at the Excelsior hotel in Hong Kong back on March 6, 2018. The event was organized by, and fielded presentations from, SmartOSC, Shopify Plus, Lumos and dotmailer – represented by Founder and President Tink Taylor. The event provided a platform for the over 50 attendees present to gain insights and give feedback on what works and what doesn’t when it comes to scaling your business with a customer-first mentality. That is, offering personalized experiences for your customers while still looking to develop cross-border ecommerce.
The seminar in Hong Kong included presentations from SmartOSC, Shopify Plus, Lumos and dotmailer
Tink Taylor kicked off the presentations, using this opportunity to educate brands on how to create human conversations at scale. It’s no longer effective to assume that all customers will respond to the same content, or that casting a wide net will generate the largest number of leads. To meet the rising expectations of consumers and their demands for a personalized experience, Tink said companies must begin incorporating the use of technology and data into their marketing to create “more natural, more human marketing.”
‘One size fits all’ marketing is becoming rightfully outdated, and although it’s still prevalent today, Tink explained that “times have changed. These one-size-fits-all adverts no longer work”. Consumers provide businesses with an enormous amount of data, but Tink explained that the exchange doesn’t stop there, as “consumers expect brands to speak to them about what they want and on their terms”.
Businesses need to be offering conversations across all channels, with the real value being drawn from “feeding the data from one channel into the others in combination”. This gives the consumer the power to choose how they are contacted and communicated with. Tink commented that if you do this correctly “it will be easy for you to have human conversations at scale”.
Tink Taylor, dotmailer Founder and President, addresses delegates in Hong Kong
Tink Taylor was followed by Jeff Chen from Lumos, Duong Bui from SmartOSC and Shopify Plus’ Leo Park.
I would like to thank all of those in attendance and dotmailer’s co-hosts; it was a brilliant event to be involved with and there’s no doubt that the information presented will help businesses in their quest to “Go Global. Think Local”.
In Part 1 we covered raising awareness, data audits and privacy notices. While in Part 2 we covered how GDPR deals with individuals’ rights including subject access requests and legal basis. In the last instalment, we reviewed consent, marketing to children and data breaches. The last three things to think about are data protection impact assessments, data protection officers and international considerations.
10. Data Protection Impact Assessments
It has always been best practice to take a privacy-by-design approach when developing your data capture and processing strategies, as well as a key part of any technology implementation. Privacy impact assessments are fundamental to this approach by giving marketers a useful tool to consider properly the privacy risks that their data processing entails. All the GDPR does here is make privacy by design an express legal requirement and makes PIAs (renamed in the regulations as Data Protection Impact Assessment or DPIA) a requirement under certain circumstances where the data processing is likely to result in high risk to the data subjects such as:
where new technology is being deployed
where a processing activity is likely to significantly impact individuals
where there is large-scale processing on special categories of data
For most marketers, it will be the first two circumstances that will be most likely to trigger a DPIA but it is important to know the special categories of data if appropriate in the future.
In many if not most situations, the DPIA will indicate that the processing of the data is not high risk or if it is high risk, you will be able to address those risks. If you cannot mitigate the risk, you should contact the ICO for guidance on whether processing the data will comply with GDPR.
If you haven’t already, you should start to asses if any DPIAs are warranted within your organisation, who will lead them and who else needs to be involved. There is great guidance published by both the UK ICO and the Article 29 Working Party on DPIAs and privacy by design.
11. Data Protection Officers
US President Harry S. Truman had a sign on his desk that read “the buck stops here.” It was his assurance that he was ultimately responsible for how the government operated under his administration. Historically when it comes to data, the buck has not stopped anywhere due to the way that the collection and processing of data has grown organically within businesses and other organisations. I was speaking with one head of CRM recently who told me of the over 80 marketing databases that they currently have. It is going to come down to this CRM manager to get all of that data into a single place.
Every organisation should designate someone to “take the data buck” – to be ultimately responsible for data privacy and compliance. You should also have a think about where this role of Data Protection Officer (DPO) sits within the organisation and overall governance structures so that the person in this role has the freedom to act, should the need arise. In many instances, the GDPR has overcome this by specifying situations where a DPO is required such as:
organisations that carry out large scale, regular and systematic monitoring of individuals
organisations that carry out large scale processing of special categories of data
The first thing to remember here is that Brexit will have little to no impact on GDPR. The government has confirmed on multiple occasions including as recently as the Queen’s Speech on 21st of June 2017, that GDPR will be the data protection law in the UK going forward. Moreover, the UK will still be an EU member when the law goes into effect on the 25th of May 2018.
If you operate in multiple EU member states, then you should determine which would be your lead data regulator. This is not meant to be a way to be under the auspices of the most favourable regulator. Your lead regulator should be the state where your central administration in the EU is based or the location where decisions about your data processing are taken. You can do this by mapping out where you take your data processing decisions and the country with the preponderance of those decisions is the one you should choose. If on the other hand you are not engaged in any cross border data processing, then your decision here is quite straightforward. Once again, the Article 29 Working Party has produced some guidance that will help you make the correct decision.
As I said at the beginning of part 1, data recently released by the DMA indicates that marketers are feeling less prepared for GDPR than they did in February. Marketers are also feeling less knowledgeable about GDPR in general and their four big concerns are:
Implementing a compliant system
I hope that this blog series has gone a little way to making you feel more prepared or at least has given you some things to think about and some things to start discussing internally. Over the coming weeks and months, dotmailer will be publishing useful guidance from recognised sources geared towards email marketers. Our approach is to keep our readers up to speed based on facts directly from this reputable guidance or vetted by the UK or other data regulators around Europe. In addition, our teams will be ready to help you implement the advice you receive from your professional advisors within the dotmailer environment.
In Part 1 we covered raising awareness, data audits and privacy notices. While in Part 2 we covered how GDPR deals with individuals’ rights including subject access requests and legal basis. In this week’s installment, we will be reviewing consent, marketing to children and data breaches.
Under the Privacy and Electronic Communications Regulations, email marketing is consent-based. GDPR however, more fully defines how to get consent with the following stipulations:
Must be freely given – giving people genuine choice and control over how you use their data and “unbundling” consent from other terms and conditions; in other words, consent cannot be a precondition for a service unless it necessary to deliver the service.
Specific – clearly explain exactly what people are consenting to in a way they can easily understand (i.e. not with a load of legal mumbo jumbo) and in a way that does not disrupt the user experience.
Informed – clearly identify yourself as the data controller, identify each processing operation you will be performing, collect separate consent for each unless this would be “unduly disruptive or confusing”, describe the reason behind each data processing operation, and notify people of their right to withdraw consent at any time.
Unambiguous – it must be clear that the person has consented and what they have consented to with an affirmative action (i.e. no pre-checked boxes). Therefore, silence would not be a valid form of consent.
In the last instalment, we talked about deciding on the legal basis you will use to process your marketing data. Consent is not your only option. That said, it is always a good idea to know the source of all of your data, how that data flows through your various systems and what consent you have for the processing of that data. The ICO has published detailed guidance on consent and has produced a consent checklist to help you review your current practices.
For the first time, the GDPR specifically calls out the rights of children and offers special protection for their personal data in the digital world. If you offer what the GDPR calls “information society services” to children and you rely on consent to process their data, you may have to get the permission of the parent or guardian before processing that child’s data. The GDPR set the age at which a child can consent for themselves at 16 but the UK may lower this to 13. One interesting thing to note is that the parent or guardian’s consent expires when the child reaches the age at which they can give consent, so you will have to refresh their consent at that milestone.
9. Data Breaches
The GDPR makes it the responsibility of all organisations to issue notifications for certain types of data breaches. You will have to notify the ICO if the breach is likely to impinge on the rights and freedoms of individuals such as financial loss, loss of confidentiality or significant economic or social harm. If this risk is high you may also have to notify the individual directly. Now is the time to think about your policies and procedures for identifying and managing data breaches.
So far, we have given you a lot to think about and we hope you have gotten started. Check back next soon for our last instalment where we will look at privacy by design, data protection officers and international considerations.
In Part 1, we covered raising awareness, data audits and privacy notices.
4. Individuals’ Rights
Just ‘getting ready’ for GDPR is not going to be good enough because you may also have to prove to the regulator that you are ready for GDPR. One critical proof point will be the decisions you make in getting ready for GDPR, as well as what you will do going forward after its implementation. Get in the habit now of documenting all of your decisions and the deliberations that went into them (more on this under the Protection by Design section). You will also have clearly defined and documented policies and procedures to comply with GDPR. These cannot be the kind of documents that are written and then live in a cupboard just in case something goes wrong, but rather they need to be distributed to staff in a useful format with comparable training so that the processes become habit within your organisation.
One area that is very well suited to this is protecting individuals’ rights. Most of the rights under GDPR are not that different than under the DPA, but now is a good time to ensure that you have your documentation in order. It is also a good time to ensure that your procedures will be compliant around things like correcting data and subject access requests.
5. Subject Access Requests
While we are on the topic of Subject Access requests, these are changing under GDPR. First, the down side; you will no longer be able to charge for these and you will have to reply within 30 rather than 40 days. You will also have to provide some metadata along with the data subject’s own data, such as your data retention periods and many of the other things covered under the notices provision.
The good news is that you can charge for or refuse excessive requests (too frequent) and you can ask the data subject to specify the data they are looking for if you process large amounts of data. You will also be able to provide the data electronically in many cases.
6. Legal Basis
Under the GDPR, the legal basis for processing data is all-important because individuals’ rights can change depending on the legal basis you determine for processing the data. It will be important for businesses to balance the requirements of consent and the legitimate interests that the GDPR provides for. The other legal basis that many email marketers will rely on is processing the data with the subject’s consent.
That puts us half way through the twelve things you should be thinking about to prepare for GDPR. Check back soon for the next two installments.
Editor’s note: The materials and information above is not intended to convey or constitute legal advice. You should seek your own advice specific to your business’ requirements.
So, here we are. There are less than 12 months to go to the implementation date of the new General Data Protection Regulations (GDPR) on 25th May 2018.
It would be great to say that all UK businesses are well on their way to being ready, but data from the DMA released at an event this morning tells a different story.
Marketers are feeling less confident about GDPR than they did in February when 68% of businesses said they were ‘on course’ or ‘ahead’ of plans to be GDPR compliant by May 2018. Since that survey, the ICO and the Article 29 Working Party have issued both guidance and discussion documents bringing businesses greater clarity around what GDPR compliance will entail. This greater clarity has caused respondents to reassess their positions:
Only 55% of companies feel they are now ‘on course’ or ‘ahead’ of plans to meet the May 2018 deadline.
Marketers perception of their knowledge as ‘good’ rather than ‘basic’ has slipped from 66% to 59%.
Marketers sense of being ‘extremely’ or ‘somewhat’ prepared has fallen from 71% to 61%.
What has not changed is marketers’ four big GDPR-related concerns:
Implementing a compliant system
So what should you be thinking about? Here are 12 things to get you started.
If you are the only person in your organization that is thinking about GDPR, you could be in big, big trouble. This is a major change to the legislative regime in which your business operates, so not only do key people need to be made aware of the revisions your business will need to make, they also need to be made to care.
As one of the speakers at this morning’s DMA event pointed out, good data practitioners already have the proper use of data on their radar; much of what the GDPR contains could therefore be considered business as usual. By stressing that this data attention is now in favor of helping the business comply with the new GDPR regulations, you may be able to obtain more budget for your undertaking.
While I am sure this is true in some cases, I know that for many companies, GDPR will represent a radical change in how they do business. It is critical that senior management is made of the impact sooner rather than later and that all members of staff are trained and brought up to speed on the changes over the next twelve months.
While you are running your internal PR campaign, you can also be talking to all of the people that have data bases squirrelled away here, there and everywhere; these will need to be examined. Among other things, you need to fully document:
What data you hold
Where you obtained it
When it was acquired
How often it is updated
All of the places it is stored within your organization
How the data flows from one place to another
Who has access to the data throughout its journey
How it is stored
Where it is stored
The retention policy for each datum
One of the things that will most likely have to change for most UK businesses under GDPR is their privacy notices. Being open, honest and transparent with consumers about what data you are collecting, why, how you will be using it, and how you will take care of it has been a core principle of data protection law since the original Data Protection Act of 1998. What has changed, however, is that the legislators feel that data owners have not always done this to the best of their ability. They have therefore given us more detailed instructions as to what openness, honesty and transparency entails in practice. The Information Commissioner’s Office (ICO) has released a great code of practice on privacy notices.
Check back next week to read 4-12 of 12 things to think about before GDPR 2018.
You might be thinking that landing pages aren’t your job; after all, your main concern is probably how many people are opening the campaign and clicking. Opens and clicks only take you so far, though: they don’t tell you how much revenue your email campaign generated or how effective your email message was for meeting company goals. And those are the things that help you to reveal the true worth of email marketing within your business which, incidentally, is proving to be a lot. The latest DMA Marketer Tracker report has revealed that the average ROI for email is £30.01 for every £1 spent; not to be sniffed at!
How do you ensure that your email campaigns perform well once the recipient has clicked the CTA button? The answer is to serve them a super-relevant landing page, otherwise they’re likely to bounce. Don’t get us wrong: there are times when it’s okay to direct people to product pages, for instance, if you’ve made it obvious that that’s where you’re sending the user. But on most occasions it’s wise to think carefully about the onward journey.
If you’re wondering when you might adopt a dedicated landing page for your campaign, we’ve compiled a list of four common use cases. If time’s a worry, you should also check out the dotmailer landing pages add-on, which offers the seamless functionality of our drag-and-drop EasyEditor. (P.S. you can also download a free copy of our latest ‘Get more from your landing pages’ guide).
Collecting additional data
You might have someone’s name and email address but what else do you know about them? Landing pages are the ideal place to embed a form to gather more data on your contacts; for example, you might encourage them to provide their preferences for ongoing email content or register their interest for a soon-to-be-launched event or product.
Targeted offers and offerings
If you’re sending out a specific offer or set of offers, a dedicated page with more information than was previously available in the email can aid conversions. What’s more, ‘exclusive’ landing pages can be safely hidden from the search engines, without damaging rankings of permanent pages, so only those with the link can view it.
Tracking individuals’ interest and intents
Landing pages facilitate an understanding of which customers are the most engaged with your brand by tracking the re-engagements of existing leads. This also means you can collect more information on customers’ preferences and online behavior, which is handy for sales.
Measuring success of marketing campaigns
Each landing page serves as a data asset for your marketing campaign, enabling you to get insight into its performance. A landing page created for a specific marketing campaign will allow you to understand the strength of your proposition – for example, the email might’ve done a great job of luring them in but the detailed landing page could be a total turn-off. On the other hand, the landing page could seal the deal and you’ll want to replicate those successes in future campaigns.
Why not get your hands on our free landing pages guide for the latest landing page advice – including 10 tips for optimization:
On 12 August we were accepted for the U.S. Department of Commerce’s voluntary privacy certification program. The news is a great milestone for dotmailer, because it recognizes the years of work we’ve put into protecting our customers’ data and privacy. For instance, just look at our comprehensive trust center and involvement in both the International Association of Privacy Professionals (IAPP) and Email Sender & Provider Coalition (ESPC).
To become certified our Chief Privacy Officer, James Koons, made the application to the U.S. Department of Commerce, who audited dotmailer’s privacy statement. (Interesting fact: James actually completed the application process while on vacation climbing Mt. Rainer in Washington state!)
By self-certifying and agreeing to the Privacy Shield Principles, it means that our commitment is enforceable under the Federal Trade Commission (FTC).
What does it mean for you (our customers)?
As we continue to expand globally, this certification is one more important privacy precedent. The aim of the EU-U.S. Privacy Shield, which was recently finalized, provides businesses with stronger protection for the exchange of transatlantic data. If you haven’t seen it already, you might be interested in reading about the recent email privacy war between Microsoft and the U.S. government.
As a certified company, it means we must provide you with adequate privacy protection – a requirement for the transfer of personal data outside of the European Union under the EU Data Protection Directive. Each year, we must self-certify to the U.S. Department of Commerce’s International Trade Administration (ITA), to ensure we adhere to the Privacy Shield Principles.
What does our Chief Privacy Officer think?
James Koons, who has 20 years’ experience in the information systems and security industry, explained why he’s pleased about the news: “I am delighted that dotmailer has been recognized as a good steward of data through the Privacy Shield Certification.
“As a company that has a culture of privacy and security as its core, I believe the certification simply highlights the great work we have already been doing.”
What happened to the Safe Harbour agreement?
The EU-U.S. Privacy Shield replaces the former Safe Harbour agreement for transatlantic data transfers.
Want to know more about what the Privacy Shield means?
Every two years, Moz surveys the brightest minds in SEO and search marketing with a comprehensive set of questions meant to gauge the current workings of Google’s search algorithm. This year’s panel of experts possesses a truly unique set of knowledge and perspectives. We’re thankful on behalf of the entire community for their contribution.
In addition to asking the participants about what does and doesn’t work in Google’s ranking algorithm today, one of the most illuminating group of questions asks the panel to predict the future of search – how the features of Google’s algorithm are expected to change over the next 12 months.
Amazingly, almost all of the factors that are expected to increase in influence revolved around user experience, including:
The experts predicted that more traditional ranking signals, such as those around links and URL structures, would largely remain the same, while the more manipulative aspects of SEO, like paid links and anchor text (which is subject to manipulation), would largely decrease in influence.
The survey also asks respondents to weight the importance of various factors within Google’s current ranking algorithm (on a scale of 1-10). Understanding these areas of importance helps to inform webmasters and marketers where to invest time and energy in working to improve the search presence of their websites.
On-page keyword features
These features describe use of the keyword term/phrase in particular parts of the HTML code on the page (title element, H1s, alt attributes, etc).
Highest influence: Keyword present in title element, 8.34 Lowest influence: Keyword present in specific HTML elements (bold/italic/li/a/etc), 4.16
Titles are still very powerful. Overall, it’s about focus and matching query syntax. If your post is about airplane propellers but you go on a three paragraph rant about gorillas, you’re going to have a problem ranking for airplane propellers.
Keyword usage is vital to making the cut, but we don’t always see it correlate with ranking, because we’re only looking at what already made the cut. The page has to be relevant to appear for a query, IMO, but when it comes to how high the page ranks once it’s relevant, I think keywords have less impact than they once did. So, it’s a necessary but not sufficient condition to ranking.
In my experience, most of problems with organic visibility are related to on-page factors. When I look for an opportunity, I try to check for 2 strong things: presence of keyword in the title and in the main content. Having both can speed up your visibility, especially on long-tail queries.
It’s very easy to link keyword-rich domains with their success in Google’s results for the given keyword. I’m always mindful about other signals that align with domain name which may have contributed to its success. These includes inbound links, mentions, and local citations.
These features describe link metrics for the individual ranking page (such as number of links, PageRank, etc).
Highest influence: Raw quantity of links from high-authority sites, 7.78 Lowest influence: Sentiment of the external links pointing to the page, 3.85
High-quality links still rule rankings. The way a brand can earn links has become more important over the years, whereas link schemes can hurt a site more than ever before. There is a lot of FUD slinging in this respect!
Similar to my thoughts on content, I suspect link-based metrics are going to be used increasingly with a focus on verisimilitude (whether content is actually true or not) and relationships between nodes in Knowledge Graph. Google’s recent issues with things, such as the snippet results for “evolution,” highlight the importance of them only pulling things that are factually correct for featured parts of a SERP. Thus, just counting traditional link metrics won’t cut it anymore.
These features describe elements that indicate qualities of branding and brand metrics.
Highest influence: Search volume for the brand/domain, 6.54 Lowest influence: Popularity of business’s official social media profiles, 3.99
This is clearly on deck to change very soon with the reintegration of Twitter into Google’s Real-Time Results. It will be interesting to see how this affects the “Breaking News” box and trending topics. Social influencers, quality and quantity of followers, RTs, and favorites will all be a factor. And what’s this?! Hashtags will be important again?! Have mercy!
It’s already noticeable; brands are more prominently displayed in search results for both informational and commercial queries. I’m expecting Google will be paying more attention to brand-related metrics from now on (and certainly more initiatives to encourage site owners to optimize for better entity detection).
These features relate to third-party metrics from social media sources (Facebook, Twitter, Google+, etc) for the ranking page.
Highest influence: Engagement with content/URL on social networks, 3.87 Lowest influence: Upvotes for the page on social sites, 2.7
Social ranking factors are important in a revamped Query Deserves Freshness algorithm. Essentially, if your content gets a lot of natural tweets, shares, and likes, it will rank prominently for a short period of time, until larger and more authoritative sites catch up.
Social popularity has several factors to consider: (1) Years ago, Google and Bing said they take into account the authority of a social profile sharing a link and the popularity of the link being shared (retweets/reshares), and there was more complexity to social signals that was never revealed even back then. (2) My experience has been that social links and shares have more power for newsy/fresh-type content. For example, a lot of social shares for a dentist’s office website wouldn’t be nearly as powerful (or relevant to consider) as a lot of social shares for an article on a site with a constant flow of fresh content.
Honestly, I do not think that the so-called “social signals” have any direct influence on the Google Algorithm (that does not mean that a correlation doesn’t exist, though). My only doubt is related to Twitter, because of the renewed contract between Google and Twitter itself. That said, as of now I do not consider Twitter to offer any ranking signals, except for very specific niches related to news and “news-able” content, where QDF plays a fundamental role.
These elements describe non-keyword-usage, non-link-metrics features of individual pages (such as length of the page, load speed, etc).
Highest influence: Uniqueness of the content on the page, 7.85 Lowest influence: Page contains Open Graph data and/or Twitter cards, 3.64
By branching mobile search off of Google’s core ranking algorithm, having a “mobile-friendly” website is probably now less important for desktop search rankings. Our clients are seeing an ever-increasing percentage of organic search traffic coming from mobile devices, though (particularly in retail), so this is certainly not an excuse to ignore responsive design – the opposite, in fact. Click-through rate from the SERPs has been an important ranking signal for a long time and continues to be, flagging irrelevant or poor-quality search listings.
I believe many of these will be measured within the ecosystem, rather than absolutely. For example, the effect of bounce rate (or rather, bounce speed) on a site will be relative to the bounce speeds on other pages in similar positions for similar terms.
I want to answer these a certain way because, while I have been told by Google what matters to them, what I see in the SERPs does not back up what Google claims they want. There are a lot of sites out there with horrible UX that rank in the top three. While I believe it’s really important for conversion and to bring customers back, I don’t feel as though Google is all that concerned, based on the sites that rank highly. Additionally, Google practically screams “unique content,” yet sites that more or less steal and republish content from other sites are still ranking highly. What I think should matter to Google doesn’t seem to matter to them, based on the results they give me.
These features describe link metrics about the domain hosting the page.
Highest influence: Quantity of unique linking domains to the domain, 7.45 Lowest influence: Sentiment of the external links pointing to the site, 3.91
Quantity and quality of unique linking domains at the domain level is still among the most significant factors in determining how a domain will perform as a whole in the organic search results, and is among the best SEO “spot checks” for determining if a site will be successful relative to other competitor sites with similar content and selling points.
Throughout this survey, when I say “no direct influence,” this is interchangeable with “no direct positive influence.” For example, I’ve marked exact match domain as low numbers, while their actual influence may be higher – though negatively.
Topical relevancy has, in my opinion, gained much ground as a relevant ranking factor. Although I find it most at play when at page level, I am seeing significant shifts at overall domain relevancy, by long-tail growth or by topically-relevant domains linking to sites. One way I judge such movements is the growth of the long-tail relevant to the subject or ranking, when neither anchor text (exact match or synonyms) nor exact phrase is used in a site’s content, yet it still ranks very highly for long-tail and mid-tail synonyms.
These features relate to the entire root domain, but don’t directly describe link- or keyword-based elements. Instead, they relate to things like the length of the domain name in characters.
Highest influence: Uniqueness of content across the whole site, 7.52 Lowest influence: Length of time until domain name expires, 2.45
Character length of domain name is another correlative yet not causative factor, in my opinion. They don’t need to rule these out – it just so happens that longer domain names get clicked on, so they get ruled out quickly.
A few points: Google’s document inception date patents describe how Google might handle freshness and maturity of content for a query. The “trust signal” pages sound like a site quality metric that Google might use to score a page on the basis of site quality. Some white papers from Microsoft on web spam signals identified multiple hyphens in subdomains as evidence of web spam. The length of time until the domain expires was cited as a potential signal in Google’s patent on information retrieval through historic data, and was refuted by Matt Cutts after domain sellers started trying to use that information to sell domain extensions to “help the SEO” of a site.
I think that page speed only becomes a factor when it is significantly slow. I think that having error pages on the site doesn’t matter, unless there are so many that it greatly impacts Google’s ability to crawl.
Mobile will continue to increase, with directly-related factors increasing as well. Structured data will increase, along with more data partners and user segmentation/personalization of SERPs to match query intent, localization, and device-specific need states.
I really think that over the next 12-18 months we are going to see a larger impact of structured data in the SERPs. In fact, we are already seeing this. Google has teams that focus on artificial intelligence and machine learning. They are studying “relationships of interest” and, at the heart of what they are doing, are still looking to provide the most relevant result in the quickest fashion. Things like schema that help “educate” the search engines as to a given topic or entity are only going to become more important as a result.
Finally, we leave you with this infographic created by Kevin Engle which shows the relative weighting of broad areas of Google’s algorithm, according to the experts.
What’s your opinion on the future of search and SEO? Let us know in the comments below.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Ever mass-deleted a bunch of impersonal emails from your inbox? Brand fatigue is a real threat to your marketing strategy. In today’s Whiteboard Friday, Rand discusses why brands become “background noise” and how you can avoid it.
For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat a little bit about why no one is paying attention to your brand, to your marketing. It’s the perilous pitfall of brand fatigue.
Brand fatigue sucks
So you have all had this happen to you. I promise you have. It’s happened in your email. It’s happened in your social streams. It’s happened through advertising in the real world, online and offline.
I’ll give you an illustration. So I sign up for this newsletter. I decide, “Hey, I want to get some houseplants. My house has no greenery in it.” So I sign up for Green Dude Houseplants’ newsletter. What do I get? Well, I get a, “Welcome to Our Newsletter.” Oh, okay.
And then maybe the next day I get, “Meet Our New Hires.” Meet our new hires? I’m sure that your new hires are very important to you and your team, but I just got introduced to your brand. I’m not sure I care that much. To me, you’re all new hires. You might as well be, right? I don’t know you or the team yet.
“Best Summer Ever Event,” okay, maybe, maybe an event. “Edible Backyard Gardens, you know, I don’t have a backyard. I was signing up for a houseplant newsletter because it was in my house. “See Us at the Garden Show,” I don’t want to go to the garden show. I was going to buy from you. That’s why I’m online.
How to cause brand fatigue
It’s not just the value of the messaging. It’s the frequency that it happens at. You’ve seen this. I’m on an email list that I signed up for, I think it’s called FounderDating. It’s here in Seattle. I think it’s in San Francisco. I thought it was a really cool idea when I signed up for it. Then I have just been inundated with messages from them. I think some of them are actually worthy of my participation, like I should have gone to the forum. I should have replied. I should have checked out what this particular person wanted. But I get so much email from them that I’ve just begun to hit Delete as soon as I get it.
We’ve actually had this problem at Moz too. If you’re a Moz subscriber, you probably get a new email every time a new crawl is completed, and a campaign is set up, and you have new rankings data. Some of that’s really important, right? Like if you’re paying attention to this particular site’s rankings and you want to see every time you get an update, well yeah, you need that email. But it’s actually kind of tough to opt in to which ones you want and with what frequency and control it all from one place.
We have found that our email open rates, engagement rates have actually drifted way, way down over time because, probably, we’ve inundated you with so much email. This is a big mistake that Moz has made in our email marketing, but a lot of brands make it in tons of places. So I want to help you avoid that.
1) Too many messages on a medium
Brand fatigue happens when there are too many messages, just too many raw messages on a medium. You start to see the same brand, the same name, the same person again and again. Their logo, their colors, the association you have, it just becomes background noise. Your brain goes into this mode where it just filters it out because it can’t handle the volume of stuff that’s coming through. It needs a filtration mechanism. So it starts to identify and associate your brand or your logo or your name or a person’s name with “filter.” Filter that out. That goes in the background.
2) Value provided is too low or infrequent to deserve attention
It also happens when the value provided is too low or too infrequent to deserve attention. So this might be what I’m talking about with FounderDating. One out of every maybe five or six messages, I’m like, “Oh yeah, that was interesting. I should pay attention to that.” But when it becomes too infrequent, that same filtration happens.
Too few of the high value messages means you’re not going to pay attention, you’re not going to engage with that brand, with that company anymore. All of us marketers will see that in the engagement rates. No matter the medium, we can look at our numbers and see that those are going down on a percentile basis, and that gets really frustrating.
3) The messaging can’t be effectively tuned or controlled by the user
So this is the problem that Moz is having where we don’t have that one email control center where you say how often you want exactly which messages updating you of which notifications about which campaigns, and newsletter and da, da, da. So your message frequency is either all the time high or very high and so you’re, “I don’t like any of those options.”
How NOT to cause brand fatigue
Now, I do have some solutions and suggestions. But it’s platform by platform.
Start very conservative with your email marketing and highly personal. In fact, I would actually recommend personally sending all the messages out to your first few hundred users if you possibly can, because you will get a great rapport that you develop individually with person by person. That will give you a sense for what your audiences like and what kind of messaging they prefer, and they’ll know they can reply directly to you.
You’ll create that highly-engaged experience through email that will mean that, as you scale, you have the experience from the past to tell you how often you can and can’t email people, what they care about and don’t, what they filter and don’t, what they’re looking for from you, etc. You can then watch your open, unsubscribe and engagement rates through your email program. No matter what program you might be using, you can almost always see these.
Then you can watch for, “Oh, we had a spike.” That spike is a good thing. That means that people were highly engaged on this email. Let’s figure out what resonated there. Let’s go talk to folks. Let’s reach out to the people who engaged with it and just say, “Hey, why did you love this? What did you love about it? What can we do to give you more value like this?”
Or you watch for dips. Then you can say, “Oh man, the last three email newsletters that we’ve sent out, we’ve seen successive declines in engagement and open rates, and we’ve seen a rise in unsubscribe rates. We’re doing something wrong. What’s going on? What’s the root cause? Is it who we’re acquiring? Is it new people that signed up, or is it old-timers who are getting frustrated with the new stuff we’re sending out? Does this fit with our strategy? What can we fix?”
Be careful. The thing that sucks about brand fatigue is a lot of platforms, email included, have systems, algorithmic systems set up to penalize you for this. With email, if you get high unsubscribes and low engagement, that will actually kill your long-term chances for email marketing success, because Gmail and Yahoo Mail and Microsoft’s various mail programs and whatever installed mail your targets might have, whatever they’re using, you will no longer be able to break through those email filters.
The email filter that Gmail has says, “Hey, a lot of people click Unsubscribe and Report Spam. Let’s put this in the Promotions tab.” Or, “Hey, a lot of people are clicking Report Spam. You know what? Let’s just block this sender entirely.” Or, “Gosh, this person has in the past not engaged very much with these messages. We’re going to not make them high priority anymore.” Gmail has that automatic high priority system. So you’re getting algorithmically turned into noise even if you might have had something that your customers really cared about.
Blog or other content platform
This is a really interesting one. I would strongly urge you to read Trevor Klein from Moz’s blog post about the experiment that we and HubSpot did around how much content to produce and whether lowering content or increasing content had positive effects. There are some fascinating results from that study.
But the valuable thing to me in that is if you don’t test, you’ll never know. You’ll never know the limits of what your audience wants, what will frustrate them, what will delight them. I recommend you don’t create content unless you can have a great answer for the question, “Who will help amplify this and why?” I don’t mean, like, “Oh, well I think people who really like houseplants will help amplify this.” That’s not a great answer.
A great answer is, “Oh, you know, I know this guy named Jerry. Jerry runs a Twitter account that’s all about gardening. Jerry loves our houseplants. He’s a big fan of this. He’s particularly interested in flowering cacti. I know if we publish this post, Jerry will help amplify it.” That’s a great answer. You have 10 Jerrys, great. Hit Publish. Go for it. You don’t? Why are you making it?
Watch your browse rate, your conversion rate, and conversion rate…. I don’t mean necessarily all the way to whatever you’re selling, your ecommerce store products or your subscription or whatever that is. Conversion rate could be conversion rate to an email newsletter or to following you on a social platform or whatever.
You can watch time on site and amplification per post to essentially get a sense for like, “Hey, as we’re producing content, are we seeing the metrics that would indicate that our content marketing is being successful?” If the answer to that is no, well we need to retool it. It turns out there’s actually no prize for hitting Publish.
You might think that your job as a content producer or a content marketer is to make content every day or content every week. That’s not your job. Your job is to have success with the metrics that are going to predict and correlate to the strategies you need as a business to acquire customers, to grow your marketing channels, to grow your brand’s impact, to help people, whatever it is that your mission is.
I highly recommend finding your audiences’ sweet spot for both focus and frequency. If you do those things, you’re going to do a great job with avoiding brand fatigue around your content.
Twitter, Facebook, and other social media
Last one is social. I’ll talk specifically about Twitter and Facebook, because most things can be classified in there, even things like Instagram and LinkedIn and the fading, sadly, Google+ and those sorts of things.
Twitter, generally speaking, more forgiving as a platform. Facebook has more of those algorithmic elements to punish you for low engagement.
So, for example, I’ve had this happen on my personal Facebook page where I’ve published a few things that people just didn’t really find interesting. This is on my Rand Fishkin Facebook page, different from the Moz one. It turns out that that meant that it was much harder for me next time, even with content that people were very engaged around, to reach them.
Facebook essentially had pushed in. They were like, “You know what? That’s three or four posts in a row from Rand Fishkin that people did not like, didn’t engage with. The next one we’re going to set the bar much higher for him to have to climb back up before we decide, ‘Hey, we’ll show that to more and more people.'”
Lately I’ve been having more success getting a higher percentage of my audience into the impression count of people who are actually seeing my posts on Facebook by getting better engagement there. But that’s a very challenging platform.
Users of both, however, are pretty sensitive, nearly equally sensitive. It’s not like Facebook users are more sensitive. It’s just that Facebook’s platform is more sensitive because Facebook doesn’t show you all the content you could possibly see.
Twitter is just a super simplistic newsfeed algorithm. It’s just, who posted last. So Twitter has that real time kind of thing. So I would still say for both of these, aim to only share stuff that gets high engagement, especially as your brand.
Personal account, do whatever you want, test whatever you want. But as your brand’s account, you want that high engagement over and over again because that will predict more people paying attention to you when you do post, going back and looking through your old social posts, subscribing to you, following you, all that sort of thing, considering you a leader.
You can watch both Twitter Analytics and your Facebook page’s stats to see if you’re having a dip or a spike, where you’re having success, where you’re not.
I actually love using Twitter and a little bit LinkedIn or Google+ to see what gets very high engagement and then I know, “Okay, I should re-share that on Twitter because my audience on Twitter is very temporal.” Two hours from now it’s going to be less than 1% overlap between who sees a Twitter post now and who sees a Twitter post 2 hours from now, and that’s a great test bed for Facebook as well.
So if I see something doing extremely well on Twitter or on Google+ or on LinkedIn, I go, “Aha, that’s the kind of thing I should post on Facebook. That will increase my engagement there. Now I can go post and get more engagement next time and build up my authority in Facebook’s newsfeed algorithm.
So with all of this stuff, hopefully, as you’re producing content, sharing content, building an email subscription, building a blog platform, you’re going to have a little less brand fatigue and a little more engagement from your users.
I look forward to chatting with you all in the comments. We’ll see you again next week for another edition of Whiteboard Friday. Take care.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
We’re excited to announce the results of Moz’s biannual Search Engine Ranking Correlation Study and Expert Survey, a.k.a. Ranking Factors.
Moz’s Ranking Factors study helps identify which attributes of pages and sites have the strongest association with ranking highly in Google. The study consists of two parts: a survey of professional SEOs and a large correlation study.
This year, with the help of Moz’s data scientist Dr. Matt Peters, new data partners, and over 150 search marketing professionals, we were able to study more data points than in any year past. All together, we measured over 170 correlations and collected over 15,000 data points from our panel of SEO experts.
We want to especially thank our data partners. SimilarWeb, Ahrefs, and DomainTools each gave us unparallelled access and their data was essential to helping make this study a success. It’s amazing and wonderful when different companies—even competitors—can come together for the advancement of knowledge.
You can see all of our findings within the study now. In the coming days and weeks we’ll dive into deeper analysis as to what we can learn from these correlations.
Search Engine Ranking Correlation Study
Moz’s Ranking Correlation Study measures which attributes of pages and websites are associated with higher rankings in Google’s search results. This means we look at characteristics such as:
Page load speed
…and over 170 other attributes
To be clear, the study doesn’t tell us if Google actually uses these attributes in its core ranking algorithm. Instead, it shows which features of pages and sites are most associated with higher rankings. It’s a fine, but important, distinction.
While correlation studies can’t prove or disprove which attributes Google considers in its algorithm, it does provide valuable hints. In fact, many would argue that correlation studies are even more important than causation when working with today’s increasingly complex algorithms.
The features in the chart below describe link metrics to the individual ranking page (such as number of links, PageRank, etc.) and their correlation to higher rankings in Google.
Despite rumors to the contrary, links continue to show one of the strongest associations with higher rankings out of all the features we studied. While this doesn’t prove how Google uses links in its algorithm, this information combined with statements from Google and the observations of many professional marketers leads us to strongly believe that links remain hugely important for SEO.
We continue to see lower correlations between on-page keyword use and rankings. This could likely be because Google is smarter about what pages mean (through related keywords, synonyms, close variants, and entities) without relying on exact keyword phrases. We believe matching user intent is of the utmost importance.
While page length, hreflang use, and total number of links all show moderate association with Google rankings, we found that using HTTPS has a very low positive correlation. This could indicate it’s the “tie-breaker” Google claims. Negatively associated factors include server response time and the total length of the URL.
Despite rumors to the contrary, the data continues to show some of the highest correlations between Google rankings and the number of links to a given page.
While there exists a decent correlation between exact-match domains (domains where the keyword matches the domain exactly, e.g. redwidgets.com) and rankings, this is likely due to the prominence of anchor text, keyword usage, and other signals, instead of an algorithmic bias in favor of these domains.
While not quite as strong as page-level link metrics, the overall links to a site’s root and subdomain showed a reasonably strong correlation to rankings. We believe links continue to play a prominent role in Google’s algorithm.
Always controversial, the number of social shares a page accumulates tends to show a positive correlation with rankings. Although there is strong reason to believe Google doesn’t use social share counts directly in its algorithm, there are many secondary SEO benefits to be gained through successful social sharing.
While correlation data can provide valuable insight into the workings of Google’s algorithm, we often learn much more by gathering the collective wisdom of search marketing experts working at the top of their game.
What ranking factors or correlations stand out to you? Leave your thoughts in the comments below.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!