As marketers, we want to influence our customers and clients to follow the path to conversion. But this can be a challenge for all of us – this is where Nathalie Nahai, the web psychologist, can help. She teaches global audiences about the link between behavioral sciences and the digital space, helping you build a better understanding of how to persuade your audience to take the desired path.
We were so impressed with Nathalie that we invited her to speak at this year’s dotmailer Summit where she’ll be bringing together the latest insights from the world of psychology, neuroscience and behavioral economics to explain the underlying dynamics and motivations behind consumer behavior.
In this blog we posed Nathalie a series of questions – read on to find out some secret hacks, interesting facts and a brief insight into what you’ll be taking away from the dotmailer Summit.
Can you tell us a little more about yourself and how you found yourself drawn to web psychology – from what we’ve seen you have a really fascinating background, so it’ll be interesting to see who or what inspired you on your journey?
“Thank you, it’s been a rather unpredictable trajectory!”
“Having studied psychology at university, upon leaving I went straight to Central St Martins to explore fine art, something I have always had a passion for. During my time at CSM I’d been recording music on the side, and I thought it would come in handy to know how to develop websites to help promote myself. So, I went to some classes and as I progressed I ended up taking on freelance work.”
“I began thinking about joining a design agency, when a good friend of mine (who was just leaving agency life for something more entrepreneurial) suggested I hold off for a bit and explore some co-working spaces instead. I found a lovely place to work from where the organiser asked me to run some psychology-related workshops, and the penny started to drop. If psychology could shed light on the factors that influence our behaviours in the physical world, surely it could provide some insight into what shapes our decisions online, too.”
“I looked for books and postgraduate courses on the subject, but at the time I couldn’t find any resources that combined research from the fields of psychology, computer science, human-computer-interaction, marketing, ethics, cross-cultural studies, behavioural economics and UX (the latter two subjects having not yet hit the mainstream). Frustrated by the lack of an integrated, multi-disciplinary approach, I decided to write a book that would allow me to draw these threads together into something I would personally find useful to read, and after a couple of years of trawling through countless studies, Webs Of Influence was conceived – and that’s where it all started.”
What do you think the key messages are in what you do? And how to do you think you help to empower people?
“There are a few key messages in what I do… Firstly, we’re not rational agents and our decision-making is open to influence, both on- and off-line. Secondly, to understand and connect with people in a meaningful way, we have to understand their psychological context, which includes universal, cultural and individual differences. Thirdly – and this is the most important – we have an ethical responsibility to use these insights to create mutually beneficial experiences, which means being transparent (not using dark patterns), delivering on what we promise (providing real value), creating a great customer experience (building trust over time), and respecting people’s privacy (not using covert forms of tracking to follow, coerce or manipulate users into taking certain actions).”
Can you give a small indication into what you will be covering at the dotmailer Summit – perhaps the key takeaways people can expect to leave with?
“People can expect to leave with specific, actionable principles that they can use straight away to create a more exciting, rewarding and engaging user experience that customers will want to come back to experience again.”
There has been a lot of talk recently about how our technology is impacting us and what we can do about it – have you got any key thoughts on this subject? And how you see this influencing our lives and our future?
“Yes I have a several thoughts on this! I think that the most important and pressing issue in this debate is having a space (or spaces) in which we can share, discuss and learn about what’s at stake, and what our choices might be for shaping the world in which we want to live.”
“We’re starting to see a greater interest in how technology can be designed and used to influence and manipulate behaviours, questions which, in my opinion, go to the heart of what it means to be human. Personally, I want to live in a society in which the individual is sovereign – we would own our own data by default and be able to choose with whom to share it, and we would be free from surveillance outside of public spaces – whether physically (via cameras and microphones in the home, or through biometric sensors which are fast becoming reality), or virtually (through the content we share and the activities we engage in online).”
What’s your favorite social medium to engage in?
“It used to be Twitter, but it feels as though it’s become so noisy that I now tend to engage more with Instagram, which I use to connect with a smaller, closer community.”
Any tips or hacks on what obvious mistakes sites make that discourage customers from buying?
“Yes – in a bid to stay on top of design trends, brands will often create websites, content and apps that look great but actually deflect attention away from the all-important call to action. A great example here is when brands use auto-playing videos on their websites – the motion will detract attention away from the CTA and will often lower conversion rates as a result. If you have to use video, reduce the amount of background motion that’s involved, so that users have the chance to locate and understand the call to action.”
Have you come across any interesting facts about global user behavior that you could share with us?
“Yes – high load times frustrate users no matter where they’re from! More seriously though, one of the most important factors that will impact the success of a business, is trust. If you can provide customers with something they value in a way that is frictionless and even joyful, not only will they be more likely to return to use your service and recommend you to their peers, they will also be more forgiving when you make mistakes – which in the amplified world of social media, can go a long way to protecting your reputation.”
And lastly has there been anything that has truly inspired you lately?
“Yes, although it’s on a completely different note! I’ve been studying academic drawing at Barcelona Academy of Art, and I’m finding the whole process extraordinary (if you want to check out some of my work you can find it here on Instagram).”
Thank you so much Nathalie for sharing some insights into web psychology, as well as some general inspiration. We’re looking forward to welcoming you on stage on the 19th April at the dotmailer Summit 2018.
Now more than ever, brands need to be data-driven and offer a highly personalized experience. To get noticed in the inbox – over and above everybody else – it’s important to send timely, contextual emails that are meaningful to subscribers.
This is where user tracking software comes in; unlocking the power of web insight enables you to enhance the relevancy of your email campaigns. dotmailer’s web insight tool tracks the website behaviour of customers and prospects after they’ve clicked through from your emails. It gathers rich behavioural insights from your site visitors:
Duration of a user’s visit
Point of drop out
This enriched data helps you quantify the impact of your campaigns, identify highly engaged subscribers and follow up with timely, appropriate content.
There are two key ways that our web behavioural tool can give your email marketing results a dramatic lift:
Building segments that target individuals based on their web activity; for instance, those who’ve viewed pages which indicate a strong intent to buy or enquire.
Powering your automations; i.e. a browse abandonment program that prompts the completion of a form or a cart recovery program that encourages the placement of an order.
Forest Holidays has recently enabled the dotmailer web behaviour tool to great effect. At present, user tracking is being leveraged to target individuals who’ve abandoned their basket by sending them a personalized, well-timed email. The results speak for themselves: a significant uplift in engagement and a 5% COS in the first 30 days of implementation.
Your welcome email is the first email that your customers are likely to receive from you. It typically has the highest engagement of any email you are likely to send, and it’s your opportunity to show off what you do and how great you are. It is also a way to thank your customers for buying from you and begin building a relationship with them; yet too many retailers miss this great opportunity. You only get one change to make a first impression.
Office, Sweaty Betty and Hotel Chocolat didn’t send a welcome email at all after signing up. There was no thank you, no offer, and no attempts to capture extra data. This is a lost opportunity for these three companies.
Charles Tyrwhitt, Reebok and Urban Outfitters did send an email after signing up. But their subject lines and content don’t come across like a welcome email and can be easily missed for instance, “15% off your Reebok gear”, “Start Urban Outfitting”, “Hurry, your £10 offer is waiting!” However, their emails are on brand and offer an incentive to take action.
Reebok’s welcome email
If UO replaced its hero image with an animated GIF they would probably see an increase in engagement with their emails.
Urban Outfitters’ email has a nice graphic but a GIF would be more eye-catching
Diesel, Footlocker, Havaianas, Hugo Boss, and Uniqlo also send a timely email shortly after signing up. Yet their emails need a lot of work. They are text-heavy, aren’t on brand and are not particularly engaging. Diesel’s email copy is confusing and tries to get you to create an account.
Lack of branding let Diesel’s email down
All of these brands create a poor initial experience. Footlocker, Fossil and Hugo Boss’s emails are double opt-in emails. This is good for data quality, but it is at the expense of great customer experience. At least Fossil and Hugo Boss’s actual welcome emails are on-brand. But since signing up and confirming my subscription, Footlocker hasn’t sent me a single email.
The welcome email from Hugo Boss
Adidas, Allsaints, Cath Kidston, FootAsylum, Forever21, Jack Wills, Kuoni, Levi’s, Schuh all sent what in my opinion are good welcome emails. They had clear subject lines that welcomed or thanked the user. The copy and design of these emails are on brand and again welcomed the user to the company.
Levi’s welcome email is image-heavy and on brand
Some of the brands like Adidas and Forever 21 included a discount to encourage the customer to engage further, and followed best practice elements to create a positive customer experience.
Adidas offers an incentive in its welcome email
However, the outstanding winner of the welcome emails goes to FootAsylum.
The email has great use of microcopy throughout.
It contains a clear benefit statement of being a subscriber. The benefit statement also set the expectations of what you’re likely to receive.
They use a great Call to Action “Stop Reading. Start Shopping!”
They are also the only company to use their welcome email to collect further data by having a very obvious preference centre within the body of the email.
Finally, the email is clearly on brand.
Foot Asylum wins the welcome email contest
Tips for welcome emails
Make sure you send it immediately after the customer signs up.
Keep the subject line clear and obvious that it’s a welcome email.
Set expectations for what the customer will receive and how frequently.
Provide a benefit statement for signing up.
Use this email as an opportunity to find out more about your customers.
Use preheader text as a follow on from your subject line.
Provide a safe sender message to encourage customers to add your email to their safe senders.
The top three reasons were People, Product and Opportunity. I met the people who make up our business and heard their stories from the past 18 years, learned about the platform and market leading status they had built in the UK, and saw that I could add value with my U.S. high growth business experience. I’ve been working with marketers, entrepreneurs and business owners for years across a series of different roles, and saw that I could apply what I’d learned from that and the start-up space to dotmailer’s U.S. operation. dotmailer has had clients in the U.S. for 12 years and we’re positioned to grow the user base of our powerful and easy-to-use platform significantly. I knew I could make a difference here, and what closed the deal for me was the people. Every single person I’ve met is deeply committed to the business, to the success of our customers and to making our solution simple and efficient. We’re a great group of passionate people and I’m proud to have joined the dotfamily.
Dan Morris, dotmailer’s EVP for North America in the new NYC office
Tell us a bit about your new role
dotmailer has been in business and in this space for more than 18 years. We were a web agency, then a Systems Integrator, and we got into the email business that way, ultimately building the dotmailer platform thousands of people use daily. This means we know this space better than anyone and we have the perfect solutions to align closely with our customers and the solutions flexible enough to grow with them. My role is to take all that experience and the platform and grow our U.S. presence. My early focus has been on identifying the right team to execute our growth plans. We want to be the market leader in the U.S. in the next three years – just like we’ve done in the UK – so getting the right people in the right spots was critical. We quickly assessed the skills of the U.S. team and made changes that were necessary in order to provide the right focus on customer success. Next, we set out to completely rebuild dotmailer’s commercial approach in the U.S. We simplified our offers to three bundles, so that pricing and what’s included in those bundles is transparent to our customers. We’ve heard great things about this already from clients and partners. We’re also increasing our resources on customer success and support. We’re intensely focused on ease of on-boarding, ease of use and speed of use. We consistently hear how easy and smooth a process it is to use dotmailer’s tools. That’s key for us – when you buy a dotmailer solution, we want to onboard you quickly and make sure you have all of your questions answered right away so that you can move right into using it. Customers are raving about this, so we know it’s working well.
What early accomplishments are you most proud of from your dotmailer time so far?
I’ve been at dotmailer for eight months now and I’m really proud of all we’ve accomplished together. We spent a lot of time assessing where we needed to restructure and where we needed to invest. We made the changes we needed, invested in our partner program, localized tech support, customer on-boarding and added customer success team members. We have the right people in the right roles and it’s making a difference. We have a commercial approach that is clear with the complete transparency that we wanted to provide our customers. We’ve got a more customer-focused approach and we’re on-boarding customers quickly so they’re up and running faster. We have happier customers than ever before and that’s the key to everything we do.
You’ve moved the U.S. team to a new office. Can you tell us why and a bit about the new space?
I thought it was very important to create a NY office space that was tied to branding and other offices around the world, and also had its own NY energy and culture for our team here – to foster collaboration and to have some fun. It was also important for us that we had a flexible space where we could welcome customers, partners and resellers, and also hold classes and dotUniversity training sessions. I’m really grateful to the team who worked on the space because it really reflects our team and what we care about. At any given time, you’ll see a training session happening, the team collaborating, a customer dropping in to ask a few questions or a partner dropping in to work from here. We love our new, NYC space.
Guests and the team at dotmailer’s new NYC office warming party
What did you learn from your days in the start-up space that you’re applying at dotmailer?
The start-up space is a great place to learn. You have to know where every dollar is going and coming from, so every choice you make needs to be backed up with a business case for that investment. You try lots of different things to see if they’ll work and you’re ready to turn those tactics up or down quickly based on an assessment of the results. You also learn things don’t have to stay the way they are, and can change if you make them change. You always listen and learn – to customers, partners, industry veterans, advisors, etc. to better understand what’s working and not working. dotmailer has been in business for 18 years now, and so there are so many great contributors across the business who know how things have worked and yet are always keen to keep improving. I am constantly in listening and learning mode so that I can understand all of the unique perspectives our team brings and what we need to act on.
What are your plans for the U.S. and the sales function there?
On our path to being the market leader in the U.S., I’m focused on three things going forward: 1 – I want our customers to be truly happy. It’s already a big focus in the dotmailer organization – and we’re working hard to understand their challenges and goals so we can take product and service to the next level. 2 – Creating an even more robust program around partners, resellers and further building out our channel partners to continuously improve sales and customer service programs. We recently launched a certification program to ensure partners have all the training and resources they need to support our mutual customers. 3 – We have an aggressive growth plan for the U.S. and I’m very focused on making sure our team is well trained, and that we remain thoughtful and measured as we take the steps to grow. We want to always keep an eye on what we’re known for – tools that are powerful and simple to use – and make sure everything else we offer remains accessible and valuable as we execute our growth plans.
What are the most common questions that you get when speaking to a prospective customer?
The questions we usually get are around price, service level and flexibility. How much does dotmailer cost? How well are you going to look after my business? How will you integrate into my existing stack and then my plans for future growth? We now have three transparent bundle options with specifics around what’s included published right on our website. We have introduced a customer success team that’s focused only on taking great care of our customers and we’re hearing stories every day that tells me this is working. And we have all of the tools to support our customers as they grow and to also integrate into their existing stacks – often integrating so well that you can use dotmailer from within Magento, Salesforce or Dynamics, for example.
Can you tell us about the dotmailer differentiators you highlight when speaking to prospective customers that seem to really resonate?
In addition to the ones above – ease of use, speed of use and the ability to scale with you. With dotmailer’s tiered program, you can start with a lighter level of functionality and grow into more advanced functionality as you need it. The platform itself is so easy to use that most marketers are able to build campaigns in minutes that would have taken hours on other platforms. Our customer success team is also with you all the way if ever you want or need help. We’ve built a very powerful platform and we have a fantastic team to help you with personalized service as an extended part of your team and we’re ready to grow with you.
How much time is your team on the road vs. in the office? Any road warrior tips to share?
I’ve spent a lot of time on the road, one year I attended 22 tradeshows! Top tip when flying is to be willing to give up your seat for families or groups once you’re at the airport gate, as you’ll often be rewarded with a better seat for helping the airline make the family or group happy. Win win! Since joining dotmailer, I’m focused on being in office and present for the team and customers as much as possible. I can usually be found in our new, NYC office where I spend a lot of time with our team, in customer meetings, in trainings and other hosted events, sales conversations or marketing meetings. I’m here to help the team, clients and partners to succeed, and will always do my best to say yes! Once our prospective customers see how quickly and efficiently they can execute tasks with dotmailer solutions vs. their existing solutions, it’s a no-brainer for them. I love seeing and hearing their reactions.
Tell us a bit about yourself – favorite sports team, favorite food, guilty pleasure, favorite band, favorite vacation spot?
I’m originally from Yorkshire in England, and grew up just outside York. I moved to the U.S. about seven years ago to join a very fast growing startup, we took it from 5 to well over 300 people which was a fantastic experience. I moved to NYC almost two years ago, and I love exploring this great city. There’s so much to see and do. Outside of dotmailer, my passion is cars, and I also enjoy skeet shooting, almost all types of music, and I love to travel – my goal is to get to India, Thailand, Australia and Japan in the near future.
At dotmailer we try our best to keep the bad guys out, but if they already have your password, there is very little we can do to detect, and stop them logging in as you…unless, of course, you have already turned on two-factor authentication (2FA). Two-factor in most cases is something you know (your username/password), and something you have (a single use access code or authentication link).
But how do can they get my password in the first place?
There are various ways an attacker may have access to your login details, but some of the possible methods include:
If the computer you use to log in to your online accounts is infected with malware, it is possible that your keystrokes and even screen captures are being logged and sent back to the bad guys…..yep, including your passwords, and other authentication details.
Snooping on the network
If an attacker has access to the network from which you are logging on to an online service (e.g. public Wi-Fi hotspot), in some cases it may be possible to capture the data as it passes to the server…..yep, including your password, and other authentication details. This is where looking for HTTPS in your browser address bar becomes very important. At dotmailer, all authentication data passes over a secure channel, thus protecting you from this sort of attack.
It’s really important not to use the same password across different services. We’ve seen an awful lot of very big data breaches in the news recently, and the attackers have been using the stolen authentication details from these breaches to try and log on to other online services…with what seems to be a great deal of success! This sadly means that many people are still using the same password everywhere they go online. This is one of the reasons why your dotmailer password is set to expire, and you are asked for a new one every 90 days; and why you should be choosing something completely different every time. Simply incrementing that number at the end of your password is not cool!
As we get better at using good passwords, and preventing malware infections; sometime, the bad guys just find it easier to ask us for our passwords. At dotmailer, our support team will never contact you asking for your password.
If one of the above unfortunate events were to happen, 2FA adds another layer of defense, as the attacker would also need access to the authentication link or SMS code. In reality that would mean having access to your mailbox, or mobile phone. We’ve already seen that it’s possible that an attacker has obtained your password due to a compromised computer, or network; which is why we would always recommend using an “out-of-band” communication such as SMS as the means to deliver the 2FA authentication token where possible. dotmailer offers SMS 2FA to all customers. It’s simple to setup, and its free!
Without access to the authentication token, the attacker could of course try and brute force the code, but that is where our other controls such as failed login account lockouts kick in.
How to turn on 2FA in dotmailer
Log in to your account, and click the user icon in the top right, and select Account:
In the resulting window click on the “Account Settings” tab, and scroll down to the “Security” section. Simply tick the Two-factor authentication box, and enter your mobile phone number, and hit save settings at the bottom of the page.
Done! Congratulations, you have just gone one step further in protecting your valuable data.
Now you have protected your dotmailer account, check out TurnOn 2FA and see which of your other online services offer a similar feature, and SWITCH IT ON!
Note: If you are a managed user, you will need to ask your account administrator to do this for you. For obvious security reasons, you will not be able to disable this feature without the help from our support team.
Without a site that’s precision engineered for a good user experience and high conversion rates, all of your email marketing efforts could be going to waste.
Of course, you do want to market your business and email campaigns are statistically proven to be one of the most successful avenues to do so – according to McKinsey, email is 40 times more effective at acquiring new customers than Facebook and Twitter. VentureBeat also released this year that according to their research, email is generating better return on investment than any other channel. So how do you make sure your site is secure and effective enough to keep customers there once your campaign has enticed them this far?
For the majority of visitors, your homepage will be the first impression you get to make. A great homepage factors in a number of different ingredients to create the biggest positive impact on the user. Chief of which are…
A great homepage is above all engaging, instantly connecting a potential customer with the brand. Engagement comes from a website having personality and a clear message, a customer should feel comfortable with the design and want to interact with it. Take the example below – Mardon, an international seafood import and export company. The large cinematic image captures attention, while the well positioned brand and informative footer let the user know who they’re interacting with. The elements on the page come together to create a beautiful and simple looking design with the feel of a company you can trust and, as a result, want to engage with.
mardon.com designed and built by Nublue.co.uk
The human element
In most cases (and where relevant), adding a human element to your homepage will encourage a positive reaction from users. Having a real life human being can enable customers to relate to your business and products more effectively. We believe in this philosophy so much that our own staff feature heavily throughout our site. Using actual staff members allows you to showcase your people, your greatest asset.
The www.nublue.co.uk homepage
Excellent user experience
Once you’ve made an excellent first impression, you’ll need a functional and user-friendly website that ensures customers aren’t left frustrated by complex navigation or slow load times. Simple, intuitive menus and navigation alongside a website that’s fast enough to keep your users from having to wait.
Load speed is critical. According to surveys done by Gomez.com, 79% of online shoppers who have trouble with web site performance say they won’t return to the site to buy again. There are also statistics that suggest consumers will abandon a site that isn’t loaded within three seconds. High performance hosting is vital to addressing this issue and your site could benefit from the use of a CDN. CDN (Content Delivery Networks) improve speed by offloading your site’s static content – such as images and CSS. This frees up your hosting package to serve only the dynamic parts of your site. The result is a faster, smoother running site, regardless of the user’s location.
Better conversion processes
Trying to get people to make a purchase from an email isn’t easy and the fewer potential stumbling blocks you put in a customer’s way the better.
For ecommerce sites, an optimised checkout with guest login will produce a much smoother, simpler and more effective conversion process. Offering guest login at checkout gives the user an option of either signing up for an account or checking out without doing so, and prevents losing any sales at the last moment.
Another best practice is to introduce ‘trust signals’ so that customers feel confidence in buying from your site. Trust signals range from having visible reviews and testimonials onsite, to things like SSL certificates – which are visible in the url bar and prevent third parties seeing or accessing a customer’s personal details between their browser and your server, through encryption.
An expert design and development team will together implement the best features and functionality using a user-centric approach to ‘reverse-engineer’ your site. Effectively creating the best and simplest customer journey, improving both customer experience and conversion levels, whilst making it as quick and easy as possible for customers to buy from you.
When sending an email campaign, it’s vital that your website is not the weak link in the marketing chain and that leads are clicking through to a secure and effective page. At the heart of an effective website is a full understanding of your audience and the expertise to clearly guide them through the actions you want them to take. This is ultimately accomplished through user friendly navigation and beautiful and engaging design.
The site’s features and functionality need to be thought about and in order to get a website that’s fast enough, you’ll need a tailored, high performance hosting solution – such as CDN.
This post was created by Nublue, a web hosting and Magento ecommerce agency and partner of dotmailer.
Below is what happened in search today, as reported on Search Engine Land and from other places across the web. From Search Engine Land: Google Using Points To Boost User Reviews, Beef Up Maps Content Nov 13, 2015 by Greg Sterling A couple of years ago, Google created a program in the mold of the…
Please visit Search Engine Land for the full article.
Every two years, Moz surveys the brightest minds in SEO and search marketing with a comprehensive set of questions meant to gauge the current workings of Google’s search algorithm. This year’s panel of experts possesses a truly unique set of knowledge and perspectives. We’re thankful on behalf of the entire community for their contribution.
In addition to asking the participants about what does and doesn’t work in Google’s ranking algorithm today, one of the most illuminating group of questions asks the panel to predict the future of search – how the features of Google’s algorithm are expected to change over the next 12 months.
Amazingly, almost all of the factors that are expected to increase in influence revolved around user experience, including:
The experts predicted that more traditional ranking signals, such as those around links and URL structures, would largely remain the same, while the more manipulative aspects of SEO, like paid links and anchor text (which is subject to manipulation), would largely decrease in influence.
The survey also asks respondents to weight the importance of various factors within Google’s current ranking algorithm (on a scale of 1-10). Understanding these areas of importance helps to inform webmasters and marketers where to invest time and energy in working to improve the search presence of their websites.
On-page keyword features
These features describe use of the keyword term/phrase in particular parts of the HTML code on the page (title element, H1s, alt attributes, etc).
Highest influence: Keyword present in title element, 8.34 Lowest influence: Keyword present in specific HTML elements (bold/italic/li/a/etc), 4.16
Titles are still very powerful. Overall, it’s about focus and matching query syntax. If your post is about airplane propellers but you go on a three paragraph rant about gorillas, you’re going to have a problem ranking for airplane propellers.
Keyword usage is vital to making the cut, but we don’t always see it correlate with ranking, because we’re only looking at what already made the cut. The page has to be relevant to appear for a query, IMO, but when it comes to how high the page ranks once it’s relevant, I think keywords have less impact than they once did. So, it’s a necessary but not sufficient condition to ranking.
In my experience, most of problems with organic visibility are related to on-page factors. When I look for an opportunity, I try to check for 2 strong things: presence of keyword in the title and in the main content. Having both can speed up your visibility, especially on long-tail queries.
It’s very easy to link keyword-rich domains with their success in Google’s results for the given keyword. I’m always mindful about other signals that align with domain name which may have contributed to its success. These includes inbound links, mentions, and local citations.
These features describe link metrics for the individual ranking page (such as number of links, PageRank, etc).
Highest influence: Raw quantity of links from high-authority sites, 7.78 Lowest influence: Sentiment of the external links pointing to the page, 3.85
High-quality links still rule rankings. The way a brand can earn links has become more important over the years, whereas link schemes can hurt a site more than ever before. There is a lot of FUD slinging in this respect!
Similar to my thoughts on content, I suspect link-based metrics are going to be used increasingly with a focus on verisimilitude (whether content is actually true or not) and relationships between nodes in Knowledge Graph. Google’s recent issues with things, such as the snippet results for “evolution,” highlight the importance of them only pulling things that are factually correct for featured parts of a SERP. Thus, just counting traditional link metrics won’t cut it anymore.
These features describe elements that indicate qualities of branding and brand metrics.
Highest influence: Search volume for the brand/domain, 6.54 Lowest influence: Popularity of business’s official social media profiles, 3.99
This is clearly on deck to change very soon with the reintegration of Twitter into Google’s Real-Time Results. It will be interesting to see how this affects the “Breaking News” box and trending topics. Social influencers, quality and quantity of followers, RTs, and favorites will all be a factor. And what’s this?! Hashtags will be important again?! Have mercy!
It’s already noticeable; brands are more prominently displayed in search results for both informational and commercial queries. I’m expecting Google will be paying more attention to brand-related metrics from now on (and certainly more initiatives to encourage site owners to optimize for better entity detection).
These features relate to third-party metrics from social media sources (Facebook, Twitter, Google+, etc) for the ranking page.
Highest influence: Engagement with content/URL on social networks, 3.87 Lowest influence: Upvotes for the page on social sites, 2.7
Social ranking factors are important in a revamped Query Deserves Freshness algorithm. Essentially, if your content gets a lot of natural tweets, shares, and likes, it will rank prominently for a short period of time, until larger and more authoritative sites catch up.
Social popularity has several factors to consider: (1) Years ago, Google and Bing said they take into account the authority of a social profile sharing a link and the popularity of the link being shared (retweets/reshares), and there was more complexity to social signals that was never revealed even back then. (2) My experience has been that social links and shares have more power for newsy/fresh-type content. For example, a lot of social shares for a dentist’s office website wouldn’t be nearly as powerful (or relevant to consider) as a lot of social shares for an article on a site with a constant flow of fresh content.
Honestly, I do not think that the so-called “social signals” have any direct influence on the Google Algorithm (that does not mean that a correlation doesn’t exist, though). My only doubt is related to Twitter, because of the renewed contract between Google and Twitter itself. That said, as of now I do not consider Twitter to offer any ranking signals, except for very specific niches related to news and “news-able” content, where QDF plays a fundamental role.
These elements describe non-keyword-usage, non-link-metrics features of individual pages (such as length of the page, load speed, etc).
Highest influence: Uniqueness of the content on the page, 7.85 Lowest influence: Page contains Open Graph data and/or Twitter cards, 3.64
By branching mobile search off of Google’s core ranking algorithm, having a “mobile-friendly” website is probably now less important for desktop search rankings. Our clients are seeing an ever-increasing percentage of organic search traffic coming from mobile devices, though (particularly in retail), so this is certainly not an excuse to ignore responsive design – the opposite, in fact. Click-through rate from the SERPs has been an important ranking signal for a long time and continues to be, flagging irrelevant or poor-quality search listings.
I believe many of these will be measured within the ecosystem, rather than absolutely. For example, the effect of bounce rate (or rather, bounce speed) on a site will be relative to the bounce speeds on other pages in similar positions for similar terms.
I want to answer these a certain way because, while I have been told by Google what matters to them, what I see in the SERPs does not back up what Google claims they want. There are a lot of sites out there with horrible UX that rank in the top three. While I believe it’s really important for conversion and to bring customers back, I don’t feel as though Google is all that concerned, based on the sites that rank highly. Additionally, Google practically screams “unique content,” yet sites that more or less steal and republish content from other sites are still ranking highly. What I think should matter to Google doesn’t seem to matter to them, based on the results they give me.
These features describe link metrics about the domain hosting the page.
Highest influence: Quantity of unique linking domains to the domain, 7.45 Lowest influence: Sentiment of the external links pointing to the site, 3.91
Quantity and quality of unique linking domains at the domain level is still among the most significant factors in determining how a domain will perform as a whole in the organic search results, and is among the best SEO “spot checks” for determining if a site will be successful relative to other competitor sites with similar content and selling points.
Throughout this survey, when I say “no direct influence,” this is interchangeable with “no direct positive influence.” For example, I’ve marked exact match domain as low numbers, while their actual influence may be higher – though negatively.
Topical relevancy has, in my opinion, gained much ground as a relevant ranking factor. Although I find it most at play when at page level, I am seeing significant shifts at overall domain relevancy, by long-tail growth or by topically-relevant domains linking to sites. One way I judge such movements is the growth of the long-tail relevant to the subject or ranking, when neither anchor text (exact match or synonyms) nor exact phrase is used in a site’s content, yet it still ranks very highly for long-tail and mid-tail synonyms.
These features relate to the entire root domain, but don’t directly describe link- or keyword-based elements. Instead, they relate to things like the length of the domain name in characters.
Highest influence: Uniqueness of content across the whole site, 7.52 Lowest influence: Length of time until domain name expires, 2.45
Character length of domain name is another correlative yet not causative factor, in my opinion. They don’t need to rule these out – it just so happens that longer domain names get clicked on, so they get ruled out quickly.
A few points: Google’s document inception date patents describe how Google might handle freshness and maturity of content for a query. The “trust signal” pages sound like a site quality metric that Google might use to score a page on the basis of site quality. Some white papers from Microsoft on web spam signals identified multiple hyphens in subdomains as evidence of web spam. The length of time until the domain expires was cited as a potential signal in Google’s patent on information retrieval through historic data, and was refuted by Matt Cutts after domain sellers started trying to use that information to sell domain extensions to “help the SEO” of a site.
I think that page speed only becomes a factor when it is significantly slow. I think that having error pages on the site doesn’t matter, unless there are so many that it greatly impacts Google’s ability to crawl.
Mobile will continue to increase, with directly-related factors increasing as well. Structured data will increase, along with more data partners and user segmentation/personalization of SERPs to match query intent, localization, and device-specific need states.
I really think that over the next 12-18 months we are going to see a larger impact of structured data in the SERPs. In fact, we are already seeing this. Google has teams that focus on artificial intelligence and machine learning. They are studying “relationships of interest” and, at the heart of what they are doing, are still looking to provide the most relevant result in the quickest fashion. Things like schema that help “educate” the search engines as to a given topic or entity are only going to become more important as a result.
Finally, we leave you with this infographic created by Kevin Engle which shows the relative weighting of broad areas of Google’s algorithm, according to the experts.
What’s your opinion on the future of search and SEO? Let us know in the comments below.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
We’re excited to announce the results of Moz’s biannual Search Engine Ranking Correlation Study and Expert Survey, a.k.a. Ranking Factors.
Moz’s Ranking Factors study helps identify which attributes of pages and sites have the strongest association with ranking highly in Google. The study consists of two parts: a survey of professional SEOs and a large correlation study.
This year, with the help of Moz’s data scientist Dr. Matt Peters, new data partners, and over 150 search marketing professionals, we were able to study more data points than in any year past. All together, we measured over 170 correlations and collected over 15,000 data points from our panel of SEO experts.
We want to especially thank our data partners. SimilarWeb, Ahrefs, and DomainTools each gave us unparallelled access and their data was essential to helping make this study a success. It’s amazing and wonderful when different companies—even competitors—can come together for the advancement of knowledge.
You can see all of our findings within the study now. In the coming days and weeks we’ll dive into deeper analysis as to what we can learn from these correlations.
Search Engine Ranking Correlation Study
Moz’s Ranking Correlation Study measures which attributes of pages and websites are associated with higher rankings in Google’s search results. This means we look at characteristics such as:
Page load speed
…and over 170 other attributes
To be clear, the study doesn’t tell us if Google actually uses these attributes in its core ranking algorithm. Instead, it shows which features of pages and sites are most associated with higher rankings. It’s a fine, but important, distinction.
While correlation studies can’t prove or disprove which attributes Google considers in its algorithm, it does provide valuable hints. In fact, many would argue that correlation studies are even more important than causation when working with today’s increasingly complex algorithms.
The features in the chart below describe link metrics to the individual ranking page (such as number of links, PageRank, etc.) and their correlation to higher rankings in Google.
Despite rumors to the contrary, links continue to show one of the strongest associations with higher rankings out of all the features we studied. While this doesn’t prove how Google uses links in its algorithm, this information combined with statements from Google and the observations of many professional marketers leads us to strongly believe that links remain hugely important for SEO.
We continue to see lower correlations between on-page keyword use and rankings. This could likely be because Google is smarter about what pages mean (through related keywords, synonyms, close variants, and entities) without relying on exact keyword phrases. We believe matching user intent is of the utmost importance.
While page length, hreflang use, and total number of links all show moderate association with Google rankings, we found that using HTTPS has a very low positive correlation. This could indicate it’s the “tie-breaker” Google claims. Negatively associated factors include server response time and the total length of the URL.
Despite rumors to the contrary, the data continues to show some of the highest correlations between Google rankings and the number of links to a given page.
While there exists a decent correlation between exact-match domains (domains where the keyword matches the domain exactly, e.g. redwidgets.com) and rankings, this is likely due to the prominence of anchor text, keyword usage, and other signals, instead of an algorithmic bias in favor of these domains.
While not quite as strong as page-level link metrics, the overall links to a site’s root and subdomain showed a reasonably strong correlation to rankings. We believe links continue to play a prominent role in Google’s algorithm.
Always controversial, the number of social shares a page accumulates tends to show a positive correlation with rankings. Although there is strong reason to believe Google doesn’t use social share counts directly in its algorithm, there are many secondary SEO benefits to be gained through successful social sharing.
While correlation data can provide valuable insight into the workings of Google’s algorithm, we often learn much more by gathering the collective wisdom of search marketing experts working at the top of their game.
What ranking factors or correlations stand out to you? Leave your thoughts in the comments below.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
There’s no doubt that quite a bit has changed about SEO, and that the field is far more integrated with other aspects of online marketing than it once was. In today’s Whiteboard Friday, Rand pushes back against the idea that effective modern SEO doesn’t require any technical expertise, outlining a fantastic list of technical elements that today’s SEOs need to know about in order to be truly effective.
For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to do something unusual. I don’t usually point out these inconsistencies or sort of take issue with other folks’ content on the web, because I generally find that that’s not all that valuable and useful. But I’m going to make an exception here.
There is an article by Jayson DeMers, who I think might actually be here in Seattle — maybe he and I can hang out at some point — called “Why Modern SEO Requires Almost No Technical Expertise.” It was an article that got a shocking amount of traction and attention. On Facebook, it has thousands of shares. On LinkedIn, it did really well. On Twitter, it got a bunch of attention.
Some folks in the SEO world have already pointed out some issues around this. But because of the increasing popularity of this article, and because I think there’s, like, this hopefulness from worlds outside of kind of the hardcore SEO world that are looking to this piece and going, “Look, this is great. We don’t have to be technical. We don’t have to worry about technical things in order to do SEO.”
Look, I completely get the appeal of that. I did want to point out some of the reasons why this is not so accurate. At the same time, I don’t want to rain on Jayson, because I think that it’s very possible he’s writing an article for Entrepreneur, maybe he has sort of a commitment to them. Maybe he had no idea that this article was going to spark so much attention and investment. He does make some good points. I think it’s just really the title and then some of the messages inside there that I take strong issue with, and so I wanted to bring those up.
First off, some of the good points he did bring up.
One, he wisely says, “You don’t need to know how to code or to write and read algorithms in order to do SEO.” I totally agree with that. If today you’re looking at SEO and you’re thinking, “Well, am I going to get more into this subject? Am I going to try investing in SEO? But I don’t even know HTML and CSS yet.”
Those are good skills to have, and they will help you in SEO, but you don’t need them. Jayson’s totally right. You don’t have to have them, and you can learn and pick up some of these things, and do searches, watch some Whiteboard Fridays, check out some guides, and pick up a lot of that stuff later on as you need it in your career. SEO doesn’t have that hard requirement.
And secondly, he makes an intelligent point that we’ve made many times here at Moz, which is that, broadly speaking, a better user experience is well correlated with better rankings.
You make a great website that delivers great user experience, that provides the answers to searchers’ questions and gives them extraordinarily good content, way better than what’s out there already in the search results, generally speaking you’re going to see happy searchers, and that’s going to lead to higher rankings.
But not entirely. There are a lot of other elements that go in here. So I’ll bring up some frustrating points around the piece as well.
First off, there’s no acknowledgment — and I find this a little disturbing — that the ability to read and write code, or even HTML and CSS, which I think are the basic place to start, is helpful or can take your SEO efforts to the next level. I think both of those things are true.
So being able to look at a web page, view source on it, or pull up Firebug in Firefox or something and diagnose what’s going on and then go, “Oh, that’s why Google is not able to see this content. That’s why we’re not ranking for this keyword or term, or why even when I enter this exact sentence in quotes into Google, which is on our page, this is why it’s not bringing it up. It’s because it’s loading it after the page from a remote file that Google can’t access.” These are technical things, and being able to see how that code is built, how it’s structured, and what’s going on there, very, very helpful.
Some coding knowledge also can take your SEO efforts even further. I mean, so many times, SEOs are stymied by the conversations that we have with our programmers and our developers and the technical staff on our teams. When we can have those conversations intelligently, because at least we understand the principles of how an if-then statement works, or what software engineering best practices are being used, or they can upload something into a GitHub repository, and we can take a look at it there, that kind of stuff is really helpful.
Secondly, I don’t like that the article overly reduces all of this information that we have about what we’ve learned about Google. So he mentions two sources. One is things that Google tells us, and others are SEO experiments. I think both of those are true. Although I’d add that there’s sort of a sixth sense of knowledge that we gain over time from looking at many, many search results and kind of having this feel for why things rank, and what might be wrong with a site, and getting really good at that using tools and data as well. There are people who can look at Open Site Explorer and then go, “Aha, I bet this is going to happen.” They can look, and 90% of the time they’re right.
So he boils this down to, one, write quality content, and two, reduce your bounce rate. Neither of those things are wrong. You should write quality content, although I’d argue there are lots of other forms of quality content that aren’t necessarily written — video, images and graphics, podcasts, lots of other stuff.
And secondly, that just doing those two things is not always enough. So you can see, like many, many folks look and go, “I have quality content. It has a low bounce rate. How come I don’t rank better?” Well, your competitors, they’re also going to have quality content with a low bounce rate. That’s not a very high bar.
Also, frustratingly, this really gets in my craw. I don’t think “write quality content” means anything. You tell me. When you hear that, to me that is a totally non-actionable, non-useful phrase that’s a piece of advice that is so generic as to be discardable. So I really wish that there was more substance behind that.
The article also makes, in my opinion, the totally inaccurate claim that modern SEO really is reduced to “the happier your users are when they visit your site, the higher you’re going to rank.”
Wow. Okay. Again, I think broadly these things are correlated. User happiness and rank is broadly correlated, but it’s not a one to one. This is not like a, “Oh, well, that’s a 1.0 correlation.”
I would guess that the correlation is probably closer to like the page authority range. I bet it’s like 0.35 or something correlation. If you were to actually measure this broadly across the web and say like, “Hey, were you happier with result one, two, three, four, or five,” the ordering would not be perfect at all. It probably wouldn’t even be close.
There’s a ton of reasons why sometimes someone who ranks on Page 2 or Page 3 or doesn’t rank at all for a query is doing a better piece of content than the person who does rank well or ranks on Page 1, Position 1.
Then the article suggests five and sort of a half steps to successful modern SEO, which I think is a really incomplete list. So Jayson gives us;
Good on-site experience
Writing good content
Getting others to acknowledge you as an authority
Rising in social popularity
Earning local relevance
Dealing with modern CMS systems (which he notes most modern CMS systems are SEO-friendly)
The thing is there’s nothing actually wrong with any of these. They’re all, generally speaking, correct, either directly or indirectly related to SEO. The one about local relevance, I have some issue with, because he doesn’t note that there’s a separate algorithm for sort of how local SEO is done and how Google ranks local sites in maps and in their local search results. Also not noted is that rising in social popularity won’t necessarily directly help your SEO, although it can have indirect and positive benefits.
I feel like this list is super incomplete. Okay, I brainstormed just off the top of my head in the 10 minutes before we filmed this video a list. The list was so long that, as you can see, I filled up the whole whiteboard and then didn’t have any more room. I’m not going to bother to erase and go try and be absolutely complete.
But there’s a huge, huge number of things that are important, critically important for technical SEO. If you don’t know how to do these things, you are sunk in many cases. You can’t be an effective SEO analyst, or consultant, or in-house team member, because you simply can’t diagnose the potential problems, rectify those potential problems, identify strategies that your competitors are using, be able to diagnose a traffic gain or loss. You have to have these skills in order to do that.
I’ll run through these quickly, but really the idea is just that this list is so huge and so long that I think it’s very, very, very wrong to say technical SEO is behind us. I almost feel like the opposite is true.
We have to be able to understand things like;
Content rendering and indexability
Disabling crawling and/or indexing of thin or incomplete or non-search-targeted content. We have a bunch of search results pages. Should we use rel=prev/next? Should we robots.txt those out? Should we disallow from crawling with meta robots? Should we rel=canonical them to other pages? Should we exclude them via the protocols inside Google Webmaster Tools, which is now Google Search Console?
Managing redirects, domain migrations, content updates. A new piece of content comes out, replacing an old piece of content, what do we do with that old piece of content? What’s the best practice? It varies by different things. We have a whole Whiteboard Friday about the different things that you could do with that. What about a big redirect or a domain migration? You buy another company and you’re redirecting their site to your site. You have to understand things about subdomain structures versus subfolders, which, again, we’ve done another Whiteboard Friday about that.
Proper error codes, downtime procedures, and not found pages. If your 404 pages turn out to all be 200 pages, well, now you’ve made a big error there, and Google could be crawling tons of 404 pages that they think are real pages, because you’ve made it a status code 200, or you’ve used a 404 code when you should have used a 410, which is a permanently removed, to be able to get it completely out of the indexes, as opposed to having Google revisit it and keep it in the index.
Downtime procedures. So there’s specifically a… I can’t even remember. It’s a 5xx code that you can use. Maybe it was a 503 or something that you can use that’s like, “Revisit later. We’re having some downtime right now.” Google urges you to use that specific code rather than using a 404, which tells them, “This page is now an error.”
Disney had that problem a while ago, if you guys remember, where they 404ed all their pages during an hour of downtime, and then their homepage, when you searched for Disney World, was, like, “Not found.” Oh, jeez, Disney World, not so good.
International and multi-language targeting issues. I won’t go into that. But you have to know the protocols there. Duplicate content, syndication, scrapers. How do we handle all that? Somebody else wants to take our content, put it on their site, what should we do? Someone’s scraping our content. What can we do? We have duplicate content on our own site. What should we do?
Diagnosing traffic drops via analytics and metrics. Being able to look at a rankings report, being able to look at analytics connecting those up and trying to see: Why did we go up or down? Did we have less pages being indexed, more pages being indexed, more pages getting traffic less, more keywords less?
Understanding advanced search parameters. Today, just today, I was checking out the related parameter in Google, which is fascinating for most sites. Well, for Moz, weirdly, related:oursite.com shows nothing. But for virtually every other sit, well, most other sites on the web, it does show some really interesting data, and you can see how Google is connecting up, essentially, intentions and topics from different sites and pages, which can be fascinating, could expose opportunities for links, could expose understanding of how they view your site versus your competition or who they think your competition is.
Then there are tons of parameters, like in URL and in anchor, and da, da, da, da. In anchor doesn’t work anymore, never mind about that one.
I have to go faster, because we’re just going to run out of these. Like, come on. Interpreting and leveraging data in Google Search Console. If you don’t know how to use that, Google could be telling you, you have all sorts of errors, and you don’t know what they are.
Leveraging topic modeling and extraction. Using all these cool tools that are coming out for better keyword research and better on-page targeting. I talked about a couple of those at MozCon, like MonkeyLearn. There’s the new Moz Context API, which will be coming out soon, around that. There’s the Alchemy API, which a lot of folks really like and use.
Identifying and extracting opportunities based on site crawls. You run a Screaming Frog crawl on your site and you’re going, “Oh, here’s all these problems and issues.” If you don’t have these technical skills, you can’t diagnose that. You can’t figure out what’s wrong. You can’t figure out what needs fixing, what needs addressing.
Using rich snippet format to stand out in the SERPs. This is just getting a better click-through rate, which can seriously help your site and obviously your traffic.
Applying Google-supported protocols like rel=canonical, meta description, rel=prev/next, hreflang, robots.txt, meta robots, x robots, NOODP, XML sitemaps, rel=nofollow. The list goes on and on and on. If you’re not technical, you don’t know what those are, you think you just need to write good content and lower your bounce rate, it’s not going to work.
Using APIs from services like AdWords or MozScape, or hrefs from Majestic, or SEM refs from SearchScape or Alchemy API. Those APIs can have powerful things that they can do for your site. There are some powerful problems they could help you solve if you know how to use them. It’s actually not that hard to write something, even inside a Google Doc or Excel, to pull from an API and get some data in there. There’s a bunch of good tutorials out there. Richard Baxter has one, Annie Cushing has one, I think Distilled has some. So really cool stuff there.
Diagnosing page load speed issues, which goes right to what Jayson was talking about. You need that fast-loading page. Well, if you don’t have any technical skills, you can’t figure out why your page might not be loading quickly.
Diagnosing mobile friendliness issues
Advising app developers on the new protocols around App deep linking, so that you can get the content from your mobile apps into the web search results on mobile devices. Awesome. Super powerful. Potentially crazy powerful, as mobile search is becoming bigger than desktop.
Okay, I’m going to take a deep breath and relax. I don’t know Jayson’s intention, and in fact, if he were in this room, he’d be like, “No, I totally agree with all those things. I wrote the article in a rush. I had no idea it was going to be big. I was just trying to make the broader points around you don’t have to be a coder in order to do SEO.” That’s completely fine.
So I’m not going to try and rain criticism down on him. But I think if you’re reading that article, or you’re seeing it in your feed, or your clients are, or your boss is, or other folks are in your world, maybe you can point them to this Whiteboard Friday and let them know, no, that’s not quite right. There’s a ton of technical SEO that is required in 2015 and will be for years to come, I think, that SEOs have to have in order to be effective at their jobs.
All right, everyone. Look forward to some great comments, and we’ll see you again next time for another edition of Whiteboard Friday. Take care.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!