How Much Has Link Building Changed in Recent Years?

Posted by Paddy_Moogan

I get asked this question a lot. It’s mainly asked by people who are considering buying my link building book and want to know whether it’s still up to date. This is understandable given that the first edition was published in February 2013 and our industry has a deserved reputation for always changing.

I find myself giving the same answer, even though I’ve been asked it probably dozens of times in the last two years—”not that much”. I don’t think this is solely due to the book itself standing the test of time, although I’ll happily take a bit of credit for that 🙂 I think it’s more a sign of our industry as a whole not changing as much as we’d like to think.

I started to question myself and if I was right and honestly, it’s one of the reasons it has taken me over two years to release the second edition of the book.

So I posed this question to a group of friends not so long ago, some via email and some via a Facebook group. I was expecting to be called out by many of them because my position was that in reality, it hasn’t actually changed that much. The thing is, many of them agreed and the conversations ended with a pretty long thread with lots of insights. In this post, I’d like to share some of them, share what my position is and talk about what actually has changed.

My personal view

Link building hasn’t changed as much we think it has.

The core principles of link building haven’t changed. The signals around link building have changed, but mainly around new machine learning developments that have indirectly affected what we do. One thing that has definitely changed is the mindset of SEOs (and now clients) towards link building.

I think the last big change to link building came in April 2012 when Penguin rolled out. This genuinely did change our industry and put to bed a few techniques that should never have worked so well in the first place.

Since then, we’ve seen some things change, but the core principles haven’t changed if you want to build a business that will be around for years to come and not run the risk of being hit by a link related Google update. For me, these principles are quite simple:

  • You need to deserve links – either an asset you create or your product
  • You need to put this asset in front of a relevant audience who have the ability to share it
  • You need consistency – one new asset every year is unlikely to cut it
  • Anything that scales is at risk

For me, the move towards user data driving search results + machine learning has been the biggest change we’ve seen in recent years and it’s still going.

Let’s dive a bit deeper into all of this and I’ll talk about how this relates to link building.

The typical mindset for building links has changed

I think that most SEOs are coming round to the idea that you can’t get away with building low quality links any more, not if you want to build a sustainable, long-term business. Spammy link building still works in the short-term and I think it always will, but it’s much harder than it used to be to sustain websites that are built on spam. The approach is more “churn and burn” and spammers are happy to churn through lots of domains and just make a small profit on each one before moving onto another.

For everyone else, it’s all about the long-term and not putting client websites at risk.

This has led to many SEOs embracing different forms of link building and generally starting to use content as an asset when it comes to attracting links. A big part of me feels that it was actually Penguin in 2012 that drove the rise of content marketing amongst SEOs, but that’s a post for another day…! For today though, this goes some way towards explain the trend we see below.

Slowly but surely, I’m seeing clients come to my company already knowing that low quality link building isn’t what they want. It’s taken a few years after Penguin for it to filter down to client / business owner level, but it’s definitely happening. This is a good thing but unfortunately, the main reason for this is that most of them have been burnt in the past by SEO companies who have built low quality links without giving thought to building good quality ones too.

I have no doubt that it’s this change in mindset which has led to trends like this:

The thing is, I don’t think this was by choice.

Let’s be honest. A lot of us used the kind of link building tactics that Google no longer like because they worked. I don’t think many SEOs were under the illusion that it was genuinely high quality stuff, but it worked and it was far less risky to do than it is today. Unless you were super-spammy, the low-quality links just worked.

Fast forward to a post-Penguin world, things are far more risky. For me, it’s because of this that we see the trends like the above. As an industry, we had the easiest link building methods taken away from us and we’re left with fewer options. One of the main options is content marketing which, if you do it right, can lead to good quality links and importantly, the types of links you won’t be removing in the future. Get it wrong and you’ll lose budget and lose the trust if your boss or client in the power of content when it comes to link building.

There are still plenty of other methods to build links and sometimes we can forget this. Just look at this epic list from Jon Cooper. Even with this many tactics still available to us, it’s hard work. Way harder than it used to be.

My summary here is that as an industry, our mindset has shifted but it certainly wasn’t a voluntary shift. If the tactics that Penguin targeted still worked today, we’d still be using them.

A few other opinions…

I definitely think too many people want the next easy win. As someone surfing the edge of what Google is bringing our way, here’s my general take—SEO, in broad strokes, is changing a lot, *but* any given change is more and more niche and impacts fewer people. What we’re seeing isn’t radical, sweeping changes that impact everyone, but a sort of modularization of SEO, where we each have to be aware of what impacts our given industries, verticals, etc.”

Dr. Pete

 

I don’t feel that techniques for acquiring links have changed that much. You can either earn them through content and outreach or you can just buy them. What has changed is the awareness of “link building” outside of the SEO community. This makes link building / content marketing much harder when pitching to journalists and even more difficult when pitching to bloggers.

“Link building has to be more integrated with other channels and struggles to work in its own environment unless supported by brand, PR and social. Having other channels supporting your link development efforts also creates greater search signals and more opportunity to reach a bigger audience which will drive a greater ROI.

Carl Hendy

 

SEO has grown up in terms of more mature staff and SEOs becoming more ingrained into businesses so there is a smarter (less pressure) approach. At the same time, SEO has become more integrated into marketing and has made marketing teams and decision makers more intelligent in strategies and not pushing for the quick win. I’m also seeing that companies who used to rely on SEO and building links have gone through IPOs and the need to build 1000s of links per quarter has rightly reduced.

Danny Denhard

Signals that surround link building have changed

There is no question about this one in my mind. I actually wrote about this last year in my previous blog post where I talked about signals such as anchor text and deep links changing over time.

Many of the people I asked felt the same, here are some quotes from them, split out by the types of signal.

Domain level link metrics

I think domain level links have become increasingly important compared with page level factors, i.e. you can get a whole site ranking well off the back of one insanely strong page, even with sub-optimal PageRank flow from that page to the rest of the site.

Phil Nottingham

I’d agree with Phil here and this is what I was getting at in my previous post on how I feel “deep links” will matter less over time. It’s not just about domain level links here, it’s just as much about the additional signals available for Google to use (more on that later).

Anchor text

I’ve never liked anchor text as a link signal. I mean, who actually uses exact match commercial keywords as anchor text on the web?

SEOs. 🙂

Sure there will be natural links like this, but honestly, I struggle with the idea that it took Google so long to start turning down the dial on commercial anchor text as a ranking signal. They are starting to turn it down though, slowly but surely. Don’t get me wrong, it still matters and it still works. But like pure link spam, the barrier is a lot more lower now in terms what of constitutes too much.

Rand feels that they matter more than we’d expect and I’d mostly agree with this statement:

Exact match anchor text links still have more power than you’d expect—I think Google still hasn’t perfectly sorted what is “brand” or “branded query” from generics (i.e. they want to start ranking a new startup like meldhome.com for “Meld” if the site/brand gets popular, but they can’t quite tell the difference between that and https://moz.com/learn/seo/redirection getting a few manipulative links that say “redirect”)

Rand Fishkin

What I do struggle with though, is that Google still haven’t figured this out and that short-term, commercial anchor text spam is still so effective. Even for a short burst of time.

I don’t think link building as a concept has changed loads—but I think links as a signal have, mainly because of filters and penalties but I don’t see anywhere near the same level of impact from coverage anymore, even against 18 months ago.

Paul Rogers

New signals have been introduced

It isn’t just about established signals changing though, there are new signals too and I personally feel that this is where we’ve seen the most change in Google algorithms in recent years—going all the way back to Panda in 2011.

With Panda, we saw a new level of machine learning where it almost felt like Google had found a way of incorporating human reaction / feelings into their algorithms. They could then run this against a website and answer questions like the ones included in this post. Things such as:

  • “Would you be comfortable giving your credit card information to this site?”
  • “Does this article contain insightful analysis or interesting information that is beyond obvious?”
  • “Are the pages produced with great care and attention to detail vs. less attention to detail?”

It is a touch scary that Google was able to run machine learning against answers to questions like this and write an algorithm to predict the answers for any given page on the web. They have though and this was four years ago now.

Since then, they’ve made various moves to utilize machine learning and AI to build out new products and improve their search results. For me, this was one of the biggest and went pretty unnoticed by our industry. Well, until Hummingbird came along I feel pretty sure that we have Ray Kurzweil to thank for at least some of that.

There seems to be more weight on theme/topic related to sites, though it’s hard to tell if this is mostly link based or more user/usage data based. Google is doing a good job of ranking sites and pages that don’t earn the most links but do provide the most relevant/best answer. I have a feeling they use some combination of signals to say “people who perform searches like this seem to eventually wind up on this website—let’s rank it.” One of my favorite examples is the Audubon Society ranking for all sorts of birding-related searches with very poor keyword targeting, not great links, etc. I think user behavior patterns are stronger in the algo than they’ve ever been.

– Rand Fishkin

Leading on from what Rand has said, it’s becoming more and more common to see search results that just don’t make sense if you look at the link metrics—but are a good result.

For me, the move towards user data driving search results + machine learning advanced has been the biggest change we’ve seen in recent years and it’s still going.

Edit: since drafting this post, Tom Anthony released this excellent blog post on his views on the future of search and the shift to data-driven results. I’d recommend reading that as it approaches this whole area from a different perspective and I feel that an off-shoot of what Tom is talking about is the impact on link building.

You may be asking at this point, what does machine learning have to do with link building?

Everything. Because as strong as links are as a ranking signal, Google want more signals and user signals are far, far harder to manipulate than established link signals. Yes it can be done—I’ve seen it happen. There have even been a few public tests done. But it’s very hard to scale and I’d venture a guess that only the top 1% of spammers are capable of doing it, let alone maintaining it for a long period of time. When I think about the process for manipulation here, I actually think we go a step beyond spammers towards hackers and more cut and dry illegal activity.

For link building, this means that traditional methods of manipulating signals are going to become less and less effective as these user signals become stronger. For us as link builders, it means we can’t keep searching for that silver bullet or the next method of scaling link building just for an easy win. The fact is that scalable link building is always going to be at risk from penalization from Google—I don’t really want to live a life where I’m always worried about my clients being hit by the next update. Even if Google doesn’t catch up with a certain method, machine learning and user data mean that these methods may naturally become less effective and cost efficient over time.

There are of course other things such as social signals that have come into play. I certainly don’t feel like these are a strong ranking factor yet, but with deals like this one between Google and Twitter being signed, I wouldn’t be surprised if that ever-growing dataset is used at some point in organic results. The one advantage that Twitter has over Google is it’s breaking news freshness. Twitter is still way quicker at breaking news than Google is—140 characters in a tweet is far quicker than Google News! Google know this which is why I feel they’ve pulled this partnership back into existence after a couple of years apart.

There is another important point to remember here and it’s nicely summarised by Dr. Pete:

At the same time, as new signals are introduced, these are layers not replacements. People hear social signals or user signals or authorship and want it to be the link-killer, because they already fucked up link-building, but these are just layers on top of on-page and links and all of the other layers. As each layer is added, it can verify the layers that came before it and what you need isn’t the magic signal but a combination of signals that generally matches what Google expects to see from real, strong entities. So, links still matter, but they matter in concert with other things, which basically means it’s getting more complicated and, frankly, a bit harder. Of course, on one wants to hear that.”

– Dr. Pete

The core principles have not changed

This is the crux of everything for me. With all the changes listed above, the key is that the core principles around link building haven’t changed. I could even argue that Penguin didn’t change the core principles because the techniques that Penguin targeted should never have worked in the first place. I won’t argue this too much though because even Google advised website owners to build directory links at one time.

You need an asset

You need to give someone a reason to link to you. Many won’t do it out of the goodness of their heart! One of the most effective ways to do this is to develop a content asset and use this as your reason to make people care. Once you’ve made someone care, they’re more likely to share the content or link to it from somewhere.

You need to promote that asset to the right audience

I really dislike the stance that some marketers take when it comes to content promotion—build great content and links will come.

No. Sorry but for the vast majority of us, that’s simply not true. The exceptions are people that sky dive from space or have huge existing audiences to leverage.

You simply have to spend time promoting your content or your asset for it to get shares and links. It is hard work and sometimes you can spend a long time on it and get little return, but it’s important to keep working at until you’re at a point where you have two things:

  • A big enough audience where you can almost guarantee at least some traffic to your new content along with some shares
  • Enough strong relationships with relevant websites who you can speak to when new content is published and stand a good chance of them linking to it

Getting to this point is hard—but that’s kind of the point. There are various hacks you can use along the way but it will take time to get right.

You need consistency

Leading on from the previous point. It takes time and hard work to get links to your content—the types of links that stand the test of time and you’re not going to be removing in 12 months time anyway! This means that you need to keep pushing content out and getting better each and every time. This isn’t to say you should just churn content out for the sake of it, far from it. I am saying that with each piece of content you create, you will learn to do at least one thing better the next time. Try to give yourself the leverage to do this.

Anything scalable is at risk

Scalable link building is exactly what Google has been trying to crack down on for the last few years. Penguin was the biggest move and hit some of the most scalable tactics we had at our disposal. When you scale something, you often lose some level of quality, which is exactly what Google doesn’t want when it comes to links. If you’re still relying on tactics that could fall into the scalable category, I think you need to be very careful and just look at the trend in the types of links Google has been penalizing to understand why.

The part Google plays in this

To finish up, I want to briefly talk about the part that Google plays in all of this and shaping the future they want for the web.

I’ve always tried to steer clear of arguments involving the idea that Google is actively pushing FUD into the community. I’ve preferred to concentrate more on things I can actually influence and change with my clients rather than what Google is telling us all to do.

However, for the purposes of this post, I want to talk about it.

General paranoia has increased. My bet is there are some companies out there carrying out zero specific linkbuilding activity through worry.

Dan Barker

Dan’s point is a very fair one and just a day or two after reading this in an email, I came across a page related to a client’s target audience that said:

“We are not publishing guest posts on SITE NAME any more. All previous guest posts are now deleted. For more information, see www.mattcutts.com/blog/guest-blogging/“.

I’ve reworded this as to not reveal the name of the site, but you get the point.

This is silly. Honestly, so silly. They are a good site, publish good content, and had good editorial standards. Yet they have ignored all of their own policies, hard work, and objectives to follow a blog post from Matt. I’m 100% confident that it wasn’t sites like this one that Matt was talking about in this blog post.

This is, of course, from the publishers’ angle rather than the link builders’ angle, but it does go to show the effect that statements from Google can have. Google know this so it does make sense for them to push out messages that make their jobs easier and suit their own objectives—why wouldn’t they? In a similar way, what did they do when they were struggling to classify at scale which links are bad vs. good and they didn’t have a big enough web spam team? They got us to do it for them 🙂

I’m mostly joking here, but you see the point.

The most recent infamous mobilegeddon update, discussed here by Dr. Pete is another example of Google pushing out messages that ultimately scared a lot of people into action. Although to be fair, I think that despite the apparent small impact so far, the broad message from Google is a very serious one.

Because of this, I think we need to remember that Google does have their own agenda and many shareholders to keep happy. I’m not in the camp of believing everything that Google puts out is FUD, but I’m much more sensitive and questioning of the messages now than I’ve ever been.

What do you think? I’d love to hear your feedback and thoughts in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Announcing the New & Improved Link Intersect Tool

Posted by randfish

Y’all remember how last October, we launched a new section in Open Site Explorer called “Link Opportunities?” While I was proud of that work, there was one section that really disappointed me at the time (and I said as much in my comments on the post).

Well, today, that disappointment is over, because we’re stepping up the Link Intersect tool inside OSE big time:

Literally thousands of sweet, sweet link opportunities are now yours at the click of a button

In the initial launch, Link Intersect used Freshscape (which powers Fresh Web Explorer). Freshscape is great for certain kinds of data – links and mentions that come from newly published pages that are in news sources, blogs, and feeds. But it’s not great for non-news/blogs/feed sources because it’s intentionally avoiding those!

For example, in the screenshot above, I wanted to see all the pages that link to SeriousEats.com and SplendidTable.org but don’t link to SmittenKitchen.com.

That’s 671 more, juicy link opportunities thanks to the hard work of the Moz Big Data and Research Tools teams.

How does the new Link Intersect work?

The tool looks at the top 250,000 links our index has pointing to each of the intersecting targets you enter, and the top 1 mllion links in our index pointing to the excluded URL.

Link Intersect then runs a differential comparison to determine which of the 250K links to each of the intersecting targets are from the same URL or root domain, and removes any of those links that point to the top million links to the excluded URL/root/sub domain.

This means it’s possible for sites and pages with massive quantities of links that we won’t show every intersecting link we know about, but since the sorting is in Page Authority order, you’ll get the highest quality/most important ones at the top.

You can use Link Intersect to see three unique views on the data:

  • Pages that link to subdomains (particularly useful if you’re interested in shared links to sites on hosted subdomains like blogspot, wordpress, etc or to a specific subdomain section of a competitor’s site)
  • Pages that link to root domains (my personal favorite, as I find the results the most comprehensive)
  • Root domains that link to the root domains (great if you’re trying to get a broad sense of domain-level outreach/marketing targets)

Note that it’s possible the root domains will actually expose more links that pages because the domain-level link graph is easier and faster to sort through, so the 250K limit is less of a barrier.

Like most of the reports in Open Site Explorer, Link Intersect comes with a handy CSV Export option:

When it finishes (my most recent one took just under 3 minutes to run and email me), you’ll get a nice email like this one:

Please ignore the grammatical errors. I’m sure our team will fix those up soon 🙂

Why are these such good link/outreach/marketing targets?

Generally speaking, this type of data is invaluable for link outreach because these sites and pages are ones that clearly care about the shared topics or content of the intersecting targets. If you enter two of your primary competitors, you’ll often get news media, blog posts, reference resources, events, trade publications, and more that produce content in your topical niche.

They’re also good targets because they actually link out! This means you can avoid sifting through sites whose policies or practices mean they’re unlikely to ever link to you – if they’ve linked to those other two chaps, why not you, too?!

Basically, you can check the trifecta of link opportunity goodness boxes (which I’ve helpfully illustrated above, because that’s just the kind of SEO dork I am).

Link Intersect is limited only by your own creativity – so long as you can keep finding sites and pages on the web whose links might also be a match for your own site, we can keep digging through trillions of links, finding the intersects, and giving them back to you.

3 examples of Link Intersect in action

Let’s look at some ways we might put this to use in the real world:

#1: I’m trying to figure out who links to my two big competitors in the world of book reviews

First off, remember that Link Intersect works on a root domain or subdomain level, so we wouldn’t want to use something like the NYTimes’ review of books, because we’d be finding all the intersections to NYTimes.com. Instead, we want to pick more topically-focused domains, like these two:

You’ll also note that I’ve used a fake website as my excluded URL – this is a great trick for when you’re simply interested in any sites/pages that link to two domains and don’t need to remove a particular target.

#2: I’ve got a locally-focused website doing plumbing and need a few link sources to help boost my potential to rank in local and organic SERPs

In this instance, I’ll certainly look at pages linking to combinations of the top ranking sites in the local results, e.g. the 15 results for this query:

This is a solid starting point, especially considering how few links local sites often need to perform well. But we can get creative by branching outside of plumbing and exploring related fields like construction:

Focusing on better-linked-to industries and websites will give more results, so we want to try to broaden rather than narrow our categories and look for the most-linked-to sites in given verticals for comparisons.

#3: I’m planning some new content around weather patterns for my air conditioning website and want to know what news and blog sites cover extreme weather content

First, I’m going to start by browsing some search results for content in this field that’s received some serious link activity. By turning on my Mozbar’s SERPs overlay, I can see the sites and pages that have generated loads of links:

Now I can run a few combinations of these through the Link Intersect Tool:

While those domain names make me fear for humanity’s intelligence and future survival, they also expose a great link opportunity tactic I hadn’t previously considered – climate science deniers and the more politically charged universe of climate science overall.


I hope you enjoy the new Link Intersect tool as much as I have been – I think it’s one of the best things we’ve put in Open Site Explorer in the last few months, though what we’re releasing in March might beat even that, so stay tuned!

And, as always, please do give us feedback and feel free to ask questions in the comments below or through the Moz Community Q+A.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Developing Innovative Content: What You Need to Know

Posted by richardbaxterseo

A few weeks ago, I attended a breakfast meeting with a bunch of entrepreneurs in the technology, space (yes, space travel), software and engineering industry. I felt so blown away by the incredible talent of the speakers. You know, there are people out there building things, like private satellite networks, bio printing facilities, quantum computers and self-driving cars. I was completely transfixed by the incredibly future facing, innovative and exceptionally inventive group in front of me. I also immediately wished I’d worked a little harder in my twenties.

After the presentations, one of the questions that came up during the Q&A session was: “what’s the next big thing?”

Wow. Have you ever thought about “the next big thing”?

Part of the magic of predicting innovation is that it’s really, really hard to get right. Those that can accurately predict the future (in my humble opinion) are those that tend to understand how people will respond to an idea once they’re exposed to it. I think predicting this is a very special skill indeed.

Then again, we’re expected to be able to predict the outcome of our marketing, all the time. While predicting it is one thing, making it happen it is a whole different ball game.

Competition for the attention of our customers is getting tougher

In our industry, when you really boil down what it is we do, we’re fixing things, making things, or we’re communicating things.

Most of the time, we’re building content that communicates: ideas, stories, news and guidance–you get the idea. The problem is, no matter which vertical you work in, we’re all competing for something: the attention of our customers.

As our customers get smarter, that competition is getting tougher and tougher.

The most successful marketers in our industry all have a special trait in common. They are good at finding new ways to communicate ideas. Take a look at classic presentations
like this from Ross Hudgens to see just how powerful it can be to observe, imitate and develop an idea with astounding viral reach.

I particularly enjoy the idea of taking a piece of content and making improvements, be it through design, layout or simply updating what’s there. I like it because it’s actually pretty easy to do, and there’s growing evidence of it happening all over the Internet. Brands are taking a second look at how they’re developing their content to appeal to a wider audience, or to appeal to a viral audience (or both!).

For example; take a look at this beautiful
travel guide to Vietnam (credit: travelindochina.com) or this long form guide to commercial property insurance (credit: Towergate Insurance / Builtvisible.com) for examples of brands in competitive verticals developing their existing content. In verticals where ordinary article content has been done to death, redeveloping the medium itself feels like an important next step.

Innovative isn’t the same thing as technical

I’ve felt for a long time that there’s a conflict between our interpretation of “innovative” and “technical”. As I’ve written before, those that really understand how the web works are at a huge advantage.
Learn how it’s built, and you’ll find yourself able to make great things happen on your own, simply by learning and experimenting.

In my opinion though, you don’t have to be able to learn how to build your own site or be a developer. All you have to do is learn the vocabulary and build a broad understanding of how things work in a browser. I actually think we all need to be doing this, right now. Why?

We need more innovation in content marketing

I think our future depends on our industry’s ability to innovate. Of course, you still need to have your basics in place. We’ll always be
T-Shaped marketers, executing a bit of technical SEO here, a bit of content strategy there. But, we’re all SEOs and we know we need to acquire links, build audiences and generally think big about our ambitions. When your goal is to attract new followers, fans, links, and garner shares in their thousands, you need to do something pretty exciting to attract attention to yourself.

The vocabulary of content development

I’ve designed this post to be a primer on more advanced features found in innovative content development. My original MozCon 2014 presentation was designed to educate on some of the technologies we should be aware of in our content development projects and the process we follow to build things. We’ll save process for another post (shout in the comments if you think that would be useful!) and focus on the “what” for now.

At Builtvisible, we’re working hard on extending our in-house content development capabilities. We learn through sharing amazing examples with each other. Our policy is to always attempt to deconstruct how something might have been developed, that way, we’re learning. Some of the things we see on the web are amazing–they deserve so much respect for the talent and the skills that surface the content.

Here are some examples that I think demonstrate some of the most useful types of approach for content marketers. I hope that these help as much as they’ve helped us, and I hope you can form a perspective of what innovative features look like in more advanced content development. Of course, do feel welcome to share your own examples in the comments, too! The more, the merrier!

The story of EBoy

eBoy: the graphic design firm whose three co-founders and sole members are widely regarded as the “godfathers” of pixel art.

The consistent styling (as well as the beautifully written content) is excellent. Technically speaking, perhaps the most clever and elegant feature is the zoom of the image positioned on the Z axis in a <canvas> container (more on this in a moment).

An event listener (jQuery) helps size the canvas appropriate to the browser window size and the z axis position shifts on scroll to create an elegant zoom effect.


View the example here:

http://www.theverge.com/2014/6/17/5803850/pixel-perfect-the-story-of-eboy.

<canvas> is an HTML element which can be used to draw graphics using scripting (usually JavaScript). This can, for instance, be used to draw graphs, make photo composition or simple animations.

Colorizing the past

Take a look at
Pixart Printing’s Guide to Colourizing the Past (credit: Pixartprinting / Builtvisible.com) for a clever example of <canvas> in use. Here’s one of the images (tip, mouse-over and click the image):

The colorization feature takes advantage of the power of the canvas element. In this case, the color version of the image is applied to the canvas as a background image, with the black and white version on a layer above. Clicking (or touching, on mobile) erases portions of the top image, revealing the color version underneath.

Chrome Experiments: Globe

Globe is “simple” global data visualization of the Earth’s population growth over a set range of dates. The 3d visualization based in
webGL: a JavaScript API for rendering interactive 3D graphics and 2D graphics within any compatible web browser without the use of plug-ins.


View the example here:

http://globe.chromeexperiments.com/.

WebGL is a really exciting, emerging option available to content marketers who might want to experiment with immersive experiences or highly interactive, simulated environments.

Some of my
favourite WebGL examples include Hello Racer and Tweetopia, a 3d Twitter Hastag visualizer.

If you’d like to see more examples of webGL in action, take a look at
Chrome Experiments. Don’t worry, this stuff works in the latest versions of Firefox and IE, too.

Polygon’s PS4 Review

You might have seen me cover this long form concept over at Builtvisible. Polygon’s Playstation 4 review is a fully featured “long form” review of Sony’s much loved gaming machine. The bit that I love is the SVG visualizations:

“What’s SVG?”, I hear you ask!

SVG is super-fast, sharp rendering of vector images inside the browser. Unlike image files (like .jpg, .gif, .png), SVG is XML based, light on file size, loads quickly and adjusts to responsive browser widths perfectly. SVG’s XML based schema lends itself to some interesting manipulation for stunning, easy to implement effects.

View Polygon’s example here: http://www.polygon.com/a/ps4-review

That line tracing animation you see is known as
path animation. Essentially the path attribute in the SVG’s XML can be manipulated in the DOM with a little jQuery. What you’ll get is a pretty snazzy animation to keep your users eyes fixated on your content and yet another nice little effect to keep eyeballs engaged.

My favourite example of SVG execution is Lewis Lehe’s
Gridlocks and Bottlenecks. Gridlocks is a AngularJS, d3.js based visualization of the surprisingly technical and oft-misunderstood “gridlock” and “bottleneck” events in road traffic management.

It’s also very cool:

View the example here:http://setosa.io/blog/2014/09/02/gridlock/.

I have a short vocabulary list that I expect our team to be able to explain (certainly these questions come up in an interview with us!). I think that if you can explain what these things are, as a developing content marketer you’re way ahead of the curve:

  • HTML5
  • Responsive CSS (& libraries)
  • CSS3 (& frameworks)
  • JavaScript (& frameworks: jQuery, MooTools, Jade, Handlebars)
  • JSON (api post and response data)
  • webGL
  • HTML5 audio & video
  • SVG
  • HTML5 History API manipulation with pushState
  • Infinite Scroll

Want to learn more?

I’ve
amassed a series of videos on web development that I think marketers should watch. Not necessarily to learn web development, but definitely to be able to describe what it is you’d like your own content to do. My favourite: I really loved Wes Bos’s JS + HTML5 Video + Canvas tutorial. Amazing.

Innovation in content is such a huge topic but I realize I’ve run out of space (this is already a 1,400 word post) for now.

In my follow up, I’d like to talk about how to plan your content when it’s a little more extensive than just an article, give you some tips on how to work with (or find!) a developer, and how to make the most of every component in your content to get the most from your marketing efforts.

Until then, I’d love to see your own examples of great content and questions in the comments!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]

Structural Optimization through FEA

FEA software can identify optimum material placement to satisfy all part requirements like geometry, loads, stiffness and constraints for multiple load cases…

[ccw-atrib-link]

How to Recover Lost Pageviews in pushState Experiences

Posted by GeoffKenyon

PushState and AJAX can be used in tandem to deliver content without requiring the entire page to refresh, providing a better user experience. The other week, Richard Baxter dove into the implications of pushState for SEO on Builtvisible. If you’re not familiar with pushState, you should spend some time to read through his post.

If you’re not familiar with delivering content this way, you can check out these sites using pushState and AJAX to deliver content:

Time: When you scroll to the bottom of the article, a new article loads and the URL changes
Halcyon: When you click on a navigation link, the left hand panel doesn’t refresh

While pushState is really cool and great for UX, there are analytics issues presented by this technology.

When the content on a page and URL are updated using AJAX and pushState, in most cases, the 
_trackPageView beacon is not fired and the pageview is not tracked. This artificially increases your bounce rate while reducing your pages per visit, time on site, and total pageviews along with other metrics associated with pageviews. 

How to tell if you’re having tracking problems

If you have a very high bounce rate or are generally curious to check if this is a problem for you, start by installing the GA Debugger extension for Chrome. Then go to the URL you want to investigate and open up the console (windows: control + shift + j, mac: command + option + j). Now, clear the console using the button at the left, and refresh the URL.

Once you refresh the page, you should see GA debugging show up in the console. To check that the initial page view is being tracked, you should see a “sent beacon” for a pageview.

Once you’ve established the initial pageview is tracked, click a link to load another page. If GA is properly tracking pageviews, you should see another pageview beacon being sent. If you don’t see this, then you have a problem.

Capturing these pageviews with GTM

The good news is that even though this is a huge problem, it can easily be fixed with Google Analytics and Google Tag Manager.

Start by creating a new “History Listener” tag. Now set your fire rules to all pages and hit save. This will simply look for changes to the URL.

Now we’ll need to create a separate event to fire a pageview when the URL History Listener fires. To do this, create a new GA tag. 

If you already run Google Analytics from GTM, you’ll simply need to modify your existing tag. This tag should, by default, be set to track pageviews. 

At this point we’ll need to set the firing rules. First, we should make sure the tag is firing on all of our pages for our basic GA installation.

The firing rule for all pages should be a default option.

If you are already running GA via GTM, you’ll already have this set up. You’ll need to create a subsequent firing rule to fire a pageview for this URL History Listener.

To do this, click to add a new firing rule and then select “create new rule.” Name the rule, and then move on to conditions. The default rule should be [url] [contains]; we need to change this to [event] [equals]. Then we’ll set the condition to gtm.historyChange. Now click save.

Now you should be all set to hit publish on your updated tag container. Overnight, you should see a change in your pageviews and related metrics.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

[ccw-atrib-link]