SEO Blog

Author Archive

Olde Tyme Whiteboard Friday

Posted by:  /  Tags: , , ,

Posted by randfish

Salutations, flappers and dappers! Today we come to you from the golden age of the Internet. That's right, the 1920's! SEOmoz's resident Whiskbroom has put on his finest glad rags and is going to give you some knowledge, strictly on the level.

OK, you got us. We're not actually in the 1920's. But some of those old techniques that we all used in the past still work in 2012 and into 2013. Let's all watch Rand tell us all about them.

What older techniques are you still using and seeing great results? Let us know in the comments below.

Video Transcription

"Howdy SEOmoz fans, and welcome to this special edition of Old Tyme Whiteboard Friday. Now, this week on Whiteboard Friday I want to talk to you about the major search engines: Lycos, Northern Light, HotBot, AltaVista, Infoseek, Yahoo, Webcrawler, and Dogpile. Now these fine search engines are going to help your visitors get to your website. The website is a very important page. It's on something called the Internet. My understanding is there are tubes, system of tubes that connect so you can get to them, and the way to get to the top of these search engines, none other than, starting with keywords.

Keywords are the cat's pajamas. You must have keywords. You want to repeat them as often as you can, stuff them into your titles, put them in your meta keywords tags, your meta descriptions tags, all over the page if you can. If I could have a page that was just a big list of keywords, I would do it.

Next, doorway pages. These are magical. The doorway page is a great way to stuff keywords into a page and yet show that only to the search engines and not have to force it upon your users, because, as we all know, visiting a doorway page can get a little, you know, rough. So you want to gazoozle a bit and show the search engine your doorway page.

Next, submissions. Submissions are very critical if you want to earn a happy cabbage. Now, to do direct submissions, you need to find all the search engines that I've listed up here, plus many hundreds of others. Remember that many hundreds of secondary search engines power these major search engines. You want to get into those so you can get into these.

And last, but not least, directories. Directories are critically important. Submitting to the directories, getting included in the directories, you can't be fimble-fambling around here. You've got to do the hard work and get in the directories.

All right, everyone. In addition, to our Old Tyme Whiteboard Friday, we're going to do a little bit of serious Whiteboard Friday, but first a drink. It smells like heaven. Don't want to take too much at work here. There we go. Just take my handy . . . burns like heaven. I feel better already.

So old-tyme SEO had some weird things going on with it, but, in fact, there are some classics from the late '90s, from the early 2000's that still work today. We're going to help you with these.

First off, reciprocation. Actually, that feels ridiculous. Reciprocation, if you help other people out, they, in turn, will help you. I don't just mean this in terms of you link to someone else and they'll link to you, although that can be helpful. But what I mean is if you help someone out doing something, something on social media, something with their website, you can often get them to pay that back to you. I'll give you one of the best examples I've got.

We love to send tons and tons of traffic to other people's websites through the Moz Top Ten. When we do that, when we drive traffic from SEOmoz's email subscribers, about 250,000 people subscribe to the Moz Top Ten, that drives traffic to those sites, and then those sites all tell people, "Wow, I was in the Moz Top Ten. You should subscribe to it." Wonderful way to play reciprocation and to get something back for giving something out.

Being on the jiffy with your keyword research and targeting. So this is really interesting, because what I mean by on the jiffy is getting to a keyword before it makes its way into the common keyword research tools. This is mostly the AdWords search tool. Before Google has volume there, you can find phrases that have come out in news, new brand names and products, things that bloggers are talking about, things people are searching for and talking about on social media, trending items. Those things will have search volume next month, but they might not make their way into the keyword research tools for 30 or 60 days. That means you can jump the gun and be ahead of any of your competitors. Using search suggest for this is actually a really smart way to go too, because a lot of the times, those search suggest terms don't make their way into the AdWords keyword research tool.

Improving on the good works of others. I've been shocked to see, you know what, we have this inside our heads, as content producers, that we have to produce something very unique and different. But great artists steal, and it is just fine to take something else on the Web that's a good resource, that you think, "Man, that's solid but I could do it better," and do it better. We've had tons of success with this.

SEOmoz, when we first started out, I used to use Vaughns-1-pager around SEO ranking factors. Then I thought, "I wish there was a better one of that." We made our own ranking factors, and it worked out great. We got statistical data and the opinions of lots of SEOs and aggregated them, so it wasn't just me saying what I thought was important. That worked very, very well. It got us a ton of notoriety and citations and links.

Empathizing with the needs of your audience. This is one area where your distance from your customers hurts you. The further you are from your customers, the worse off you're going to be. But the closer you are, the better you can be. If you can spend time with your customers, talking to them, figuring out, "Hey, what do they need? What do they like? What are they missing? What do they not understand," not just about what you're doing, but about anything that's going on in your field, about any topic that a large percent of your customers are having, even if it doesn't really relate to what you sell or what you do, you can produce content and provide solutions, basic easy tools, a resource guide or a list. You can contract this out to somebody who might be an expert, to have them come in and produce the content for you, a video, a landing page that describes all these problems, a downloadable white paper, a research document. This kind of stuff works wonders in terms of not just getting engagement, but also targeting new keywords, reaching your audience, and making them delighted.

And finally, requesting action at the pinch of the game. So, a lot of the time we will do things that I think are a little bit foolish in the inbound marketing sphere. One of my favorite examples, worst examples too, is you get to a blog post and you look at the top and on the sidebar, and it's just filled with all these things asking you to share and subscribe and become my friend on Facebook. You kind of think to yourself, "I've never been here before. How do I know that I want to share this on LinkedIn, and pin it on Pinterest, and put it on Facebook?"

Ask in the pinch of the game. Once they've finished reading the article, then, at the bottom, right, that's the time to potentially ask. This happens all the time. For example, someone's just purchased something from you in an e-commerce store. One of my favorites was this store that I bought some supplies from, and they sent, in their email, in their thank you email and confirmation a, "If you had a great experience with our product, with our store, we'd love to get a link from you, and here's a little embed you can put on your site, saying that you're a customer." What a great time. Don't ask for it before you've done a good job for me. Ask for it after you've done a great job for me. That's the pinch of the game.

All right, everyone, I hope you've enjoyed Old Tyme Whiteboard Friday, and we will see you again next week for another edition, sans chopped mustache and ridiculous costume. Thanks everyone. Take care."

Video transcription by

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

SEOmoz Daily SEO Blog

Pitching Search Marketing In Traditional Marketing Terms

Posted by:  /  Tags: , , , ,

For those selling search marketing to customers, especially those customers new to the concept of search marketing, it’s often useful to pitch search marketing services in terms the customer already understands.

A lot search marketing theory and practice is borrowed and adapted from direct marketing. Direct marketing concepts have been around since the 60s, and may be more readily understood by some customers than some of the arcane terminology sometimes associated with SEO/SEM.

Here are some ideas on how to link search marketing and direct marketing concepts.

1. Targeting & Segmentation

A central theme of direct marketing is targeting.

On broadcast television, advertisers show the one advertisement to many people, and hope it will be relevant to a small fraction of that audience. Most television advertising messages are wasted on people who aren’t interested in those messages. It’s a scattergun, largely untargeted approach.

Search marketing, a form of direct marketing, is targeted. Search marketers target their audience based on the specific keywords the audience use.

Search marketing is concerned with the most likely prospects – a small fraction of the total audience. Further, if we analyse the visitor behavior of people using specific keyword terms post-click, we can find out who are the hottest prospects amongst that narrowly defined group.

The widely accepted 20-80 rule says that 20% of your customers create 80% of your business. An example might be “luxury vacations France”, as opposed to “vacations France”. If we have higher margins on luxury travel, then segmenting to focus on the frequent luxury travel buyer, as opposed to a less frequent economy buyer whom we still might sell to, but at lower margins, might be more in line with business objectives. Defining, and refining, keyword terms can help us segment the target market.

2. Focus

Once you get a search visitor to your site, what happens next?

They start reading. Such a specific audience requires focused, detailed information, and a *lot* of it, or they will click back.

It is a mistake to pitch to an “average” audience at this point i.e. to lose focus. If we’ve done our job correctly, and segmented our visitors using specific keyword terms, we already know they are interested in what we offer.

To use our travel example above, the visitor who typed in “luxury vacations in France” wants to hear all about luxury vacations in France. They are unlikely to want a pitch about how wonderful France, as a country, is, as the keyword term suggests they’ve already made their mind up about destination. Therefore, a simplistic, generalized message selling French tourism is less likely to work.

Genuine buyers – who will spend thousands on such vacations – will want a lot of detail about luxury travel in France, as this is unlikely to be a trivial purchase they make often. That generally means offering long, detailed articles, not short ones. It means many options, not few. It means focusing on luxury travel, and not general travel.

Simple, but many marketers get this wrong. They go for the click, but don’t focus enough on the level of detail required by hot prospects i.e. someone most likely to buy.

3. Engagement

One advantage of the web is that we can spend a lot of time getting a message across once a hot prospect has landed on a site. This is not the case on radio. Radio placements only have seconds to get the message across. Likewise, television slots are commonly measured in 15 and 30 second blocks.

On the web, we can engage a visitor for long periods of time. The message becomes as long as the customer is prepared to hear it.

4. Personalized

The keyword tells you a lot about visitor intent. “Luxury travel France” is a highly targeted term that suggests a lot about the visitor i.e. their level of spend and tastes. If we build keyword lists and themes associated with this term, we can personalize the sales message using various landing pages that talk specifically to the needs of the visitor. Examples might include “Five Star Hotels”, “Luxury Car Hire”, “Best Restaurants In Paris”, and so on. Each time they click a link, or reveal a bit more about themselves,we can start to personalize the message. Personalized marketing works well because the message is something the prospect is willing to hear. It’s specifically about them.

We can personalize the journey through the site, configuring customized pathways so we can market one-to-one. We see this at work on Amazon notes your search and order history and prompts you with suggestions based on that history. One-to-many marketing approaches, as used in newspapers, on radio and on television typically aren’t focused and lack personalization. They may work well for products with broad appeal, but work less well for defined niches.

5. Active Response

We’re not just interested in views, impressions, or reach. We want the visitor to actively respond. We want them to take a desired, measurable action. This may involve filling out a form, using a coupon, giving us an email address, and/or making a purchase.

Active response helps make search marketing spends directly accountable and measurable.

6. Accountable

People either visit via a search term, or they don’t.

Whilst there can be some advantage in brand awareness i.e. a PPC ad that appears high on the page, but is only clicked a fraction of the time, the real value is in the click-thru. This is, of course, measurable, as the activity will show up in the site statistics, and can be traced back to the originating search engine.

Compare this with radio, television or print. It’s difficult to know where the customer came from, as their interaction may be difficult to link back to the advertising campaign.

Search marketing is also immediately measurable.

7. Testable

Some keyword terms work, some do not. Some keyword terms only work when combined with landing page X, but not landing page Y. By “work” we tend to mean “achieves a measurable business outcome”.

Different combinations can be tried and compared against one another. Keywords can be tested using PPC. Once we’ve determined what the most effective keywords are in terms of achieving measurable business outcomes, we can flow these through to our SEO campaign. We can do the reverse, too. Use terms that work in our SEO campaigns to underpin our PPC campaigns.

This process is measureable, repeatable and ongoing. Language has near infinite variety. There are many different ways to describe things, and the landing pages can be configured and written in near infinite ways, too. We track using software tools to help determine patterns of behaviour, so we can keep feeding this back into our strategy in order to refine and optimize. We broaden keyword research in order to capture the significant percentage of search phrases that are unique.

Further Reading:



Three Ways To Break Down A Market

Posted by:  /  Tags: , , , ,

Ford said “give the customer any color they want, so long as it is black”. This strategy worked for a while, because people just wanted a car. However, the market changed when GM decided they would offer a range of cars to suit different “purposes, purses and personalities”.

Between 1920 and 1923, Ford’s market share plummeted from 55 to 12 percent.

These days, auto manufacturers segment the market, rather than treat it as one homogeneous mass. There are cars for the rich, cars for the less well off, cars built for speed, and cars built for shopping.

Manufacturers do this because few manufacturers can cater to very large markets where the consumer has infinite choice. To be all things to all people is impossible, but to be the best for a smaller, well-defined group of people is a viable business strategy. It costs less to target, and therefore has less risk of failure. Search marketing is all about targeting, so let’s take a look at various ways to think about targeting in terms of the underlying marketing theory which might give you a few ideas on how to refine and optimize your approach.

While there are many ways to break down a market, here are three main concepts.


Any market can be broken down into segments. A segment means “a group of people”. We can group people by various means, however the most common forms of segmentation include:

Benefit segmentation: a group of people who seek similar benefits. For example, people who want bright white teeth would seek a toothpaste that includes whitener. People who are more concerned with tooth decay may choose a toothpaste that promises healthy teeth.

Demographic Segmentation: a group of people who share a similar age, gender, income, occupation, education, religion, race and nationality. For example, retired people may be more interested in investment services than a student would, as retired people are more likely to have capital to invest.

Occasion Segmentation: a group of people who buy things at a particular time. Valentines Day is one of the most popular days for restaurant bookings. People may buy orange juice when they think about breakfast time, but not necessarily at dinner. The reverse is true for wine.

Usage Segmentation: a group of people who buy certain volumes, or at specific frequencies. For example, a group of people might dine out regularly, vs those who only do so occasionally. The message to each group would be different.

Lifestyle segmentation: a group of people who may share the same hobbies, or live a certain way. For example, a group of people who collect art, or a group of people who are socialites.

The aim is to find a well-defined market opportunity that is still large enough to be financially viable. If one segment is not big enough, a business may combine segments – say, young people (demographic) who want whiter teeth (benefit). The marketing for this combined segment would be different – and significantly more focused – that the more general “those who want whiter teeth” (benefit) market segment, alone.

How does this apply to search and internet marketing in general?

It’s all about knowing your customer. “Knowing the customer” is an easy thing to say, and something of a cliche, but these marketing concepts can help provide us with a structured framework within which to test our assumptions.

Perhaps that landing page I’ve been working on isn’t really working out. Could it be because I haven’t segmented enough? Have I gone too broad in my appeal? Am I talking the language of benefits when I should really be focusing on usage factors? What happens if I combine “demographics” with “occassion”?


Niches are similar to segments, but even more tightly defined based on unique needs. For example, “search engine marketing education” is a niche that doesn’t really fit usefully within segments such as demographics, lifestyle or occasion.

The advantage of niche targeting is that you may have few competitors and you may be able to charge high margins, as there is a consumer need, but very few people offer what you do. The downside is that the niche could weaken, move, or disappear. To mitigate this risk, businesses will often target a number of niches – the equivalent of running multiple web sites – reasoning that if one niche moves or disappears, then the other niches will take up the slack.

Search marketing has opened up many niches that didn’t previously exist due to improved marketing efficiency. It doesn’t cost much to talk to people anywhere in the world. Previously, niches that required a global audience in order to be viable were prohibitive due to the cost of reaching people spread over such a wide geographic area.

To function well in a niche, smaller companies typically need to be highly customer focused and service oriented as small niche businesses typically can’t drive price down by ramping volume.


Cells are micro-opportunities. This type of marketing is often overlooked, but will become a lot more commonplace on the web due to the easy access to data.

For example, if you collect data about your customers buying habits, you might be able to identify patterns within that data that create further marketing opportunities.

If you discover that twenty people bought both an iPhone and a PC, then they may be in the market for software products that makes it easy for the two devices to talk to each other. Instead of targeting the broader iPhone purchaser market, you might tailor the message specifically for the iphone plus PC people, reasoning that they may be having trouble getting the two devices to perform certain functions, and would welcome a simple solution.

Further Reading:



2012 Search Marketing Year in Review

Posted by:  /  Tags: , , , ,

2012 Search Marketing Year in Review was originally published on, home of expert search engine optimization tips.

2012 was an eventful year in search marketing — we laughed, we cried (some of us more than others when Penguin hit). But we made it. And we even survived an apocalypse. Each month here at Bruce Clay, Inc., we bring to you an industry newsletter that dives into the issues that matter to marketers  (Did you know that? Have you signed up?). Looking back on newsletter editions in 2012, we can see some of the events that shaped the year in search marketing. The following is a culmination of some of our most popular reading in the SEO Newsletter this year, starting with some big Google-focused events.

Google Algorithm Issues 2012

Woman Pruning a Tree

Did you have to take up link pruning as a new hobby after Google’s Penguin hit?

In April 2012, Google made some big changes to its algorithm, and sites with if-y link practices were hit the hardest. That sent site owners into a tailspin — many not knowing how to recover from the penalties and loss of rankings. Bruce Clay, Inc. knew that cleaning up a site’s link profile, including “pruning” the inbound links was the way to help sites get back on the right path. In this step-by-step guide to link pruning back in May 2012, we showed you how to understand your link profile, identify the links you should be removing and how to handle removal requests. Since the article was written, Google gave webmasters some help getting rid of links as a last resort with the Link Disavow tool.

Related Topics – The Google Saga Continues

Back in April, Bruce Clay weighed in on the whole Google over-optimization issue (more on that to the right), including:

  • What to watch out for when evaluating the long-term security of your SEO strategy.
  • Possible technologies Google is using to detect over-optimization.
  • Potential motivation for Google’s reinvigorated offense, including the future face of search results.
Warning Symbol
It was a lively time for search back towards the beginning of 2012. Google Webmaster Tools was sending warning messages left and right to webmasters. Matt Cutts *mentioned” something about an algo update to target aggressive SEO at SXSW in February. And what happened next was a panic over what many were calling an “over-optimization penalty.” In this article by  Bruce Clay Australia, we looked at over-optimization and unnatural linking.
Woman Standing Next to ChalkboardIs your site worthy of rankings? This is the question many site owners and marketers ask every day — especially in the light of Google’s crackdown in 2012. In this article on algorithm-proof SEO, we explored an approach to SEO that can help you avoid disastrous consequences of algorithm updates and keep your site healthy. Letter Grade Drawn on a ChalkboardIt’s been known for some time that Google has a method for rating the quality of a site, but just how they do it has been somewhat of a mystery. Enter Google’s leaked quality rating manual for its human raters. The good news? It confirms what many marketers already suspected about what Google believes is quality.


Data and Tools in 2012

Caveman with a White Board

Did you practice “caveman analytics” in 2012? We learned tips from author and speaker Matt Bailey on how to get the most from the data available.

What were people talking about in the way of data and tools to help add context to our decisions in search marketing in 2012? Here we look at some of our more popular reads on reporting, including the launch of Google’s social reports in analytics; an interview with author and speaker Matt Bailey on context in data and asking the right questions; and how to mine data for SEO with BCI’s SEOToolSet.

Typewriter Report

Earlier in the year, Google released social reporting in its analytics tool. In this article, we explored how to find and read social reports.

Laptop and Magnifying Glass

Do you know how to get the most out of your data? This article about our SEOToolSet aimed to help marketers understand data that’s available for wiser decisions within SEO strategy.


Ecommerce, Mobile and Social

In 2012, Google’s Search Plus Your World (and the rise of Google Plus) was all the buzz. People wondered once again if it “killed SEO” (the running joke amongst search marketers who’ve heard this question one too many times). There was also a continued emphasis on the need for responsive design for mobile as Google announced official guidelines for mobile optimization. And an evergreen topic that’s always interesting to readers: how site navigation and information architecture work together to create a great ecommerce and shopping experience. These are just some of the more popular articles on the topics of mobile, social and ecommerce in 2012.

Baby and Cell Phone

Back in June, Google announced guidelines for optimizing mobile-ready sites. Bruce Clay Australia dives into this topic.

Magnet with Money

What’s the best way to design navigation for ecommerce sites? Bruce Clay India looked into sites that hit the mark.

Mouse Icon Over a Button with a Questions Mark

What is Google’s “Search, plus Your World”? And how does Google+ factor into it? In this article, we explored how the new search functionality works.


SEO Factors and Trends Report

Bruce Clay Australia put out a report at the head of the new year that talked trends over 2011 and predictions for SEO and other search marketing disciplines in 2012. You can download the report to see what transpired over 2011, and how different or the same it was to this year. And check out some highlights below …

Search Engine Optimization on a Chalkboard

Just some of the highlights from the report on search marketing in 2011 included:

  • A smarter Web, tailoring content to individual users.
  • The rise of Google+ and big changes to Facebook.
  • The continued importance of White Hat SEO.

2012 predictions included a focus on the user:

  • Serve them awesome content regularly.
  • Increase the number of touch points with them by integrating with social platforms.
  • Spend time creating advanced, cross-platform user-engagement strategies.
  • Allow them access to your information wherever they are through mobile sites and apps.
  • Make their lives easier by facilitating their access to your information, products and services.
  • Reinforce your local presence and geo-location services.

Hope you enjoyed this 2012 recap of search marketing hot topics. See you in 2013 for another lively year. My prediction? There won’t ever be a dull moment.

And if you’re interested in Bruce Clay’s predictions for the state of search, stay tuned for the January edition of the SEO Newsletter, set to hit mid-January.

Bruce Clay Blog

Change Your Offer Without Changing Your Product

Posted by:  /  Tags: , , , ,

Have you been selling a product or service for some time, but think you might need to do something new to keep up with the market? Offer something fresh?

One of the problems with making significant changes to your products or services is that it tends to carry a high level of risk. There is a risk you could alienate your existing prospects. There is always risk in starting over and trying something new and untested, as the untried and untested is more likely to fail.

But what if you could change your product or service without really changing it at all! Here are a few ideas on how to make changes, by changing the pitch, and without going to the effort, or taking the risk of making fundamental changes.


One of the great things about direct marketing, of which search marketing is a part, is that we’re not likely to be starting with products and services that have had an awareness and associations built up over many years – like Coca-Cola, for example. We get to modify the position, if we so choose.

Position, in marketing, means perception. Perception in the minds of the prospective customer. We can appeal to perceptions, or shape our product to fit perceptions, depending on what our prospects want.

For example, we could take the same car and market it to two different groups using positioning. To one group, we emphasize safety features above all else. To another group, we emphasize performance. The product doesn’t change, but the positioning does, and thus appeals to different groups of buyers. In reality, a car manufacturer probably wouldn’t do this, at least not in the same market, as it could send confusing messages.

However, on the web, we can often chop and change products, and target different groups, and one doesn’t necessarily need to overlap another.

Vertical Positioning

A vertical is a group of similar businesses and customers that engage in trade based on specific and specialized need. They may be a subset of a larger market. For example, PPC is a vertical within internet marketing, itself a subset of general marketing.

In terms of positioning within vertical markets, imagine you’re a software developer in the search marketing space. If you were talking to a group of manufacturers, and want to talk about what you do in a way that is understandable to this audience, you might talk to them in broad terms about marketing.

If you were talking to a group of marketers you might talk more specifically about search marketing. If you were talking to a room full of search marketers, you might talk more specifically again about the PPC optimization software you’re working on. If you were talking to a room of PPC optimization software developers…..and so on.

They are all part of the same market – and they might all need what you have – but each audience exists in different verticals, and so you change the message to suit. Changing vertical positioning is when you target a different vertical within the same market. An example of this might be a landlord who rents out a house to a single tenant changes to renting it out students on a room by room basis with “shared facilities”. She’s still in the accommodation provision market, the product is the same, but it is pitched to a different niche.

Can you identify different verticals in your market to which you product or service might also appeal? Can you configure your product, without making fundamental changes, so that it appeals to the needs of a different niche within your market?

Positioning In Time

Positioning in time, sometimes described as horizontal positioning in direct marketing circles, refers to the point in time when a person buys something, and positioning the message to appeal to different buyers depending on where they are in the buy-cycle.

For example, if someone is genuinely new to your product, and doesn’t even know they want it, then you could pitch your advertising based on the benefits your product provides. If I wanted to sell, say, a revolutionary new power cell, I wouldn’t talk about specifications to someone unfamiliar with the product, I’d talk about the fact that it replaces the need to be on an electricity grid, so the buyer doesn’t need to pay line charges. I’d emphasize benefits.

If someone is already aware of these new power cells, and knows all the benefits, I would likely emphasize other aspects, such as features and price more than benefits, as the buyer should already understand them.

This type of positioning will be familiar to people who do a lot of PPC. The link text, message and landing page changes to accommodate buyers at different stages in the sales cycle. The product doesn’t change, but the message does.


Another way to reposition a product or service is to use an isolation technique. Take a single aspect of the product and make it a major part of the offer. For example, TIME magazine sells subscriptions to a magazine, but their advertising often focuses on the “free” gifts that accompany a subscription. This technique is often used when the main product itself is well known to the audience, and there’s not much new that can be said about it.

Many software companies who formerly sold their software now give their software away as part of a freeware model, but sell software support and maintenance services around it. They isolate an aspect that was always there – service – but now emphasize it, and push the actual product into the background. This tends to happen when the product becomes commodity and there are few ways to differentiate it without making significant changes.


Think about bundling products or services together to appeal to a different vertical.

For example, there might be a small market for individual electronic components, but a large market for a “phone tapping device”. Something Woz and Steve Jobs built a company on.

Music distribution companies, like Spotify, take individual tunes, bundle them together as a huge library, and sell subscriptions to it, as opposed to selling on a song by song, basis, like iTunes do.

Individual garden plants and potting accessories might not be very interesting, but bundled together as a “kitchen greenhouse” they might appeal to an audience of foodies who don’t necessarily see themselves as gardeners.

Further Reading



Which Data Matters Most to Marketers? Take the Survey!

Posted by:  /  Tags: , , , , ,

Posted by randfish

2012 was a year of triumphs and setbacks for marketers seeking the data to best accomplish their goals. Big improvements and additions in products like Google Analytics, GWMT, Bing Webmaster Tools, Mixpanel, KISSMetrics, Raven, and yes, SEOmoz PRO, too (along with dozens of others), helped many of us improve our reporting, auditing, and optimization efforts. But with the good came the bad, and setbacks like Google's expansion of keyword (not provided), the loss of referral data from iOS6, and kerfuffles over AdWords data appearing alongside rankings reared their heads, too.

When it comes to marketing data, I really like the concept behind Google's own mission statement: organize the world's information and make it universally accessible and useful. Unfortunately, I think the search giant has been falling short on a lot of the aspects that relate to our world, and thus it's up to third parties to pick up the slack. Moz is obviously part of that group, and we have plenty of self-interest there, but many other companies (and often Google and Bing themselves) are stepping in to help.

To help better understand the information that matters most to professionals in our field, we want to run a short survey focused specifically on data sources:

Data Sources Survey


We hope that this takes less than two minutes to complete, and that by aggregating broad opinions on the importance of data sources, we can better illustrate what matters most to marketers. In the spirit of transparency, we plan to share the results here on the Moz blog (possibly in an update to this post) in the next week or two.

Please help us out by taking the survey and by sharing it with your fellow marketers (or any professional you know who relies on marketing data).

Thanks very much!

*For those who have asked about SEOmoz's own plans regarding rankings vs. AdWords API data – we have removed AdWords search volume from our keyword difficulty tool (it was never part of the formula), and will be working on alternatives, possibly with the folks over at Bing. Like others in the field – Hubspot, Ginza, Conductor, Brightedge, Authority Labs, etc. – we plan to maintain rankings data in our software.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

SEOmoz Daily SEO Blog

Is Google Concerned About Amazon Eating Their Lunch?

Posted by:  /  Tags: , , , , , ,

Leveling The Playing Field

When monopolies state that they want to “level the playing field” it should be cause for concern.

Groupon is a great example of how this works. After they turned down Google’s buyout offer, Google responded by…

The same deal is slowly progressing in the cell phone market: “we are using compatibility as a club to make them do things we want.”

Leveling Shopping Search

Ahead of the Penguin update Google claimed that they wanted to “level the playing field.” Now that Google shopping has converted into a pay-to-play format & has opted out of participation, Google once again claims that they want to “level the playing field”:

“We are trying to provide a level playing field for retailers,” [Google’s VP of Shopping Sameer Samat] said, adding that there are some companies that have managed to do both tech and retail well. “How’s the rest of the retail world going to hit that bar?”

This quote is particularly disingenuous. For years you could win in search with a niche site by being more focused, having higher quality content & more in-depth reviews. But now even some fairly large sites are getting flushed down the ranking toilet while the biggest sites that syndicate their data displace them (see this graph for an example, as Pricegrabber is the primary source for Yahoo! Shopping).

How Google Drives Businesses to Amazon, eBay & Other Platforms

Google has spent much of the past couple years scrubbing smaller ecommerce sites off the web via the Panda & Penguin updates. Now if small online merchants want an opportunity to engage in Google’s search ecosystem they have a couple options:

  • Ignore it: flat out ignore search until they build a huge brand (it’s worth noting that branding is a higher level function & deep brand investment is too cost intensive for many small niche businesses)
  • Join The Circus: jump through an endless series of hoops, minimizing their product pages & re-configuring their shopping cart
  • PPC: operate at or slightly above the level of a non-functional thin phishing website & pay Google by the click via their new paid inclusion program
  • Ride on a 3rd Party Platform: sell on one of the larger platforms that Google is biasing their algorithms toward & hope that the platform doesn’t cut you out of the loop.

Ignoring search isn’t a lasting option, some of the PPC costs won’t back out for smaller businesses that lack a broad catalog to do repeat sales against to lift lifetime customer value, SEO is getting prohibitively expensive & uncertain. Of these options, a good number of small online merchants are now choosing #4.

Operating an ecommerce store is hard. You have to deal with…

  • sourcing & managing inventory
  • managing employees
  • technical / software issues
  • content creation
  • marketing
  • credit card fraud
  • customer service
  • shipping

Some services help to minimize the pain in many of these areas, but just like people do showrooming offline many also do it online. And one of the biggest incremental costs added to ecommerce over the past couple years has been SEO.

Google’s Barrier to Entry Destroys the Diversity of Online Businesses

How are the smaller merchants to compete with larger ones? Well, for starters, there are some obvious points of influence in the market that Google could address…

  • time spent worrying about Penguin or Panda is time that is not spent on differentiating your offering or building new products & services
  • time spent modifying the source code of your shopping cart to minimize pagecount & consolidate products (and various other “learn PHP on the side” work) is not spent on creating more in-depth editorial
  • time switching carts to one that has the newly needed features (for GoogleBot and ONLY GoogleBot) & aligning your redirects is not spent on outreach and media relations
  • time spent disavowing links that a competitor built into your site is not spent on building new partnerships & other distribution channels outside of search

Ecosystem instability taxes small businesses more than larger ones as they…

The presumption that size = quality is false. A fact which Google only recognizes when it hits their own bottom line.

Anybody Could Have Saw This Coming

About a half-year ago we had a blog post about ‘Branding & The Cycle‘ which stated:

algorithmically brand emphasis will peak in the next year or two as Google comes to appreciate that they have excessively consolidated some markets and made it too hard for themselves to break into those markets. (Recall how Google came up with their QDF algorithm only *after* Google Finance wasn’t able to rank). At that point in time Google will push their own verticals more aggressively & launch some aggressive public relations campaigns about helping small businesses succeed online.

Since that point in time Amazon has made so many great moves to combat Google:

All of that is on top of creating the Kindle Fire, gaining content streaming deals & their existing strong positions in books and e-commerce.

It is unsurprising to see Google mentioning the need to “level the playing field.” They realize that Amazon benefits from many of the same network effects that Google does & now that Amazon is leveraging their position atop e-commerce to get into the online ads game, Google feels the need to mix things up.

If Google was worried about book searches happening on Amazon, how much more worried might they be about a distributed ad network built on Amazon’s data?

Said IgnitionOne CEO Will Margiloff: “I’ve always believed that the best data is conversion data. Who has more conversion data in e-commerce than Amazon?”

“The truth is that they have a singular amount of data that nobody else can touch,” said Jonathan Adams, iCrossing’s U.S. media lead. “Search behavior is not the same as conversion data. These guys have been watching you buy things for … years.”

Amazon also has an opportunity to shift up the funnel, to go after demand-generation ad budgets (i.e. branding dollars) by using its audience data to package targeting segments. It’s easy to imagine these segments as hybrids of Google’s intent-based audience pools and Facebook’s interest-based ones.

Google is in a sticky spot with product search. As they aim to increase monetization by displacing the organic result set they also lose what differentiates them from other online shopping options. If they just list big box then users will learn to pick their favorite and cut Google out of the loop. Many shoppers have been trained to start at even before Google began polluting their results with paid inclusion:

Research firm Forrester reported that 30 percent of U.S. online shoppers in the third quarter began researching their purchase on, compared with 13 percent who started on a search engine such as Google – a reversal from two years earlier when search engines were more popular starting points.

Who will Google partner with in their attempt to disrupt Amazon? Smaller businesses, larger corporations, or a mix of both? Can they succeed? Thoughts?



An Updated Guide to Google Webmaster Tools

Posted by:  /  Tags: , , , ,

Posted by beammeup

With the recent Google Webmaster Tools security bug, I thought a deep dive into what GWT has to offer SEOs might be prudent since many SEOs may have logged in recently.

Google Webmaster Tools was once Google Webmaster Wasteland. But the past year has been a fruitful one as Webmaster Tools has rolled out improvements faster than Facebook does new privacy statements. Google Webmaster Tools (GWT) is now full of insightful data and metrics that you cannot get anywhere else. Some GWT data is useful, some is not. Let's dive in and take a look at each tool in GWT.

Guide to Google Webmaster Tools Index

Webmaster Tools Sections My Favorite Tools
Configuration #1. Download Your Latest Links
Health #2. View Your Website Crawl Stats
Traffic #3. Submit To Index
Optimization #4. Webmaster Tools Data in Google Analytics
Labs #5. Rich Snippets/Structured Data Test Tool

Webmaster Tools Home

When you first login, you'll see a list of all websites in your Google Webmaster tools account as well as few links to view all messages from Google, 'Preferences', 'Author Stats' (Labs), and a few miscellaneous links under 'Other Resources'.

Google Webmaster Tools Home

All Messages

Google used to rarely communicate with Webmasters through messages. This year some probably wish they communicated a little less with the amount of "love notes" many SEOs have received. You might see a message here if:

  • Google thinks your site may have been hacked
  • Google detected unnatural links pointing to your site
  • Google thinks links pointing to your site are using techniques outside Google’s Webmaster Guidelines

You can set the messages email threshold to: 'only important' or 'all messages' under the "Preferences" tab

See it: View All Your Messages

Labs – Author Stats

Author stats in Google Webmaster Tools

Since authorship isn't tied to a single domain, Google shows authorship stats for all sites you write for as well as individual stats. You'll need a valid author profile (go Google+!) to see stats here. The stats are interesting, and good for verifying which URLs are showing your ugly mug in the SERPs.

See it: View your Author Stats

Other Resources – Rich Snippets/Structured Data

Structured Data Testing ToolIf you've never used the rich snippets testing tool, now known as "structured data", bookmark it now. It's a one stop shop to test URLs to see if your author profile is linked correctly.

You can also use the tool to check if you've setup or verified your:

  • Author Page
  • Name
  • Google+ Page as a Publisher
  • Any structured data detected (reviews, products, song titles, etc) in the form of microdata, microformats, or RDFa

See it: Test Your URLs for Structured Data

Specific Site Dashboard in Google Webmaster Tools

Once you select a site after logging in, you see the real meat of the tool. The site specific dashboard has a nice overview showing:

  • Crawl Errors
  • URL Errors
  • Site Errors
  • Health status of DNS, Server Connectivity & Robots.txt
  • Overview of # of Queries (plus clicks and impressions)
  • Sitemaps (including submitted URLs and indexed URLs)

GWT Site Dashboard

There are five major sections once you've selected a site: 'Configuration', 'Health', 'Traffic', 'Optimization', and 'Labs'. I find that the most insightful data is in the 'Heath' and 'Traffic' sections, and what you can get inside Google Analytics.

The 'Configuration' Section


Google Webmaster Tools Settings

Here you can target a specific country for your website, choose a preferred domain (www or non-www), and limit the crawl rate of Googlebot if you so choose.


Google Sitelinks

Google automatically choosing Sitelinks to display below your main URL on certain queries, usually brand related. If you have certain URLs you wouldn't want showing as Sitelinks you can "demote" them and Google won't show those demoted URLs.

URL Parameters

If you're having problems with duplicate content on your site because of variables/parameters in your URLs you can restrict Google from crawling them with this tool. Unless you're sure about what you're restricting, don't play with the settings here!

Change of Address

If you are switching your site to a whole new domain, do a 301 redirect, then make sure Google knows about it here.


Ever taken like 20 minutes to add a new user to your Google Analytics account? No? OK, maybe that was just me. Luckily adding a user to GWT is much easier. There are two main user types: 'Full user' and 'Restricted User'. Restricted users are good for clients if you want to give them most view-only access, but little ability to change settings or submit things (you probably don't clients filing random reconsideration requests!).

adding users in GWT


This setting is a way for members of YouTube's Partner Program (probably not you) to link their YouTube Channel with Webmaster Tools. My guess is this section will get more settings in the future, but for now, it's very confusing. More details on the Google Webmaster Central blog here.

The 'Health' Section

Crawl Errors

Crawl errors shows you issues Googlebot had in crawling your site. This includes response codes (404s, 301s) as well as a graph of the errors over time. This is a fantastic resource for spotting broken links, as the URL shows up as a 404 error. You can see when Google first detected the error codes and download the table of errors into a spreadsheet.

google webmaster tools crawl errors

Crawl Stats

Pages crawled per day is a good SEO metric to track over time. You can get some insight from the chart, but this is a metric to check in on and record every week. Ideally you want that number continuing to climb, especially if you are adding new content.

google webmaster tools crawl stats

Blocked URLs Fetch as Googlebot & Submit To Index

Fetch as Googlebot will return exactly what Google's spider "sees" on the URL you submit. This is handy for spotting hacked sites as well as seeing your site the way Google does. It's a good place to start an SEO audit.

The really neat feature that's new this year is "Submit to Index". Ever made a title tag change and wished Google would update its index faster to get those changes live? 'Submit to Index' does just that. 50 times a month you can submit a page to update in near real-time in Google's index. Very handy for testing on-page changes.

Here's Matt Cutts on how to use the 'Submit to Index' tool:

Index Status

Make sure and hit the 'Advanced' button here so you can see all the interesting index stats Google shows about your site. Keep an eye on the 'Not Selected' number as that could indicate that Google is not viewing your content favorably or you have a duplicate content issue if that number is rising.

google webmaster tools index status


If Google has detected any malware on your site you will see more information here. Google often sends messages now if Malware is detected as well.

The 'Traffic' Section

Search Queries

These queries are when your site shows up in a search result, not just when someone clicks your site. So you may find some keyword opportunities where you are showing up but not getting clicks. I much prefer the interface in Google Analytics for this query data, and you may find a lot more queries showing up there then here.

Keep an eye on the CTR % for queries. If you have a known #1 ranking (your brand terms for example) for but an abnormally low position 1 CTR that's a sign that someone might be bidding on your brand terms (which may or may not be good). If you have a high position but low CTR it usually indicates that your meta descriptions and title tags may not be enticing enough. Can you add a verified author to the page? Or other structured data? That could help CTR rates.

google webmaster tools search queries

Links To Your Site

This is my favorite addition to GWT this year. The link data here keeps getting updated faster and faster. When this was first launched earlier this year the delay on finding links was around three weeks. I've seen the delay down to as little as one week now.

There are two ways to download lists of links, but the "Download Latest Links" is the more useful of the two.

"Download More Sample Links" just gives a list of the same links as the latest links but in alphabetical order instead of most recent. The main report lists the domains linking to your site sorted by the number of links. Unfortunately drilling down into the domain level doesn't give really any useful insights other than the pages that are linked too (but you can't see where they are linked from on the domain). You'll find domains listed here but not in the "Latest Links" report. Bummer.

google webmaster tools links to site

Internal Links

Pretty good report for diagnosing internal link issues. This tool is nothing fancy but URLs are sorted by most internal links. Use this to diagnose pages on your site that should be getting more internal link juice.

The 'Optimization' Section


See a list of all the different types of sitemaps Google has found or that you have added and some stats about each one. You can also test a sitemap as well before submitting it and Google will scan to find any errors. Webmaster Tools shows stats here on Web sitemaps, as well as Video, News, and Image sitemaps as wellgoogle webmaster tools sitemaps

Remove URLs

You can submit URLs (only for sites you control of course) that you wish removed from Google. Make sure and follow the removal requirements process.

HTML Improvement

Think of this as a basic On-Page SEO audit tool. Google will show you lists of URLs on your site that don't have unique Title Tags, or are missing Meta Descriptions. This is a handy tool for quick On-Page SEO issues when you first take over a new website. Click on any of the issues found to return a list of the URLs that need improvement.

google webmaster tools html improvements

Content Keywords

See a list of single keywords, not key phrases, which Google thinks your site is about. As long as you don't see spam stuff here, you're good.

Structured Data

If you have some structured data on your site, such as a linked Google+ author or product review data, you can see stats about that data including the type of data found and the schema. This is useful to mass verify that all the pages you think are marked up correctly actually are.

google webmaster tools structured data tool

The 'Labs' Section

Custom Search

Ever wanted to build your own search engine? You can with Google Custom Search. If you have a collection of sites that you're always searching through using Google, you might consider using Google Custom search to build your own Google that just returns results from those sites. You can see how the custom search engine would work on just your own site using the preview tool here in Webmaster Tools.

Instant Previews

Input any URL on your site (or just leave blank and click 'Compare' to see the homepage) to see what the preview of the site might look like in a Google desktop search results set, or on a mobile SERP.

google webmaster tools instant preview

Site Performance

This tool got dropped by Google's spring cleaning in April 2012. I like using for testing site performance.

Webmaster Tools Data In Google Analytics

Connecting your Google Analytics account with your verified site profile in Google Webmaster tools brings some GWT data directly into your Google Analytics account. No need to login to two places.

To connect a verified GWT site to the correct analytics site, click the "manage site" dropdown:

google webmaster tools connection to Google Analytics

Once connected, GWT data shows up in the Standard Reporting section of Google Analytics under "Traffic Sources" -> "Search Engine Optimization".

Not all GWT data is available in GA. You'll only get three data sets in Google Analytics:

  • Queries
  • Landing Pages
  • Geographical Summary

Let's look at each of these and see what's worth looking at.


Queries are interesting because you can see some of the keywords that might be hidden under (not provided). This doesn't help with attribution of course, but at least we can still use that data for keyword research. Darn you (not provided).

What's really interesting is how many more queries show up in the query report in Google Analytics (that is supposed to be GWT data) than do when you directly get the query data in Google Webmaster Tools. For example, for the timeframe: Oct 28th-Nov 27th we had 317 queries report in Google Analytics:

analytics query data from webmaster tools

but only 93 in the Google Webmaster Tools 'Top queries' report:

google webmaster tools top queries

I'm not sure why such a big discrepancy between GWT queries and queries in Analytics from GWT. I definitely see more Google Images type queries in the GA report and less in the 'Top Queries' in GWT. Interesting discrepancy. Anyone else notice a big difference in query data?

Nonetheless the Query data can be interesting and it's nice to have right in GA. I hope that Google continues to provide more GWT data directly into Google Analytics like this.

Landing Pages

You're better off getting your actual top landing pages list from Analytics, but you can see what GWT sees as your tops pages sorted by Impressions. The interesting nugget of info here is the CTR. That's not data you see in analytics and could be insightful. I like comparing the CTR to the site average:

landing pages in google analytics

Geographical Summary

This section is again useful really for the CTR rate data. Looking at specific countries you can see where it might be worth running more Facebook ads or doing some international SEO work in.

What do you use Google Webmaster Tools For?

OK, I've ranted enough about what I like in GWT. What about you?

What data do you find useful in Google Webmaster tools?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

SEOmoz Daily SEO Blog

The Cassandra Memorandum: Google in 2013

Posted by:  /  Tags: , , ,

Posted by gfiorelli1

Apollo fell in love with a priestess, and offered her the terrible gift of prophecy. She agreed to the gift, but when Apollo asked her to lie with him, the daughter of Priam refused. The God, angry, cursed her: the young priestess would have been a prophetess, but no one would believe her.

Her name was Cassandra.


Every day this month, I've seen Twitter posts with every kind of predictions about how web marketing disciplines will look in 2013.

I am not exempt from the desire to predict the future. The urge is something deeply human and, in an industry as uncertain as ours, sometimes it is a psychological necessity. However, some of those predictions are so improbable that this obscure prediction (written to be blatantly false by one of my Spanish friends) seems more plausible:

"Google launches Google Balrog. The name of this new Google algorithm is inspired by the name of the mythical creature imagined by Tolkien, because it will operate the same way.

It will be an algorithm that, wrapped in fire, will crawl the Web, penalizing and incinerating sites which do not include the anchor text "click here" at least seven times and not include a picture of a kitten asleep in a basket.

If your site will not meet these minimums, the Balrog will go after you." (The translation is mine from Ricardo`s original post in Spanish.)

Every speculation about how something may evolve in the future should be based on the knowledge of the past, and, in the case of Google, we should not make the mistake of excluding elements like its acquisitions and technological evolution when predicting its future.

For example, Panda should be seen as a needed action that Google took in order to solve a problem caused by the technological advancement of Caffeine. In fact, with Caffeine (June 2010), Google was able to find new pages (or new information about existing pages) and could add them straight to the index.

As a negative side effect, gazillions of poor-quality pages flooded the SERPs, objectively deteriorating them. I'm sure the Search Quality team was already working on finding a solution to the poor results in the SERPs, but this particular Caffeine side effect accelerated the creation of the Panda algorithm.

Opening the prophecies book of Cassandra

If you visit the About Us page of Google, you will read this: Google's mission is to organize the world's information and make it universally accessible and useful.

That is the why of Google, and three words matter the most:

  1. Organize
  2. Accessible
  3. Useful

The how is represented by its algorithms; the what is composed by all the products Google has developed along the years (Search, Local, Mobile, Video, Voice, etc.).

The Golden Circle of Google



For many years, I've considered Google as a cataloguer of objects, which offers information on a variety of topics:

  • Written content
    • "Generic" content
    • Blog posts
    • News
    • Questions/Answers
    • Books
    • Patents
    • Academic articles
  • Photos and images
  • Videos
  • Apps

You've probably noticed that these are the vertical searches Google offers and that compose the Universal Search right now (I excluded Products because they are a paid option, and I consider Google+ status as “posts”).

Almost all these “objects” have their own algorithms which are frequently updated, similar to the YouTubeGoogle News, and Google Images algorithm updates. And all them have their flaws (for instance, the real value to be assigned to a link).

Until recent years, Universal Search seemed to be similar to a building constructed with Legos, but there were three important changes in 2011 that changed the landscape. These changed began developing in 2012 and – possibly – will be consolidated in 2013, which could really unify all the vertical searches. These three changes are:

  2. Authorship
  3. Google+

We know that Google is using semantic closeness in its categorization of crawled information, as we know how the concept of Entity is strongly related to semantic.

However, this aspect of the crawling function has assumed a larger relevance after the implementation of The general acceptance of HTML5 as the new standard (pushed by Google since its beginning) and tools like Open Graph has helped boost relevance, as well.

The fact that Google offers the opportunity to verify accuracy of rich snippets implementation (and is continuously updating the tool), changed its name to  the Structured Data testing tool, and recently offered the opportunity to highlight events structured data directly from Webmaster Tools makes me suspect that semantic is going to have even greater weight in how Google will organize the information it crawls.

cbs records inc Knowledge graph

The Knowledge Graph (and open data such as Freebase since Google's acquisition of Metaweb in 2010), which recently rolled out in many regional Googles, is based on the semantic web and is improving very quickly. My advice is to start thinking seriously about semantic as a channel for possible web marketing and SEO actions


AuthorRank has been one of the hot topics of 2012. People smarter than me wrote about it, and even created a tool around it.

In my 2011 “Wake up SEOs, the New Google is here” post, I presented my hypothesis that AuthorRank would have become a part of a more complex set of graphs, whose purpose was to organize and present for real the best content in the SERPs. We have not yet seen that “prophecy” become fact, but I am stubbornly convinced that this is the direction Google has taken. If not, why can we already use the relation ”author”/Google profile in posts, articles, videos, and books? In the future, I see AuthorRank becoming useful in other objects as well, such as photos, images, and audio.

Today, I want to focus on an aspect of AuthorRank which (incomprehensibly) does not receive much attention: the rel=”publisher” mark up. It is rel=”publisher” that connects a site (the publisher) with the authors. Even when those same authors abandon the publisher to start working with another site, their AuthorRank will continue to influence the “PublisherRank,” which makes it even stronger.

Relation between Publisher and Authors


During the last SMX Social Media Marketing Expo, Vic Gundotra told Danny Sullivan:

"I think people are just now starting to understand that Google+ is Google. At some point, we’re gonna have a billion users. And eventually, you’ll just say: 'Google has a billion users.'”

I am not going to discuss the value of Google+ as a social media channel, but we must admit that it is the irresistible evolution of Google for a number of reasons:

  • Social is the nature of a profiler tool.
  • The fact that rel=”author” and rel=”publisher” are strictly related to Google profiles makes them part of the future of Google Search (and also Paid Search).
  • It is the easiest way for Google to obtain real social signals, not just through how users act on Google+, but also through users connecting many other social accounts to the their G+ profile.

Google Plus connected accout


"You don’t need to be at your desk to need an answer."

You can find that phrase in the Ten Things We Know to Be True page of

Google bought Android Inc. in 2005 and entered in the mobile phone industry in 2008. Why the sudden surge into mobile? Aside from Android being a gold mine, Google's goal is making information universally accessible. Because more and more people are using mobile for search (especially for local), it was imperative for Google to be a player in the space:

Mobile vs. Desktop local search

Search on mobile is almost synonymous with Local Search, which can also (partly) explain why Google Now has been developed, along with the peculiar design of the mobile version of Google Search.

Google Mobile Search with Local Search iconsTherefore, it's time to stop thinking of mobile as an option and realize it's a priority.

For that same reason (strongly connected to the concept of accessibility), Google started giving instructions and suggestions about how to build mobile-optimized sites, with special predilection for the responsive design technology.

Mobile accessibility means also Google Voice Search, and <irony> what a surprise </irony>, from Knowledge Graph, Schema, and Local Search.

In addition, we can't forget Project Glass. It is still a Google X project, but has been given to developers to start designing software/apps in preparation for its commercial release predicted for 2014.

Accessibility gives information to users quickly, which explains why site speed is so important to Google – so much that it released mod page speed this last October and updated it just few days ago.

Lastly, WPO (Web Performance Optimization) is not exactly an SEO activity, but it affects SEOs, so it must be considered one of the priority for 2013. The frequently troubled relation between SEOs and developers/web designers will forcedly find a solution in 2013. We will need to start being better communicators and reach them where they are.


At the end of November, Matt Cutts gave a definition of SEO as Search Experience Optimization in his Google Webmaster Help video series:

Nothing really new, but yet another confirmation that SEO should focus on providing users with useful information.

Panda, Penguin, EMD, Layout Update… all of these updates were aimed at prioritizing the most useful sites available, and punishing those that Google considered useless.

Content marketing (the discipline that helps create useful information) has become a big priority for SEOs. However, there are still so many in our industry who don't really understand what content marketing really means and how SEO can be implemented into a content strategy. This is not the best place to discuss this topic, but check out the deck below for further explanation.

How to Build SEO into Content Strategy by Jonathon Colman

2013 will see definitive adoption of content marketing into SEO, and those sites that do not integrage marketable content into their strategies will see themselves overtaken in the SERPs.

At the same time, we will also see an increase of content marketing spamming: guest posts as article marketing, infograph-spam, or simply not consistent content actions. Sadly, SEOs tend to screw up a lot of at-first-good-tactics just because of a short-sighted tactical vision we may have. It's possible that some of the Search Quality Team actions will be focused on those facets of spam, because they already have the tools for doing it.

Usefulness to Google does not just mean "content," hence filling every kind of site with zombie infographics just because "they are cool" is not the way to go. Usefulness is paired with personalization, as if Google was saying, "We will offer you the opportunity to find the information that is interesting to you based on your previous searches, your interests, the authority of the sources, and where you are."

For that reason, I consider the Venice update the most underestimated update of 2012. It completely changed the SERPs for almost every kind of query.

Moving forward, I recommend paying close attention to the Gmail Search Field experiment, or why Google is putting effort towards making International and Multilingual SEO easier. 

Cassandra's appendices: what updates might we see in 2013?

Between 2011 and 2012, we experienced three major updates: Panda, Penguin, and EMD.

The first update's goal was to clean the SERPs of useless content, defined as content that doesn't provide any real value to the users. The second aimed to clean the SERPs of content that ranked thanks to a "false popularity" obtained through low-quality link building actions, rather than ranking by value according to users. The third update's goal was to clean the SERPs of content that ranked solely because of an exaggerated boost obtained from its exact match domain. 

The Penguin and EMD updates were even more necessary after Panda as a logical consequence, if you really think about it. Panda resulted in a large amount of sites disappearing from the SERPs. Other sidtes that survived Panda's ranking factors still won in the SERPs, mostly due to an over-SEO'd link building strategy. After Penguin, we saw those sites replaced by the sites relying only on the strength of their domain names, leading to the EMD update roll out.

Are the SERPs perfect now? Not quite. Domain crowding (and its counter part, domain shrinking), which was minor issue since 2007 was somehow justified by the Brand Update, is becoming a problem, especially in countries where the EMD update is not yet rolled out. 

MozCast Metrics Domain Diversity


We know how much can still be done through Rich Snippets spam, the gaming of Local Search, and guest posting and infographic directories spam. In 2013, we may see the effects of a massive collection of spam sites (although Google is working on it, thanks to the disavow tool); could this be the "linkapocalypse," or maybe even the real "Balrog" update? These are simply my assumptions, as every year when it comes to possible updates. What is sure is that we will see new releases of Panda and Penguin, and the extension of the EMD update in all regional Googles.

This is Google in 2013 for me and I am not a prophet, just someone who likes to look at the past while trying to interpret the future. I am right? Probably not.

But, just maybe, I am Cassandra.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

SEOmoz Daily SEO Blog

Comparing Backlink Data Providers

Posted by:  /  Tags: , , ,

Since Ayima launched in 2007, we’ve been crawling the web and building our own independent backlink data. Starting off with just a few servers running in our Directory of Technology’s bedroom cupboard, we now have over 130 high-spec servers hosted across 2 in-house server rooms and 1 datacenter, using a similar storage platform as Yahoo’s former index.

Crawling the entire web still isn’t easy (or cheap) though, which is why very few data providers exist even today. Each provider makes compromises (even Google does in some ways), in order to keep their data as accurate and useful as possible for their users. The compromises differ between providers though, some go for sheer index size whilst others aim for freshness and accuracy. Which is best for you?

This article explores the differences between SEOMoz’s Mozscape, MajesticSEO’s Fresh Index, Ahref’s link data and our own humble index. This analysis has been attempted before at Stone Temple and SEOGadget, but our Tech Team has used Ayima’s crawling technology to validate the data even further.

We need a website to analyze first of all, something that we can’t accidentally “out”. Search Engine Land is the first that came to mind, very unlikely to have many spam links or paid link activity.

So let’s start off with the easy bit – who has the biggest result set for SEL?

The chart above shows MajesticSEO as the clear winner, followed by a very respectable result for Ahrefs. Does size matter though? Certainly not at this stage, as we only really care about links which actually exist. The SEOGadget post tried to clean the results using a basic desktop crawler, to see which results returned a “200” (OK) HTTP Status Code. Here’s what we get back after checking for live linking pages:

Ouch! So MajesticSEO’s “Fresh” index has the distinct smell of decay, whilst Mozscape and Ayima V2 show the freshest data (by percentage). Ahrefs has a sizeable decay like MajesticSEO, but still shows the most links overall in terms of live linking pages. Now the problem with stopping at this level, is that it’s much more likely that a link disappears from a page, than the page itself disappearing. Think about short-term event sponsors, 404 pages that return a 200, blog posts falling off the homepage, spam comments being moderated etc. So our “Tenacious Tim” got his crawler out, to check which links actually exist on the live pages:

Less decay this time, but at least we’re now dealing with accurate data. We can also see that Ayima V2 has a live link accuracy of 82.37%, Mozscape comes in at 79.61%, Ahrefs at 72.88% and MajesticSEO is just 53.73% accurate. From Ayima’s post-crawl analysis, our techies concluded that MajesticSEO’s crawler was counting URLs (references) and not actual HTML links in a page. So simply mentioning somewhere on a web page, was counting as an actual link. Their results also included URL references in JavaScript files, which won’t offer any SEO value. That doesn’t mean that MajesticSEO is completely useless though, I’d personally use it more for “mention” detection outside of the social sphere. You can then find potential link targets who mention you somewhere, but do not properly link to your site.

Ahrefs wins the live links contest, finding 84,496 more live links than MajesticSEO and 513,733 more live links than SEOmoz’s Mozscape! I still wouldn’t use Ahrefs for comparing competitors or estimating the link authority needed to compete in a sector though. Not all links are created equal, with Ahrefs showing both the rank-improving links and the crappy spam. I would definitely use Ahrefs as my main data source for “Link Cleanup” tasks, giving me a good balance of accuracy and crawl depth. Mozscape and Ayima V2 filter out the bad pages and unnecessarily deep sites by design, in order to improve their data accuracy and showing the links that count. But when you need to know where the bad PageRank zero/null links are, Ahrefs wins the game.

So we’ve covered the best data for “mentions”, the best data for “link cleanup”, now how about the best for competitor comparison and market analysis? The chart below shows an even more granular filter, removing dead links, filtering by unique Class C IP blocks and removing anything below a PageRank 1. By using Google’s PageRank data, we can filter the links from pages that hold no value or that have been penalized in the past. Whilst some link data providers do offer their own alternative to PageRank scores (most likely based on the original Google patent), these cannot tell whether Google has hit a site for selling links or for other naughty tactics.

Whilst Ahrefs and MajesticSEO hit the top spots, the amount of processing power needed to clean their data to the point of being useful, makes them untenable for most people. I would therefore personally only use Ayima V2 or Mozscape for comparing websites and analyzing market potential. Ayima V2 isn’t available to the public quite yet, so let’s give this win to Mozscape.

So in summary

  • Ahrefs – Use for link cleanup
  • MajesticSEO – Use for mentions monitoring
  • Mozscape – Use for accurate competitor/market analysis

Juicy Data Giveaway

One of the best parts of having your own index, is being able to create cool custom reports. For example, here’s how the big SEO websites compare against each other:

“Index Rank” is a ranking based on who has the most value-passing Unique Class C IP links across our entire index. The league table is quite similar to HitWise’s list of the top traffic websites, but we’re looking at the top link authorities.

Want to do something cool with the data? Here’s an Excel spreadsheet with the Top 10,000 websites in our index, sorted by authority: Top 10,000 Authority Websites.

Rob Kerry is the co-founder of Ayima, a global SEO Consultancy started in 2007 by the former in-house team of an online gaming company. Ayima now employs over 100 people on 3 continents and Rob has recently founded the new Ayima Labs division as Director of R&D.



Page 14 of 19 «...101213141516...»