SEO Blog

Posts Tagged ‘Google’


Is Google Concerned About Amazon Eating Their Lunch?

Posted by:  /  Tags: , , , , , ,

Leveling The Playing Field

When monopolies state that they want to “level the playing field” it should be cause for concern.

Groupon is a great example of how this works. After they turned down Google’s buyout offer, Google responded by…

The same deal is slowly progressing in the cell phone market: “we are using compatibility as a club to make them do things we want.”

Leveling Shopping Search

Ahead of the Penguin update Google claimed that they wanted to “level the playing field.” Now that Google shopping has converted into a pay-to-play format & Amazon.com has opted out of participation, Google once again claims that they want to “level the playing field”:

“We are trying to provide a level playing field for retailers,” [Google’s VP of Shopping Sameer Samat] said, adding that there are some companies that have managed to do both tech and retail well. “How’s the rest of the retail world going to hit that bar?”

This quote is particularly disingenuous. For years you could win in search with a niche site by being more focused, having higher quality content & more in-depth reviews. But now even some fairly large sites are getting flushed down the ranking toilet while the biggest sites that syndicate their data displace them (see this graph for an example, as Pricegrabber is the primary source for Yahoo! Shopping).

How Google Drives Businesses to Amazon, eBay & Other Platforms

Google has spent much of the past couple years scrubbing smaller ecommerce sites off the web via the Panda & Penguin updates. Now if small online merchants want an opportunity to engage in Google’s search ecosystem they have a couple options:

  • Ignore it: flat out ignore search until they build a huge brand (it’s worth noting that branding is a higher level function & deep brand investment is too cost intensive for many small niche businesses)
  • Join The Circus: jump through an endless series of hoops, minimizing their product pages & re-configuring their shopping cart
  • PPC: operate at or slightly above the level of a non-functional thin phishing website & pay Google by the click via their new paid inclusion program
  • Ride on a 3rd Party Platform: sell on one of the larger platforms that Google is biasing their algorithms toward & hope that the platform doesn’t cut you out of the loop.

Ignoring search isn’t a lasting option, some of the PPC costs won’t back out for smaller businesses that lack a broad catalog to do repeat sales against to lift lifetime customer value, SEO is getting prohibitively expensive & uncertain. Of these options, a good number of small online merchants are now choosing #4.

Operating an ecommerce store is hard. You have to deal with…

  • sourcing & managing inventory
  • managing employees
  • technical / software issues
  • content creation
  • marketing
  • credit card fraud
  • customer service
  • shipping

Some services help to minimize the pain in many of these areas, but just like people do showrooming offline many also do it online. And one of the biggest incremental costs added to ecommerce over the past couple years has been SEO.

Google’s Barrier to Entry Destroys the Diversity of Online Businesses

How are the smaller merchants to compete with larger ones? Well, for starters, there are some obvious points of influence in the market that Google could address…

  • time spent worrying about Penguin or Panda is time that is not spent on differentiating your offering or building new products & services
  • time spent modifying the source code of your shopping cart to minimize pagecount & consolidate products (and various other “learn PHP on the side” work) is not spent on creating more in-depth editorial
  • time switching carts to one that has the newly needed features (for GoogleBot and ONLY GoogleBot) & aligning your redirects is not spent on outreach and media relations
  • time spent disavowing links that a competitor built into your site is not spent on building new partnerships & other distribution channels outside of search

Ecosystem instability taxes small businesses more than larger ones as they…

The presumption that size = quality is false. A fact which Google only recognizes when it hits their own bottom line.

Anybody Could Have Saw This Coming

About a half-year ago we had a blog post about ‘Branding & The Cycle‘ which stated:

algorithmically brand emphasis will peak in the next year or two as Google comes to appreciate that they have excessively consolidated some markets and made it too hard for themselves to break into those markets. (Recall how Google came up with their QDF algorithm only *after* Google Finance wasn’t able to rank). At that point in time Google will push their own verticals more aggressively & launch some aggressive public relations campaigns about helping small businesses succeed online.

Since that point in time Amazon has made so many great moves to combat Google:

All of that is on top of creating the Kindle Fire, gaining content streaming deals & their existing strong positions in books and e-commerce.

It is unsurprising to see Google mentioning the need to “level the playing field.” They realize that Amazon benefits from many of the same network effects that Google does & now that Amazon is leveraging their position atop e-commerce to get into the online ads game, Google feels the need to mix things up.

If Google was worried about book searches happening on Amazon, how much more worried might they be about a distributed ad network built on Amazon’s data?

Said IgnitionOne CEO Will Margiloff: “I’ve always believed that the best data is conversion data. Who has more conversion data in e-commerce than Amazon?”

“The truth is that they have a singular amount of data that nobody else can touch,” said Jonathan Adams, iCrossing’s U.S. media lead. “Search behavior is not the same as conversion data. These guys have been watching you buy things for … years.”

Amazon also has an opportunity to shift up the funnel, to go after demand-generation ad budgets (i.e. branding dollars) by using its audience data to package targeting segments. It’s easy to imagine these segments as hybrids of Google’s intent-based audience pools and Facebook’s interest-based ones.

Google is in a sticky spot with product search. As they aim to increase monetization by displacing the organic result set they also lose what differentiates them from other online shopping options. If they just list big box then users will learn to pick their favorite and cut Google out of the loop. Many shoppers have been trained to start at Amazon.com even before Google began polluting their results with paid inclusion:

Research firm Forrester reported that 30 percent of U.S. online shoppers in the third quarter began researching their purchase on Amazon.com, compared with 13 percent who started on a search engine such as Google – a reversal from two years earlier when search engines were more popular starting points.

Who will Google partner with in their attempt to disrupt Amazon? Smaller businesses, larger corporations, or a mix of both? Can they succeed? Thoughts?

Categories: 

SEO Book.com

An Updated Guide to Google Webmaster Tools

Posted by:  /  Tags: , , , ,

Posted by beammeup

With the recent Google Webmaster Tools security bug, I thought a deep dive into what GWT has to offer SEOs might be prudent since many SEOs may have logged in recently.

Google Webmaster Tools was once Google Webmaster Wasteland. But the past year has been a fruitful one as Webmaster Tools has rolled out improvements faster than Facebook does new privacy statements. Google Webmaster Tools (GWT) is now full of insightful data and metrics that you cannot get anywhere else. Some GWT data is useful, some is not. Let's dive in and take a look at each tool in GWT.

Guide to Google Webmaster Tools Index

Webmaster Tools Sections My Favorite Tools
Configuration #1. Download Your Latest Links
Health #2. View Your Website Crawl Stats
Traffic #3. Submit To Index
Optimization #4. Webmaster Tools Data in Google Analytics
Labs #5. Rich Snippets/Structured Data Test Tool

Webmaster Tools Home

When you first login, you'll see a list of all websites in your Google Webmaster tools account as well as few links to view all messages from Google, 'Preferences', 'Author Stats' (Labs), and a few miscellaneous links under 'Other Resources'.

Google Webmaster Tools Home

All Messages

Google used to rarely communicate with Webmasters through messages. This year some probably wish they communicated a little less with the amount of "love notes" many SEOs have received. You might see a message here if:

  • Google thinks your site may have been hacked
  • Google detected unnatural links pointing to your site
  • Google thinks links pointing to your site are using techniques outside Google’s Webmaster Guidelines

You can set the messages email threshold to: 'only important' or 'all messages' under the "Preferences" tab

See it: View All Your Messages

Labs – Author Stats

Author stats in Google Webmaster Tools

Since authorship isn't tied to a single domain, Google shows authorship stats for all sites you write for as well as individual stats. You'll need a valid author profile (go Google+!) to see stats here. The stats are interesting, and good for verifying which URLs are showing your ugly mug in the SERPs.

See it: View your Author Stats

Other Resources – Rich Snippets/Structured Data

Structured Data Testing ToolIf you've never used the rich snippets testing tool, now known as "structured data", bookmark it now. It's a one stop shop to test URLs to see if your author profile is linked correctly.

You can also use the tool to check if you've setup or verified your:

  • Author Page
  • Name
  • Google+ Page as a Publisher
  • Any structured data detected (reviews, products, song titles, etc) in the form of microdata, microformats, or RDFa

See it: Test Your URLs for Structured Data

Specific Site Dashboard in Google Webmaster Tools

Once you select a site after logging in, you see the real meat of the tool. The site specific dashboard has a nice overview showing:

  • Crawl Errors
  • URL Errors
  • Site Errors
  • Health status of DNS, Server Connectivity & Robots.txt
  • Overview of # of Queries (plus clicks and impressions)
  • Sitemaps (including submitted URLs and indexed URLs)

GWT Site Dashboard

There are five major sections once you've selected a site: 'Configuration', 'Health', 'Traffic', 'Optimization', and 'Labs'. I find that the most insightful data is in the 'Heath' and 'Traffic' sections, and what you can get inside Google Analytics.

The 'Configuration' Section

Settings

Google Webmaster Tools Settings

Here you can target a specific country for your website, choose a preferred domain (www or non-www), and limit the crawl rate of Googlebot if you so choose.

Sitelinks

Google Sitelinks

Google automatically choosing Sitelinks to display below your main URL on certain queries, usually brand related. If you have certain URLs you wouldn't want showing as Sitelinks you can "demote" them and Google won't show those demoted URLs.

URL Parameters

If you're having problems with duplicate content on your site because of variables/parameters in your URLs you can restrict Google from crawling them with this tool. Unless you're sure about what you're restricting, don't play with the settings here!

Change of Address

If you are switching your site to a whole new domain, do a 301 redirect, then make sure Google knows about it here.

Users

Ever taken like 20 minutes to add a new user to your Google Analytics account? No? OK, maybe that was just me. Luckily adding a user to GWT is much easier. There are two main user types: 'Full user' and 'Restricted User'. Restricted users are good for clients if you want to give them most view-only access, but little ability to change settings or submit things (you probably don't clients filing random reconsideration requests!).

adding users in GWT

Associates

This setting is a way for members of YouTube's Partner Program (probably not you) to link their YouTube Channel with Webmaster Tools. My guess is this section will get more settings in the future, but for now, it's very confusing. More details on the Google Webmaster Central blog here.

The 'Health' Section

Crawl Errors

Crawl errors shows you issues Googlebot had in crawling your site. This includes response codes (404s, 301s) as well as a graph of the errors over time. This is a fantastic resource for spotting broken links, as the URL shows up as a 404 error. You can see when Google first detected the error codes and download the table of errors into a spreadsheet.

google webmaster tools crawl errors

Crawl Stats

Pages crawled per day is a good SEO metric to track over time. You can get some insight from the chart, but this is a metric to check in on and record every week. Ideally you want that number continuing to climb, especially if you are adding new content.

google webmaster tools crawl stats

Blocked URLs Fetch as Googlebot & Submit To Index

Fetch as Googlebot will return exactly what Google's spider "sees" on the URL you submit. This is handy for spotting hacked sites as well as seeing your site the way Google does. It's a good place to start an SEO audit.

The really neat feature that's new this year is "Submit to Index". Ever made a title tag change and wished Google would update its index faster to get those changes live? 'Submit to Index' does just that. 50 times a month you can submit a page to update in near real-time in Google's index. Very handy for testing on-page changes.

Here's Matt Cutts on how to use the 'Submit to Index' tool:

Index Status

Make sure and hit the 'Advanced' button here so you can see all the interesting index stats Google shows about your site. Keep an eye on the 'Not Selected' number as that could indicate that Google is not viewing your content favorably or you have a duplicate content issue if that number is rising.

google webmaster tools index status

Malware

If Google has detected any malware on your site you will see more information here. Google often sends messages now if Malware is detected as well.

The 'Traffic' Section

Search Queries

These queries are when your site shows up in a search result, not just when someone clicks your site. So you may find some keyword opportunities where you are showing up but not getting clicks. I much prefer the interface in Google Analytics for this query data, and you may find a lot more queries showing up there then here.

Keep an eye on the CTR % for queries. If you have a known #1 ranking (your brand terms for example) for but an abnormally low position 1 CTR that's a sign that someone might be bidding on your brand terms (which may or may not be good). If you have a high position but low CTR it usually indicates that your meta descriptions and title tags may not be enticing enough. Can you add a verified author to the page? Or other structured data? That could help CTR rates.

google webmaster tools search queries

Links To Your Site

This is my favorite addition to GWT this year. The link data here keeps getting updated faster and faster. When this was first launched earlier this year the delay on finding links was around three weeks. I've seen the delay down to as little as one week now.

There are two ways to download lists of links, but the "Download Latest Links" is the more useful of the two.

"Download More Sample Links" just gives a list of the same links as the latest links but in alphabetical order instead of most recent. The main report lists the domains linking to your site sorted by the number of links. Unfortunately drilling down into the domain level doesn't give really any useful insights other than the pages that are linked too (but you can't see where they are linked from on the domain). You'll find domains listed here but not in the "Latest Links" report. Bummer.

google webmaster tools links to site

Internal Links

Pretty good report for diagnosing internal link issues. This tool is nothing fancy but URLs are sorted by most internal links. Use this to diagnose pages on your site that should be getting more internal link juice.

The 'Optimization' Section

Sitemaps

See a list of all the different types of sitemaps Google has found or that you have added and some stats about each one. You can also test a sitemap as well before submitting it and Google will scan to find any errors. Webmaster Tools shows stats here on Web sitemaps, as well as Video, News, and Image sitemaps as wellgoogle webmaster tools sitemaps

Remove URLs

You can submit URLs (only for sites you control of course) that you wish removed from Google. Make sure and follow the removal requirements process.

HTML Improvement

Think of this as a basic On-Page SEO audit tool. Google will show you lists of URLs on your site that don't have unique Title Tags, or are missing Meta Descriptions. This is a handy tool for quick On-Page SEO issues when you first take over a new website. Click on any of the issues found to return a list of the URLs that need improvement.

google webmaster tools html improvements

Content Keywords

See a list of single keywords, not key phrases, which Google thinks your site is about. As long as you don't see spam stuff here, you're good.

Structured Data

If you have some structured data on your site, such as a linked Google+ author or product review data, you can see stats about that data including the type of data found and the schema. This is useful to mass verify that all the pages you think are marked up correctly actually are.

google webmaster tools structured data tool

The 'Labs' Section

Custom Search

Ever wanted to build your own search engine? You can with Google Custom Search. If you have a collection of sites that you're always searching through using Google, you might consider using Google Custom search to build your own Google that just returns results from those sites. You can see how the custom search engine would work on just your own site using the preview tool here in Webmaster Tools.

Instant Previews

Input any URL on your site (or just leave blank and click 'Compare' to see the homepage) to see what the preview of the site might look like in a Google desktop search results set, or on a mobile SERP.

google webmaster tools instant preview

Site Performance

This tool got dropped by Google's spring cleaning in April 2012. I like using webpagetest.org for testing site performance.

Webmaster Tools Data In Google Analytics

Connecting your Google Analytics account with your verified site profile in Google Webmaster tools brings some GWT data directly into your Google Analytics account. No need to login to two places.

To connect a verified GWT site to the correct analytics site, click the "manage site" dropdown:

google webmaster tools connection to Google Analytics

Once connected, GWT data shows up in the Standard Reporting section of Google Analytics under "Traffic Sources" -> "Search Engine Optimization".

Not all GWT data is available in GA. You'll only get three data sets in Google Analytics:

  • Queries
  • Landing Pages
  • Geographical Summary

Let's look at each of these and see what's worth looking at.

Queries

Queries are interesting because you can see some of the keywords that might be hidden under (not provided). This doesn't help with attribution of course, but at least we can still use that data for keyword research. Darn you (not provided).

What's really interesting is how many more queries show up in the query report in Google Analytics (that is supposed to be GWT data) than do when you directly get the query data in Google Webmaster Tools. For example, for the timeframe: Oct 28th-Nov 27th we had 317 queries report in Google Analytics:

analytics query data from webmaster tools

but only 93 in the Google Webmaster Tools 'Top queries' report:

google webmaster tools top queries

I'm not sure why such a big discrepancy between GWT queries and queries in Analytics from GWT. I definitely see more Google Images type queries in the GA report and less in the 'Top Queries' in GWT. Interesting discrepancy. Anyone else notice a big difference in query data?

Nonetheless the Query data can be interesting and it's nice to have right in GA. I hope that Google continues to provide more GWT data directly into Google Analytics like this.

Landing Pages

You're better off getting your actual top landing pages list from Analytics, but you can see what GWT sees as your tops pages sorted by Impressions. The interesting nugget of info here is the CTR. That's not data you see in analytics and could be insightful. I like comparing the CTR to the site average:

landing pages in google analytics

Geographical Summary

This section is again useful really for the CTR rate data. Looking at specific countries you can see where it might be worth running more Facebook ads or doing some international SEO work in.

What do you use Google Webmaster Tools For?

OK, I've ranted enough about what I like in GWT. What about you?

What data do you find useful in Google Webmaster tools?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


SEOmoz Daily SEO Blog

The Cassandra Memorandum: Google in 2013

Posted by:  /  Tags: , , ,

Posted by gfiorelli1

Apollo fell in love with a priestess, and offered her the terrible gift of prophecy. She agreed to the gift, but when Apollo asked her to lie with him, the daughter of Priam refused. The God, angry, cursed her: the young priestess would have been a prophetess, but no one would believe her.

Her name was Cassandra.

cassandra_by_anthony_frederick_augustus_sandys

Every day this month, I've seen Twitter posts with every kind of predictions about how web marketing disciplines will look in 2013.

I am not exempt from the desire to predict the future. The urge is something deeply human and, in an industry as uncertain as ours, sometimes it is a psychological necessity. However, some of those predictions are so improbable that this obscure prediction (written to be blatantly false by one of my Spanish friends) seems more plausible:

"Google launches Google Balrog. The name of this new Google algorithm is inspired by the name of the mythical creature imagined by Tolkien, because it will operate the same way.

It will be an algorithm that, wrapped in fire, will crawl the Web, penalizing and incinerating sites which do not include the anchor text "click here" at least seven times and not include a picture of a kitten asleep in a basket.

If your site will not meet these minimums, the Balrog will go after you." (The translation is mine from Ricardo`s original post in Spanish.)

Every speculation about how something may evolve in the future should be based on the knowledge of the past, and, in the case of Google, we should not make the mistake of excluding elements like its acquisitions and technological evolution when predicting its future.

For example, Panda should be seen as a needed action that Google took in order to solve a problem caused by the technological advancement of Caffeine. In fact, with Caffeine (June 2010), Google was able to find new pages (or new information about existing pages) and could add them straight to the index.

As a negative side effect, gazillions of poor-quality pages flooded the SERPs, objectively deteriorating them. I'm sure the Search Quality team was already working on finding a solution to the poor results in the SERPs, but this particular Caffeine side effect accelerated the creation of the Panda algorithm.

Opening the prophecies book of Cassandra

If you visit the About Us page of Google, you will read this: Google's mission is to organize the world's information and make it universally accessible and useful.

That is the why of Google, and three words matter the most:

  1. Organize
  2. Accessible
  3. Useful

The how is represented by its algorithms; the what is composed by all the products Google has developed along the years (Search, Local, Mobile, Video, Voice, etc.).

The Golden Circle of Google

 

Organized

For many years, I've considered Google as a cataloguer of objects, which offers information on a variety of topics:

  • Written content
    • "Generic" content
    • Blog posts
    • News
    • Questions/Answers
    • Books
    • Patents
    • Academic articles
  • Photos and images
  • Videos
  • Apps

You've probably noticed that these are the vertical searches Google offers and that compose the Universal Search right now (I excluded Products because they are a paid option, and I consider Google+ status as “posts”).

Almost all these “objects” have their own algorithms which are frequently updated, similar to the YouTubeGoogle News, and Google Images algorithm updates. And all them have their flaws (for instance, the real value to be assigned to a link).

Until recent years, Universal Search seemed to be similar to a building constructed with Legos, but there were three important changes in 2011 that changed the landscape. These changed began developing in 2012 and – possibly – will be consolidated in 2013, which could really unify all the vertical searches. These three changes are:

  1. Schema.org
  2. Authorship
  3. Google+

Schema.org

We know that Google is using semantic closeness in its categorization of crawled information, as we know how the concept of Entity is strongly related to semantic.

However, this aspect of the crawling function has assumed a larger relevance after the implementation of Schema.org. The general acceptance of HTML5 as the new standard (pushed by Google since its beginning) and tools like Open Graph has helped boost relevance, as well.

The fact that Google offers the opportunity to verify accuracy of rich snippets implementation (and is continuously updating the tool), changed its name to  the Structured Data testing tool, and recently offered the opportunity to highlight events structured data directly from Webmaster Tools makes me suspect that semantic is going to have even greater weight in how Google will organize the information it crawls.

cbs records inc Knowledge graph

The Knowledge Graph (and open data such as Freebase since Google's acquisition of Metaweb in 2010), which recently rolled out in many regional Googles, is based on the semantic web and is improving very quickly. My advice is to start thinking seriously about semantic as a channel for possible web marketing and SEO actions

Authorship

AuthorRank has been one of the hot topics of 2012. People smarter than me wrote about it, and even created a tool around it.

In my 2011 “Wake up SEOs, the New Google is here” post, I presented my hypothesis that AuthorRank would have become a part of a more complex set of graphs, whose purpose was to organize and present for real the best content in the SERPs. We have not yet seen that “prophecy” become fact, but I am stubbornly convinced that this is the direction Google has taken. If not, why can we already use the relation ”author”/Google profile in posts, articles, videos, and books? In the future, I see AuthorRank becoming useful in other objects as well, such as photos, images, and audio.

Today, I want to focus on an aspect of AuthorRank which (incomprehensibly) does not receive much attention: the rel=”publisher” mark up. It is rel=”publisher” that connects a site (the publisher) with the authors. Even when those same authors abandon the publisher to start working with another site, their AuthorRank will continue to influence the “PublisherRank,” which makes it even stronger.

Relation between Publisher and Authors

Google+

During the last SMX Social Media Marketing Expo, Vic Gundotra told Danny Sullivan:

"I think people are just now starting to understand that Google+ is Google. At some point, we’re gonna have a billion users. And eventually, you’ll just say: 'Google has a billion users.'”

I am not going to discuss the value of Google+ as a social media channel, but we must admit that it is the irresistible evolution of Google for a number of reasons:

  • Social is the nature of a profiler tool.
  • The fact that rel=”author” and rel=”publisher” are strictly related to Google profiles makes them part of the future of Google Search (and also Paid Search).
  • It is the easiest way for Google to obtain real social signals, not just through how users act on Google+, but also through users connecting many other social accounts to the their G+ profile.

Google Plus connected accout

Accessible

"You don’t need to be at your desk to need an answer."

You can find that phrase in the Ten Things We Know to Be True page of Google.com.

Google bought Android Inc. in 2005 and entered in the mobile phone industry in 2008. Why the sudden surge into mobile? Aside from Android being a gold mine, Google's goal is making information universally accessible. Because more and more people are using mobile for search (especially for local), it was imperative for Google to be a player in the space:

Mobile vs. Desktop local search

Search on mobile is almost synonymous with Local Search, which can also (partly) explain why Google Now has been developed, along with the peculiar design of the mobile version of Google Search.

Google Mobile Search with Local Search iconsTherefore, it's time to stop thinking of mobile as an option and realize it's a priority.

For that same reason (strongly connected to the concept of accessibility), Google started giving instructions and suggestions about how to build mobile-optimized sites, with special predilection for the responsive design technology.

Mobile accessibility means also Google Voice Search, and <irony> what a surprise </irony>, from Knowledge Graph, Schema, and Local Search.

In addition, we can't forget Project Glass. It is still a Google X project, but has been given to developers to start designing software/apps in preparation for its commercial release predicted for 2014.

Accessibility gives information to users quickly, which explains why site speed is so important to Google – so much that it released mod page speed this last October and updated it just few days ago.

Lastly, WPO (Web Performance Optimization) is not exactly an SEO activity, but it affects SEOs, so it must be considered one of the priority for 2013. The frequently troubled relation between SEOs and developers/web designers will forcedly find a solution in 2013. We will need to start being better communicators and reach them where they are.

Useful

At the end of November, Matt Cutts gave a definition of SEO as Search Experience Optimization in his Google Webmaster Help video series:

Nothing really new, but yet another confirmation that SEO should focus on providing users with useful information.

Panda, Penguin, EMD, Layout Update… all of these updates were aimed at prioritizing the most useful sites available, and punishing those that Google considered useless.

Content marketing (the discipline that helps create useful information) has become a big priority for SEOs. However, there are still so many in our industry who don't really understand what content marketing really means and how SEO can be implemented into a content strategy. This is not the best place to discuss this topic, but check out the deck below for further explanation.

How to Build SEO into Content Strategy by Jonathon Colman

2013 will see definitive adoption of content marketing into SEO, and those sites that do not integrage marketable content into their strategies will see themselves overtaken in the SERPs.

At the same time, we will also see an increase of content marketing spamming: guest posts as article marketing, infograph-spam, or simply not consistent content actions. Sadly, SEOs tend to screw up a lot of at-first-good-tactics just because of a short-sighted tactical vision we may have. It's possible that some of the Search Quality Team actions will be focused on those facets of spam, because they already have the tools for doing it.

Usefulness to Google does not just mean "content," hence filling every kind of site with zombie infographics just because "they are cool" is not the way to go. Usefulness is paired with personalization, as if Google was saying, "We will offer you the opportunity to find the information that is interesting to you based on your previous searches, your interests, the authority of the sources, and where you are."

For that reason, I consider the Venice update the most underestimated update of 2012. It completely changed the SERPs for almost every kind of query.

Moving forward, I recommend paying close attention to the Gmail Search Field experiment, or why Google is putting effort towards making International and Multilingual SEO easier. 

Cassandra's appendices: what updates might we see in 2013?

Between 2011 and 2012, we experienced three major updates: Panda, Penguin, and EMD.

The first update's goal was to clean the SERPs of useless content, defined as content that doesn't provide any real value to the users. The second aimed to clean the SERPs of content that ranked thanks to a "false popularity" obtained through low-quality link building actions, rather than ranking by value according to users. The third update's goal was to clean the SERPs of content that ranked solely because of an exaggerated boost obtained from its exact match domain. 

The Penguin and EMD updates were even more necessary after Panda as a logical consequence, if you really think about it. Panda resulted in a large amount of sites disappearing from the SERPs. Other sidtes that survived Panda's ranking factors still won in the SERPs, mostly due to an over-SEO'd link building strategy. After Penguin, we saw those sites replaced by the sites relying only on the strength of their domain names, leading to the EMD update roll out.

Are the SERPs perfect now? Not quite. Domain crowding (and its counter part, domain shrinking), which was minor issue since 2007 was somehow justified by the Brand Update, is becoming a problem, especially in countries where the EMD update is not yet rolled out. 

MozCast Metrics Domain Diversity

Conclusion 

We know how much can still be done through Rich Snippets spam, the gaming of Local Search, and guest posting and infographic directories spam. In 2013, we may see the effects of a massive collection of spam sites (although Google is working on it, thanks to the disavow tool); could this be the "linkapocalypse," or maybe even the real "Balrog" update? These are simply my assumptions, as every year when it comes to possible updates. What is sure is that we will see new releases of Panda and Penguin, and the extension of the EMD update in all regional Googles.

This is Google in 2013 for me and I am not a prophet, just someone who likes to look at the past while trying to interpret the future. I am right? Probably not.

But, just maybe, I am Cassandra.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


SEOmoz Daily SEO Blog

3 Reasons Google Won’t Offer Car Insurance Comparisons in the US Anytime Soon

Posted by:  /  Tags: , , , , , , ,

The following is a guest column written by Rory Joyce from CoverHound.

Last week Google Advisor made its long-awaited debut in the car insurance vertical — in the UK. Given Google’s 2011 acquisition of BeatThatQuote.com, a UK comparison site, for 37.7 million pounds ($ 61.5 million US), it comes as little surprise that the company chose to enter the UK ahead of other markets. While some might suspect Google’s foray into the UK market is merely a trial balloon, and that an entrance into the US market is inevitable, I certainly wouldn’t hold my breath.

Here are three reasons Google will not be offering an insurance comparison product anytime soon in the US market:

1) High Opportunity Cost

Finance and insurance is the number one revenue – generating advertising vertical for Google, totaling $ 4 billion in 2011. While some of that $ 4 billion is made up of products like health insurance, life insurance and credit cards, the largest segment within the vertical is undoubtedly car insurance. The top 3 advertisers in the vertical as a whole are US carriers — State Farm, Progressive and Geico — spending a combined sum of $ 110 million in 2011.

The keyword landscape for the car insurance vertical is relatively dense. A vast majority of searches occur across 10-20 generic terms (ie – “car insurance,” “auto insurance,” “cheap auto insurance,” “auto insurance quotes,” etc). This is an important point because it helps explain the relatively high market CPC of car insurance keywords versus other verticals. All of the major advertisers are in the auction for a large majority of searches, resulting in higher prices. The top spot for head term searches can reach CPCs well over $ 40. The overall average revenue/click for Google is probably somewhere around $ 30. Having run run similar experiments with carrier click listing ads using SEM traffic, I can confidently assume that the click velocity (clicks per clicker) is around 1.5. So the average revenue per searcher who clicks is probably somewhere around $ 45 for Google.

Now, let’s speculate on Google’s potential revenues from advertisers in a comparison environment. Carriers’ marketing allowable is approximately $ 250 per new policy. When structuring pay-for-performance pricing deep in the funnel (or on a sold-policy basis), carriers are unlikely to stray from those fundamentals. In a fluid marketplace higher in the funnel (i.e.  Adwords PPC), they very often are managing to a marginal cost per policy that far exceeds even $ 500 (see $ 40 CPCs). While it may seem like irrational behavior, there are two reasons they are able to get away with this:

a) They are managing to an overall average cost per policy, meaning all direct response marketing channels benefit from “free,” or unattributable sales. With mega-brands like Geico, this can be a huge factor.

b) There are pressures to meet sales goals at all costs. Google presents the highest intent of any marketing channel available to insurance marketers. If marketers need to move the needle in a hurry, this is where they spend.

Regardless of how Google actually structures the pricing, the conversion point will be much more efficient for the consumer since they will be armed with rates and thus there will be less conversion velocity for Google. The net-net here is a much more efficient marketplace, and one where Google can expect average revenue to be about $ 250 per sold policy.

How does this match up against the $ 45 unit revenue they would significantly cannibalize? The most optimized and competitive carriers can convert as high as 10% of clicks into sales. Since Google would be presenting multiple policies we can expect that in a fully optimized state, they may see 50% higher conversion and thus 15% of clicks into sales. Here is a summary of the math:

With the Advisor product, in an optimized state, Google will make about $ 37.50 ($ 250 x .15) per clicker. Each cannibalized lead will thus cost Google $ 7.50 of unit revenue ($ 45 – $ 37.50). Given the dearth of compelling comparison options in insurance (that can afford AdWords), consumers would definitely be intrigued and so one can assume the penetration/cannibalization would be significant.

Of course there are other impacts to consider: How would this affect competition and average revenue for non-cannibalized clicks? Will responders to Advisor be incremental and therefore have zero opportunity cost?

2) Advisor Has Poor Traction in Other Verticals

Over the past couple of years, Google has rolled out its Advisor product in several verticals including: personal banking, mortgage, and flight search.

We know that at least mortgage didn’t work out very well. Rolled out in early 2011, it was not even a year before Google apparently shut the service down in January of 2012.

I personally don’t have a good grasp on the Mortgage vertical so I had a chat with a high-ranking executive at a leading mortgage site, an active AdWords advertiser. In talking to him it became clear that there were actually quite a bit of similarities between mortgage and insurance as it relates to Google including:

  1. Both industries are highly regulated in the US, at the state level.
  2. Both verticals are competitive and lucrative. CPCs in mortgage can exceed $ 40.
  3. Like insurance, Google tested Advisor in the UK market first.

Hoping he could serve as my crystal ball for insurance, I asked, “So why did Advisor for Mortgage fail?” His response was, “The chief issue was that the opportunity cost was unsustainably high. Google needed to be as or more efficient than direct marketers who had been doing this for years. They underestimated this learning curve and ultimately couldn’t sustain the lost revenue as a result of click cannibalization.”

Google better be sure it has a good understanding of the US insurance market before entering, or else history will repeat itself, which brings me to my next point…

3) They Don’t Yet Have Expertise

Let’s quickly review some key differences between the UK and US insurance markets:

  1. Approximately 80% of car insurance is purchased through comparison sites in the UK vs under 5% in the US.
  2. There is one very business-friendly pricing regulatory body in the UK versus state-level, sometimes aggressive, regulation in the US.
  3. The UK is an efficient market for consumers, the US is not. This means margins are tighter for UK advertisers, as evidenced by the fact that CPCs in the UK are about a third of what they are in the US.

As you can see, these markets are completely different animals. Despite the seemingly low barriers for entry in the UK, Google still felt compelled to acquire BeatThatQuote to better understand the market. Yet, it still took them a year and a half post acquisition before they launched Advisor.

I spoke with an executive at a top-tier UK insurance comparison site earlier this week about Google’s entry. He mentioned that Google wanted to acquire a UK entity primarily for its general knowledge of the market, technology, and infrastructure (API integrations). He said, “Given [Google’s] objectives, it didn’t make sense for them to acquire a top tier site (ie – gocompare, comparethemarket, moneysupermarket, confused) so they acquired BeatThatQuote, which was unknown to most consumers but had the infrastructure in place for Google to test the market effectively.”

It’s very unlikely BeatThatQuote will be of much use for the US market. Google will need to build its product from the ground up. Beyond accruing the knowledge of a very complex, and nuanced market, they will need to acquire or build out the infrastructure. In the US there are no public rate APIs for insurance carriers; very few insurance comparison sites actually publish instant, accurate, real-time rates. Google will need to understand and navigate its way to the rates (though it’s not impossible). It will take some time to get carriers comfortable and then of course build out the technology. Insurance carriers, like most financial service companies, can be painfully slow.

Conclusion

I do believe Google will do something with insurance at some point in the US. Of the various challenges the company currently faces, I believe the high opportunity cost is the toughest to overcome. However, the market will shift. As true insurance comparison options continue to mature, consumers will be searching exclusively for comparison sites (see travel), and carriers will no longer be able to effectively compete at the scale they are now — driving down the market for CPCs and thus lowering the opportunity cost.

This opportunity cost is much lower however for other search engines where average car insurance CPC’s are lower. If I am Microsoft or Yahoo, I am seriously considering using my valuable real estate to promote something worthwhile in insurance. There is currently a big void for consumers as it relates to shopping for insurance. A rival search engine can instantly differentiate themselves from Google overnight in one of the biggest verticals. This may be one of their best opportunities to regain some market share.

Categories: 

SEO Book.com

Google Disavow Tool

Posted by:  /  Tags: , ,

Google launched a disavow links tool. Webmasters who want to tell Google which links they don’t want counted can now do so by uploading a list of links in Google Webmaster Tools.

If you haven’t received an “unnatural link” alert from Google, you don’t really need to use this tool. And even if you have received notification, Google are quick to point out that you may wish to pursue other avenues, such as approaching the site owner, first.

Webmasters have met with mixed success following this approach, of course. It’s difficult to imagine many webmasters going to that trouble and expense when they can now upload a txt file to Google.

Careful, Now

The disavow tool is a loaded gun.

If you get the format wrong by mistake, you may end up taking out valuable links for long periods of time. Google advise that if this happens, you can still get your links back, but not immediately.

Could the use of the tool be seen as an admission of guilt? Matt gives examples of “bad” webmaster behavior, which comes across a bit like “webmasters confessing their sins!”. Is this the equivalent of putting up your hand and saying “yep, I bought links that even I think are dodgy!”? May as well paint a target on your back.

Some webmasters have been victims of negative SEO. Some webmasters have had scrapers and autogen sites that steal their content, and then link back. There are legitimate reasons to disavow links. Hopefully, Google makes an effort to make such a distinction.

One wonders why Google simply don’t discount the links they already deem to be “bad”? Why the need for the webmaster to jump through hoops? The webmaster is still left to guess which links are “bad”, of course.

Not only is it difficult working out the links that may be a problem, it can be difficult getting a view of the entire link graph. There are various third party tools, including Google’s own Webmaster Central, but they aren’t exhaustive.

Matt mentioned that the link notification emails will provide examples of problem links, however this list won’t be exhaustive. He also mentioned that you should pay attention to the more recent links, presumably because if you haven’t received notification up until now, then older links weren’t the problem. The issue with that assumption is that links that were once good can over time become bad:

  • That donation where you helped a good cause & were later mortified that “online casino” and “discount cheap viagra” followed your course for purely altruistic reasons.
  • That clever comment on a well-linked PR7 page that is looking to cure erectile dysfunction 20 different ways in the comments.
  • Links from sources that were considered fine years ago & were later repositioned as spam (article banks anyone?)
  • Links from sites that were fine, but a number of other webmasters disavowed, turning a site that originally passed the sniff test into one that earns a second review revealing a sour stench.

This could all get rather painful if webmasters start taking out links they perceive to be a problem, but aren’t. I imagine a few feet will get blasted off in the process.

Webmasters Asked, Google Gaveth

Webmasters have been demanding such a tool since the un-natural notifications started appearing. There is no question that removing established links can be as hard, if not harder, than getting the links in the first place. Generally speaking, the cheaper the link was to get the higher the cost of removal (relative to the original purchase price). If you are renting text link ads for $ 50 a month you can get them removed simply by not paying. But if you did a bulk submission to 5,000 high PR SEO friendly directories…best of luck with that!

It is time consuming. Firstly, there’s the overhead in working out which links to remove, as Google doesn’t specify them. Once a webmaster has made a list of the links she thinks might be a problem, she then needs to go through the tedious task of contacting each sites and requesting that a link be taken down.

Even with the best will in the world, this is an overhead for the linking site, too. A legitimate site may wish to verify the identity of the person requesting the delink, as the delink request could come from a malicious competitor. Once identity has been established, the site owner must go to the trouble of making the change on their site.

This is not a big deal if a site owner only receives one request, but what if they receive multiple requests per day? It may not be unreasonable for a site owner to charge for the time taken to make the change, as such a change incurs a time cost. If the webmaster who has incurred a penalty has to remove many links, from multiple sites, then such costs could quickly mount. Taken to the (il)logical extremes, this link removal stuff is a big business. Not only are there a number of link removal services on the market, but one of our members was actually sued for linking to a site (when the person who was suing them paid to place the link!)

What’s In It For Google?

Webmasters now face the prisoner’s dilemma and are doing Google’s job for them.

It’s hard to imagine this data not finding it’s way to the manual reviewers. If there are multiple instances of webmasters reporting paid links from a certain site, then Google have more than enough justification to take it out. This would be a cunning way around the “how do we know if a link is paid?” problem.

Webmasters will likely incorporate bad link checking into their daily activities. Monitoring inbound links wasn’t something you had to watch in the past, as links were good, and those that weren’t, didn’t matter, as they didn’t affect ranking anyway. Now, webmasters may feel compelled to avoid an unnatural links warning by meticulously monitoring their inbound links and reporting anything that looks odd. Google haven’t been clear on whether they would take such action as a result – Matt suggests they just reclassify the link & see it as a strong suggestion to treat it like the link has a nofollow attribute – but no doubt there will be clarification as the tool beds in. Google has long used a tiered index structure & enough complaints might lower the tier of a page or site, cause it’s ability to pass trust to be blocked, or cause the site to be directly penalized.

This is also a way of reaffirming “the law”, as Google sees it. In many instances, it is no fault of the webmaster that rogue sites link up, yet the webmaster will feel compelled to jump through Google’s hoops. Google sets the rules of the game. If you want to play, then you play by their rules, and recognize their authority. Matt Cutts suggested:

we recommend that you contact the sites that link to you and try to get links taken off the public web first. You’re also helping to protect your site’s image, since people will no longer find spammy links and jump to conclusions about your website or business.

Left unsaid in the above is most people don’t have access to aggregate link data while they surf the web, most modern systems of justice are based on the presumption of innocence rather than guilt, and most rational people don’t presume that a site that is linked to is somehow shady simply for being linked to.

If the KKK links to Matt’s blog tomorrow that doesn’t imply anything about Matt. And when Google gets featured in an InfoWars article it doesn’t mean that Google desires that link or coverage. Many sketchy sites link to Adobe (for their flash player) or sites like Disney & Google for people who are not old enough to view them or such. Those links do not indicate anything negative about the sites being linked into. However, as stated above, search is Google’s monopoly to do with as they please.

On the positive side, if Google really do want sites to conform to certain patterns, and will reward them for doing so by letting them out of jail, then this is yet another way to clean up the SERPs. They get the webmaster on side and that webmaster doing link classification work for them for free.

Who, Not What

For a decade search was driven largely by meritocracy. What you did was far more important than who you were. It was much less corrupt than the physical world. But as Google chases brand ad Dollars, that view of the search landscape is no longer relevant.

Large companies can likely safely ignore much of the fear-first approach to search regulation. And when things blow up they can cast off blame on a rogue anonymous contractor of sorts. Whereas smaller webmasters walk on egg shells.

When the government wanted to regulate copyright issues Google claimed it would be too expensive and kill innovation at small start ups. Google then drafted their own copyright policy from which they themselves are exempt. And now small businesses not only need to bear that cost but also need to police their link profiles, even as competitors can use Fivver, ScrapeBox, splog link networks & various other sources to drip a constant stream of low cost sludge in their direction.

Now more than ever, status is important.

Gotchas

No doubt you’ve thought of a few. A couple thoughts – not that we advocate them, but realize they will happen:

  • Intentionally build spam links to yourself & then disavow them (in order to make your profile look larger than it is & to ensure that competitor who follows everything you do – but lacks access to your disavow data – walks into a penalty).
  • Find sites that link to competitors and leave loads of comments for the competitor on them, hoping that the competitor blocks the domain as a whole.
  • Find sites that link to competitors & buy links from them into a variety of other websites & then disavow from multiple accounts.
  • Get a competitor some link warnings & watch them push to get some of their own clean “unauthorized” links removed.
  • The webmaster who parts on poor terms burning the bridge behind them, or leaving a backdoor so that they may do so at anytime.

If a malicious webmaster wanted to get a target site in the bad books, they could post obvious comment spam – pointing at their site, and other sites. If this activity doesn’t result in an unnatural linking notification, then all good. It’s a test of how Google values that domain. If it does result in an unnatural link notification, the webmaster could then disavow links from that site. Other webmasters will likely do the same. Result: the target site may get taken out.

To avoid this sort of hit, pay close attention to your comment moderation.

Please add your own to the comments! 🙂 Gotchas, that is, not rogue links.

Further opinions @ searchengineland and seoroundtable.

Categories: 

SEO Book.com

Page 2 of 2 12