SEO Blog

Author Archive


SEO Industry is Booming and SEO Jobs Abound

Posted by:  /  Tags: , , ,

SEO Industry is Booming and SEO Jobs Abound was originally published on BruceClay.com, home of expert search engine optimization tips.

Last month, I reported on the nation’s high demand for analysts in the SEO industry. That demand is only continuing to grow; currently, there are nearly 1,000 SEO jobs available in the United States (as seen on career search site Indeed.com). With that many jobs available, thousands of SEO analysts will be submitting applications and preparing for interviews.  In order to help them prepare, we thought, hey, why not give SEO job applicants a cheat sheet sharing SEO analyst interview questions?

And this cheat sheet couldn’t come at a better time, as we’re on the hunt for two experienced SEO analysts. Have you got what it takes to work with and learn from the man who wrote the book on SEO? Apply on LinkedIn.

20 Questions Bruce Clay might ask prospective SEO analysts during a job interview (answers not provided):

  • What makes a website search friendly?
  • How do you define success when it comes to SEO?
  • How do you stay updated on industry news and algorithm changes?
  • What programming languages do you have experience with?
  • Regarding your previous SEO job, what did an average day look like?
  • How do you adapt to the needs of different clients?
  • How often do you communicate with clients?
  • Describe your SEO workflow.
  • What SEO tools do you regularly use?
  • How do you stay organized when working on an SEO project?
  • Who is Matt Cutts?matt cutts
  • What is your favorite website and why?
  • What is your opinion on proper link building?
  • Have you or your clients ever received a manual link penalty from Google?
  • What’s the ideal speed for a site to load a web page?
  • Do you prefer to move a page with a 302 redirect or a 301 redirect?
  • Are you aware of the latest changes to Google and the latest updates to Panda, Penguin?
  • What do you know about content building and content marketing?
  • Do you have a sense of humor?
  • So, explain why I should hire you.

Whether you’re applying to Bruce Clay, Inc. or apply to one of the other hundreds of jobs in the SEO industry, we wish you luck and encourage you to better prepare for interviews with the help of these questions!

If you need a break from optimizing your answers, check out this awesome infographic on the state of the SEO industry from Blue Caribu.

 

seo infographic seo infographicseo infographic seo infographic

Have any tips for going on an interview for a position in the SEO industry? Share them in the comments!

Bruce Clay Blog

Disavowed: Secrets of Google’s Most Mysterious Tool

Posted by:  /  Tags: , , , , ,

Posted by Cyrus Shepard

To many webmasters, Google’s Disavow Tool seems a lifesaver. If you’ve suffered a Google penalty or been plagued by shady link building, simply upload a file of backlinks you want to disavow, and BOOM – you’re back in good graces. Traffic city!

Or nothing happens at all.

Few Google products have produced more fear, rumors and speculation. No one outside Google knows exactly how it works, and fewer understand how to succeed with it. To better understand, I used the tool myself to disavow 1000s of links, and talked with dozens of SEOs who used it in attempts to recover from Google penalties.

How Dangerous Is Disavow?

When you first log into the Disavow Tool, Google does everything in its power to dissuade you from actually using it with scary messaging.

Do Not Disavow

What’s the worst that could happen?

To find out how much damage I could do, I performed an experiment: Disavowing every link pointing to my website. Over 35,000 of them.

In this case, no reconsideration request was filed. Would the disavow tool work on its own?

Experiment

Disavow 35,000 Links to a Single Website

URL: http://cyrusshepard.com

Process:

  1. Download all links from Google Webmaster Tools
  2. Upload 35,000 properly formatted links to Google's Disavow Tool
  3. Wait 2 Months

Results:

Disavow-experiment

After 2 months, nothing happened. No drop in traffic.

The evidence suggests one of three possibilities:

  1. You must file a reconsideration request after disavowing your links, or…
  2. The disavow has built-in safeguards in order to protect you from disavowing good links, or…
  3. It takes longer than 2 months for Google to process all the links.

We’ve heard conflicting accounts from Googlers whether the tool works automatically, or if must file a reconsideration request for it to work. The data implies the later, although some SEOs say they’ve seen results from using the Disavow without filing a reconsideration request.

Google also states they reserve the right to ignore your disavowed links if they think you made a mistake, much like rel=”canonical”.

Best Advice: Safeguards or not, you might still shoot yourself in the foot. Be careful disavowing links!

Can You Use Disavow for Penguin?

Can you use the Disavow Tool if you haven't received a manually penalized? For example, will it work for Penguin?

The answer: maybe.

Here's a reminder: Google updates like Panda and Penguin are part of Google's overall algorithm. They automatically affect your rankings without direct human intervention. On the other hand, a manual penalty is often applied when you violate Google's guidelines. These can be both applied and lifted manually by Google's Webspam team.

Google representatives, including Matt Cutts, have gone on record to say the Disavow Tool could be used to help if you’ve been hit by Penguin (an algorithmic action), but also suggests that this applies to links that also violate Google’s Quality Guidelines.

Penguin and Google’s Unnatural Link Warnings often go hand in hand. So if you were hit by one, you are often hit by the other. Conversely, certain SEOs have claimed benefits from using the disavow on sites that were not penalized.

Best Advice: If you’ve been hit with a manual penalty, you need to file a reconsideration request if using the Disavow Tool. If you haven't been manually penalized, the benefits of using the tool are inconclusive.

Pro Tips for Reconsideration Requests

1. Remove First, Disavow Last

Google wants you to remove links first. Disavow is a last resort.

100% accuracy isn’t required, but effort counts.

Google’s Webspam team keeps a historical index of your backlink profile, so that when you file a reconsideration request they can see the links you’ve worked to remove.

2. Gather Your Links

You can use any source you want, but Google recommends downloading your Latest Links report directly from Webmaster Tools.

3. Find the Bad Links

You can do this two ways, with either automatic tools or manual analysis. Realistically, you should use both. Best Manual Analysis Resource:

Best Link Removal Research Tools:

Link Removal Resources

4. Outreach, Outreach, Outreach

Next, you’re going to send emails to get those links removed. Lots of emails.

Resources for Link Removal Outreach:

4. Trust in Google Docs

When you document your efforts, don’t submit random links to the Webspam team; they may not click on them. By sharing all your evidence via Google Docs, you provide a level of protection that helps ensure the Webspam team sees your evidence.

5. When in Doubt, Disavow Entire Domains

Google’s Disavow Tool gives you 2 options when disavowing links: individual URLs or entire domains.

Many webmasters fail at their reconsideration requests the first time because they miss too many links. The fear is that you’ll disavow something valuable, but if you’ve been rejected time and time again, this one change often leads to success.

Here’s a screenshot from Dr. Pete’s post showing both formats.

Disavow Format

Best Advice: If you are rejected after disavowing individual URLs, try disavowing entire domains.

6. Formatting Counts

Google rejects many disavow files because of bad formatting, but webmasters usually never know. Guidelines state the file type should be .txt only and “must be encoded UTF-8 or 7-bit ASCII.”

7. Bonus: Extra "Removed" Links with Screaming Frog

Google’s link index of your site is rarely up to date. They undoubtedly include links that no longer exist. To find dead links quickly, download a complete file of your latest links from Google Webmaster Tools into Screaming Frog (use List Mode) or another crawler of your choice.

When finished, take any links that return a 404 and download these into a spreadsheet. Be sure to include these dead links as "Removed" when you submit your reconsideration request to Google, otherwise they may not know about them.

Conclusion

The Disavow Tool is useful, but damn tricky.

Someday, perhaps Google can get away from tools like the Disavow. Today, good SEOs can't keep up with what's considered a good link or a bad, and Google continually cracks down on what it considers a “bad link.”

For successful marketers, it’s much more fulfilling to build new links, than disavow old ones.

I suppose that's Google's point, no?

Penalty Lifted

Additional Resources:

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


SEOmoz Daily SEO Blog

How to Do Keyword Research: A 6-Point Checklist

Posted by:  /  Tags: , , ,

How to Do Keyword Research: A 6-Point Checklist was originally published on BruceClay.com, home of expert search engine optimization tips.

Open a landing page on your website. Describe what it offers in three to five words.

If you are able to think of not one, but several dozen different three- to five-word combinations that work equally well to describe your landing page and are not sure which one is the best choice, you’re not alone.

Words have synonyms. “How do I…” and “How to…” are painfully similar. How do you choose whether to call it “Backpacking Europe,” “How to Backpack in Europe,” or “European Backpacking Tips” when all three describe your landing page equally well? How do you know which one will resonate best with your target market and land the SERP position, traffic, and conversions you’re after?

Because there are often dozens of ways to say the exact same thing, optimizers turn to keyword research to help them base their language decisions on consumer and competitive data, rather than blindly guessing which option “feels most right.”

How to Do Keyword Research

The goal of this article is to make keyword research easy and accessible. There are lots of articles that dive deep into using Google tools for keyword research, and advanced keyword research techniques — this is not one of them.

This article goes back to basics to elaborate on six central pillars of keyword research, including:

1) Getting started with a brainstorm list

2) Acknowledging that you need a keyword research tool

3) Refining your list using suggested keyword phrases from an analysis tool

4) Verifying keyword phrase relevance

5) Looking at search volume to determine consumer demand

6) Analyzing the competitive space to make sure you and the searcher think the keywords mean the same things, and to decide if the space is too competitive

Big picture, the idea is to have extremely targeted keyword phrases that have a high search volume and low competition mix. In the most basic terms, this means — in a perfect world — your keyword phrases describes your content accurately, a lot of people are searching for the exact phrase, and there aren’t a lot of authoritative competitors who are also optimizing for that exact phrase.

 

Pillar One: The Brainstorm List

Most keyword research starts with a long list of hunches. Optimizers (even if it’s Joe from the mailroom doing your keyword research, once he starts working on SEO he technically becomes an “optimizer”) compile a list of words and phrases that they think their target demographic would use to describe their content, products, and services.

Sometimes these lists are based on persona research, but most of the time these lists are compiled of phrases pulled out of thin air and largely represent the words the company hopes people use to describe their products and content, not the actual words the demographic is using.

Do It

Start your keyword research process with one of these lists. Don’t set any limitations at this point. Think about how your consumers would ask for your products, services, and content in search queries. What kind of stems, like “How to” or “Where can I,” are appropriate? What about local modifiers like “Los Angeles County” or “Ventura, CA”? Or modifiers like “free”? What works best to describe your content?

 

Pillar Two: Selecting and Using a Keyword Research Tool

In your keyword research process you’re going to analyze your brainstorm list of theoretical keyword phrases to determine which have the right mix of demand, attainability, and relevance to earn top SERP results.

Since this post is about the data you pull from keyword research tools, not the tools themselves, I am going to keep this section high-level but I did want to make it clear that, unless you are a mind reader, you are going to need to use a keyword research tool to mine keyword data.

Everything covered in this article you can do at a basic level with Google.com and the Google AdWords Keyword Tool. Paid tools like WordTracker and the Bruce Clay, Inc. SEOToolSet® offer more advanced keyword insights like in-depth competitive analysis, and search insights for the Bing and Yahoo! engines, as well as Google.

If you’re hungry for more information about tool possibilities now, check out our live blogging session from SMX Advanced: Advanced Keyword Research Tools.

Do It

While physically you can enter the words from your brainstorm list into your keyword tool one at a time or all at once, I recommend entering your words in small batches of 10-12 related phrases as it will make parsing through related keyword phrases (called “Keyword Ideas” in the AdWords Keyword Tool) much easier.

When using the AdWords Keyword Tool it’s important that you are searching for “Exact” keyword matches, and not “Broad” keyword matches. With “Exact” selected, the tool will tell you how many people searched for your exact phrase as you entered it, and then in the section below (Keyword Ideas) it will return a list of related search queries that are close to the phrases you entered.

"Exact" match selected in the Google AdWords Keyword Tool

Use these suggested phrases to add new ideas to your keyword brainstorm list, or to parse out not-so-great-in-hindsight ideas.

 

Pillar Three: Using Keyword Suggestions to Refine Your List

Nearly every keyword research tool will return suggested keyword phrases that are similar to your original phrase request. As mentioned, in the AdWords Keyword Tool these are called Keyword Ideas.

Gleaning insights from suggested keywords is truly invaluable as it allows you to understand the exact language your target demographic is using to search for your products. This information can help you build and refine your target keyword list, as well as your product and content roadmaps (read more about why SEOs use keyword phrases, including why they help marketers develop content and product strategies).

A Screen Capture of Google AdWords Keyword Tool Keyword Ideas

Pillar Four: Do the Keywords Accurately Describe the Content?

When you start looking at keyword suggestions it can be easy to fall into a high-volume drunken haze and forget that relevance means directly descriptive of your content or product — not loosely related to the idea of the content or the general needs of the target demographic.

Don’t approach your keyword like the Six Degrees of Kevin Bacon. If you identify a keyword phrase that doesn’t describe the topic on your landing page, but is related to your topic, or of related interest to your target demographic, create a new landing page with new content to work in that keyword phrase. Don’t try to fool humans or Google spiders by using phrases that do not exactly describe your content — use keyword research to inform content strategy!

Think to yourself: When the user searches this query, what are they looking for? What do they want? If they find my site, will their needs be met?

 

Pillar Five: Volume — How Many People Ask For It Like This?

Looking at the Search Volume of a keyword phrase will tell you how many times per month an exact keyword phrase was entered into a Google search. Some tools like the AdWords Keyword Tool will only tell you how many times the term was searched in Google; other tools like the SEOToolSet® will tell you how many times the term was searched in Google, Yahoo! and Bing.

When an exact keyword phrase has a high search volume it tells us two things. One, that there is a high consumer demand for this product or for information on this topic, and two, that right now — this month — this is the exact language that many people are using to try to find more information about the products and services you carry.

Identifying high volume search terms means identifying demand and pinpointing language trends. Optimize your page with the exact words the searcher uses and Google, recognizing your content as an extremely relevant choice, will consider you a contender for the SERP top three.

Do It

Look at the Local Monthly Searches column in the AdWords Keyword Tool. There is not a catch-all magic number that represents the perfect search volume. What constitutes the “right” search volume is going to be different from brand to brand, and objective to objective.

Although really specific long-tail keywords like “how to backpack in Europe” are going to have less search volume than broad keywords like “backpacking,” specific phrases that target exact need are significantly more likely to convert. I’d rather drive 100 qualified leads who spend time on my site and buy products than 10,000 clicks that immediately bounce (leave the site quickly without clicking any other content).

That said, whenever possible try to use the Keyword Ideas section to find words that are the best of both worlds  — very specific long-tail keyword phrases with high search volume. For instance, in the example to the right we’d want to use “Backpacking Europe” — a phrase with 2,900 local searches — rather than the very similar phrase “How to Backpack in Europe,” which only has 29 local monthly searches.

A Screen Capture of Google AdWords Keyword Tool Local Search Volume

Pillar Six: Competitive Analysis — Does This Mean What I Think It Means?

Pillar six is all about looking at what your competitors are doing and analyzing what the competition for the keyword SERP space looks like.

To get an idea of a keyword’s competitive space do a search for the phrase you’re trying to rank for. Just enter the phrase into Google as if you were the searcher.

What do you see?

Do you see results that offer products and services similar to yours? Do you see highly competitive big brands in the top ten? Do you see ten results that have nothing to do with your content?

If you see results that offer products and services similar to yours…

That’s good! That means you’re in the right space. Now look at who else is ranking for your keyword phrase. Who are your competitors for the top ten, or top three? What are they doing? What language are they using? One key to beating your competitors in the SERPs is doing more things right than them, so take some time to think about what the website in the spot you want is doing well and what they’re neglecting. Do they have the keyword phrase in their Title, Description, and body copy? Sometimes you have to click on all ten links to get a 360-degree idea of where you stand.

To get an even more detailed view of how competitive your keyword phrase is do a Google search for “Allintitle:keyword” where keyword is your keyword phrase. This will tell you how many web pages include this exact phrase in their Title tag, which will give you an idea of how many other web pages are optimizing for that exact phrase.

Do you see highly authoritative big brands in the top ten?

I know your mother always told you to never give up, but…sometimes you have to know when to fold ’em if the competition for a keyword phrase is just too steep. The number of clicks-throughs you’re going to see actively decays with every position you move away from spot number one, so if you have a slim to none chance of beating Adobe, Wacom, and Microsoft for spots one, two, and three I would recommend you spend your time targeting a different keyword phrase that you have a chance to rank highly for.

You will have see more traffic and conversions ranking number one for a keyword phrase that has 1,500 monthly searches, than being in spot 15 for a keyword phrase that has 10,000 monthly searches.

If you don’t see results that offer products and services similar to yours…

Sometimes keyword phrases can mean two totally different things depending on who you ask. If you perform a search for your keyword phrase and see returned results that have nothing to do with your content, or what you thought the keyword phrase was asking for, then you have uncovered a keyword phrase with two meanings.

These situations are something hard to imagine, so here is an example. Say you’re optimizing for a page that teaches people how to use Ableton Live to create custom drum beats. When you search for your keyword phrase “how to create custom beats” — a long-tail keyword phrase that is, in theory, extremely relevant to your page content — you expect to see tutorials that show people how to make custom drum beats with music editing software. What you actually see is ten links that show people how to make custom Dr. Dre Beats headphones.

A Screen Capture of a Google Search for the phrase "how to make custom beats"

In the end the person who is looking for your content — how to create custom drum beats — is going to see these same results, be just as disappointed as you, and refine their search. Which means no traffic for you and wasted optimization effort.

Even if your web page does make it to number one, if it’s not in the right competitive space your click-through traffic will be minimal. (For example, did you even notice the Full Sail University link for music creation in the SERP example to the right? Most people will scan the page, see only headphones, and refine their search.)

Why You Should Use Science To Choose Better Keywords

In order to drive traffic your web pages need to compete for page one rank in the SERPs. And in order to compete for page-one rank, every one of your pages needs to have keyword-rich Meta Titles, Descriptions, and body content. If you’re going to invest all the time to use target phrases in your optimization efforts, why not spend just a little more time to make sure you’re using the right language?

Optimizing based on a hunch is not optimizing. Data is your friend.

Bruce Clay Blog

No Effort Longtail SEO Revenues, from FindTheBest

Posted by:  /  Tags: , , , ,

In our infographic about the sausage factory that is online journalism, we had a throw away line about how companies were partnering with FindTheBest to auto-generate subdomains full of recycled content. Apparently, a person named Brandon who claims to work for FindTheBest didn’t think our information was accurate:

Hi Aaron,
My name is Brandon. I have been with FindTheBest since 2010 (right after our launch), and I am really bummed you posted this Infographic without reaching out to our team. We don’t scrape data. We have a 40 person+ product team that works very closely with manufacturers, companies, and professionals to create useful information in a free and fair playing field. We some times use whole government databases, but it takes hundreds-of-thousands of hours to produce this content. We have a product manager that owns up to all the content in their vertical and takes the creation and maintenance very seriously. If you have any questions for them about how a piece of content was created, you should go to our team page and shoot them a email. Users can edit almost any listing, and we spend a ton of time approving or rejecting those edits. We do work with large publishers (something I am really proud of), but we certainly do not publish the same exact content. We allow the publishers to customize and edit the data presentation (look, style, feel) but since the majority of the content we produce is the factual data, it probably does look a little similar. Should we change the data? Should we not share our awesome content with as many users as possible? Not sure I can trust the rest of your “facts”, but great graphics!

I thought it was only fair that we aired his view on the main blog.

…but then that got me into doing a bit of research about FindTheBest…

In the past when searching for an issue related to our TV I saw a SERP that looked like this

Those mashed sites were subdomains on trusted sites like VentureBeat & TechCrunch.

Graphically the comparison pages appear appealing, but how strong is the editorial?

How does Find The Best describe their offering?

In a VentureBeat post (a FindTheBest content syndication partner) FTB’s CEO Kevin O’Connor was quoted as saying: “‘Human’ is dirty — it’s not scalable.”

Hmm. Is that a counter view to the above claimed 40 person editorial research team? Let’s dig in.

Looking at the top listed categories on the homepage of Find The best I counted 497 different verticals. So at 40 people on the editorial team that would mean that each person managed a dozen different verticals (if one doesn’t count all the outreach and partnership buildings as part of editorial & one ignores the parallel sites for death records, grave locations, find the coupons, find the company & find the listing).

Google shows that they have indexed 35,000,000 pages from FindTheBest.com, so this would mean each employee has “curated” about 800,000 pages (which is at least 200,000 pages a year over the past 4 years). Assuming they work 200 days a year that means they ensure curation of at least 1,000 “high quality” pages per day (and this is just the stuff in Google’s index on the main site…not including the stuff that is yet to be indexed, stuff indexed on 3rd party websites, or stuff indexed on FindTheCompanies.com, FindTheCoupons.com, FindTheListing, FindTheBest.es, FindTheBest.or.kr, or the death records or grave location sites).

Maybe I am still wrong to consider it a bulk scrape job. After all, it is not unreasonable to expect that a single person can edit 5,000 pages of high quality content daily.

Errr….then again…how many pages can you edit in a day?

Where they lost me though was with the “facts” angle. Speaking of not trusting the rest of “facts” … how crappy is the business information for SEO Book on FindTheBest that mentions that our site launched in 2011, we have $ 58,000 in sales, and we are a book wholesaler.

I realize I am afforded the opportunity to work for free to fix the errors of the scrape job, but if a page is full of automated incorrect trash then maybe it shouldn’t exist in the first place.

I am not saying that all pages on these sites are trash (some may be genuinely helpful), but I know if I automated content to the extent FTB does & then mass email other sites for syndication partnerships on the duplicate content (often full of incorrect information) that Google would have burned it to the ground already. They likely benefit from their CEO having sold DoubleClick to Google in the past & are exempt from the guidelines & editorial discrimination that the independent webmaster must deal with.

One of the ways you can tell if a company really cares about their product is by seeing if they dogfood it themselves.

Out of curiousity, I looked up FindTheBest on their FindTheCompany site.

They double-list themselves and neither profile is filled out.

That is like having 2 sentence of text on your “about us” page surrounded by 3 AdSense blocks. 😀

I think they should worry about fixing the grotesque errors before worrying about “sharing with as many people as possible” but maybe I am just old fashioned.

Certainly they took a different approach … one that I am sure that would get me burned if I tried it. An example sampling of some partner sites…

  • accountants.entrepreneur.com
  • acronyms.sciencedaily.com
  • alternative-fuel.cleantechnica.com
  • analytics-software.businessknowhow.com
  • antivirus.betanews.com
  • apps.edudemic.com
  • atvs.agriculture.com
  • autopedia.com/TireSchool/
  • autos.nydailynews.com
  • backup-software.venturebeat.com
  • bags.golfdigest.com
  • beer.womenshealthmag.com
  • best-run-states.247wallst.com
  • bestcolleges.collegenews.com
  • bikes.cxmagazine.com
  • bikes.triathlete.com
  • birds.findthelisting.com
  • birth-control.shape.com
  • brands.goodguide.com
  • breast-pumps.parenting.com
  • broker-dealers.minyanville.com
  • businessschools.college-scholarships.com
  • camcorders.techcrunch.com
  • cars.pricequotes.com
  • cats.petharbor.com
  • catskiing.tetongravity.com
  • chemical-elements.sciencedaily.com
  • comets-astroids.sciencedaily.com
  • companies.findthecompany.com
  • companies.goodguide.com
  • compare-video-editing-software.burnworld.com
  • compare.consumerbell.com
  • compare.guns.com
  • compare.roadcyclinguk.com
  • comparemotorbikes.motorbike-search-engine.co.uk
  • congressional-lookup.nationaljournal.com
  • courses.golfdigest.com
  • crm.venturebeat.com
  • cyclocross-bikes.cyclingdirt.org
  • dealers.gundigest.com
  • death-record.com
  • debt.humanevents.com
  • design-software.underworldmagazines.com
  • destination-finder.fishtrack.com
  • diet-programs.shape.com
  • digital-cameras.techcrunch.com
  • dinosaurs.sciencedaily.com
  • dirt-bikes.cycleworld.com
  • dogbreeds.petmd.com
  • dogs.petharbor.com
  • donors.csmonitor.com
  • e-readers.techcrunch.com
  • earmarks.humanevents.com
  • earthquakes.sciencedaily.com
  • ehr-software.technewsworld.com
  • fallacies.sciencedaily.com
  • fec-candidates.theblaze.com
  • fec-committees.theblaze.com
  • federal-debt.nationaljournal.com
  • fha-condos.realtor.org
  • fha.nuwireinvestor.com
  • financial-advisors.minyanville.com
  • findthebest.com
  • findthebest.motorcycleshows.com
  • findthecoupons.com
  • findthedata.com
  • firms.privateequity.com
  • franchises.fastfood.com
  • ftb.cebotics.com
  • game-consoles.tecca.com
  • game-consoles.venturebeat.com
  • gin.drinkhacker.com
  • golf-courses.bunkershot.com
  • gps-navigation.techcrunch.com
  • gps-navigation.venturebeat.com
  • green-cars.cleantechnica.com
  • guns.dailycaller.com
  • ham-radio.radiotower.com
  • hdtv.techcrunch.com
  • hdtv.venturebeat.com
  • headphones.techcrunch.com
  • headphones.venturebeat.com
  • high-chairs.parenting.com
  • highest-mountains.sciencedaily.com
  • hiv-stats.realclearworld.com
  • horsebreeds.petmd.com
  • hospital-ratings.lifescript.com
  • hr-jobs.findthelistings.com
  • inventors.sciencedaily.com
  • investment-advisors.minyanville.com
  • investment-banks.minyanville.com
  • iv-housing.dailynexus.com
  • laptops.mobiletechreview.com
  • laptops.techcrunch.com
  • laptops.venturebeat.com
  • lawschool.lawschoolexpert.com
  • locategrave.org
  • mammography-screening-centers.lifescript.com
  • mba-programs.dealbreaker.com
  • medigap-policies.findthedata.org
  • military-branches.nationaljournal.com
  • motorcycles.cycleworld.com
  • mountain-bikes.outsideonline.com
  • nannies.com
  • nobel-prize-winners.sciencedaily.com
  • nursing-homes.caregiverlist.com
  • nursing-homes.silvercensus.com
  • onlinecolleges.collegenews.com
  • phones.androidauthority.com
  • pickups.agriculture.com
  • planets.realclearscience.com
  • planets.sciencedaily.com
  • plants.backyardgardener.com
  • presidential-candidates.theblaze.com
  • presidents.nationaljournal.com
  • privateschools.parentinginformed.com
  • processors.betanews.com
  • project-management-software.venturebeat.com
  • projectors.techcrunch.com
  • pushcarts.golfdigest.com
  • recovery-and-reinvestment-act.theblaze.com
  • religions.theblaze.com
  • reviews.creditcardadvice.com
  • saving-accounts.bankingadvice.com
  • sb-marinas.noozhawk.com
  • sb-nonprofits.noozhawk.com
  • scheduling-software.venturebeat.com
  • scholarships.savingforcollege.com
  • schools.nycprivateschoolsblog.com
  • scooters.cycleworld.com
  • smartphones.techcrunch.com
  • smartphones.venturebeat.com
  • solarpanels.motherearthnews.com
  • sports-drinks.flotrack.org
  • stables.thehorse.com
  • state-economic-facts.nationaljournal.com
  • steppers.shape.com
  • strollers.parenting.com
  • supplements.womenshealthmag.com
  • tablets.androidauthority.com
  • tablets.techcrunch.com
  • tablets.venturebeat.com
  • tabletsandstuff.com/tablet-comparison-chart
  • tallest-buildings.sciencedaily.com
  • technology.searchenginewatch.com
  • telescopes.universetoday.com
  • tequila.proof66.com
  • texas-golf-courses.texasoutside.com
  • tires.agriculture.com
  • tractors.agriculture.com
  • tsunamies.sciencedaily.com
  • us-hurricanes.sciencedaily.com
  • video-cameras.venturebeat.com
  • volcanic-eruptions.com
  • waterheaters.motherearthnews.com
  • wetsuits.swellinfo.com
  • whiskey.cocktailenthusiast.com
  • whiskey.drinkoftheweek.com
  • white-house-visitors.theblaze.com
  • wineries.womenshealthmag.com



we have seen search results where a search engine didn’t robots.txt something out, or somebody takes a cookie cutter affiliate feed, they just warm it up and slap it out, there is no value add, there is no original content there and they say search results or some comparison shopping sites don’t put a lot of work into making it a useful site. They don’t add value. – Matt Cutts

That syndication partnership network also explains part of how FTB is able to get so many pages indexed by Google, as each of those syndication sources is linking back at FTB on (what I believe to be) every single page of the subdomains, and many of these subdomains are linked to from sitewide sidebar or footer links on the PR7 & PR8 tech blogs.

And so the PageRank shall flow 😉

Hundreds of thousands of hours (eg 200,000+) for 40 people is 5,000 hours per person. Considering that there are an average of 2,000 hours per work year, this would imply each employee spent 2.5 full years of work on this single aspect of the job. And that is if one ignores the (hundreds of?) millions of content pages on other sites.

How does TechCrunch describe the FTB partnership?

Here’s one reason to be excited: In its own small way, it combats the recent flood of crappy infographics. Most TechCrunch writers hate the infographics that show up in our inboxes— not because infographics have to be terrible, but because they’re often created by firms that are biased, have little expertise in the subject of the infographic, or both, so they pull random data from random sources to make their point.

Get that folks? TechCrunch hosting automated subdomains of syndicated content means less bad infographics. And more cat lives saved. Or something like that.

How does FTB describe this opportunity for publishers?

The gadget comparisons we built for TechCrunch are sticky and interactive resources comprised of thousands of SEO optimized pages. They help over 1 million visitors per month make informed decisions by providing accurate, clear and useful data.

SEO optimized pages? Hmm.

Your comparisons will include thousands of long-tail keywords and question/answer pages to ensure traffic is driven by a number of different search queries. Our proprietary Data Content Platform uses a mesh linking structure that maximizes the amount of pages indexed by search engines. Each month—mainly through organic search—our comparisons add millions of unique visitors to our partner’s websites.

Thousands of long-tail keyord & QnA pages? Mesh linking structure? Hmm.

If we expand the “view more” section at the footer of the page, what do we find?

Holy Batman.

Sorry that font is so small, the text needed reduced multiple sizes in order to fit on my extra large monitor, and then reduced again to fit the width of our blog.

Each listing in a comparison has a number of associated questions created around the data we collect.

For example, we collect data on the battery life of the Apple iPad.

An algorithm creates the question “How long does the Apple iPad tablet battery last?” and answers it

So now we have bots asking themselves questions that they answer themselves & then stuffing that in the index as content?

Yeah, sounds like human-driven editorial.

After all, it’s not like there are placeholder tokens on the auto-generated stuff

{parent_field}

Ooops.

Looks like I was wrong on that.

And automated “popular searches” pages? Nice!

As outrageous as the above is, they include undisclosed affiliate links in the content, and provided badge-based “awards” for things like the best casual dating sites, to help build links into their site.

That in turn led to them getting a bunch of porn backlinks.

If you submit an article to an article directory and someone else picks it up & posts it to a sketchy site you are a link spammer responsible for the actions of a third party.

But if you rate the best casual dating sites and get spammy porn links you are wonderful.

Content farming never really goes away. It only becomes more corporate.

Categories: 

SEO Book

Link Madness

Posted by:  /  Tags: ,

Link paranoia is off the scale. As the “unnatural link notifications” fly, the normally jittery SEO industry has moved deep into new territory, of late.

I have started to wonder if some of these links (there are hundreds since the site is large) may be hurting my site in the Google Algo. I am considering changing most of my outbound links to rel=”nofollow”. It is not something I want to do but … “

We’ve got site owners falling to their knees, confessing to be link spammers, and begging for forgiveness. Even when they do, many sites don’t return. Some sites have been returned, but their rankings, and traffic, haven’t recovered. Many sites carry similar links, but get a free pass.

That’s the downside of letting Google dictate the game, I guess.

Link Removal

When site owners are being told by Google that their linking is “a problem,” they tend to hit the forums and spread the message, so the effect is multiplied.

Why does Google bother with the charade of “unnatural link notifications,” anyway?

If Google has found a problem with links to a site, then they can simply ignore or discount them, rather than send reports prompting webmasters to remove them. Alternatively, they could send a report saying they’ve already discounted them.

So one assumes Google’s strategy is a PR – as in public relations – exercise to plant a bomb between link buyers and link sellers. Why do that? Well, a link is a link, and one could conclude that Google must still have problems nailing down the links they don’t like.

So they get some help.

The disavow links tool, combined with a re-inclusion request, is pretty clever. If you wanted a way to get site owners to admit to being link buyers, and to point out the places from which they buy links, or you want to build a database of low quality links, for no money down, then you could hardly imagine a better system of outsourced labour.

If you’re a site owner, getting hundreds, if not thousands, of links removed is hardly straightforward. It’s difficult, takes a long time, and is ultimately futile.

Many site owners inundated with link removal requests have moved to charging removal fees, which in many cases is understandable, given it takes some time and effort to verify the true owner of a link, locate the link, remove it, and update the site.

As one rather fed-up sounding directory owner put it:

Blackmail? Google’s blackmailing you, not some company you paid to be listed forever. And here’s a newsflash for you. If you ask me to do work, then I demand to be paid. If the work’s not worth anything to you, then screw off and quit emailing me asking me to do it for free.

Find your link, remove it, confirm it’s removed, email you a confirmation, that’s 5 minutes. And that’s $ 29US. Don’t like it? Then don’t email me. I have no obligation to work for you for free, not even for a minute. …. I think the best email I got was someone telling me that $ 29 was extortion. I then had to explain that $ 29 wasn’t extortion – but his new price of $ 109 to have the link removed, see, now THAT’S extortion.

if it makes you sleep at night, you might realize that you paid to get in the directory to screw with your Google rankings, now you get to pay to get out of the directory, again to screw your Google rankings. That’s your decision, quit complaining about it like it’s someone else’s fault. Not everyone has to run around in circles because you’re cleaning up the very mess that you made

Heh.

In any case, if these links really did harm a site – which is arguable – then it doesn’t take a rocket scientist to guess the next step. Site owners would be submitting their competitors links to directories thick and fast.

Cue Matt Cutts on negative SEO….

Recovery Not Guaranteed

Many sites don’t recover from Google penalties, no matter what they do.

It’s conceivable that a site could have a permanent flag against it no matter what penance has been paid. Google takes into account your history in Adwords, so it’s not a stretch to imagine similar flags may continue to exist against domains in their organic results.

The most common reason is not what they’re promoting now, its what they’ve promoted in the past.
Why would Google hold that against them? It’s probably because of the way affiliates used to churn and burn domains they were promoting in years gone by…

This may be the reason why some recovered sites just don’t rank like they used to after they’ve returned. They may carry permanent negative flags.

However, the reduced rankings and traffic when/if a site does return may have nothing to do with low-quality links or previous behaviour. There are many other factors involved in ranking and Google’s algorithm updates aren’t sitting still, so it’s always difficult to pin down.

Which is why the SEO environment can be a paranoid place.

Do Brands Escape?

Matt Cutts is on record discussing big brands, saying they get punished, too. You may recall the case of Interflora UK.

Google may well punish big brands, but the punishment might be quite different to the punishment handed out to a no-brand site. It will be easier for a big brand to return, because if Google don’t show what Google users expect to see in the SERPs then Google looks deficient.

Take, for example, this report received – amusingly – by the BBC:

I am a representative of the BBC site and on Saturday we got a ‘notice of detected unnatural links’. Given the BBC site is so huge, with so many independently run sub sections, with literally thousands or agents and authors, can you give us a little clue as to where we might look for these ‘unnatural links

If I was the BBC webmaster, I wouldn’t bother. Google isn’t going to dump the BBC sites as Google would look deficient. If Google has problems with some of the links pointing to the BBC, then perhaps Google should sort it out.

Take It On The Chin, Move On

Many of those who engaged in aggressive link tactics knew the deal. They went looking for an artificial boost in relevancy, and as a result of link building, they achieved a boost in the SERPs.

That is playing the game that Google, a search engine that factors in backlinks, “designed”. By design, Google rewards well-linked sites by ranking them above others.

The site owners enjoyed the pay-off at the expense of their less aggressive competitors. The downside – there’s always a downside – is that Google may spot the artificial boost in relevancy, now or in the future, and and may slam the domain as a result.

That’s part of the game, too.

Some cry about it, but Google doesn’t care about crying site owners, so site owners should build that risk into their business case from the get go.

Strategically, there are two main ways of looking at “the game”:

Whack A Mole: Use aggressive linking for domains you’re prepared to lose. If you get burned, then that’s a cost of playing the game. Run multiple domains using different link graphs for each and hope that a few survive at any one time, thus keeping you in the game. If some domains get taken out, then take it on the chin. Try to get reinstated, and if you can’t, then torch them and move on.

Ignore Google: If you operate like Google doesn’t exist, then it’s pretty unlikely Google will slam you, although there are no guarantees. In any case, a penalty and a low ranking are the same thing in terms of outcome.

Take one step back. If your business relies on Google rankings, then that’s a business risk. If you rely entirely on Google rankings, then that’s a big business risk. I’m not suggesting it’s not a risk worth taking, but only you can answer that what risks make sense for your business.

If the whack a mole strategy is not for you, and you want to lower the business risk of Google’s whims, then it makes sense to diversify the ways in which you get traffic so that if one traffic stream fails, then all is not lost. If you’re playing for the long term, then establishing brand, diversifying traffic, and treating organic SEO traffic as a bonus should be considerations. You then don’t need to worry about what Google may or may not do as Google aren’t fueling your engine.

Some people run both these strategies simultaneously, which is an understandable way of managing risk. Most people probably sit somewhere in the middle and hope for the best.

Link Building Going Forward

The effect of Google’s fear, uncertainty and doubt strategy is that a lot of site owners are going to be running scared or confused, or both.

Just what is acceptable?

Trouble is, what is deemed acceptable today might be unacceptable next week. It’s pretty difficult, if not impossible, for a site owner to wind the clock back once they undertake a link strategy, and who knows what will be deemed unacceptable in a years time.

Of course, Google doesn’t want site owners to think in terms of a “link strategy”, if the aim of said link strategy is to “inflate rankings”. That maxim has remained constant.

If you want to take a low-risk approach, then it pays to think of Google traffic as a bonus. Brett Tabke, founder of WebmasterWorld, used to keep a sticker on his monitor that said “Pretend The Search Engines Don’t Exist”, or words to that effect. I’m reminded of how useful that message still is today, as it’s a prompt to think strategically beyond SEO. If you disappeared from Google today, would your business survive? If the answer is no, then you should revise your strategy.

Is there a middle ground?

Here are a few approaches to link building that will likely stand the test of time, and incorporate strategy that provides resilience from Google’s whims. The key is having links for reasons besides SEO, even if you part of their value is higher rankings.

1. Publisher

Publish relevant, valuable content, as determined by your audience.

It’s no longer enough to publish pages of information on a topic, the information must have demonstrable utility i.e. other people need to deem it valuable, reference it, visit it, and talk about it. Instead of putting your money into buying links, you put your money into content development and then marketing it to people. The links will likely follow. This is passive link acquisition.

It’s unlikely these types of links will ever be a problem, as the link graph is not going to look contrived. If any poor quality links slip into this link graph, then they’re not going to be the dominant feature. The other signals will likely trump them and therefore diminish their impact.

Build brand based on unique, high quality information, and then market it to people by via multiple channels, and the links tend to follow, which then boost your ranking in Google. Provide a high degree of utility, first and foremost.

One problem with this model is that it’s easy for other people to steal your utility. This is a big problem and prevents investment in quality content. One way of getting around this is to use some content as loss-leader and lock the rest away behind pay-walls. You give the outside world, and Google, just enough, but if they want the rest, then they’re going to need to sign up.

Think carefully about the return on giving the whole farm away to a crawler. Think about providing utility, not “content”.

2. Differentiation

There is huge first mover advantage when it comes to getting links.

If a new field opens up, and you get there first, or early, then it’s relatively easy to build a credible link graph. As a field expands, the next layer involves a lot of meta activity i.e. bloggers, media and other information curators writing about that activity. At the start of any niche, there aren’t many players to talk about, so the early movers get all the links.

As a field matures, you get a phenomenon Mike Grehan aptly characterised as “filthy linking rich

The law of “preferential attachment” as it is also known, wherein new links on the web are more likely to go to sites that already have many links, proves that the scheme is inherently biased against new and unknown pages. When search engines constantly return popular pages at the top of the pile, more web users discover those pages and more web users are likely to link to them

Those who got all those links early on will receive more and more links over time because they are top of the results. They just need to keep doing what they’re doing. It becomes very difficult for late entrants to beat them unless they do something extraordinary. By definition, that probably means shifting the niche to a new niche.

If you’re late to a crowded field, then you need to think in terms of differentiation. What can you offer the rest do not? New content in such fields must be remarkable i.e worth remarking upon.

Is that field moving in a new direction? If so, can you pivot in that direction and be first mover in that direction? Look not where a niche currently is, but where it’s going, then position ahead of it.

“Same old, same old content” doesn’t get linked to, engaged with, ranked, or remarked upon – and why should it? The web is not short of content. The web has so much content that companies like Google have made billions trying to reduce it to a manageable set of ten links

3. Brand

Brand is the ultimate Google-protection tactic.

It’s not that brands don’t get hammered by Google occasionally, because they do. But what tends to differ is the sentence handed down. The bigger the brand, the lighter the sentence, or the shorter the sentence, because no matter how much WalMart or The Office Of The President Of The United States Of America spams Google, Google must show such sites. I’m not suggesting these sites engage in aggressive SEO tactics, or need to, but we know they’ll always be in Google.

You don’t have to be a big brand. You do need search volume on your distinctive brand name. If you’re well known enough in your niche i.e. you attract significant type-in search volume, Google must show you or appear deficient.

This is not to say having a brand means you can get away with poor behavior. But the more type-in traffic for your brand, the more pressure there is on Google to rank you.

Links to a brand name will almost never look forced in the same way a link in a footer to “cheap online pharmacy” looks forced. People know your name, and they link to you by name , they talk about you by name – naturally.

The more generic your site, the more vulnerable you are, as it’s very difficult to own a space when you’re aligning with generic keyword terms. The links are always going to look a little – or a lot – forced.

This is not to say you shouldn’t get links with keywords in them, but build a distinctive brand, too. The link graph will appear mostly natural – because it is. A few low quality links won’t trump the good signals created by a lot of natural brand links.

4. Engagement

The web is a place.

This placed is filled with people. There are relationships between people. Relationships between people on the web, are almost always expressed as a link. It might be a Facebook link, a Twitter link, a comment link, a blog link, but they’re all links. It doesn’t matter if they’re crawlable or not, or if they’re no-followed, or not, it still indicates a relationship.

If Google is to survive, it must figure out these relationships.

That’s why all links – apart from negative SEO – are good links. The more signals of a real relationship, the better you *should* be ranked, because you are more relevant, in an objective sense.

So look for ways to foster relationships and engagement. It might be guest posting. It might be commenting on someone elses site. It might be contributing to forums. It might be interviewing people. It might be accepting interviews. It might be forging relationships with the press. It might be forging relationships with business organisations. It might be contributing to charity. It might be running competitions. It might be attending conferences. It might be linking out to influential people.

It’s all networking.

And wherever you network, you should be getting links as a byproduct.

One potential problem:

Provide long – well, longer than 400 words – unique, editorial articles. Articles also need get linked to, and engaged with. Articles need to be placed on sites they’ll be seen, as opposed to content farms.

Ask yourself “am I providing genuine utility?”

5. Fill A Need

This is similar to differentiation, but a little more focused.

Think about the problems people have in a niche. The really hard problems to solve. “How to”, “tips”, “advice”, and so on.

Solving a genuine problem for people tends to make people feel a sense of obligation, especially if they would otherwise have to pay for that help. If you can twist that obligation towards getting a link, all the better. For example, “if this article/video/whatever helped you, no payment necessary! But it would be great if you link to us/follow us on Twitter/” and so on. It doesn’t need to be that overt, although sometimes overt is what is required. It fosters engagement. It builds your network. And it builds links.

Think about ways you can integrate a call-to-action that results in a link of some kind.

Coda

In other news, Caesars Palace bans Google 🙂

Categories: 

SEO Book

How to Make a Graphic-Text Mash-up to Promote Blog Content on Facebook

Posted by:  /  Tags: , , , , ,

How to Make a Graphic-Text Mash-up to Promote Blog Content on Facebook was originally published on BruceClay.com, home of expert search engine optimization tips.

I’m in a few Google+ groups focused on SEO, social media and content marketing. The question of what stock photo service to use and where to get free images has come up a couple times. It got me thinking about the process I use to find, modify and use images in my day-to-day.

As a community manager and a blogger, I have 2 main needs for images:

  1. Including them in BCI blog posts to break up text and add visual interest
  2. Posting images to social media to share blog and other BCI content

What you’ll know by the end of reading this is:

  • Where I get images, both free and paid services
  • How to make a graphic-text mash-up using Google Drive that will get noticed in the midst of noisy Facebook, Twitter and Google+ streams

 

Free Images and Paid Stock Photo Services

The stock photo site I use is Dreamstime.com because the price is right and the selection passes muster. If you use advanced search to set the price slider bar to the lowest setting, you’ll find images available for 1 credit in the extra small size. Extra small is usually around 480 px by 320 px, which is fine for both my purposes (blog posts and social media posts).

panda on Dreamstime stock photo service

Credits will run you $ 1.36 if you buy the smallest credit package to about $ 1 if you buy 120 credits at a time; 250+ credit packages save you even more cents.

Other Stock Photo Services

I checked out some stock photo site comparisons to get an idea of what else is out there and how they stack up. In 6 Stock Photography Services Compared I learned that Stock.xchng is the most popular free stock photo library, yet it has a limited selection. Among the most popular paid services, iStockphoto has the most massive library and Getty Images has a complicated pricing and licensing scheme.

Getting Images for Free Online

As long as you’re not looking for high-res or print quality images, you’ve got good free options online.

Creative Commons

When using images with Creative Commons licenses, the attribution requirement adds a hurdle to the graphic mash-up use for images I describe later since it adds another element to what must be included in the graphic. But, CC images are great for blog posts.

panda on flickr

This panda image has a Creative Commons license that requires attribution. Flickr makes it easy to post the image to your blog by copying code that includes the required attribution.

For a long time, I used Creative Commons licensed photos on Flickr that allow commercial use and derivatives. For use in blog posts, Flickr makes it easy to use Creative Commons licensed images, and the “share” function gives you HTML code including the required attribution. The Creative Commons site search includes Flickr, Google Images, Open Clip Art Library and Pixabay for images, and a number of media and music sources as well.

Author Elizabeth Jolley and (younger) sister Madelaine Winifred in the garden, 1927

This image was found in The Commons using Flickr search. It was taken in 1927 and is part of the State Library of New South Wales collection.

You can also search Wikimedia‘s library of free images, a collection with Creative Commons copyrights, free documentation licenses or no copyright.

For free images you can also search EveryStockPhoto.com, a search engine for free photos across a number of sources and including a variety of license types.

Public Domain

You can also search Flickr’s collection The Commons, images that have passed into the public domain and belong to everyone, mostly due to their being old. You’ll find awesome vintage photos, advertising, illustrations and art that have passed into public use and can give modern blog and social posts refreshing classic flare. Since they don’t have copyright or licensing requirements, you can use public domain images for the graphic mash-up use which we get into next!

 

The Graphic Text Mash-up Promo

This is my little trick for sharing blog posts on Twitter, Facebook and Google+ to get a little more attention than straight text updates.

As you may have noticed, recent layout updates to Facebook and Google+ have put an emphasis on visual media. Skyrocketing mobile use of Facebook, along with other social media apps, was a big reason behind Facebook’s update last March. Images show up larger in the News Feed and may also get priority in the ranking algo. An update to Google+ around the same time also made images feature more heavily. And in the endlessly updating churn of a Twitter stream, a picture attachment makes tweets stand out and, as pictures are worth a thousand words, lets you extend your message past 140 characters.

The graphic should include these three vital components:

  1. Image to grab fan/follower attention within a feed or stream
  2. A link to drive a viewer to your site
  3. Text that promises a payoff from clicking through

You can opt to include a logo for branding purposes as well. Note that if there’s text in the logo, it would add to your text to image ratio which Facebook limits to 20% for ads and promoted content. More on that below.

Creating A Graphic with Google Drive

I use the drawing function in Google Drive to add text on top of images. It’s super easy and Google gives you a ton of font options as well as shapes, arrows and call-outs you can add to the drawing. Here you can see a graphic mash-up I created last week to promote our Thank You page series.

elvis says thank you

I shared this image on the BCI Facebook and Twitter accounts to promote a 2-part series on Thank You pages optimization on the blog.

  1. Sign in to Google Drive at https://drive.google.com/ and create a Drawing.

  2. Insert an image that you own or one sanctioned for public use.

  3. Create a custom short link to the content. If you’ve got a registered Bitly account you can customize links, and in the Elvis example here you can see I created a custom link “typagecro,” which I chose to suggest “Thank You page CRO” (conversion rate optimization). Another bonus of a registered Bitly account is that you can track clicks on your short links.

  4. Insert text on top of the image. These are elements #2 and #3 in my list of three critical components.

    (#2) Include the custom short link, which a viewer can type into their address bar since it’s short and easy to understand. Of course, also include a hyperlink in the image caption or tweet.

    (#3) Include a promise of what’s to come in the full article, or hint at what the full content contains. If it’s a “Top 3 Reasons Why…” post, you may want include the three reasons right there in the image with an invitation to get all the info in the full post. In the Elvis example I included a brief description of what was covered in each of the two-parts of the Thank You page CRO series. Try to make this message seductive, whatever that means for you and your content.

  5. When the graphic is done, go to File > Download as > JPEG and save it.

A Quick Note About Design

I’ve taken one graphic design class, one web design class and a handful of painting and photography classes, so while I’m not a professional designer, I’ve been exposed to the rules of good composition. I think these are the basics to keep in mind when you’re creating mash-ups.

  • Make sure text is clearly legible. Black on white is best. White on black is hard for the eye to process. If text is anything besides dark text on light, not-busy background, make sure text is legible in other ways, such as increasing font thickness or putting a background color behind the text.
  • Use no more than two font types. At least one should be extremely easy to read; sans serif fonts are generally easier to read online than serif fonts. The other font can be stylized, used as an accent and in small amounts.
  • For the most part, text should align left. It’s hard for the eye to follow a ragged left edge
  • White space is a component of good design, especially in the modern aesthetic. While the graphic will likely be dense as you’re trying to communicate a lot in a little space, available white space should be a consideration in choosing the image.

If you want to get a background in some basics of design, I recommend Bootstrapping Design, a $ 39 ebook. It’s written for programmers, but I like it because it’s accessible design fundamentals for a non-artist set. Considering we’re in an age where everyone can publish online content, learning the basics of good design is an investment that will payoff.

Facebook Guidelines for Text in Images

12 percent of image is text

Acceptable

60 percent of image is text

Unacceptable

Shortly after Facebook’s update in March, it made a new rule limiting text in images used in ads, sponsored stories and Page cover photos to 20%. If you plan to “promote” the Facebook post including an image, pay to boost its visibility or turn it into an ad, the surface area of the image that includes text has to stay under 20%.

wall post on facebook

Text placement fail. The sides of landscape images are cropped in the viewable portion on a Facebook wall. Clicking on the image displays it in full for the viewer.

I’ll also note here that image posts as they’re displayed on a Facebook Wall favor portrait orientation and will cut off the left and right sides of landscape oriented images. If you’re using a landscape image, try to keep the text within the area that is “center square” to the height of the image. I’ve illustrated the center square in this drawing.

diagram of landscape image cut-off
Yep, made that in Google Drive, too.

Bruce Clay Blog

How to Build a Google Analytics Tracking Code

Posted by:  /  Tags: , , , ,

How to Build a Google Analytics Tracking Code was originally published on BruceClay.com, home of expert search engine optimization tips.

Need more input? As an optimizer who is regularly looking to learn more about how my recipients are interacting with content, I find myself regularly consuming analytics reports filled with Google Analytics tracking code data like Johnny Number 5 eats the Encyclopedia Britannica in the above clip from the 1986 gem Short Circuit.

Google Analytics tracking codes —  also know as custom campaigns or UTM codes — are custom tracking parameters that communicate to Google Analytics granular information about how your referral traffic is interacting with your calls to action. To implement a UTM tracking code simply add your desired parameters to the end of the URL you want to track insights for, like this:

http://www.YourWebsite.com/your-CRO-landing-page-article?utm_source=blog&utm_medium=viral&utm_campaign=CRO-0513-JThompson

UTM tracking codes can help you analyze traffic from banner ads, email newsletters, social media content, and any other campaign that links people to a property that you own (such as your website or your blog). You cannot use UTM tracking to analyze clicks to external websites, like YouTube or Link-To-Related-Content.com. To track click activity on links that send people to properties you don’t own, Bitly is a great free resource.

How To Put Together a Google Analytics Tracking Code

UTM_Source=Awesome Google Analytics tracking code parameter

Bruce Clay, Inc. does not recommend or condone using “awesome” as a Google Analytics UTM code parameter. (But we may or may not find it amusing.)

There are five possible parameters you can set for each UTM tracking code: Source, Medium, Campaign, Content and Term. You don’t have to use all of them. For this blog post I am going to show you show to create a UTM tracking code for a link that directs people from a blog post to a page on my website. To keep it simple, I am only going to discuss the parameters needed for this scenario — Source, Medium and Campaign.

Note: When and how to use Term and Content parameters is really a whole separate blog post; leave a comment if you are interested in seeing us write about it.

The Medium (&utm_Medium) is the most broad parameter and tells Google Analytics — big picture — how to classify the medium by which your link was presented to the user. For example, was the link presented in a Facebook wall post? Then the Medium might be “viral” because the link you posted to your Facebook wall is now spreading virally all over the Internet and, accordingly, was delivered via a “viral” medium. (If viral is too abstract for you, “social” could also work.) Was the link transmitted to the end user via an email newsletter? Then your Medium might be “email,” or even more specifically, “ConstantContact” or “CheetahMail” to identify the service that delivered your newsletter. In our example above, our link was a blog post, so we used &utm_medium=viral.

Getting one step more specific from Medium, the Source (&utm_Source=) tells Google Analytics where the click came from, where the person was when they clicked the link. In our example above (utm_source=blog) the person clicked on a link that was posted to my blog (so the Medium is “viral,” and the Source is “blog.”). Other Source options might include Twitter, Facebook or newsletter (Medium equals “email” and Source equals “newsletter”).

The Campaign parameter (&utm_Campaign=) is one step even more specific than Source, and the parameter where you can really start to get granular with your tracking. The Campaign is how you identify the specifics of a link, from the details of where it goes all the way down to the color and size of the call to action. In the example above I used &utm_campaign=CRO-JThompson-image because I wanted to identify which of my silos encouraged the most clicks, the longest time on site, and — at the other end of the spectrum — the most site exits. I also wanted to collect data to help me determine which of my authors are being read the most, and if an image call-to-action perform better than a text call to action. If this link was a banner ad I might have included the dimensions of the banner (for instance 320 or 160) to help determine which banner size encourages more clicks. If I wanted to test how well a link to free content performs versus how well a link to paid content performs I might have included “free” or “paid” as Campaign parameters.

Six Essential Google Analytics Tracking Code Details

  •  Every UTM tracking code starts with a question mark. For example: ?utm_. This question mark tells Google Analytics where your link URL ends and your tracking starts. If you don’t include the question mark Google will think your link is http://www.YourWebsite.com/your-CRO-landing-page-articleutm_source which, as an alteration of the URL permalink, will result in a 404 error. The question mark is important.
  • There are five possible parameters you can set for each UTM tracking code: Source, Medium, Campaign, Content and Term. The parameters you choose to use are strung together in one sentence (no spaces) and separated by ampersands (&). It doesn’t matter what order you list your parameters in, but your first parameter must start with a question mark and all the following parameters must start with ampersands. The & tells Google Analytics where one parameter ends and the next begins. If you forget the ampersand and write your code like &utm_medium=viralutm_campaign= Google Analytics will think that your Medium is “viralutm_campaign=” which, as you can imagine, will skew your Medium and Campaign data pretty badly.
  • Since the Google Analytics URL builder makes it easy for any of your team members to create and assign UTM tracking codes it is critical to have a discussion about UTM parameter conventions before anyone on your team starts creating UTM codes willy-nilly. I highly recommend creating a spreadsheet or other living document (a Google Drive spreadsheet works great) that clearly outlines conventions for Source, Medium, and Campaign. (If you are using Content and Term parameters regularly, make sure to add conventions for those parameters as well.) You may even consider taking your spreadsheet to the next level to establish a record of every link posted and its associated Campaign allocations. While a spreadsheet that documents every link your company pushes out is a larger commitment, these resources become invaluable as associates join and leave your team.

Note: If you’re crafty you’ve noticed the links in this blog post have not been amended to include Google Analytics UTMs. This is because the Bruce Clay, Inc. content team is  currently developing our analysis goals and tracking conventions. Since I am a data-hungry Johnny Number 5 monster I have been using Bitly as my personal one-man-band interim tracking convention because I can’t survive a minute without data. I do not recommend this as it’s not scalable long-term. 

  • UTM codes are case sensitive so Google Analytics will collect data for potatoes and Potatoes as two separate reports. This means, since Google Analytics does not have the human sensibility to tell you that there is a capitalized version of your Campaign floating around somewhere in your referral traffic data, you may be analyzing incomplete data if your team isn’t careful about capitalization.
  • Hyphens allow Google Analytics to understand each word individually; underscores are considered alphanumeric characters and connect words to make phrases (see dashes vs. underscores for more detail). For instance: sandals-coupon versus sandals_coupon. If you are building UTM codes for a newsletter send it might make sense to use an underscore to connect your newsletter identifier with the release date of the newsletter — for instance, DealerUpdates_2013July09-colorado. In this example you will be able to find data in Google Analytics for the specific term “DealerUpdates_2013July09” which will tell you exactly how that specific dealer updates newsletter that was sent out on July 9, 2013 performed. You are also able to analyze how every email sent to your Colorado demographic performed, but because “DealerUpdates_2013July09”and “Colorado” are separated by a hyphen the Colorado data will not be exclusive to the July 9 email.
  • Worth noting again, you must own a URL in order to attach UTM tracking to it. In other words, you can only use UTM tracking to assigned parameters to links that go to your properties — your website, your blog, your app, etc. You cannot use UTM tracking to analyze clicks that go to external properties like Facebook.com or Other-Website.com.

Why Use Tracking Codes?

I consistently use Google Analytics tracking codes to measure where my referral traffic is coming from, which of my initiatives are meeting traffic goals, how my target markets prefer to receive communication, and the ebb and flow of industry based on seasonality.

They give you a granular snapshot of your referral traffic, how your consumers (and potential-consumers) are interacting with the calls to action you’re putting out there, and they are a great way to quench an unrelenting need for specific ROI data.

Are you a Johnny Number 5? How have Google Analytics UTM codes made your life easier?

Bruce Clay Blog

Why the Yahoo! Search Revenue Gap Won’t Close

Posted by:  /  Tags: , , , ,

In spite of Yahoo! accepting revenue guarantees for another year from Microsoft, recently there has been speculation that Yahoo! might want to get out of their search ad deal with Microsoft. I am uncertain if the back channeled story is used as leverage to secure ongoing minimum revenue agreements, or if Yahoo! is trying to set the pretext narrative to later be able to push through a Google deal that might otherwise get blocked by regulators.

When mentioning Yahoo!’s relative under-performance on search, it would be helpful to point out the absurd amount of their “search” traffic from the golden years that was various forms of arbitrage. Part of the reason (likely the primary reason) Yahoo! took such a sharp nose dive in terms of search revenues (from $ 551 million per quarter to as low as $ 357 million per quarter) was that Microsoft used quality scores to price down the non-search arbitrage traffic streams & a lot of that incremental “search” volume Yahoo! had went away.

There were all sorts of issues in place that are rarely discussed. Exit traffic, unclosible windows, forcing numerous clicks, iframes in email spam, raw bot clicks, etc. … and some of this was tied to valuable keyword lists or specific juicy keywords. I am not saying that Google has outright avoided all arbitrage (Ask does boatloads of it in paid + organic & Google at one point tested doing some themselves on credit cards keywords) but it has generally been a sideshow at Google, whereas it was the main attraction at Yahoo!.

And that is what drove down Yahoo!’s click prices.

Yahoo! went from almost an “anything goes” approach to their ad feed syndication, to the point where they made a single syndication partner Cyberplex’s Tsavo Media pay them $ 4.8 million for low quality traffic. There were a number of other clawbacks that were not made public.

Given that we are talking $ 4.8 million for a single partner & this alleged overall revenue gap between Google AdWords & Bing Ads is somewhere in the $ 100 million or so range, these traffic quality issues & Microsoft cleaning up the whoring of the ad feed that Yahoo! partners were doing is a big deal. It had a big enough impact that it caused some of the biggest domain portfolios to shift from Yahoo! to Google. I am a bit surprised to see it so rarely mentioned in these discussions.

Few appreciate how absurd the abuses were. For years Yahoo! not only required you to buy syndication (they didn’t have a Yahoo!-only targeting option until 2010 & that only came about as a result of a lawsuit) but even when you blocked a scammy source of traffic, if that scammy source was redirecting through another URL you would have no way of blocking the actual source, as mentioned by Sean Turner:

To break it down, yahoo gives you a feed for seobook.com & you give me a feed for turner.com. But all links that are clicked on turner.com redirect through seobook.com so that it shows up in customer logs as seobook.com If you block seobook.com, it will block ads from seobook.com, but not turner.com. The blocked domain tool works on what domains display, not on where the feed is redirected through. So if you are a customer, there is no way to know that turner.com is sending traffic (since it’s redirecting through seobook.com) and no way to block it through seobook.com since that tool only works on the domain that is actually displaying it.

I found it because we kept getting traffic from gogogo.com. We had blocked it over and over and couldn’t figure out why they kept sending us traffic. We couldn’t find our ad on their site. I went to live.com and ran a site:gogogo.com search and found that it indexed some of those landing pages that use gogogo.com as a monetization service.

The other thing that isn’t mentioned is the longterm impact of a Yahoo! tie up with Google. Microsoft pays Yahoo! an 88% revenue share (and further guarantees on top of that), provides the organic listings free, manages all the technology, and allows Yahoo! to insert their own ads in the organic results.

If Bing were to exit the online ad market, maybe Yahoo! could make an extra $ 100 million in the first year of an ad deal with Google, but if there is little to no competition a few years down the road, then when it comes time for Yahoo! to negotiate revenue share rates with Google, you know Google would cram down a bigger rake.

This isn’t blind speculation or theory, but aligned with Google’s current practices. Look no further than Google’s current practices with YouTube, where “partners” are paid different rates & are forbidden to mention their rates publicly: “The Partner Program forbids participants to reveal specifics about their ad-share revenue.”

Transparency is a one way street.

Google further dips into leveraging that “home team always wins” mode of negotiating rates by directly investing in some of the aggregators/networks which offer sketchy confidential contracts < ahref=”http://obviouslybenhughes.com/post/13933948148/before-you-sign-that-machinima-contract-updated”>soaking the original content creators.:

As I said, the three images were posted on yfrog. They were screenshots of an apparently confidential conversation had between MrWonAnother and a partner support representative from Machinima, in which the representative explained that the partner was locked indefinitely into being a Machinima partner for the rest of eternity, as per signed contract. I found this relevant, informative and honestly shocking information and decided to repost the images to obviouslybenhughes.com in hopes that more people would become aware of the darker side of YouTube partnership networks.

Negotiating with a monopoly that controls the supply chain isn’t often a winning proposition over the long run.

Competition (or at least the credible risk of it) is required to shift the balance of power.

The flip side of the above situation – where competition does help market participants to get a better revenue share – can be seen in the performance of AOL in their ad negotiation in 2005. AOL’s credible threat to switched to Microsoft had Google invest a billion Dollars into AOL, where Google later had to write down $ 726 million of that investment. If there was no competition from Microsoft, AOL wouldn’t have received that $ 726 million (and likely would have had a lower revenue sharing rate and missed out on some of the promotional AdWords credits they received).

The same sort of “shifted balance of power” was seen in the Mozilla search renewal with Google, where Google paid Mozilla 3X as much due to a strong bid from Microsoft.

The iPad search results are becoming more like phone search results, where ads dominate the interface & a single organic result is above the fold. And Google pushed their “ehnanced” ad campaigns to try to push advertisers into paying higher ad rates on those clicks. It would be a boon for Google if they can force advertisers to pay the same CPC as desktop & couple it with that high mobile ad CTR.

Google owning Chrome + Android & doing deals with Apple + Mozilla means that it will be hard for either Microsoft or Yahoo! to substantially grow search marketshare. But if they partner with Google it will be a short term lift in revenues and dark clouds on the horizon.

I am not claiming that Microsoft is great for Yahoo!, or that they are somehow far better than Google, only that Yahoo! is in a far better position when they have multiple entities competing for their business (as highlighted in the above Mozilla & AOL examples).

SEO Book

How to Rank: 25 Step SEO Master Blueprint

Posted by:  /  Tags: , , ,

Posted by Cyrus Shepard

If you’re like most SEOs, you spend a lot of time reading. Over the past several years, I’ve spent 100s of hours studying blogs, guides, and Google patents. Not long ago, I realized that 90% of what I read each doesn’t change what I actually do – that is, the basic work of ranking a web page higher on Google.

For newer SEOs, the process can be overwhelming.

To simplify this process, I created this SEO blueprint. It’s meant as a framework for newer SEOs to build their own work on top of. This basic blueprint has helped, in one form or another, 100s of pages and dozens of sites to gain higher rankings.

Think of it as an intermediate SEO instruction manual, for beginners.

Level: Beginner to Intermediate

Timeframe: 2 to 10 Weeks

What you need to know: The blueprint assumes you have basic SEO knowledge: you’re not scared of title tags, can implement a rel=canonical, and you’ve built a link or two. (If this is your first time to the rodeo, we suggest reading the Beginners Guide to SEO and browsing our Learn SEO section.)

How To Rank SEO Blueprint

Table of Contents


Keyword Research

1. Working Smarter, Not Harder

Keyword research can be simple or hard, but it should always be fun. For the sake of the Blueprint, let’s do keyword research the easy way.

The biggest mistakes people make with keyword research are:

  1. Choosing keywords that are too broad
  2. Keywords with too much competition
  3. Keywords without enough traffic
  4. Keywords that don’t convert
  5. Trying to rank for one keyword at a time

The biggest mistake people make is trying to rank for a single keyword at a time. This is the hard way. It’s much easier, and much more profitable, to rank for 100s or even 1,000s of long tail keywords with the same piece of content.

Instead of ranking for a single keyword, let’s aim our project around a keyword theme.

2. Dream Your Keyword Theme

Using keyword themes solves a whole lot of problems. Instead of ranking for one Holy Grail keyword, a better goal is to rank for lots of keywords focused around a single idea. Done right, the results are amazing.

Easy Keyword Research

I assume you know enough about your business to understand what type of visitor you’re seeking and whether you’re looking for traffic, conversions, or both. Regardless, one simple rule holds true: the more specific you define your theme, the easier it is to rank.

This is basic stuff, but it bears repeating. If your topic is the football, you’ll find it hard to rank for  “Super Bowl,” but slightly easier to rank for “Super Bowl 2014” – and easier yet to rank for “Best Super Bowl Recipes of 2014.”

Don’t focus on specific words yet – all you need to know is your broad topic. The next step is to find the right keyword qualifiers.

3. Get Specific with Qualifiers

Qualifiers are words that add specificity to keywords and define intent. They take many different forms.

  • Time/Date: 2001, December, Morning
  • Price/Quality: Cheap, Best, Most Popular
  • Intent: Buy, Shop, Find
  • Location: Houston, Outdoors, Online

The idea is to find as many qualifiers as possible that fit your audience. Here’s where keyword tools enter the picture. You can use any keyword tool you like, but favorites include Wordstream, Keyword Spy, SpyFu, and Bing Keyword Tool and Übersuggest.

For speed and real-world insight, Übersuggest is an all-time SEO favorite. Run a simple query and export over 100 suggested keyword based on Google’s own Autocomplete feature – based on actual Google searches.

Did I mention it’s free?

4. Finding Diamonds in the Google Rough

At this point you have a few dozen, or a few hundred keywords to pull into Google Adwords Keyword Tool.

Pro Tip #1: While it’s possible to run over a hundred keyword phrases at once in Google’s Keyword Tool, you get more variety if you limit your searches to 5-10 at a time.

Ubersuggest and Google Keyword Tool

Using “Exact” search types and “Local Monthly” search volume, we’re looking for 10-15 closely related keyword phrases with decent search volume, but not too much competition.

Pro Tip #2: Be careful trusting the “Competition” column in Google Adwords Keyword Tool. This refers to bids on paid search terms, not organic search.

5. Get Strategic with the Competition

Now that we have a basic keyword set, you need to find out if you can actually rank for your phrases. You have two basic methods of ranking the competition:

  1. Automated tools like the Keyword Difficulty Tool
  2. Eyeballing the SERPs

If you have an SEOmoz PRO membership (or even a free trial) the Keyword Difficulty Tool calculates – on a 100 point scale – a difficulty score for each individual keyword phrase you enter.

Keyword Difficulty Tool

Keyword phrases in the 60-70+ range are typically competitive, while keywords in the 30-40 range might be considered low to moderately difficult.

To get a better idea of your own strengths, take the most competitive keyword you currently rank #1 or #2 for, and run it through the tool.

Even without automated tools, the best way to size up the competition is to eyeball the SERPs. Run a search query (non-personalized) for your keywords and ask yourself the following questions:

  • Are the first few results optimized for the keyword?
  • Is the keyword in the title tag? In the URL? On the page?
  • What’s the Page and/or Domain Authority of the URL?
  • Are the first few results authorities on the keyword subject?
  • What’s the inbound anchor text?
  • Can you deliver a higher quality resource for this keyword?

You don’t actually have to rank #1 for any of your chosen words to earn traffic, but you should be comfortable cracking the top five.

With keyword themes, the magic often happens from keywords you never even thought about.

Case Study: Google Algo Update

When SEOmoz launched the Google Algorithm Change HIstory (run by Dr. Pete) we used a similar process for keyword research to explore the theme “Google Algorithm” and more specifically, “Google Algorithm Change.”

According to Google’s search tool, we could expect a no more than a couple thousand visits a month – best case – for these exact terms. Fortunately, because the project was well received and because we optimized around a broad keyword theme of “Google Algorithm,” the Algo Update receives lots of traffic outside our pre-defined keywords.

This is where the long tail magic happens:

Long Tail Keywords

How can you improve your chances of ranking for more long tail keywords? Let’s talk about content, architecture, on-page optimization and link building.


Content

6. Creating Value

Want to know the truth? I hate the word content. It implies words on a page, a commodity to be produced, separated from the value it creates.

Content without value is spam.

In the Google Algorithm Update example above, we could have simply written 100 articles about Google’s Algorithm and hoped to rank. Instead, the conversation started by asking how we could create a valuable resource for webmasters.

For your keyword theme, ask first how you can create value.

Value is harder to produce than mere words, but value is rewarded 100x more. Value is future proof & algorithm proof. Value builds links by itself. Value creates loyal fans.

Value takes different forms. It’s a mix of:

  1. Utility
  2. Emotional response
  3. Point of view (positive or negative)
  4. Perceived value, including fame of the author

Your content doesn’t have to include all 4 of these characteristics, but it should excel in one or more to be successful.

A study of the New York Times found key characteristics of content to be influential in making the Most Emailed list.

New York Times Most Emailed
Source: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1528077\

7. Driving Your Content Vehicle

Here’s a preview: the Blueprint requires you create at least one type of link bait, so now is a good time to think about the structure of your content.

What’s the best way to deliver value given your theme? Perhaps it’s an

  • Infographic
  • Video series
  • A new tool
  • An interview series
  • Slide deck
  • How-to guide
  • Q&A
  • Webinar or simple blog post

Perhaps, it’s all of these combined.

The more ways you find to deliver your content and the more channels you take advantage of, the better off you’ll be.

Not all of your content has to go viral, but you want to create at least one “tent-pole” piece that’s better than anything else out there and you’re proud to hang your hat on.

If you need inspiration, check out Distilled's guide to Viral Linkbait or QuickSprout’s Templates for Content Creation.

8. Title – Most Important Work Goes Here

Spend two hours, minimum, writing your title.

Sound ridiculous? If you’re an experienced title writer like Rand Fishkin, you can break this rule. For the rest of us, it’s difficult to underplay the value delivered by a finely crafted title.

Write 50 titles or more before choosing one.

Study the successful titles on Inbound.org, Mashable, Wired, or your favorite publication.

Headline Formulas Work

Whatever you do, read this fantastic post by Dan Shure and the headline resources at CopyBlogger.

9. Length vs. Depth – Why it Matters

How long should your content be? A better question is: How deep should it be? Word count by itself is a terrible metric to strive for, but depth of content helps you to rank in several ways.

  1. Adds uniqueness threshold to avoid duplicate content
  2. Deeper topic exploration makes your content “about” more
  3. Quality, longer content is correlated with more links and higher rankings

I. Uniqueness

At a minimum, your content needs to meet a minimum uniqueness threshold in order for it to rank. Google reps have gone on record to say a couple sentences is sometimes sufficient, but in reality a couple hundred words is much safer.

II. Long Tail Opportunities

Here’s where the real magic happens. The deeper your content and the more in-depth you can explore a particular topic, the more your content becomes “about.”

The more your content is “about”, the more search queries it can answer well.

The more search queries you can answer well, the more traffic you can earn.

Google’s crawlers continuously read your content to determine how relevant it is to search queries. They evaluate paragraphs, subject headings, photographs and more to try to understand your page. Longer, in-depth content usually send more relevancy signals than a couple short sentences.

III. Depth, Length, and Links

Numerous correlation studies have shown a positive relationship between rankings and number of words in a document.

“The length in HTML and the HTML within the <body> tag were the highest correlated factors, in fact with correlations of .12 they could be considered somewhat if not hugely significant.

While these factors probably are not implemented within the algorithm, they are good signs of what Google is looking for; quality content, which in many cases means long or at least sufficiently lengthy pages.”

– Mark Collier The Open Algorithm

This could be attributed longer, quality content earning more links. John Doherty examined the relationship between the length of blog posts on SEOmoz and the number of links each post earned, and found a strong relationship.

Links based on wordcount

10. Content Qualities You Can Bank On

If you don’t focus on word count, how do you add quality “depth” to your content?

SEOs have written volumes about how Google might define quality including metrics such as reading level, grammar, spelling, and even Author Rank. Most is speculation, but it’s clear Google does use guidelines to separate good content from bad.

My favorite source for clues comes from the set of questions Google published shortly after the first Panda update. Here are a few of my favorites.

Google Panda Questions

11. LDA, nTopic, and Words on the Page

Google is a machine. It can’t yet understand your page like a human can, but it’s getting close.

Search engines use sophisticated algorithms to model your sentences, paragraphs, blocks, and content sections. Not only do they want to understand your keywords, but also your topic, intent, and expertise as well.

How do you know if your content fits Google’s model of expectations?

For example, if your topic is “Super Bowl Recipes,” Google might expect to see content about grilling, appetizers, and guacamole. Content that addresses these topics will likely rank higher than pages that talk about what color socks you’re wearing today.

Words matter.

SEOs have discovered that using certain words around a topic associated with concepts like LDA and nTopic are correlated with higher rankings.

Virante offers an interesting stand alone keyword suggestion tool called nTopic. The tools analyzes your keywords and suggests related keywords to improve your relevancy scores.

nTopic

12. Better than LDA – Poor Man's Topic Modeling

Since we don’t have access to Google’s computers for topic modeling, there’s a far simpler way to structure your content that I find far superior to worrying about individual words:

Use the keyword themes you created at the beginning of this blueprint.

You’ve already done the research using Google’s keyword tool to find closely related keyword groups. Incorporating these topics into your content may help increase your relevancy to your given topic.

Example: Using the Google Algorithm project cited above, we found during keyword research that certain keywords related to our theme show up repeatedly, time and time again. If we conducted this research today, we would find phrases like “Penguin SEO” and “Panda Updates” frequently in our results.

Google suggests these terms via the keyword tool because they consider them closely related. So any content that explored “Google Algorithm Change” might likely include a discussion of these ideas.

Poor Man's Topic Modeling

Note: This isn't real LDA, simply a way of adding relevant topics to your content that Google might associate with your subject matter.

13. Design Is 50% of the Battle

If you have any money in your budget, spend it on design. A small investment with a designer typically pays outsized dividends down the road. Good design can:

  • Lower bounce rate
  • Increase page views
  • Increase time on site
  • Earn more links
  • Establish trust

… All of which can help earn higher rankings.

“Design doesn’t just matter, it’s 50% of the battle.”
-Rand Fishkin

Dribbble.com

Dribbble.com is one of our favorite source of design inspiration.


Architecture

Here’s the special secret of the SEO Blueprint: you’re not making a single page to rank; you’re making several.

14. Content Hubs

Very few successful websites consist of a single page. Google determines context and relevancy not only by what’s on your page, but also by the pages around it and linking to it.

The truth is, it’s far easier to rank when you create Content Hubs exploring several topics in depth focused around a central theme.

Using our “Super Bowl Recipes” example, we might create a complete section of pages, each exploring a different recipe in depth.

Content Hub for SEO

15. Linking the Hub Together

Because your pages now explore different aspects of the same broad topic, it makes sense to link them together.

  • Your page about guacamole relates to your page about nachos.
  • Your page about link building relates to your page about infographics.
  • Your page about Winston Churchill relates to major figures of World War II.

Linking Your Content Hub

It also helps them to rank by distributing PageRank, anchor text, and other relevancy signals.

16. Find Your Center

Content Hubs work best with a “hub” or center. Think of the center as the master document that acts as an overview or gateway to all of your individual content pages.

The hub is the authority page. Often, the hub is a link bait page or a category level page. It’s typically the page with the most inbound links and often as a landing page for other sections of your site.

Center of the SEO  Content Hub

For great example of Hub Pages, check out:


On-Page Optimization

17. Master the Basics

You could write an entire book about on-page optimization. If you’re new to SEO, one of the best ways to learn is by using SEOmoz’s On-page Report Card (free, registration required) The tool grades 36 separate on-page SEO elements, gives you a report and suggestions on how to fix each element. Working your way through these issues is an excellent way to learn (and often used by agencies and companies as a way to teach SEO principals)

On-Page Tool

Beyond the basics, let’s address a few slightly more advanced tactics to take advantage of your unique keyword themes and hub pages, in addition to areas where beginners often make mistakes.

18. Linking Internally for the Reasonable Surfer

Not all links are created equal (One of the greatest SEO blog posts ever written!) So, when you interlink your internal pages within your content hub together, keep in mind a few important points.

  1. Links from inside unique content pass more value than navigation links.
  2. Links higher up the page pass more value than links further down.
  3. Links in HTML text pass more weight than image links.

When interlinking your content, it’s best to keep links prominent and “editorial” – naturally link to your most important content pages higher up in the HTML text.

19. Diversify Your Anchor Text – Naturally

If Google’s Penguin update taught us anything, it’s that over-thinking anchor text is bound to get us in trouble.

When you link naturally and editorially to other places on the web, you naturally diversify your anchor text. The same should hold true when you link internally.

Don’t choose your anchor text to fit your keywords; choose your anchor text to fit the content around it.

Practically speaking, this means linking internally with a mix of partial match keyword and related phrases. Don’t be scared to link occasionally without good keywords in the anchor – the link can still pass relevancy signals. When it comes to linking, it’s safer to under-do it than over-do it.

Choose Descriptive Anchor Text

Souce: Google's SEO Starter Guide

20. Title Tags – Two Quick Tips

We assume you know how to write a compelling title tag. Even today, keyword usage in the title tag is one of the most highly correlated on-page ranking factors that we know.

That said, Google is getting strict about over-optimizing title tags, and appears to be further cracking down on titles “written for SEO.” Keep this in mind when crafting your title tags

I. Avoid Boilerplates

It used to be common to tack on your business phrase or main keywords to the end of every title tag, like so:

  • Plumbing Supplies – Chicago Plumbing and Fixtures
  • Pipes & Fittings – Chicago Plumbing and Fixtures
  • Toilet Seat Covers – Chicago Plumbing and Fixtures

While we don’t have much solid data, many SEOs are now asserting that “boilerplate” titles tacked on to the end of every tag are no longer a good idea. Brand names and unique descriptive information is okay, but making every title as unique as possible is the rule of the day.

II. Avoid Unnecessary Repetition

Google also appears (at least to many SEOs) on what’s considered the lower threshold of “keyword stuffing.”

In years past it was a common rule of thumb never to repeat your keyword more than twice in the title. Today, to be on the safe side, you might be best to consider not repeating your keywords more than once.

21. Over-Optimization: Titles, URLs, and Links

Writing for humans not only gets you more clicks (which can lead to higher rankings), but hardly ever gets you in trouble with search engines.

As SEOs we're often tempted to get a "perfect score" which means exactly matching our title tags, URLs, inbound anchor text, and more. unfortunately, this isn't natural in the real world, and Google recognizes this.

Diversify. Don’t over-optimize.

22. Structured Data

Short and simple: Make structured data part of every webpage. While structured data hasn’t yet proven to be a large ranking factor, it’s future-facing value can be seen today in rich snippet SERPs and social media sharing. In some verticals, it’s an absolute necessity.

rich snippets

There’s no rule of thumb about what structured data to include, but the essentials are:

  • Facebook Open Graph tags
  • Twitter Cards
  • Authorship
  • Publisher
  • Business information
  • Reviews
  • Events

To be honest, if you’re not creating pages with structured data, you’re probably behind the times.

For an excellent guide about Micro Data and Schema.org, check out this fantastic resource from SEOGadget.


Building Links

23. The 90/10 Rule of Link Building

This blueprint contains 25 steps to rank your content, but only the last three address link building. Why so few? Because 90% of your effort should go into creating great content, and 10% into link building.

If you have a hard time building links, it may be because you have these numbers reversed.

Creating great content first solves a ton of problems down the line:

  1. Good content makes link building easier
  2. Attracts higher quality links in less time
  3. Builds links on its own even when sleeping or on vacation

If you’re new to marketing or relatively unknown, you may need to spend more than 10% of your time building relationships, but don’t let that distract you from crafting the type of content that folks find so valuable they link to you without you even asking.

90-10 Rule of Link Building

24. All Link Building is Relationships – Good & Bad

This blueprint doesn't go into link building specifics, as there are 100's of ways to build quality links to every good project. That said, a few of my must link building resources:

  1. Jon Cooper's Complete List of Link Building Strategies
  2. StumbleUpon Paid Discovery
  3. Citation Labs
  4. Promoted Tweets
  5. Ontolo
  6. eReleases – Press releases not for links, but for exposer
  7. BuzzStream
  8. Paddy Moogan's excellent Link Building Book

These resources give you the basic tools and tactics for a successful link building campaign, but keep in mind that all good link building is relationship building.

Successful link builders understand this and foster each relationship and connection. Even a simple outreach letter can be elevated to an advanced form of relationship building with a little effort, as this Whiteboard Friday by Rand so graciously illustrates.
 


 

25. Tier Your Link Building… Forever

The truth is, for professionals, link building never ends. Each content and link building campaign layers on top of previous content, and the web as a whole like layers of fine Greek baklava.

For example, this post could be considered linkbait for SEOmoz, but it also links generously to several other content pieces within the Moz family, and externally as well; spreading both the link love and the relationship building as far as possible at the same time.

SEOmoz links generously to other sites: the link building experience is not just about search engines, but the people experience, as well. We link to great resources, and build links for the best user experience possible. When done right, the search engines reward exactly this type of experience with higher rankings.

For an excellent explanation as to why you should link out to external sites when warranted, read AJ Kohns excellent work, Time to Long Click.

One of my favorite posts on SEOmoz was 10 Ugly SEO Tools that Actually Rock. Not only was the first link on the page directed to our own SEO tools, but we linked and praised our competitors as well.

Linkbait at its finest.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


SEOmoz Daily SEO Blog

How to Create a Thank You Page Call To Action

Posted by:  /  Tags: , , , ,

How to Create a Thank You Page Call To Action was originally published on BruceClay.com, home of expert search engine optimization tips.

This is a two-part series on Thank You page conversion rate optimization.

Part 1: How to Create a Thank You Page that Engages and Converts

Part 2: How to Create a Thank You Page Call To Action

If you read the first installment in my two-part How to Create a Thank You Page That Engages and Converts series you know that Thank You pages are the pages that leads (aka, potential customers) are directed to after they complete a designated task.

Emergen-C Thank You Page Example

This Emergen-C Thank You page is spot-on with a personality-rich message, social share opt-in buttons, plus several call to action links that direct leads to browse products, download coupons, explore retail locations, and share their own story.

And you also understand that by taking the action that led to the Thank You page your customer, or potential customer, has basically tapped you on the shoulder to let you know they are interested in your product or service.

Now what?

Now is your chance to make an impression and inspire action that keeps your lead engaged. Now is your chance to funnel traffic to the pages you want them to see, your chance to use persona information to offer exactly the right offer at the right time, and your chance to bridge the gap between you and your leads with social media opt-ins that keep you connected long after they’ve left the site and forgotten about your form.

Now it’s all about the call to action.

The first post in this series established the foundation for creating a Thank You page that converts. Taking that foundation to the next level, in this post I use an “if this, than that” format to show you how to use your goals (the decisions your business has made about what they want to get out of the page) to craft compelling calls to action that encourage leads to take the next steps that you want them to take.

 

Creating Goal-Based Thank You Page Calls To Action

Again, what call to action you choose will depend 100 percent on your unique brand goals. In my opinion the best “best practice” for Thank You page calls to action is to be thoughtful about what you want to accomplish up front and then purposeful about how you direct the attention of your leads toward actions that help accomplish these goals.

That said, without further ado, here are four example goals, and correlating calls to action you might consider for each scenario.

Content Marketing Institute Thank You Page Example

Here, the Content Marketing Institute says thank you for signing up for their email list with an ebook download and keeps you engaged with four links to read more popular content.

Goal Scenario One: Keep them on your website and engaged with content right now.

Call to Action Option A: Offer links to three content pages that you want to see have increased traffic numbers, or three of your most popular pieces of content.

When selecting your content links you might also consider offering a variety of content that represents the interests of different market segments to help guide your persona research. For instance, if your target markets include PPC, SEO, and PR professionals include three articles — one PPC-focused, one SEO-focused, and one PR-focused — then analyze which link gets clicked on the most to help inform your persona research. If your leads are all clicking on the PPC article and no one clicks on the PR article then you can begin to make some “people who take this action are more interested in this topic/product/etc.” correlations. For instance, “people who sign up for the newsletter are more interested in PPC than PR.” You can use this information to help you choose audience-relevant links to include on your Thank You page, and to guide your Internet marketing optimization strategy as a whole.

Advanced option: Establish several Thank You pages that each correlate to targeted entry points for a more custom page experience. For instance, a Request a Quote button on your Denver page that links to a Denver-specific Thank You page, and a Request a Quote button on your Michigan page that links to a Michigan-specific Thank You page.

This strategy will allow you to use entry-point information to create content calls to action that are based more closely on the interests of the lead and what they were thinking about when they filled out the form. For instance, if your lead just requested a quote for home owners insurance from the Denver insurance page, you may offer them links to read articles about Denver home safety or natural disaster prevention. (This is where really knowing your demographic and their needs can be incredibly beneficial. If you own an insurance company that focuses on the city of Denver, you should have a better idea of their needs and interests than I do…)

 

Goal Scenario Two: Keep them engaged with your brand and your content offline.

Call to action: Use your persona information (i.e., the information you know about who your customer is, how they communicate, why they filled out the form, and what their needs are) to offer them a content download (usually an ebook, PDF, or a self-contained slide-deck presentation).

 

Goal Scenario Three: Encourage them to make a purchase.

Example of an Amazon.com Thank You Page

Setting the bar high, on this post-purchase Thank You page Amazon gives order details, offers social sharing options, and includes links to related products to keep you shopping and engaged.

Call to action option A: Offer them a coupon code to encourage online shopping or a physical coupon download if you want to encourage brick-and-mortar sales. Sometimes making the coupon a limited-time offer helps motivate immediate action. For example, a 30 percent off coupon that expires in 48 hours. If you set up entry-specific Thank You pages, this is a great time to use what you already know about your lead’s interests to offer him or her the perfect deal. For instance, if they signed up for your newsletter through your snowboards page, you know they are interested in winter sports so you can offer them a special coupon good for 50 perfect off any [insert winter sport item that you want to push sales numbers up for].

Call to action option B: Offer them links to product landing pages on your website that you’d like to see increased traffic to, or take this opportunity to plug promotions, outlets, or other sales you have going on. I’ll say it again: Thank You pages are a great opportunity to funnel traffic! Not only do you get to narrow the traffic focus to three options of your choosing, but you have a better chance of seeing conversion from qualified leads who have already expressed an interest in your product.

 

Toms Thank You Page Lightbox

Tom’s says thank you, extends a Keep Shopping call to action to keep you on the website, and asks you to Stay Connected with three straight-forward social media opt-in buttons.

Goal Scenario Four: Extend your marketing reach to keep in touch with them after they leave the website.

Call to action: Ask them to follow you on Twitter, Like you on Facebook, sign up for your blog RSS feed, etc. Make sure to use active language and highlight specific benefits whenever possible. For instance, “Follow us on Facebook for weekly tips and tricks.” Remember never to promise anything you can’t actually deliver on. Something broad and actionable like “Connect with us on Facebook and Twitter” also works.

It is also important to keep in mind that you have to make it easy for them to take action. One click opt-in is ideal. There are many websites that offer free social media buttons that can be easily added to your website with plug-and-play code.

 

This sampling only represents the tip of the goal-and-call-to-action iceberg. What calls to action are working for your Thank You pages? Do you have any favorites you’d like to share?

Bruce Clay Blog

Page 2 of 19 12345...»