SEO Blog

Author Archive


Why Google Analytics Tagging Matters – Whiteboard Friday

Posted by:  /  Tags: , , , , ,

Posted by RachaelGerson

When Google Analytics doesn't know where a traffic source comes from, it assumes the traffic is direct and lumps it in with your direct visits. This happens frequenly with social shares, as many of us make the mistake of not tagging our links accordingly.

In today's Whiteboard Friday, Rachael Gerson sheds some light on "dark social" and explains why tagging in Google Analytics improves the accuracy of your referrals. Take credit for the work that you're doing, and tag your links!

 

 

Video Transcription

"Hi, everyone. I'm Rachael Gerson. I'm the head of analytics at SEER Interactive. We're a digital marketing agency in Philadelphia, although we are growing and spreading across the world. Although we're primarily known for our SEO, we actually have an amazing paid search team and a really talented analytics team. I want to share our story with you. The timing on this story is actually really convenient because it ties with what I wanted to talk to you about.

My sister wrote a blog post last night. She has a new blog. No one ever goes to it. I think I may be the only person who knows it exists. She wrote the post. I read it this morning and went, "This is really good content. I'm going to share this." And I put it out on Twitter.

She saw me share it, and she put it on Facebook and thought, "Okay. Let's see what happens." In the last 8 hours, she's gotten 74,000 page views to this one blog post. I'm looking at the real-time traffic right now, down here. There are 1,500 people on the site. This thing is blowing up. It's going viral.

We can see it spreading through Twitter. We can see it spreading through Facebook. We can see it being referred by random sites, but we're also seeing a lot of traffic come in as direct. Since no one knows this blog exists, I highly doubt they're typing in the 40 plus characters of the URL to go directly to this page. They're not. It's being shared socially. This is the idea of dark social.

It's not a new idea, but it's a fascinating idea, and that's what I wanted to talk to you about today, was this idea of dark social, that content spreads, if it's good content, socially, organically.

Dark social sounds like a bad thing. It's not. It's actually really awesome and really fun to dig into. Let's say that someone read this post earlier, and they shared it on Twitter, Facebook, whatever. We kind of know where that came from for the most part. They may have texted it to a friend or copied a link and sent it in chat. In both cases, when the person clicks on the link and goes to the site, they come in as direct.

Direct is Google Analytics' version of, "We have no idea what this is, so let's call it direct and throw it in that bucket." We know it's not direct. That's our dark, organic social. It's spreading organically in all different ways, and we're getting traffic because of it. It's pretty amazing.

I wanted to talk to you about the analysis I'm doing on the dark social side because it's really fun stuff. Unfortunately, in talking to a lot of people, I found they're not there yet.

Here's the problem. When we say direct it's our catchall bucket and we need to look at direct to get an idea of our dark social, organic social, whatever we want to call it, if things are not tagged properly, we can't dig into to what's [out] to this dark social side. Actually, we can't do anything. If things aren't tagged properly, you're not taking credit for the work that you're doing.

For your paid search, for your social media, for email marketing, whatever it is, you have to tag your links. Otherwise, you're not getting credit for the work that you're doing.

You know what really sucks, by the way? When you work really hard on a project and, at the last second, your boss takes credit for it. That was your project. You did all the work for it. Why is he taking your credit? It sucks!

What we're talking about right now is the digital marketing version of that. It's the online version, where you're giving your credit away for the work that you're doing. Honestly, you need that credit to keep your budget, to keep your job, to get a promotion, to get any of these things. You need to prove your value.

When we talk about tagging, it's using UTM parameters. Dark social, organic social, that's really sexy. It's fun. We can dig into that. UTM parameters are not sexy. They're not fun, but they're necessary. If you're not doing this, you're wasting your time and you're wasting your money. Now that sucks.

How are you wasting your time? If you're not doing this, you're putting all kinds of time, hopefully, into analysis, if you're looking at what you're doing, but your analysis is based on data that's not accurate. You're putting your time into marketing efforts that may not actually be working as well as you think they are. You're putting your money into marketing efforts. You need to know that your stuff's actually working. Keep doing that. Make your well-informed decisions to help the business and drive it forward.

Again, time is money. You need to make sure you get all this stuff right, so you can do all the other stuff.

Let's talk about a few examples of where tagging actually matters. If we're looking at Twitter, if you don't tag your links, things will still come in. You'll see t.co showing up. In your real-time traffic, you'll see Twitter as social coming in, and you'll see some of that in your multi-channel funnels as well.

If you tag your links, you're going to always know it's Twitter. You're going to know which campaign it was. You're going to know all the information you put into it. You're also going to be protected from the other side of it. That's when people use Twitter apps. For example, HootSuite doesn't come in as Twitter unless you've tagged it. People clicking on a link that you post on Twitter that's untagged in HootSuite are going to come in as HootSuite referral usually.

If you posted on TweetDeck, they're coming in as direct. By the way, I'm still playing with all of this, and it all changes. I've played with stuff that's changed before. So if this is different by the time it comes out, I apologize. Just keep up with it all the time.

That's our Twitter side. On Facebook, if we don't tag our links, they'll come in as Facebook referral. It's nice and easy. It's clean. We know what it is. The exception to that is if someone's trying to open a link in Facebook, they click on the link, it doesn't load fast enough, they're probably going to click Open in Safari if they really care about it. Once they open in Safari, that's a direct visit. We just lost the Facebook tracking in it.

There're also a missing piece here, and that's if you do tag this stuff, you get an extra level to your analysis. You can say, "This is all the same campaign. It's the same effort, same content." You can tie it together across all these different platforms, and that helps.

We get to email. If you're putting time and money into your email marketing, you want to take your credit for it. If you're not tagging your email, it's usually going to come in one of two ways:  One as a referral from all the different mail things that can come in or as direct.

At least with the mail, where is says mail.yahoo.whatever, we know it's mail. We can't track it down to what you did versus what someone sent. We have some analysis on it. If it's direct, you lose everything. So tag your email.

Paid search. It's nice. AdWords actually makes it really easy for us to tag our paid search. We can connect Google Analytics and AdWords very easily, and they play really well together. It's awesome. The problem is when you don't tag your stuff. If you don't tag your paid search, either through AdWords or through your manual tracking parameters on other platforms as well, it comes in as organic.

This actually happened to us at SEER. One of our SEO clients, we were watching their traffic, and organic traffic spiked. The account manager went, "Hey, guys, this is awesome." To which the client responded, "Oh, we forgot to tell you we launched paid search," and the account manager discovered they weren't tagging their paid search. This paid search manager accidentally just gave away their credit. We don't want to have that happen.

Let's say you've actually tagged everything properly in your URLs. All this is done. These are just a few examples, but all of the other stuff is taken care of. Let's look at the tracking on the site itself. We see this happen pretty often with paid search landing pages, where we have to put this on our checklist that this is done immediately.

We'll create brand new landing pages that are optimized for paid search for conversion. They're different from the rest of the site. They're a totally new template, which means that if the Google Analytics code is in a template already for the site, it may not be in here. If we don't have someone add it back in, what's going to happen is paid search will drive all this traffic to the site, they'll get to that page, go to page two. Page two has the Google Analytics code, but they don't know where it came from. This is going to show up as direct. Paid search just gave away their credit. We can't have that happen. You worked too hard for that credit.

I've also seen it where people make little mistakes with the tracking on the site. Spotify did this a few months ago, and I sent them a message to help them out with it. They were tagging all of the links on their site with UTM parameters. When visitors would hit those different links, they'd reset the visit ,and it would be a new visit with each one. Spotify, all their marketers were giving away their credit through that.

Let's say you've got all this other stuff right. Good job. That's awesome. There's still stuff that you can't control unfortunately. There are a lot of things that can cause traffic to come in as direct when it really isn't. I have a short list that people have been adding to at [bitly/direct-rome]. If you have others, keep adding them because I want to have a giant list of all the things we can tackle and fix, but the list just keeps growing.

If you look at mobile traffic, for example, iOS 6, we can't tell if it's search or if it's direct. That's a problem. For me, if I'm doing an analysis and I really need that part, or I really need to know that part for sure, I may cut that out so it's not throwing off my data. There are different ways to deal with that, and that's a whole other topic.

The point is control whatever you can. Where you control the spread of information, make sure you're doing your part. If you're sharing a link socially, tag your links. That way, if people want to share it or retweet it, the tracking is already in place there. If your posts on the site have social plugins, put the tracking in your social plugins too. It makes it easy if someone wants to hit the share on Facebook or to share on Twitter. It already has the tracking. It goes through, people get to the site, your tracking's in place, and you can breathe a sigh of relief.

Now once you've done everything else up here, your tagging is right on your URLs, your tracking is right on the site, there's nothing you messed up by accident, you've controlled everything you can with these other issues, you kind of have to accept what's left. You know that there's stuff that you can't account for. There's direct in there that may have been shared through a text, through a chat, through any other thing. You don't know where it actually came from.

First off, that gets a dark social. We can now start doing our awesome analysis, like dark social or other things, because we have confidence in our data. We can trust that we're making the right decisions for our business, and we can save our time and our money this way.

If you have questions or thoughts, hit me up on Twitter or in the comments below, because I love talking about this stuff. Maybe another time, we'll talk about this organic social idea."

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


SEOmoz Daily SEO Blog

Post-Panda: Data Driven Search Marketing

Posted by:  /  Tags: , , , ,

Now is the best and exciting time to be in marketing. The new data-driven approaches and infrastructure to collect customer data are truly changing the marketing game, and there is incredible opportunity for those who act upon the new insights the data provides” – Mark Jeffrey, Kellog School Of Management

I think Jeffries is right – now is one of the best and exciting times to be in marketing!

It is now cheap and easy to measure marketing performance, so we are better able to spot and seize marketing opportunities. If we collect and analyze the right data, we will make better decisions, and increase the likelihood of success.

As Google makes their system harder to game using brute force tactics, the next generation of search marketing will be tightly integrated with traditional marketing metrics such as customer retention, churn, profitability, and customer lifetime value. If each visitor is going to be more expensive to acquire, then we need to make sure those visitors are worthwhile, and the more we engage visitors post-click, the more relevant our sites will appear to Google.

We’ll look at some important metrics to track and act upon.

But first….

Data-Driven Playing Field

There is another good reason why data-driven thinking should be something every search marketer should know about, even if some search marketers choose to take a different approach.

Google is a data-driven company.

If you want to figure out what Google is going to do next, then you need to think like a Googler.
Googlers think about – and act upon – data.

`

Douglas Bowman, a designer at Google, left the company because he felt they placed too much reliance on data over intuition when it came to visual design decisions.

Yes, it’s true that a team at Google couldn’t decide between two blues, so they’re testing 41 shades between each blue to see which one performs better. I had a recent debate over whether a border should be 3, 4 or 5 pixels wide, and was asked to prove my case. I can’t operate in an environment like that. I’ve grown tired of debating such miniscule design decisions. There are more exciting design problems in this world to tackle

Regardless of whether you think acting on data or intuition is the right idea, if you can relate to the data-driven mindset and the company culture that results, you will better understand Google. Searcher satisfaction metrics are writ-large on Google’s radar and they will only get more refined and granular as time goes on.

Update Panda was all about user engagement issues. If a site does not engage users, it is less likely to rank well.

As Jim Boykin notes, Google are interested in the “long click”:

On the most basic level, Google could see how satisfied users were. To paraphrase Tolstoy, happy users were all the same. The best sign of their happiness was the “long click”. this occurred when someone went to a search result, ideally the top one, and did not return. That meant Google has successfully fulfilled the query. But unhappy users were unhappy in their own ways, most telling were the “short clicks” where a user followed a link and immediately returned to try again. “If people type something and then go and change their query, you could tell they aren’t happy,” says (Amit) Patel. “If they go to the next page of results, it’s a sign they’re not happy. You can use those signs that someone’s not happy with what we gave them to go back and study those cases and find places to improve search.

In terms of brand, the more well known you are, the more some of your traffic is going to be pre-qualified. Brand awareness can lower your bounce rate, which leads to better engagement signals.

Any site is going to have some arbitrary brand-related traffic and some generic search traffic. Where a site has good brand-related searches, those searches create positive engagement metrics which lift the whole of the site. The following chart is conceptual, but it drives the point home. As more branded traffic gets folded into the mix, aggregate engagement metrics improve.

If your site and business metrics look good in terms of visitor satisfaction – i.e. people are buying what you offer and/or reading what you have to say, and recommending you to their friends – it’s highly likely your relevancy signals will look positive to Google, too. People aren’t just arriving and clicking back. They are engaging, spending time, talking about you, and returning.

Repeat visits to your site, especially from logged-in Google users with credit cards on file, are yet another signal Google can look at to see that people like, demand and value what you offer.

Post-Panda, SEO is about the behavior of visitors post-click. In order to optimize for visitor satisfaction, we need to measure their behavior post-click and adjust our offering. A model that I’ve found works well in a post-Panda environment is a data-driven approach, often used in PPC. Yes, we still have to do link building and publish relevant pages, but we also have to focus on the behavior of users once they arrive. We collect and analyze behavior data and feed it back into our publication strategy to ensure we’re giving visitors exactly what they want.

What Is Data Driven Marketing?

Data driven marketing is, as the name suggests, the collection and analysis of data to provide insights into marketing strategies.

It’s a way to measure how relevant we are to the visitor, as the more relevant we are, the more positive our engagement metrics will be. A site can constantly be adapted, based on the behavior of previous visitors, in order to be made more even more relevant.

Everyone wins.

The process involves three phases. Setting up a framework to measure and analyze visitor behaviour, testing assumptions using visitor data, then optimizing content, channels and offers to maximize return. This process is used a lot in PPC.

Pre-web, this type of data used to be expensive to collect and analyse. Large companies engaged market researchers to run surveys, focus groups, and go out on the street to gather data.

These days, collecting input from consumers and adapting campaigns is as easy as firing up analytics and creating a process to observe behaviour and modify our approach based on the results. High-value data analysis and marketing can be done on small budgets.

Yet many companies still don’t do it.

And many of those that do aren’t measuring the right data. By capturing and analysing the right data, we put ourselves at a considerable advantage to most of our competitors.

In his book Data Driven Marketing, Jeffrey notes that the lower performing companies in the Fortune 500 were spending 4% less than the average on marketing, and the high performers were investing 20% more than average. Low performers focused on demand generation – sales, coupons, events – whereas high performers spend a lot more on brand and marketing infrastructure. Infrastructure includes the processes and software tools needed to capture and analyse marketing data.

So the more successful companies are spending more on tools and process than lower performing companies.

When it comes to the small/medium sized businesses, we have most of the tools we need readily available. Capturing and analyzing the right data is really about process and asking the right questions.

What Are The Right Questions?

We need a set of metrics that help us measure and optimize for visitor satisfaction.

Jeffrey identifies 15 data-analysis areas for marketers. Some of these metrics relate directly to search marketing, and some do not. However, it’s good to at least be aware of them as these are the metrics traditional marketing managers use, so might serve as inspiration get us thinking about where the cross-overs into search marketing lay. I recommend reading his book to anyone who wants a crash course in data-driven marketing and to better understand where how marketing managers think.

  • Brand awareness
  • Test Drive
  • Churn
  • Customer satisfaction
  • Take rate
  • Profit
  • Net Present Value
  • Internal Rate Of Return
  • Payback
  • Customer Lifetime Value
  • Cost Per Click
  • Transaction Conversion Rate
  • Return On Ad Dollars Spent
  • Bounce Rate
  • Word Of Mouth (Social Media Reach)

I’ll re-define this list and focus on a few metrics we could realistically use that help us optimize sites and offers in terms of visitor engagement and satisfaction. As a bonus, we’ll likely create the right relevancy signature Google is looking for which will help us rank well. Most of these metrics come directly from PPC.

First, we need a…..dashboard! Obviously, a dashboard is a place where you can see how you’re progressing, at a glance, measured over time. There are plenty of third party offerings, or you can roll-your-own, but the important thing is to have one and use it. You need a means to measure where you are, and where you’re going in terms of visitor engagement.

1. Traffic Vs Leads

Traffic is a good metric for display and brand purposes. If a site is making money based on how many people see the site, then they will be tracking traffic.

For everyone else, combining the two can provide valuable insights. If traffic has increased, but the site is generating the same number of leads – or whatever your desired engagement action may be, but I’ll use the term “leads” to mean any desired action – then is that traffic worthwhile? Track how many leads are closed and this will tell you if the traffic is valuable. If the traffic is high, but engagement is low, then visitors are likely clicking back, and this is not a signal Google deems favorable.

This data is also the basis for adjusting and testing the offer and copy. Does engagement increase or decrease after you’ve adjusted the copy and/or the offer?

2. Search Channel Vs Other Channels

Does search traffic result in more leads than, say, social media traffic? Does it result in more leads vs any other channel? If so, then there is justification to increase spending on search marketing vs other channels.

Separate marketing channels out so you can compare and contrast.

3. Channel Growth

Is the SEM channel growing, staying the same, or declining vs other channels?

Set targets and incremental milestones. Create a process to adjust copy and offers and measure the results. The more conversions to desired action, the better your relevancy signal is likely to be, and the more you’ll be rewarded.

You can get quite granular with this metric. If certain pages are generating more leads than others as the direct result of keyword clicks, then you know which keyword areas to grow and exploit in order to grow the performance of the channel as a whole. It can be difficult to isolate if visitors skip from page to page, but it can give you a good idea which entry pages and keywords kick it all off.

4. Paid Vs Organic

If a search campaign is running both PPC and SEO, then split these two sources out. Perhaps SEO produces more leads. In which case, this will justify creating more blog posts, articles, link strategies, and so on.

If PPC produces more leads, then the money may be better spent on PPC traffic, optimizing offers and landing pages, and running A/B tests. Of course, the information gleaned here can be fed into your organic strategies. If the content works well in PPC, it is likely to work well in SEO, at least in terms of engagement.

5. Call To Action

How do you know if a call to action is working? Could the call to action be worded differently? Which version of the call to action works best? Which position does it work best? Does the color of the link make a difference?

This type of testing is common in PPC, but less so in SEO. If SEO pages are optimized in this manner, then we increase the level of engagement and reduce the click-back.

6. Returning Visitor

If all your visitors are new and never return, then your broader relevance signals aren’t likely to be great.

This doesn’t mean all sites must have a high number of return visitors in order to deemed relevant – one-off sales sites would be unlikely to have return visitors, yet a blog would – however, if your site is in a class of sites where every other site listed is receiving return visits, then your site is likely to suffer by comparison.

Measure the number of return visitors vs new visitors. Think about ways you can keep visitors coming back, especially if you suspect that your competitors have high return visitor rates.

7. Cost Per Click/Transaction Conversion Rate/Return On Ad Dollars Spent

PPC marketers are familiar with these metrics. We pay per click (CPC) and hope the visitor converts to desired action. We get a better idea of the effectiveness of keyword marketing when we combine this metric with transaction conversion rate (TCR) and return on ad dollars spent (ROA). TCR = transaction conversion rate; the percentage of customers who purchase after clicking through to your website. ROA = return on ad dollars spent.

These are good metrics for SEOs to get their heads around, too, especially when justifying SEO spends relative to other channels. For cost per click, use the going rate on Adwords and assign it to the organic keyword if you want to demonstrate value. If you’re getting visitors in at a lot lower price per click the SEO channel looks great. The cost-per-click in SEO is also the total cost of the SEO campaign divided by clicks over time.

8. Bounce Rate

Widely speculated to be an important metric post-Panda. Obviously, we want to get this rate down, Panda or not.

If you’re seeing good rankings but high bounce rates for pages it’s because the page content isn’t relevant enough. It might be relevant in terms of content as far as the algorithm sees it, but not relevant in terms of visitor intent. Such a page may drift down the rankings over time as a result, and it certainly doesn’t do other areas of your business any good

9. Word Of Mouth (Social Media Reach/Brand)

Are other people talking about you? Do they repeat your brand name? Do they do so often? If you can convince enough people to search for you based on your name, then you’ll “own” that word. Google must return your site, else they’ll be seen as lacking.

Measuring word-of-mouth used to be difficult but it’s become a lot easier, thanks to social media and the various information mining tools available. Aaron has written a lot on the impact of brand in SEO, so if this area is new to you, I’d recommend reading back through The Rise Of Brand Over Time, Big Brands and Potential Brand Signals For Panda.

10. Profit

It’s all about the bottom line.

If search marketers can demonstrate they add value to the bottom line, then they are much more likely to be retained and have budget increased. This isn’t directly related to Panda optimization, other than in the broad sense that the more profitable the business, the more likely they are keeping visitors satisfied.

Profit = revenue – cost. Does the search marketing campaign bring in more revenue that it costs to run? How will you measure and demonstrate this? Is the search marketing campaign focused on the most profitable products, or the least? Do you know which products and services are the most profitable to the business? What value does your client place on a visitor?

There is no one way of tracking this. It’s a case of being aware of the metric, then devising techniques to track it and add it to the dashboard.

11. Customer Lifetime Value

Some customers are more important than others. Some customers convert, buy the least profitable service or product, and we never hear from them again. Some buy the most profitable service or product, and return again and again.

Is the search campaign delivering more of the former, or the latter? Calculating this value can be difficult, and relies on internal systems within the company that the search marketer may not have access to, but if the company already has this information, then it can help validate the cost of search marketing campaigns and to focus campaigns on the keyword areas which offer the most return.

Some of these metrics don’t specifically relate to ranking, they’re about marketing value, but perhaps an illustration of how some of the traditional marketing metrics and those of search marketers are starting to overlap. The metrics I’ve outlined are just some of the many metrics we could use and I’d be interested to hear what other metrics you’re using, and how you’re using them.

Optimizing For Visitor Experience

If you test these metrics, then analyse and optimize your content and offers based on your findings, not only will this help the bottom line, but your signature on Google, in terms of visitor relevance, is likely to look positive because of what the visitor does post-click.

When we get this right, people are engaging. They are clicking on the link, they’re staying rather than clicking back, they’re clicking on a link on the page, they’re reading other pages, they’re interacting with our forms, they’re book-marking pages or telling others about our sites on social media. These are all engagement signals, and increased engagement tends to indicate greater relevance.

This is diving deeper than a traditional SEO-led marketing approach, which until quite recently worked, even if you only operated in the search channel and put SEO at the top of the funnel. It’s not just about the new user and the first visit, it’s also about the returning visitor and their level of engagement over time. The search visitor has a value way beyond that first click and browse.

Data-driven content and offer optimization is where SEO is going.

Categories: 

SEO Book

How To Prevent Content Value Gouging

Posted by:  /  Tags: , , ,

What are the incentives to publish high-value content to the web?

Search engines, like Google, say they want to index quality content, but provide little incentive to create and publish it. The reality is that the publishing environment is risky, relatively poorly paid in most instances, and is constantly being undermined.

The Pact

There is little point publishing web content if the cost of publishing outweighs any profit that can be derived from it.

Many publishers, who have search engines in mind, work on an assumption that if they provide content to everyone, including Google, for free, then Google should provide traffic in return. It’s not an official deal, of course. It’s unspoken.

Rightly or wrongly, that’s the “deal” as many webmasters perceive it.

What Actually Happens

Search engines take your information and, if your information is judged sufficiently worthy that day, as the result of an ever-changing, obscure digital editorial mechanism known only to themselves, they will rank you highly, and you’ll receive traffic in return for your efforts.

That may all change tomorrow, of course.

What might also happen is that they could grab your information, amalgamate it, rank you further down the page, and use your information to keep visitors on their own properties.

Look at the case of Trip Advisor. Trip Advisor, frustrated with Google’s use of its travel and review data, filed a competition complaint against Google in 2012.

The company said: “We hope that the commission takes prompt corrective action to ensure a healthy and competitive online environment that will foster innovation across the internet.”

The commission has been investigating more than a dozen complaints against Google from rivals, including Microsoft, since November 2010, looking at claims that it discriminates against other services in its search results and manipulates them to promote its own products.

TripAdvisor’s hotel and restaurants review site competes with Google Places, which provides reviews and listings of local businesses.”We continue to see them putting Google Places results higher in the search results – higher on the page than other natural search results,” said Adam Medros, TripAdvisor’s vice president for product, in February. “What we are constantly vigilant about is that Google treats relevant content fairly.”

Similarly, newspapers have taken aim at Google and other search engines for aggregating their content, and deriving value from that aggregation, but the newspapers claim they aren’t making enough to cover the cost of producing that content in the first place:

In 2009 Rupert Murdoch called Google and other search engines “content kleptomaniacs”. Now cash-strapped newspapers want to put legal pressure on what they see as parasitical news aggregators.”

Of course, it’s not entirely the fault of search engines that newspapers are in decline. Their own aggregation model – bundling news, sport, lifestyle, classifieds topics – into one “place” has been surpassed.

Search engines often change their stance without warning, or can be cryptic about their intentions, often to the determent of content creators. For example, Google has stated they see ads as helpful, useful and informative:

In his argument, Cutts said, “We actually think our ads can be as helpful as the search results in some cases. And no, that’s not a new attitude.”

And again:

we firmly believe that ads can provide useful information

And again:

In entering the advertising market, Google tested our belief that highly relevant advertising can be as useful as search results or other forms of content

However, business models built around the ads as content idea, such as Suite101.com, got hammered. Google could argue these sites went too far, and that they are asserting editorial control, and that may be true, but such cases highlight the flaky and precarious nature of the search ecosystem as far as publishers are concerned. One day, what you’re doing is seemingly “good”, the next day it is “evil”. Punishment is swift and without trial.

Thom Yorke sums it up well:

In the days before we meet, he has been watching a box set of Adam Curtis’s BBC series, All Watched Over by Machines of Loving Grace, about the implications of our digitised future, so the arguments are fresh in his head. “We were so into the net around the time of Kid A,” he says. “Really thought it might be an amazing way of connecting and communicating. And then very quickly we started having meetings where people started talking about what we did as ‘content’. They would show us letters from big media companies offering us millions in some mobile phone deal or whatever it was, and they would say all they need is some content. I was like, what is this ‘content’ which you describe? Just a filling of time and space with stuff, emotion, so you can sell it?”

Having thought they were subverting the corporate music industry with In Rainbows, he now fears they were inadvertently playing into the hands of Apple and Google and the rest. “They have to keep commodifying things to keep the share price up, but in doing so they have made all content, including music and newspapers, worthless, in order to make their billions. And this is what we want? I still think it will be undermined in some way. It doesn’t make sense to me. Anyway, All Watched Over by Machines of Loving Grace. The commodification of human relationships through social networks. Amazing!

There is no question the value of content is being deprecated by big aggregation companies. The overhead of creating well-researched, thoughtful content is the same whether search engines value it or not. And if they do value it, a lot of the value of that content has shifted to the networks, distributors and aggregators and away from the creators.

Facebook’s value is based entirely on the network itself. Almost all of Google’s value is based on scraping and aggregating free content and placing advertising next to it. Little of this value gets distributed back to the creator, unless they take further, deliberate steps to try and capture some back.

In such a precarious environment, what incentive does the publisher have to invest and publish to the “free” web?

Content Deals

Google lives or dies on the relevancy of the information they provide to visitors. Without a steady supply of “free” information from third parties, they don’t have a business.

Of course, this information isn’t free to create. So if search engines do not provide you profitable traffic, then why allow search engines to crawl your pages? They cost you money in terms of bandwidth and may extract, and then re-purpose, the value you created to suit their own objectives.

Google has done content-related deals in the past. They did one in France in February whereby Google agreed to help publishers develop their digital units:

Under the deal, Google agreed to set up a fund, worth 60 million euroes, or $ 80 million, over three years, to help publishers develop their digital units. The two sides also pledged to deepen business ties, using Google’s online tools, in an effort to generate more online revenue for the publishers, who have struggled to counteract dwindling print revenue.

This seems to fit with Google’s algorithmic emphasis on major web properties, seemingly as a means to sift the “noise in the channel”. Such positioning favors big, established content providers.

It may have also been a forced move as Google would have wanted to avoid a protracted battle with European regulators. Whatever the case, Google doesn’t do content deals with small publishers and it could be said they are increasingly marginalizing them due to algorithm shifts that appear to favor larger web publishers over small players.

Don’t Be Evil To Whom?

Google’s infamous catch-phrase is “Don’t Be Evil”. In the documentary Inside Google”, Eric Schmidt initially thought the phrase was a joke. Soon after, he realized they took it seriously.

The problem with such a phrase is that it implies Google is a benevolent moral actor that cares about……what? You – the webmaster?

Sure.

“Don’t Be Evil” is typically used by Google in reference to users, not webmasters. In practice, it’s not even a question of morality, it’s a question of who to favor. Someone is going to lose, and if you’re a small webmaster with little clout, it’s likely to be you.

For example, Google appear to be kicking a lot of people out of Adsense, and as many webmasters are reporting, Google often act as judge, jury and executioner, without recourse. That’s a very strange way of treating business “partners”, unless partnership has some new definition of which I’m unaware.

It’s getting pretty poor when their own previously supportive ex-employees switch to damning their behavior:

But I think Google as an organization has moved on; they’re focussed now on market position, not making the world better. Which makes me sad. Google is too powerful, too arrogant, too entrenched to be worth our love. Let them defend themselves, I’d rather devote my emotional energy to the upstarts and startups. They deserve our passion.

Some may call such behavior a long way from “good” on the “good” vs “evil” spectrum.

How To Protect Value

Bottom line: if your business model involves creating valuable content, you’re going to need a strategy to protect it and claw value back from aggregators and networks in order for a content model to be sustainable.

Some argue that if you don’t like Google, then block them using robots.txt. This is one option, but there’s no doubt Google still provides some value – it’s just a matter of deciding where to draw the line on how much value to give away.

What Google offers is potential visitor attention. We need to acquire and hold enough visitor attention before we switch the visitors to desired action. An obvious way to do this, of course, is to provide free, attention grabbing content that offers some value, then lock the high value content away behind a paywall. Be careful about page length. As HubPages CEO Paul Edmonds points out:

Longer, richer pages are more expensive to create, but our data shows that as the quality of a page increases, its effective revenue decreases. There will have to be a pretty significant shift in traffic to higher quality pages to make them financially viable to create”

You should also consider giving the search engines summaries or the first section of an article, but block them from the rest.

Even if you decide to block search engines from indexing your content they still might pay others to re-purpose it:

I know a little bit about this because in January I was invited to a meeting at the A.P.’s headquarters with about two dozen other publishers, most of them from the print world, to discuss the formation of the consortium. TechCrunch has not joined at this time. Ironically, neither has the A.P., which has apparently decided to go its own way and fight the encroachments of the Web more aggressively (although, to my knowledge, it still uses Attributor’s technology). But at that meeting, which was organized by Attributor, a couple slides were shown that really brought home the point to everyone in the room. One showed a series of bar graphs estimating how much ad revenues splogs were making simply from the feeds of everyone in the room. (Note that this was just for sites taking extensive copies of articles, not simply quoting). The numbers ranged from $ 13 million (assuming a $ .25 effective CPM) to $ 51 million (assuming a $ 1.00 eCPM)

You still end up facing the cost of policing “content re-purposing” – just one of the many costs publishers face when publishing on the web, and just one more area where the network is sucking out value.

Use multiple channels so you’re not reliant on one traffic provider. You might segment your approach by providing some value to one channel, and some value to another, but not all of it to both. This is not to say models entirely reliant on Google won’t work, but if you do rely on a constant supply of new visitors via Google, and if you don’t have the luxury of having sufficient brand reputation, then consider running multiple sites that use different optimization strategies so that the inevitable algorithm changes won’t take you out entirely. It’s a mistake to think Google cares deeply about your business.

Treat every new visitor as gold. Look for ways to lock visitors in so you aren’t reliant on Google in future for a constant stream of new traffic. Encourage bookmarking, email sign-ups, memberships, rewards – whatever it takes to keep them. Encourage people to talk about you across other media, such as social media. Look for ways to turn visitors into broadcasters.

Adopt a business model that leverages off your content. Many consultants write business books. They make some money from the books, but the books mainly serve as advertisements for their services or speaking engagements. Similarly, would you be better creating a book and publishing it on Amazon than publishing too much content to the web?

Business models focused on getting Google traffic and then monetarizing that attention using advertising only works if the advertising revenue covers production cost. Some sites make a lot of money this way, but big money content sites are in the minority. Given the low return of a lot of web advertising, other webmasters opt for cheap content production. But cheap content isn’t likely to get the attention required these days, unless you happen to be Wikipedia.

Perhaps a better approach for those starting out is to focus on building brand / engagement / awarenesss / publicity / non-search distribution. As Aaron points out:

…the sorts of things that PR folks & brand managers focus on. The reason being is that if you have those things…

  • the incremental distribution helps subsidize the content creation & marketing costs
  • many of the links happen automatically (such that you don’t need to spend as much on links & if/when you massage some other stuff in, it is mixed against a broader base of stuff)
  • that incremental distribution provides leverage in terms of upstream product suppliers (eg: pricing leverage) or who you are able to partner with & how (think about Mint.com co-marketing with someone or the WhiteHouse doing a presentation with CreditCards.com … in addition to celebrity stuff & such … or think of all the ways Amazon can sell things: rentals, digital, physical, discounts via sites like Woot, higher margin high fashion on sites like Zappos, etc etc etc)
  • as Google folds usage data & new signals in, you win
  • as Google tracks users more aggressively (Android + Chrome + Kansas City ISP), you win
  • if/when/as Google eventually puts some weight on social you win
  • people are more likely to buy since they already know/trust you
  • if anyone in your industry has a mobile app that is widely used & you are the lead site in the category you could either buy them out or be that app maker to gain further distribution
  • Google engineers are less likely to curb you knowing that you have an audience of rabid fans & they are more likely to consider your view if you can mobilize that audience against “unjust editorial actions”

A lot of the most valuable content on this site is locked-up. We’d love to open this content up, but there is currently no model that sufficiently rewards publishers for doing so. This is the case across the web, and it’s the reason the most valuable content is not in Google.

It’s not in Google because Google, and the other search engines, don’t pay.

Fair? Unfair? Is there a better way? How can content providers – particularly newcomers – grow and prosper in such an environment?

Categories: 

SEO Book

The Rise Of Paywalls

Posted by:  /  Tags: ,

“Information wants to be free” was a phrase coined by Stewart Brand, a counter-culture figure and publisher of the Whole Earth Catalog.

This was the context of the quote:

On the one hand information wants to be expensive, because it’s so valuable. The right information in the right place just changes your life. On the other hand, information wants to be free, because the cost of getting it out is getting lower and lower all the time. So you have these two fighting against each other

Brand talks about distribution cost, but not the production cost. Whatever our views on information freedom, I think everyone can agree that those who create information need to pay their bills. If creating information is how someone makes their living, then information must make an adequate return.

Information production is not free.

The distribution cost has been driven down to near zero on the internet, but it is the distributors, not content creators, who make most of the money. “Information wants to be free”, far from being an anti-corporate battle-cry, suits the business model of fat mega-corporations, like Google, who make money bundling “free” content and running advertising next to it. In this environment, the content creator can often struggle to make a satisfactory return.

So, content creators have been experimenting with models that reject the notion information must be free. One of these models involves the paywall, which we’ll examine today.

Content Disappearing Behind The Wall

More than 300 US dailies now have paywalls, and that number is growing. Big players, like the New York Times and the Financial Times, have reported increasing paid subscription numbers for their online content:

The FT reported that it has breached the 250,000 subscriber mark, having grown digital subscriptions 30% during the last year. The FT charges about $ 390 for an annual subscription to its website, which would indicate total digital subscription revenues of nearly $ 100 million if everyone was paying the full annual price. However, the actual total is almost certainly lower than that, since print subscribers pay discounted fee and not all subscriptions are annual. However, the performance is still impressive. The FT said 100,000 of those subscriptions are from corporations

Their paywall experiment appears to be paying off. However, critics are quick to point out that those newspapers enjoy an established reputation, and that lesser-known media outlets might have trouble emulating such success.

Certainly, this seems to be the case for the Rupert Murdoch owned “The Daily” which went belly-up due to poor subscription numbers:

The Daily, a boldly innovative publication – in the platform sense – is over. It’s never pleasant to see a newspaper of any form go under. However, there are lessons to be made from its birth, growth, and eventual demise that have wide implications for the content industry that are worth discussing.Here’s the raw truth: The Daily lost too much money and didn’t have a clear path to profitability, or something close to it. News Corp stated this succinctly, saying that the paper’s key problem was that it “could not find a large enough audience quickly enough to convince us the business model was sustainable in the long-term.

Even with the clout of News Corporation behind it, the Daily folded in less than two years. It was reportedly losing an estimated $ 30 million annually.

But was size was part of its problem? Did that paywall model fail due to high overhead and the relative inflexibility of a traditional media operation? Perhaps success involves leveraging off an existing reputation, innovation and running a tight ship?

Some smaller media start-ups have opted for the paywall approach. Well-known blogger Andrew Sullivan left the Daily Dish and went solo, offering readers a subscription based website.

Was this a big risk? Could Sullivan really make subscription content pay when the well-resourced Daily failed?

Sullivan did 333K in 24 hours.

Basically, we’ve gotten a third of a million dollars in 24 hours, with close to 12,000 paid subscribers [at last count],” Sullivan wrote today. “On average, readers paid almost $ 8 more than we asked for. To say we’re thrilled would obscure the depth of our gratitude and relief.

Sullivan doesn’t have the overhead of The Daily, so his break-even point is significantly lower. It looks like Sullivan may have hit on a model that works for him.

Another small media outfit, called The Magazine run by Marco Arment, started as an “IOS newstand publication for geeks”. Arment was known to his audience as he was the lead developer on Tumblr and and developer of Instapaper.

The Magazine publishes four articles every two weeks for $ 1.99 per month with a 7-day free trial. It started off as an app for the iPad but has since migrated to the web, but behind a paywall.

There’s room for another category between individuals and major publishers, and that’s where The Magazine sits. It’s a multi-author, truly modern digital magazine that can appeal to an audience bigger than a niche but smaller than the readership of The New York Times. This is what a modern magazine can be, not a 300 MB stack of static page images laid out manually by 100 people. The Magazine supports writers in the most basic, conventional way that, in the modern web context, actually seems least conventional and riskiest: by paying them to write. Since I’m keeping production costs low, I’m able to pay writers reasonably today, and very competitively with high-end print magazines in the future if The Magazine gets enough subscribers. A risk, but I’m confident. Here goes”

So how’s this niche publication doing?

Arment walked me through the numbers. He has 25,000 subscribers who pay $ 1.99 a month. Apple takes a 30 percent cut, leaving Arment about $ 35,000 a month.his cost of putting out the magazine is a bit over $ 20,000 per month. It comes out every two weeks, and each issue costs about $ 10,000. Roughly $ 4,000 goes to writers. The rest goes mostly to copy editors, illustrators, photographers and editors

Then there is Paul Carr, ex-Tech Crunch journalist who started NSFW Corporation, a web publication that has, up until recently, sat entirely behind a paywall. It’s a general interest and humor site that, by Pauls’ own admission, doesn’t need a ton of readers, just enough readers prepared to pay $ 3 a month for access so they can make money. He figures if he gets 30K paying subscribers, then that’s enough to break even.

Interestingly, he’s announced that they are diversifying into print. He claims NSFW will be profitable by the end of the year:

They’ll curse at SEO-driven headlines and at a public unwilling to pay even a few dollars for journalism that costs many thousand times that to produce. …….Rather than mourning the loss of long-form investigative pieces, we’re combining an online subscription model with ebooks and even print to make that kind of journalism profitable again. Instead of resorting to cheap tricks to jack up page views to sell another million belly fat ads, we’re inventing sponsorship products that provide more value to sponsors as editorial quality (not quantity) increases…..

It’s probably too early to draw many firm conclusions on the paywall experiment, although it’s clear that some operators are making it work.

News is a difficult form of content to monetarize on the web. It’s ephemeral, time-sensitive and ultimately disposable. However, if you’re providing educational and consultancy content, then it should be easier. If you do publish this type of content, how much of this should you be giving away? And if you do, what return are you getting back? Do you have a way to measure it?

The answers will be different for everyone, but they are interesting questions to consider. Many publishers are making paywalls work. And these people are making money from web content without exclusively pandering to flaky search engines in the hope some traffic may come their way.

The free content in exchange for free traffic “deal” is simply no longer worthwhile for many publishers.

Paywalls Are Hard

Paywalls are difficult to get right.

When “The Magazine” launched, it placed too much content behind the paywall, in the form of an app, meaning people couldn’t link to it. This meant the conversation was happening elsewhere.

I hastily built a basic site while I was waiting for the app to be approved. I only needed it to do two things: send people to the App Store, and show something at the sharing URLs for each article. Since The Magazine had no ads, and people could only subscribe in the app, I figured there was no reason to show full article text on the site — it could only lose money and dilute the value of subscribing. That was the biggest mistake I’ve made with The Magazine to date

The Magazine is now offering one free article view per month. The casual reader will still be able to assess the value and conversation and interaction can still happen, whilst most of the valuable content sits behind a paywall, helping ensure content creators paid.

Taking a different approach, The Times of London erected a “Berlin Wall”, locking content inside a fortress. How did that work out?

Not so well.

While the Times once had 10m monthly unique visitors, figures in September show that it has only managed to attract 100,000 digital-only subscribers, although print subscribers are able to access the site as well. As a result, Murdoch was recently forced to capitulate and allow Google and other search engines partial access to his content

When it comes to paywalls, mixed models appear to work best. Some content needs to appear where everyone can see it. Some content needs to appear in search engines and social media. The question is how much, and via what channel?

Some sites use a free-on-the-web model, whilst charging for mobile access. Other’s use a freemium model where some content is free in order to entice people to pay for premium content. One of the more successful models, of late, has been a metered approach.

The New York Times allows you to view five free pages if you come via a search engine because they get some referral revenue from the search sites. If you come to the site via Facebook, Twitter, blogs or other social media it does not count towards your monthly allowance

People don’t like to be forced into paying for content, but don’t seem to mind paying once the value has been demonstrated. One of the most successful apps in the Apple store, Angry Birds, enticed people to pay by giving the basic game away. Once they could see the value, people were more willing to pay.

Fred Wilson labels this “ex post facto monetization” — “you get paid after the fact, not before.” Under this strategy, you let people receive the value of your product first, then pay later — because they want to. Those who do sign up willingly are likely to be long-term, loyal customers. Those who never sign up probably haven’t discovered enough personal value and would have unsubscribed after a month even if they had initially been forced to subscribe

Paywalls can also be difficult to get right on a technical level. Some paywalls are porous in that the content can be seen so long as you know enough to jump through a few digital hoops:

When we launched our digital subscription plan we knew there were loopholes to access our content beyond the allotted number of articles each month. We have made some adjustments and will continue to make adjustments to optimize the gateway by implementing technical security solutions to prohibit abuse and protect the value of our content

However, even if some content does leak – and let’s face it, anything on the net can leak as a cut n’ paste is only a few keystrokes away – at least an expectation of payment is being established. The message is that this content has a value attached to it.

Another way of approaching it could be to make content available in formats that are more difficult to crawl and replicate, such as streaming video, or Kindle books. Here’s a guide on how to self-publish on the Kindle.

Think about different ways to make it difficult for scrapers to extract all your value easily.

Paywalls Are Strategic

Paywalls are not just a sign-up form and a payment gateway. Paywalls are also a publishing strategy.

How much are you prepared to give away for free? How does giving away this content pay off?

A consultant may publish far and wide for free. The pay-off is more consulting gigs. The consultancy “content” sits behind a paywall in that you have to pay for that service. Not many SEO consultants give their detailed analysis away for free. The content we see in the public domain on SEO is a tiny fraction of the information held by the professionals in our niche, and that information may want to be free, but the owners, wisely, hold onto most of it, else they wouldn’t eat.

Be wary about giving away your labour in exchange for “awareness”. Here’s a story about how The Atlantic tried to get a journalist to work for nothing.

From the Atlantic:

Thanks for responding. Maybe by the end of the week? 1,200 words? We unfortunately can’t pay you for it, but we do reach 13 million readers a month. I understand if that’s not a workable arrangement for you, I just wanted to see if you were interested.

Thanks so much again for your time. A great piece!

From me:

Thanks Olga:

I am a professional journalist who has made my living by writing for 25 years and am not in the habit of giving my services for free to for profit media outlets so they can make money by using my work and efforts by removing my ability to pay my bills and feed my children…..

Such arrangements suit the publisher, of course, but all the risk sits with the content creator. Sometimes, those deals can work if they lead to payment in some other form, but ensure you have a means to track the pay-off.

Another way of thinking about a paywall is a switch of channel. We’re seeing the rise and rise of mobile computing and, as it turns out, mobile consumers are much more willing to pay for content than people who browse the web:

The upshot: paid content, it seems, is alive and well, but some media categories are doing a lot better than others.Taking just the use of paid content on tablets in Q4 2011, Nielsen found that in the U.S., a majority of tablet owners have already paid for downloaded music, books and movies, with 62 percent, 58 percent and 51 percent respectively saying they have already made such purchases

Could your content be better off pitched to a mobile audience? Made into an app? Published and promoted as a Kindle book?

The Hamster Wheel

This is not to say leaving content out in the open can’t pay the bills. Perhaps you don’t feel a paywall is right for you, but you’re growing tired of running faster just to stay in the same place.

Brian Lam used to be the editor of Gizmodo, Gawker media’s gadget blog. Gizmodo was run on a model familiar to search marketers where you first find a keyword stream then capture that stream by writing keyword-driven articles.

He likens this approach to a hamster on a wheel as he relentlessly churned out copy in order to drive more and more traffic.

It led to burn-out.

He loved the ocean, but his frantic digital existence meant his surfboard was gathering cobwebs. “I came to hate the Web, hated chasing the next post or rewriting other people’s posts just for the traffic,” he told me. “People shouldn’t live like robots.

The problem with ad-supported media models, such as Adsense, is that they depend on scale. With advertising rates decreasing year by year as the market gets more and more fractured, content production increases just to keep pace.

Lam went in the opposite direction.

His new gadget site only posts 12 times a month, but goes deep. The majority of his income comes from Amazon’s affiliate program. He achieves a 10-20% click-thru rate.

Mr. Lam’s revenue is low, about $ 50,000 a month, but it’s doubling every quarter, enough to pay his freelancers, invest in the site and keep him in surfboards. And now he actually has time to ride them. In that sense, Mr. Lam is living out that initial dream of the Web: working from home, working with friends, making something that saves others time and money…..The clean, simple interface, without the clutter of news, is a tiny business; it has fewer than 350,000 unique visitors a month at a time when ad buyers are not much interested in anything less than 20 million.But The Wirecutter is not really in the ad business. The vast majority of its revenue comes from fees paid by affiliates, mostly Amazon, for referrals to their sites. As advertising rates continue to tumble, affiliate fees could end up underwriting more and more media businesses“

Is running on a search-driven hamster wheel, churning out more and more keyword content the most worthwhile use of your time? Lam is making more money by feeding the beast less in terms of quantity and going deep on quality.

Loss Leader For The Search Engines

But, hang on. This is an SEO site, isn’t it? Aren’t we all about getting content into the search engines and ranking well?

Of course.

SEO is still a great marketing channel, however this doesn’t mean to say everything we publish must appear in search engines. I hope this article prompts you to consider just how much you’re giving away compared to how much benefit you’re getting in return.

It all comes down to an ROI calculation. Does it cost me less to publish page X than I get in return? If you can publish pages cheaply enough, and if the traffic is worth enough, then great. If your publishing costs exceeds your return, then there are other models worth considering.

This article is mainly concerned with deep, researched, unique content that doesn’t have a trivial production cost attached to it. If the search engines don’t deliver enough value to make deep content creation worthwhile, then publishers must look beyond the “free” web model many have been using up until now in order to be sustainable.

Don’t let distributors suck out all your value so only they can grow fat. A paywall is more than a physical thing, it’s a strategy. If you publish a lot of valuable information that isn’t getting a reasonable return, then think about ways bundle that information into product form and ask yourself if you should keep it out of the search engines. Decide on your loss-leader content and create a sales funnel to ensure there is a payday at the end. The existence of content farms showed deep, free content often doesn’t pay. The way they made their content pay was to make it dirt cheap to produce and so useless that the advertising became the most relevant content on the page.

Content that relies heavily on search engine traffic is a high risk strategy. Some may recall a Mac site, called Cult Of Mac, that got hit by Panda. They were big enough, and connected enough, to have Google reinstate them, but the first comment in this thread tells it like it is:

It’s great news that Google reinstated Cult of Mac although that will not happen to other smaller genuine blogs and websites..

True, that.

It’s not enough to “publish quality content”. A lot of quality content gets hammered and tossed out of the search engines each day. And even if it stays listed, it may not make a return. There are no guarantees. Instead, build a brand and an audience. And then sell that audience something they can’t get for free.

Content may want to be free, but free doesn’t pay. For many publishers, the search engines aren’t giving enough back so be wary about how much you hand over to them.

Categories: 

SEO Book

SMX West 2013 Liveblog Schedule & Where We’ll Be

Posted by:  /  Tags: , , , , ,

SMX West 2013 Liveblog Schedule & Where We’ll Be was originally published on BruceClay.com, home of expert search engine optimization tips.

Holy heatmap have you seen the SMX West agenda? This is some next level content. Getting to attend an educational series like the one going down next week in San Jose is truly one of the perks of the job. Of course it’s not all fun times and smart insights while at SMX; I earn my keep by liveblogging. And with an agenda like this, picking the sessions to attend has been the hardest part! But I did it, and after the fold down you’ll see the SMX West sessions you’ll find on the blog next week. Ad retargeting, YouTube PPC, Google Knowledge Graph, Facebook Graph Search, authorship and online identity — it’s all here! Before that, some details you should know if you’re lucky like me and will be at the conference!

How to Meet with Bruce Clay, Inc. at SMX West 2013

If you’re attending the conference, let’s connect. Here’s where we’ll be next week.

We are exhibiting at SMX West
  • If you’re getting into town Sunday, have a drink on us! BCI co-sponsors the Meet & Greet this Sunday from 6-7:30 and we’d love to see you there, get pumped for the conference and nerd out marketing style.
  • The team will be stationed at booth #406 in the expo hall. Visit us to enter to win a free seat in our 3-day SEO training course in Simi Valley, CA. We’ll also be giving away $ 400 off the cost of our 3-day SEO training in 2013 for you and 6 friends, so stop by!
  • Bruce will be in Theater A explaining Google ranking penalties and how businesses can get back in Google’s good graces on Monday at 12:00.
  • Register for the SEO Workshop on Thursday, March 14 for a search optimization intensive you can use to take your business’s optimization efforts from 0 to 100. The last two SEO Workshops Bruce presented at SMX conferences last year sold out. If you have any questions about SEO, Bruce answers any questions in face-to-face workshop and equips you with a comprehensive and fundamental understanding of SEO. (P.S. You can deduct the cost of the workshop from our full SEO training course in Southern California if one day isn’t enough!)
  • If you decide to go to SMX, save yourself 10% with code SMXW13bruceclay.

Liveblogging SMX West

Day 1: Monday, March 11

Time BCI Liveblog Coverage
9:00 a.m. Essential SEO Analytics: The Performance Metrics That Truly Count
10:45 a.m. Ready, Aim, Fire… Then Retarget!
3:30 p.m. The Search Police: Matt & Duane’s Excellent Search Engine Adventure
5:00 p.m. Enhancing AdWords For A Constantly Connected World

 

Day 2: Tuesday, March 12

Time BCI Liveblog Coverage
9:00 a.m. Keynote Conversation: Grady Burnett, Facebook
10:45 a.m. Meet Facebook Graph Search
3:30 p.m. From Authorship To Authority: Why Claiming Your Identity Matters
5:00 p.m. Inside Google’s Game-Changing Knowledge Graph

 

Day 3: Wednesday, March 13

Time BCI Liveblog Coverage
10:45 a.m. YouTube Words: Tying Your PPC Campaigns To YouTube
1:00 p.m. Social Media Ads
2:30 p.m. Google Enhanced Campaigns: What You Really Need To Know

 

Bruce Clay Blog

Creep-Free Retargeting and More with SMX West Speaker Susan Waldes

Posted by:  /  Tags: , , , , , ,

Creep-Free Retargeting and More with SMX West Speaker Susan Waldes was originally published on BruceClay.com, home of expert search engine optimization tips.

If you’re already investing in PPC, retargeting is a logical next step. Retargeting, or remarketing as it’s sometimes called, allows you to keep track of your visitors and serves up your ads as they browse the Web to remind them of your products and services. But there’s an art to remarketing, and today, as part of our interview series with SMX West speakers, Susan Waldes of PPC Associates discusses why you should do it, what makes a great campaign and how to serve up ads without giving off the creepy stalker vibe.

Jessica Lee: Why do online businesses need to consider retargeting?

Susan Waldes: Visitors to your site (or YouTube videos, or email opens) not only have expressed an interest in your offerings, but also in your brand. As such, this is usually the most valuable segment of people you can target. The ROI is almost always there with the right gentle reminder.

Susan-Waldes-Photo

Beyond that, smart retargeting campaigns also look at increasing lifetime value of users, increasing loyalty, driving social engagement with the brand and increasing average order sizes.

What are some of the aspects of an effective retargeting campaign?

Creative is often the focus of retargeting, but retargeting can be successful even with your standard creative. The two things that advanced marketers focus on in retargeting campaigns are:

  • The right cookie length
  • The right impression cap

Are people seeing your ads at the right time in the purchase cycle and with the right frequency to drive re-engagement but not become weary with over-saturation?

What are some elements of great ads for remarketing?

Great remarketing ads are smart about providing gentle and compelling reminders while being mindful not to overdo the “creepy” factor and make users feel they are being stalked.

They acknowledge sensitivities, and offer some value-add to customers that re-engage, like a promo or free shipping offer that encourages people to pull the trigger.

They target people at the right time and with messaging that aligns with where they are in the funnel. They use the brand and logo prominently in the creative and appear in the right places.

Thanks for the insight, Susan. You can stay connected with Susan Waldes on Twitter @SuzyVirtual. If you’re headed to SMX West next week, don’t miss Susan in the session, “Ready, Aim, Fire … Then Retarget!” on March 11 at 10:45 a.m.

Bruce Clay Blog

We’re Going Google…

Posted by:  /  Tags: , ,

In the search ecosystem Google controls the relevancy algorithms (& the biases baked into those) as well as the display of advertisements and the presentation of content. They also control (or restrict) the flow of marketable data.

For example, a publisher might not get keyword referral data on organic search, but Google passes that data on via advertisements & passes a large amount of data on through their ad network to other ad networks. Consider this:

a DoubleClick tag on the site sent data to two other companies that collect it for various purposes — Rubicon and Casale Media, representing a “hop.” In a subsequent hop, Casale transferred the IMDB data to BlueKai, Optimax and Brandscreen, while Rubicon pushed it to TargusInfo, RocketFuel, Platform 161, Efficient Frontier and the AMP Platform. AMP then sent the data on to AppNexus and back to DoubleClick.

For about a decade being relevant & focused created efficiencies that more than offset any “size = quality” biases that the Google engineers created. However across many verticals that window is closing & it is never a good idea to wait until it is fully closed to adjust. 😉

This shift from relevancy to “size = quality” can be seen in the stock performance of mid-market companies like BankRate & Quinstreet.

Those companies were laser focused on the markets that have significant consumer intent & traffic value, but Google has eroded the affiliate base & ad networks of many of the direct marketing plays for a couple years straight now.

If Google’s algorithmic biases are strong enough to literally move the market on companies worth hundreds of millions to billions of Dollars, one is naive to swim against the tide. The market is becoming more bifurcated.

This is why it is so hard to find a great SEO to recommend for small businesses. If that SEO really knows what they are doing & understands the market dynamics, then they probably won’t serve the small business end of the market very long, or if they do, they will do so in a way where their continued flow of payments is not tied to performance. It is hard to have a sustainable business operating in a closed ecosystem if you are swimming in the opposite direction of that ecosystem.

In terms of our membership site here, a good slice of our customer base is the expert end of the market.


It is a tiny sliver of the market, but it is a segment that is somewhat well aligned with independent affiliate types & the sort of direct marketing relevancy-minded folks that Google has spent a couple years trying to marginalize as they cater to branded advertisers. We could try to shift our site to make it more mass market, but I prefer to run a site where we both learn & teach, and fear that moving to lower the barrier to entry and push more mass market will destroy what makes the membership site unique & valuable in the first place.

In early Google research they warned about relevancy shifting toward the interest of advertisers.

Currently, the predominant business model for commercial search engines is advertising. The goals of the advertising business model do not always correspond to providing quality search to users. For example, in our prototype search engine one of the top results for cellular phone is “The Effect of Cellular Phone Use Upon Driver Attention”, a study which explains in great detail the distractions and risk associated with conversing on a cell phone while driving. This search result came up first because of its high importance as judged by the PageRank algorithm, an approximation of citation importance on the web [Page, 98]. It is clear that a search engine which was taking money for showing cellular phone ads would have difficulty justifying the page that our system returned to its paying advertisers. For this type of reason and historical experience with other media [Bagdikian 83], we expect that advertising funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.

Perform that same cellular phone search today & that original cited page is nowhere to be found. Today that same search includes Wal-Mart, T-mobile, Samsung, Amazon.com, Best Buy & other well known brands. Search for the more common phrase cell phones & you get the same brands plus local results and shopping results. Awareness is replacing precision.

I think Gabe Newell described it best:

Closed platforms increase the chunk size of competition & increase the cost of market entry, so people who have good ideas, it is a lot more expensive for their productivity to be monetized. They also don’t like standardization … it looks like rent seeking behaviors on top of friction

As Google makes search more complex & mixes in more signals, it is becoming harder to win at the game if your operation is singularly focused on SEO & it is becoming easier to win if your business already has a strong footprint in many other channels which bleeds into your search profile. The following chart is conceptual, but it aims to get the issue across.

If one company is spending significant capital & effort trying to combat the Panda algorithm & another company automatically sees a ranking boost from Panda, then the company with the boost is typically going to see greater ROI from any further investments in SEO.

Having spilled all the above digital ink, back in 2007 we decided to shift away from an ebook model to run a membership site. On and off over the years we have done a bit of consulting outside of running this site, but haven’t put significant emphasis on it over the past couple years as we were pushing hard to keep up with the algorithms & keep this site growing. With all the above shifts in place we recently decided to offer SEO consulting again.

Some FAQs on that front…

  • If we work with you, who will be working on our project? The same people who write on the blog & run the community: Peter Da Vanzo, Eric Covino & Aaron Wall.
  • How many clients will you work with? Just a handful at any given time. We prefer to have a deep integration with a few clients rather than a bulk model.
  • Who are ideal clients? Those who know the value of search traffic & already have some general awareness & momentum in the marketplace. Examples of companies we have worked with in the past include: large ecommerce companies, tier 1 web portals, strong start ups & hedge funds invested in the web. Many of these clients already had an in-house SEO team & some were just actively beginning to leverage search.
  • I have a tiny company with a small budget. Could I still work with you? In some cases there might be a fit, but if you feel our consulting is beyond your budget you can of course still join our membership website. Consulting is for those who want a deeper engagement than we can provide through our current membership site model.
  • Can you name some past clients? For the most part, no. Our consulting projects typically come with nondisclosure agreements.
  • Can you fill out an RFP? Most likely not. If you are still shopping around for an SEO, we are probably not going to be a great fit. But if you have known of us for years & know you want to work with us, do get in touch.
Categories: 

SEO Book

UX Myths That Hurt SEO – Whiteboard Friday

Posted by:  /  Tags: , , ,

Posted by randfish

User experience and SEO: friends or enemies? They've had a rocky past, but it's time we all realized that they live better in harmony. Dispelling the negative myths about how UX and SEO interact is the first step in improving both the look and search results of your website. 

In today's Whiteboard Friday, Rand talks about some persistent UX myths that we should probably ignore.

Have anything to add that we didn't cover? Leave your thoughts in the comments below!

Video Transcription

"Howdy, SEOmoz fans, and welcome to another edition of Whiteboard Friday. This week I wanted to talk a little about user experience, UX, and the impact that it has on SEO.

Now, the problem historically has been that these two worlds have had a lot of conflict, especially like late '90s, early 2000s, and that conflict has stayed a little bit longer than I think it should have. I believe the two are much more combined today. But there are a few things that many people, including those who invest in user experience, believe to be true about how people use the web and the problems that certain user experience, types of functionality, certain design types of things cause impact SEO, and they impact SEO negatively. So I want to dispel some of those myths and give you things that you can focus on and fix in your own websites and in your projects so that you can help not only your SEO, but also your UX.

So let's start with number one here. Which of these is better or worse? Let's say you've got a bunch of form fields that you need a user to fill out to complete some sort of a registration step. Maybe they need to register for a website. Maybe they're checking out of an e-commerce cart. Maybe they're signing up for an event. Maybe they're downloading something.

Whatever it is, is this better, putting all of the requests on one page so that they don't have to click through many steps? Or is it better to break them up into multiple steps? What research has generally shown and user experience testing has often shown is that a lot of the time, not all of the time certainly, but a lot of the time this multi-step process, perhaps unintuitively, is the better choice.

You can see this in a lot of e-commerce carts that do things very well. Having a single, simple, direct, one step thing that, oh yes, of course I can fill out my email address and give you a password. Then, oh yeah, sure I can enter my three preferences. Then, yes, I'll put in my credit card number. Those three things actually are more likely to carry users through a process because they're so simple and easy to do, rather than putting it all together on one page.

I think the psychology behind this is that this just feels very overwhelming, very daunting. It makes us sort of frustrated, like, "Oh, do I really have to go through that?"

I'm not saying you should immediately switch to one of these, but I would fight against this whole, "Oh, we're not capturing as many registrations. Our conversion rate is lower. Our SEO leads aren't coming in as well, because we have a multi-step process, and it should be single step." The real key is to usability test to get data and metrics on what works better and to choose the right path. Probably if you have something small, splitting it up into a bunch of steps doesn't matter as much. If you have something longer, this might actually get more users through your funnel.

Number two. Is it true that if we give people lots of choice, then they'll choose the best path for them, versus if we only give people a couple options that they might not go and take the action that they would have, had we given them those greater choices? One of my favorite examples from this, from the inbound marketing world, the SEO world, the sharing world, the social world is with social sharing buttons themselves. You'll see tons of websites, blogs, content sites, where they offer just an overwhelming quantity of tweet this, share this on Facebook, like this on Facebook, like us on Facebook, like our company page on Facebook, plus one this on Google+, follow us on Google+, embed this on your own webpage, link to this page, Pinterest this.

Okay. Yes, those are all social networks. Some of them may be indeed popular with many of your users. The question is:  Are you overwhelming them and creating what psychologists have often called the "paradox of choice," which is that we as human beings, when we look at a long list of items and have to make a decision, we're often worse at making that decision than we would be if we looked at a smaller list of options? This is true whether it's a restaurant menu or shopping for shoes or crafting something on the Internet. Etsy has this problem constantly with an overwhelming mass of choice and people spending lots of time on the site, but then not choosing to buy something because of that paradox of choice.

What I would urge you to do is not necessarily to completely get rid of this, but maybe to alter your philosophy slightly to the three or four or if you want to be a little religious about it, even the one social network or item that you think is going to have the very most impact. You can test this and bear it out across the data of your users and say, "Hey, you know what? 80% of our users are on Facebook. That's the network where most of the people take the action even when we offer them this choice. Let's see if by slimming it down to just one option, Twitter or Facebook or just the two, we can get a lot more engagement and actions going." This is often the case. I've seen it many, many times.

Number three. Is it true that it's absolutely terrible to have a page like this that is kind of text only? It's just text and spacing, maybe some bullet points, and there are no images, no graphics, no visual elements. Or should we bias to, hey let's have a crappy stock photo of some guy holding up a box or of a team smiling with each other?

In my experience, and a lot of the tests that I've seen around UX and visual design stuff, this is actually a worse idea than just going with a basic text layout. If for some reason you can't break up your blog post, your piece of content, and you just don't have the right visuals for it, I'd urge you to break it up by having different sections, by having good typography and good visual design around your text, and I'd urge you to use headlines and sub-headlines. I wouldn't necessarily urge you to go out and find crappy stock photos, or if you're no good at creating graphics, to go and make a no good graphic. This bias has created a lot of content on the web that in my opinion is less credible, and I think some other folks have experienced that through testing. We've seen it a little bit with SEOmoz itself too.

Number four. Is it true that people never scroll, that all the content that you want anyone to see must be above the fold on a standard web page, no matter what device you think someone might be looking at it on? Is that absolutely critical?

The research reveals this is actually a complete myth. Research tells us that people do scroll, that over the past decade, people have been trained to scroll and scroll very frequently. So content that is below the fold can be equally accessible. For you SEO folks and you folks who are working on conversion rate optimization and lead tracking, all that kind of stuff, lead optimization, funnel optimization, this can be a huge relief because you can put fewer items with more space up at the top, create a better visual layout, and draw the eye down. You don't have to go ahead and throw all of the content and all of the elements that you need and sacrifice some of the items that you wanted to put on the page. You can just allow for that scroll. Visual design here is obviously still critically important, but don't get boxed into this myth that the only thing people see is the above the fold stuff.

Last one. This myth is one of the ones that hurts SEOs the most, and I see lots of times, especially when consultants and agencies, or designers, developers are fighting with people on an SEO team, on a marketing team about, "Hey, we are aiming for great UX, not great SEO." I strongly disagree with this premise. This is a false dichotomy. These two, in fact, I think are so tied and interrelated that you cannot separate them. The findability, the discover bility, the ability for a page to perform well in search engines, which remains the primary way that we find new information on the Internet, that is absolutely as critically important as it is to have that great user experience on the website itself and through the website's pages.

If you're not tying these two together, or if you're like this guy and you think this is a fight or a competition, you are almost certainly doing one of these two wrong. Oftentimes it's SEO, right? People believe, hey we have to put this keyword in here this many times, and the page title has to be this big on the page. Or, oh we can't have this graphic here. It has to be this type of graphic, and it has to have these words on it.

Usually that stuff is not nearly important as it was, say, a decade ago. You can have fantastic UX and fantastic SEO working together. In fact, there almost always married.

If you're coming up with problems like these, please leave them in the comments. Reach out to me, tweet to me and let me know. I guarantee you almost all of them have a creative solution where the two can be brought together.

All right, gang, love to hear from you, and we will see you again next week for another edition of Whiteboard Friday. Take care."

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


SEOmoz Daily SEO Blog

How to Disavow Links in Google and Bing: An Instructional Guide

Posted by:  /  Tags: , , , , ,

How to Disavow Links in Google and Bing: An Instructional Guide was originally published on BruceClay.com, home of expert search engine optimization tips.

To help our clients who would like to use the disavow links tools from Google and Bing, this is an instructional guide.

It’s important to note that Google strongly advises against using the disavow links tool unless it is the last available option and will be implemented by a highly technical power user of Webmaster Tools. Incorrect use of the disavow links tool can harm Google’s evaluation of that site’s rankings and is a difficult process to reverse.

Introduction to Disavow Links

In this 9+ minute video, Google’s ambassador to webmasters and SEOs Matt Cutts tells us why a disavow links tool exists, who might need to use it, and how to use it. It’s a helpful introduction to the topic of harmful links.

Who Might Consider Using Disavow Links Tools

  1. You’ve received a bad link warning in Google Webmaster Tools.
  2. Your SEO has identified that your site is affected by the Penguin Update or manual action penalty removing you from search results.
  3. Or you may have identified negative SEO waged against your site.

In the video above, Matt gives some specific examples of the actions that could put you in category #2 in this list. If you’ve paid for links or used spammy comments or article directories to build backlinks, this is you.

First Course of Action: Link Removal

If inbound links are harming a site’s search engine standings, those links should be removed, or at least, an effort must be made to remove them. Bruce Clay, Inc.’s link pruning process is a vetted link removal method that we have used with success for many clients.

The following resources explain our link removal process, from identification of harmful links to contacting linking domains to tracking and reporting the process to Google for reconsideration:

Google and Bing’s Disavow Links Tools

After having exhausted your link removal efforts and made necessary reconsideration request to Google, the Disavow Links tools in Bing Webmaster Tools and Google Webmaster Tools may be a viable option for your situation.

Bing Webmaster Help and How-To use Disavow Links tool

  1. Go to “Configure my site” in Bing Webmaster Tools and then go to “Disavow links” in the following navigation.
  2. Use the Disavow Links tool to select a page, directory or domain you wish to disavow, and then enter the corresponding URL in the “Enter a URL” field.
  3. Click “Disavow”.
  4. The disavow submission will be listed below.
  5. You can delete disavow submissions by checking the box to the left of the listed selection and clicking the “Delete” button.

Bing Disavow Links tool

Google Webmaster Tools explanation of the Disavow links tool

  1. Create a text file (.txt) containing the URL of the links you want to disavow.
  2. Include only one link per line.
  3. To disavow all links from a whole domain, add “domain:” before the link URL of the domain home page (for example, “domain:example.com”)
  4. You may include additional information about links in a line beginning with “#” (for example, “# this webmaster won’t return my requests for removal”).
  5. Signed in to Google Webmaster Tools, visit https://www.google.com/webmasters/tools/disavow-links-main.
  6. Select the domain from a drop down menu for which you are submitting a disavow links list and click the “Disavow Links” button.
  7. Click through the pop-up warning (Google warns against the dangers of improper Disavow use throughout the process) and upload the text file of links you want Google to ignore and click “Submit”.
  8. You’ll see your .txt file listed here. Click “Done” to finish the process.

Google Disavow Links tool

Expect it to take weeks before the disavow is no longer a calculation in your site’s search engine valuation. Again, we stress not to use the Disavow Links tool without guidance from an expert.

Bruce Clay Blog

4 Ways to Start Optimizing Your Facebook Presence

Posted by:  /  Tags: , , , ,

4 Ways to Start Optimizing Your Facebook Presence was originally published on BruceClay.com, home of expert search engine optimization tips.

Companies will often have a Facebook presence but are still not quite sure what to do with it. And while the opportunities are seemingly endless, we’re gonna get back to the basics on this one and talk about how a few simple steps can help you lay the foundations for a more successful Facebook experience. Today we’ll go over:

  1. Understanding your Facebook Insights.
  2. Promoting your status updates.
  3. Optimizing your about section.
  4. Creating a schedule for posting.

1. Understand Your Facebook Insights

As a Facebook Page owner, you have access to Facebook’s analytics for your page, Facebook Insights. Get intimate with your Facebook Insights to understand what your audience is looking for. This is an area that should be monitored regularly to see how your community is engaging, and what sorts of things it responds to. The goal is to experiment, and give them what they want.

For a crash course, check out this document from Facebook circa 2011 on getting to know Insights.

The following snapshot shows the Insights landing page graph — what you first see when you go to the analytics from your page. There’s a lot of data to mine., but let’s just look at a couple you can learn from quickly.

Explore the data on the main graph on the Insights home page as well as the data below it. This is where you can see how the page is performing over a specified period of time, and which posts have proven to have the most engagement.

Facebook Post Insights

Find the posts that have a higher engagement percentage or “virality” (as indicated by the shadow boxes), analyze those updates and use that as fuel for creating more posts like those to see if they consistently receive higher engagement.

There could be many factors contributing to the success of a status update om Facebook. It could be the topic or an element within the post (like including an image or a particular tip, etc.). Consider creating a spreadsheet that breaks down common elements of your status updates to see if the more popular updates have anything in common.

2. Promote Your Status Updates

If and when a post becomes popular as defined by your Facebook Insights, consider using the “promote” option to get even more eyeballs on it. This has worked well in our experience and is a relatively inexpensive solution for visibility. Take care to promote only those status updates that you feel are important to your goals.

Facebook Post Promote

We’ve seen several posts enjoy more reach and bring in more likes to our page from the promote feature. Check out the results of this one:

Facebook Post Promoted

There’s been some controversy about this functionality. Mainly, people are worried that you’re going to have to pay to play in the future; however, a Facebook rep tells us in this post that they apply a similar approach to both paid and organic stories in news feeds:

“Regardless of whether you’re paying to promote a story or just posting one to your Page, the news feed will always optimize for stories that generate high levels of user engagement and filter out ones that don’t.”

3. Optimize Your About Section

The “About” section is a perfect place for branding your organization. Use this section wisely to clearly communicate what your organization is about and use keywords that are important to your company.

The arrow on the following image indicates the area on the home page where the short description you have created in the Abut section will render. Use this space wisely to quickly communicate what your organization is about and consider putting your website so people can click through directly from the Facebook landing page.

Facebook About

The About section offers ample opportunity to go into more detail about your company:

  • Tell your story and highlight your unique value proposition (what makes you different).
  • If there’s a link you really want your community to check out, include it. But minimize any call to actions to only the most important so as not to split the attention.
  • Use keywords throughout that are important to your offerings.
  • Cross-promote your presence in other places on the Web here so people can find content that is useful to them coming from you. If you have other Facebook profiles or a YouTube or Twitter account, let your community know.

Here’s an example using The Bruce Clay Facebook About section:

Facebook Post About Section

4. Create a Schedule for Posting

Knowing how often to post is never easy, and typically comes with some experimentation. Posting too much can annoy your community, and not posting enough can leave you forgotten.

Using data from third-party tools can be very helpful in giving you a starting point for a schedule. EdgeRank Checker looks at historical data of your Facebook account and comes up with some suggestions on how often to post, and when is an optimal time for your particular community.

Facebook Post Overall Recommendations

And another suggestion:

Facebook Post Best and Worst Metrics

Use that data as a starting point for experimentation on how often you post, which days you post. Here’s a post on social media scheduling that might help you, too.

I hope this has given you a starting point to audit your Facebook presence and start making more informed decisions right away. Comments welcomed below!

Bruce Clay Blog

Page 6 of 19 «...45678...»